• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What was lacking GPU-wise at this year's CES

Joined
Dec 12, 2016
Messages
2,444 (0.80/day)
All three GPU manufacturers left out major information during CES this year:

Intel - released no GPU related news or products whatsoever
AMD - strangely announced RDNA4 with no pricing, specs or benchmarks (there was some strange graphic with boxes showing relative performance)
Nvidia - released no rasterization benchmarks (meaning without AI, RT and DLSS) and generally avoided a straight comparison with previous models and the competition. AI TOPS was the top line spec.

It's almost like none of these companies really want to be here in the discrete consumer GPU space.

PS. This is my first forum post so I hope it follows the rules and is in the right channel.
 
All three GPU manufacturers left out major information during CES this year:

Intel - released no GPU related news or products whatsoever
AMD - strangely announced RDNA4 with no pricing, specs or benchmarks (there was some strange graphic with boxes showing relative performance)
Nvidia - released no rasterization benchmarks (meaning without AI, RT and DLSS) and generally avoided a straight comparison with previous models and the competition. AI TOPS was the top line spec.

It's almost like none of these companies really want to be here in the discrete consumer GPU space.

PS. This is my first forum post so I hope it follows the rules and is in the right channel.
AMD wanted to see Nvidia's pricing first? Intel probably doesn't have much else to push this gen. Can you actually find a $250 B580?

As for Nvidia...it's still on 5nm, and outside the 5090, cuda core count between 40-series models and their 50-series replacements isn't hugely different. Power draw seems up. There is the move from G6X to G7 memory. It seems likely traditional rendering improvements gen over gen is going to be minimal, while the largest improvements will be with the RT/AI/DLSS/Frame Gen, so that's what they are pushing to give it the best light.

EDIT: Looking at simple cuda core improvements:

4090 (16,384) -> 5090 (21,760) = 32.81% increase
4080 Super (10,240) -> 5080 (10,752) = 5% increase
4070 Ti Super (8,448) -> 5070 Ti (8,960) = 6.06% increase
4070 Super (7,168) -> 5070 (6,144) = 14.29% decrease
4070 (5,888) -> 5070 (6,144) = 4.35% increase
 
Last edited:
CES is more pep rally than a show for consumer information. Nvidia is not going to release rasterization benchmarks unless the truly blow the RTX 4XXX series out the water and even if they did it would stop consumers from buying the RTX 4XXX series right now.

Nvidia, AMD, and everyone else are simply setting the message for the products so the media can regurgitate that exact information to the fan boys who will go all over twitter, youtube, tiktok, discord, reddit, message boards, etc., about how their brand is killer leet while the other brands all suck. How is this new to anyone?
 
AMD wanted to see Nvidia's pricing first? Intel probably doesn't have much else to push this gen. Can you actually find a $250 B580?

As for Nvidia...it's still on 5nm, and outside the 5090, cuda core count between 40-series models and their 50-series replacements isn't hugely different. Power draw seems up. There is the move from G6X to G7 memory. It seems likely traditional rendering improvements gen over gen is going to be minimal, while the largest improvements will be with the RT/AI/DLSS/Frame Gen, so that's what they are pushing to give it the best light.

EDIT: Looking at simple cuda core improvements:

4090 (16,384) -> 5090 (21,760) = 32.81% increase
4080 Super (10,240) -> 5080 (10,752) = 5% increase
4070 Ti Super (8,448) -> 5070 Ti (8,960) = 6.06% increase
4070 Super (7,168) -> 5070 (6,144) = 14.29% decrease
4070 (5,888) -> 5070 (6,144) = 4.35% increase

Don’t forget that the 50 series pricing foreshadows very minimal improvements gen over gen, with the exception being their 90 series card yet again. Going to be near zero value improvements.
 
Don’t forget that the 50 series pricing foreshadows very minimal improvements gen over gen, with the exception being their 90 series card yet again. Going to be near zero value improvements.
And the highest end card improved, but it's also 25% more expensive and uses 25% more power, and it's probably about 30-35% faster when not using 3x-4x fake frames, so it's really not a huge uplift in the great scheme of things.

In comparison the 4090 was 40%-60% faster than the 3090TI, cheaper and at the same power.
 
Last edited:
And the highest end card improved, but it's also 25% more expensive and uses 25% more power, and it's probably about 30-35% faster when not using 3x-4x fake frames, so it's really not a huge uplift in the great scheme of things.

Yep, very disappointing CES from all camps.
 
CES is more pep rally than a show for consumer information. Nvidia is not going to release rasterization benchmarks unless the truly blow the RTX 4XXX series out the water and even if they did it would stop consumers from buying the RTX 4XXX series right now.

Nvidia, AMD, and everyone else are simply setting the message for the products so the media can regurgitate that exact information to the fan boys who will go all over twitter, youtube, tiktok, discord, reddit, message boards, etc., about how their brand is killer leet while the other brands all suck. How is this new to anyone?
Exactly. It wasn't a hard sell after owning a 3090(ebay scalped) and it will be even easier to sell me a 5090 after owning the 4090.
Call me a fanboy or call me an experienced user.
 
Don’t forget that the 50 series pricing foreshadows very minimal improvements gen over gen, with the exception being their 90 series card yet again. Going to be near zero value improvements.

In lockstep with their brethren the Core Ultra.
 
It's almost like none of these companies really want to be here in the discrete consumer GPU space.
Nailed It GIF by Mailchimp

nVidia is making bank on AI-MI; 0 interest in Consumer space other than retaining and growing "mindshare".

AMD is trying to break into the (Enterprise) share nVidia left over. (MI300, etc)
[Note: that AMD plans to Re-integrate 'compute' and 'graphics' uArchs after RDNA 4, putting things back to the way they were before 5700 XT/RDNA1.0.]

Intel is focusing on the budget/high-value Consumer segment and specialty Enterprise segments.
 
AMD wanted to see Nvidia's pricing first? Intel probably doesn't have much else to push this gen. Can you actually find a $250 B580?

As for Nvidia...it's still on 5nm, and outside the 5090, cuda core count between 40-series models and their 50-series replacements isn't hugely different. Power draw seems up. There is the move from G6X to G7 memory. It seems likely traditional rendering improvements gen over gen is going to be minimal, while the largest improvements will be with the RT/AI/DLSS/Frame Gen, so that's what they are pushing to give it the best light.
Yeah, Intel B780 isn't coming yet, and B580 is great but a year late to market so it's going to become irrelevant as the RX 7600 and RTX 4060 fall off the bottom of the GPU heirarchy and get deep discounts below $250 as their supply dries up to make way for the RX 9060 and RTX 5060.

As for Nvidia, the only claims are "twice as fast" across several 40-series to 50-series direct model comparisons, but only in games that support MFG (multi frame generation). I understand that as Nvidia just doubling the number of fake frames to make the "FPS" score seem higher, but it's all smoke and mirrors because the underlying hardware hasn't really had any improvements. They're just tweaking frame-gen to falsely inflate the size of their bars in the FPS charts even more than last gen.
 
4090 (16,384) -> 5090 (21,760) = 32.81% increase
4080 Super (10,240) -> 5080 (10,752) = 5% increase
4070 Ti Super (8,448) -> 5070 Ti (8,960) = 6.06% increase
4070 Super (7,168) -> 5070 (6,144) = 14.29% decrease
4070 (5,888) -> 5070 (6,144) = 4.35% increase
Holy crap. I didn't actually run the CUDA percent increase numbers. Other than the 5090, there might not be any rasterization difference down the stack. This is a huge margin win for Nvidia as they are leaving the hardware mostly untouched and really only changing the software.

My biggest disappointment was AMD. Their CPU/APU launches at CES were excellent but they should have just left out RDNA4 instead of whatever the hell that was.

Intel's new CEOs already said they are going to slow down so I wasn't expecting anything after their December Battlemage launch but still it would have been nice to at least have a Battlemage recap and plug the upcoming B570.
 
Low quality post by JohH
I don't see the problem. Ballers get a nice 5090. Poor people can buy a 5080.

It's all great for everyone. The entire market is met.
 
AMD got wind of the 5070 launch price and couldn't get their own slides updated in time :peace:

I kid.

Whole thing kinda sucked all around cause there wasn't really any major hype or performance improvement. If you played the AI drinking game your liver is shot and you have 30 minutes to live.

It feels more and more like Pascal might have been the last "The Undertaker threw Mankind off Hell In A Cell, and plummeted 16 ft through an announcer's table." moment we're going to see in the dGPU space.
 
The 5090 looks great, MSI have decided that Intel and Nvidia are their future and AMD is either fumbling or sandbagging in the DGPU space. Too bad Acer lost the plot and forgot they were the value option as their handhelds are too expensive. I did like the Strix laptop with just an APU inside and it looks like the B650E E Strix is a great replacement for the B550-XE. Now we have way more choice for AM5 too but this was the first CES in a while where I watched no Keynotes.

I don't see the problem. Ballers get a nice 5090. Poor people can buy a 5080.

It's all great for everyone. The entire market is met.
$1100 US is for poor people?
 
The 5090 looks great, MSI have decided that Intel and Nvidia are their future and AMD is either fumbling or sandbagging in the DGPU space. Too bad Acer lost the plot and forgot they were the value option as their handhelds are too expensive. I did like the Strix laptop with just an APU inside and it looks like the B650E E Strix is a great replacement for the B550-XE. Now we have way more choice for AM5 too but this was the first CES in a while where I watched no Keynotes.


$1100 US is for poor people?

- It's a banana Michael how much could it cost, $10?!
 
as i see it:

intel: they praise our card, the least we speak of it the better, they will talk about overheads and crap
amd: just wait to see what nvidia will do and take $50 out of whatever card seems similar. We have higher numbers and x's in the name so all should be good
nvidia: whatever, they will buy all our cards anyway, just talk about the jacket instead
 
There was good information if you looked hard enough.

At least on the Nvidia side a lot of the core DLSS technologies are improving all the way back to the 20 series. Which was nice to see.

Other than that Nvidia wants to shove frame gen down our throats and AMD was being shy with their gpus....

I would say actual apples to apples benchmarks were lacking but even had they shown them i would have taken them with a huge grain of salt even last generation the 40 series was mostly decent uplifts but Nvidia still claimed 2x-3x bs with frame generation and amd didn't even come close to matching it's 7900XTX performance claims so regardless we need actual reviewers testing these new cards.
 
There was good information if you looked hard enough.

At least on the Nvidia side a lot of the core DLSS technologies are improving all the way back to the 20 series. Which was nice to see.

Other than that Nvidia wants to shove frame gen down our throats and AMD was being shy with their gpus....

I would say actual apples to apples benchmarks were lacking but even had they shown them i would have taken them with a huge grain of salt even last generation the 40 series was mostly decent uplifts but Nvidia still claimed 2x-3x bs with frame generation and amd didn't even come close to matching it's 7900XTX performance claims so regardless we need actual reviewers testing these new cards.

no one is happy with the fake frames, and latency is getting worst for what i seen with DF
 
Why is Nvidia spending more time and resources trying to generate guess-frames than just rendering more actual frames? maybe this makes sense to other people lol, I just don't get this focus.
 
no one is happy with the fake frames, and latency is getting worst for what i seen with DF

I don't even dislike it and in some games it's ok but it's troubling that Nvidia fanboys think it's actual extra perfomance.... There is no hope for humanity I guess....
 
Why is Nvidia spending more time and resources trying to generate guess-frames than just rendering more actual frames? maybe this makes sense to other people lol, I just don't get this focus.
differentiate themselves from the competition as something they provide the others (AMD & Intel) don't do as well. That said, I agree. I'm paying for native rasterization performance and Nvidia can paint themselves into a corner constantly asking for a premium on something many people don't seem to care for.
 
differentiate themselves from the competition as something they provide the others (AMD & Intel) don't do as well. That said, I agree. I'm paying for native rasterization performance and Nvidia can paint themselves into a corner constantly asking for a premium on something many people don't seem to care for.

While i think us the 1% that actually understands what the technology is doing may or may not care about it I think the average buyer probably 80% of the market if not more is going to see these crazy frame generation numbers and think it's actual perfomance.
 
Part of me wonders if AMD is now kicking themselves for not going big with RDNA4.

This was the gen where they presumably could have caught up to Nvidia or at least had another RDNA2 moment where they were within striking distance of halo with a couple asterisks...

Maybe AMD is in the back trying to glue two N48's together to take the crown :banghead:
 
While i think us the 1% that actually understands what the technology is doing may or may not care about it I think the average buyer probably 80% of the market if not more is going to see these crazy frame generation numbers and think it's actual perfomance.
I agree with this entirely. The only game I've even tried it on is CP2077 on a 4090 and it is not a pleasant experience to me. It also doesn't double the FPS or even give a huge uplift for me (it never felt like it works the way they describe) and I'm not sure if just putting more estimated information in the middle is really going to make it a better experience.

Edit: I also just look at this whole lineup as pretty small improvements (if any) in raster. There may be a decent RT uplift (which I do use and really enjoy), so that's good, but most of that comes with a pretty hefty power increase as well. So with 5090 for example you might get ~30% uplift for 25% more money and 28% more power.
 
Back
Top