• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

What was lacking GPU-wise at this year's CES

I read through all the posts in this thread and much of the CES coverage and one thing comes to my mind…I’m more confused about these GPUs after CES than before CES.

That being said I highly anticipate W1zzard’s reviews!
 
I read through all the posts in this thread and much of the CES coverage and one thing comes to my mind…I’m more confused about these GPUs after CES than before CES.

That being said I highly anticipate W1zzard’s reviews!

At the AMD presentation they said AI 150+ times and at the Nvidia presentation they said AI 200+ times..... So I know we getting some AI....
 
Why is Nvidia spending more time and resources trying to generate guess-frames than just rendering more actual frames? maybe this makes sense to other people lol, I just don't get this focus.
Because the nVidia current ("universal") tech is designed for prosumers and professional workloads. ML and AI stuff.
It is just brought down to consumer level.

Exactly the same with Ryzen. And exactly the same what AMD will aim with UDNA.
Its still hard to see it for some people when thinking that games is a priority for these multi-billion companies when its last on their list.

To be fair going this way its the only way if you want the big profits and keep growing.
Not with $500-2000 cards, but with $30k-40k products that other corporations will buy by cases...

Gamers can keep dreaming that the GPU world is revolving around them.
Nope... the Earth is going around the Sun...
 
Because the nVidia current ("universal") tech is designed for prosumers and professional workloads. ML and AI stuff.
It is just brought down to consumer level.

Exactly the same with Ryzen. And exactly the same what AMD will aim with UDNA.
Its still hard to see it for some people when thinking that games is a priority for these multi-billion companies when its last on their list.

To be fair going this way its the only way if you want the big profits and keep growing.
Not with $500-2000 cards, but with $30k-40k products that other corporations will buy by cases...

Gamers can keep dreaming that the GPU world is revolving around them.
Nope... the Earth is going around the Sun...

100%... Gpu's are no longer meant for gaming they just can be used to gaming.... If you look at the 50 series their AI improvement are 2x+ vs what will likely be modest gaming increases.

Us gamers are just the bottom of the totem poll in 2025.
 
It still sucks.... But that is literally only an issue in path tracing it's the only thing I even feel is worth using frame gen with although In some games it's pretty terrible. Silent Hill 2 has an awful implementation of frame gen.
Well if you are one of the people that "im already at 80-100 fps, why do I need more" then sure, fg is pointless for you, and MFG even more so. Personally I don't mind enabling it and going from 100 to 150+, and at that framerate latency isn't an issue.
 
At the AMD presentation they said AI 150+ times and at the Nvidia presentation they said AI 200+ times..... So I know we getting some AI....
They're pushing it so hard that it sounds like the typical "say it multiple times to make sure it sticks" marketing technique rather than something actually worth talking about at this point. If AI was really that great, then we wouldn't need to hear the word so many times. It's like they're begging us on their knees to buy into it.
 
Well if you are one of the people that "im already at 80-100 fps, why do I need more" then sure, fg is pointless for you, and MFG even more so. Personally I don't mind enabling it and going from 100 to 150+, and at that framerate latency isn't an issue.

I just hope the actual quality of the generated frames especially with ui is substantially improved especially now that it's going to be 131313 vs 111111
 
I'm waiting for new reviews that will have new stats - how many false frames each card can generate :roll:
 
I'm waiting for new reviews that will have new stats - how many false frames each card can generate :roll:

100% of what we are seeing is fake I just want to see who can fake it better at the lowest latency possible lol....:laugh::toast:
 
Well if you are one of the people that "im already at 80-100 fps, why do I need more" then sure, fg is pointless for you, and MFG even more so. Personally I don't mind enabling it and going from 100 to 150+, and at that framerate latency isn't an issue.
Its fine if you like staring at a framerate counter, because so far tech youtubers are saying it will still feel like 100fps as the latency won't be any lower. I personally don't care about frame gen, and IMO its silly for brand new cards to be marketed with frame gen and no numbers provided for native rendered performance, and the fact any new card would need frame gen.
Because the nVidia current ("universal") tech is designed for prosumers and professional workloads. ML and AI stuff.
It is just brought down to consumer level.

Exactly the same with Ryzen. And exactly the same what AMD will aim with UDNA.
Its still hard to see it for some people when thinking that games is a priority for these multi-billion companies when its last on their list.

To be fair going this way its the only way if you want the big profits and keep growing.
Not with $500-2000 cards, but with $30k-40k products that other corporations will buy by cases...

Gamers can keep dreaming that the GPU world is revolving around them.
Nope... the Earth is going around the Sun...
Unfortunately, gaming isn't going to be the focus unless an AI crash happens, or general purpose hardware can efficiently run AI without the need for $30-40k compute cards.
Although these companies are marketing AI so much and consumers don't seem to care, there has to be a point where people say enough and just won't buy overpriced gaming cards with little improvement over the previous gen.
 
Last edited:
Its fine if you like staring at a framerate counter, because so far tech youtubers are saying it will still feel like 100fps as the latency won't be any lower. I personally don't care about frame gen, and IMO its silly for brand new cards to be marketed with frame gen and no numbers provided for native rendered performance, and the fact any new card would need frame gen.
If youtubers said it, must be true. 100%.
 
Its very easy to capture what was missing on GPUs in one word

Progress

Part of me wonders if AMD is now kicking themselves for not going big with RDNA4.

This was the gen where they presumably could have caught up to Nvidia or at least had another RDNA2 moment where they were within striking distance of halo with a couple asterisks...

Maybe AMD is in the back trying to glue two N48's together to take the crown :banghead:
I'm still fully expecting a surprise comeback of Raja and a month later the announcement they'll go full on Crossfire and we need to buy two 9070XTs now.

I'm only half joking, given AMD"s track record they might actually even consider that an actual selling argument.
 
I just hope the actual quality of the generated frames especially with ui is substantially improved especially now that it's going to be 131313 vs 111111

Digital Foundry talked about 141 is their new video about hands on experience with the 5000 series
 
Single Fan GPU, missed !?
 
They are not spending more time. But it's the only waily to saturare the modern 4k 240hz monitors. You can't do that with hardware alone, 4090 is down to the 20 fps range without those technologies, no matter how huge a boost the 50 series was it would still be lacking. Also, cpus, they are a major bottleneck in aaa titles and FG fixes that as well.
Fair. It just seems like they've thrown so many cores and so much power at these that they could have done better at render (including RT) without the smoke and mirrors. Thinking a little bit more critically than I was in that comment, I'd answer my own question by saying it's more valuable to them to include the hardware that improves AI performance these days (for other users), and they've just put a bunch of time into coming up with ways to make that hardware do something for gamers too.

Because the nVidia current ("universal") tech is designed for prosumers and professional workloads. ML and AI stuff.
It is just brought down to consumer level.

Exactly the same with Ryzen. And exactly the same what AMD will aim with UDNA.
Its still hard to see it for some people when thinking that games is a priority for these multi-billion companies when its last on their list.

To be fair going this way its the only way if you want the big profits and keep growing.
Not with $500-2000 cards, but with $30k-40k products that other corporations will buy by cases...

Gamers can keep dreaming that the GPU world is revolving around them.
Nope... the Earth is going around the Sun...
Yeah...it's a fact I've known for a long time, I just tend to forget the outcome of that sometimes lol. So they've put all this other hardware in and this is a way they've come up with to make it seem like a completely usable feature for gamers, even though we're no longer the biggest revenue stream.

We are definitely still a valuable revenue stream for them though, because they do still spend a lot of time marketing these, making consumer gaming versions of everything, and pricing them accordingly (and a lot of time on this software, drivers, etc.). I do think we're headed in a direction where they could just decide we're not worth the effort anymore and just stop putting resources into gaming. This sounds insane, because it's still a massive market, but Nvidia is getting so big and has such a focus on AI and supercomputers, that our impact on their revenue gets less and less important to them as the years go on.
 
100% of what we are seeing is fake I just want to see who can fake it better at the lowest latency possible lol....:laugh::toast:
Video games are fake if you think about it. It's all just graphics on a monitor, none of it is real. :laugh:

I mean, we could also play cards or chess or tabletop games with realistic graphics, infinite FPS and no input lag, but we play video games instead. :D
 
Fair. It just seems like they've thrown so many cores and so much power at these that they could have done better at render (including RT) without the smoke and mirrors. Thinking a little bit more critically than I was in that comment, I'd answer my own question by saying it's more valuable to them to include the hardware that improves AI performance these days (for other users), and they've just put a bunch of time into coming up with ways to make that hardware do something for gamers too.


Yeah...it's a fact I've known for a long time, I just tend to forget the outcome of that sometimes lol. So they've put all this other hardware in and this is a way they've come up with to make it seem like a completely usable feature for gamers, even though we're no longer the biggest revenue stream.

We are definitely still a valuable revenue stream for them though, because they do still spend a lot of time marketing these, making consumer gaming versions of everything, and pricing them accordingly (and a lot of time on this software, drivers, etc.). I do think we're headed in a direction where they could just decide we're not worth the effort anymore and just stop putting resources into gaming. This sounds insane, because it's still a massive market, but Nvidia is getting so big and has such a focus on AI and supercomputers, that our impact on their revenue gets less and less important to them as the years go on.
Even if we (as gamers) are not big revenue for them they still need the word, the hype and the fame to go around everywhere. At least for the present still.

After seeing how many frames the new DLSS/AI can generate I’m going to predict that more and more new AAA games in near future (within 1-2 years) will start to have super high stupid requirements if the GPU lacks AI tech in order to have the full “eye candy” experience.
Path tracing is a good start.
AI tech won’t be able to be replaced not even by x5 SLI/CF with “simple” and humble rasterization.
So let’s forget it once and for all.

It’s inevitable from where I stand. And from the moment AI has debuted for good the last couple of years the web is already starting to fill with fake generated videos.
There will be a moment in future that you won’t be able to believe anything you see online.
Wait and see
 
Even if we (as gamers) are not big revenue for them they still need the word, the hype and the fame to go around everywhere. At least for the present still.

After seeing how many frames the new DLSS/AI can generate I’m going to predict that more and more new AAA games in near future (within 1-2 years) will start to have super high stupid requirements if the GPU lacks AI tech in order to have the full “eye candy” experience.
Path tracing is a good start.
AI tech won’t be able to be replaced not even by x5 SLI/CF with “simple” and humble rasterization.
So let’s forget it once and for all.
Let's wait to see the quality. I'm guessing that the more fake frames you insert, the more fake the game will look and feel.

It’s inevitable from where I stand. And from the moment AI has debuted for good the last couple of years the web is already starting to fill with fake generated videos.
There will be a moment in future that you won’t be able to believe anything you see online.
Wait and see
There was a test where they drew 200 or so pictures of elephants, then asked AI to do the same. Then again. Then again. The more the picture pool got saturated with AI-made images, the more errors appeared, such as two-headed or three-legged elephants. AI works well with a large amount of human-made data, but it doesn't work well with other AI-generated drivel. What it lacks is judgement and control over quality.
 
lacking is AMD, rest sounds just fine.
 
Let's wait to see the quality. I'm guessing that the more fake frames you insert, the more fake the game will look and feel.


There was a test where they drew 200 or so pictures of elephants, then asked AI to do the same. Then again. Then again. The more the picture pool got saturated with AI-made images, the more errors appeared, such as two-headed or three-legged elephants. AI works well with a large amount of human-made data, but it doesn't work well with other AI-generated drivel. What it lacks is judgement and control over quality.
Not arguing but it’s improving day by day. It is Inevitable IMO that one day will come to what I said.

Like vehicle autonomous driving. Today it’s miles ahead of what was used to be when first introduced. And it keeps getting better and better. Dropping the chance of collision in comparison with actual human driving.

We can deny it all we want but the day that AI will match and surpass humans in certain and later in most tasks is coming.

I bet that in 20-30years from now, if not sooner, the human driving will be illegal.

A bit off topic, sorry.
 
Not arguing but it’s improving day by day. It is Inevitable IMO that one day will come to what I said.
Based on my comment above, I have my doubts.

Like vehicle autonomous driving. Today it’s miles ahead of what was used to be when first introduced. And it keeps getting better and better. Dropping the chance of collision in comparison with actual human driving.
It drops the fun factor, too.

We can deny it all we want but the day that AI will match and surpass humans in certain and later in most tasks is coming.
And what will humans do? Lay on the sofa, watch TV and get fat? Or work for AI-run companies? Quite unrealistic images, both of them.

Don't get me wrong, if AI could take over my job so I could get paid for nothing, I'm all in. But 1. I don't see that happen anytime soon, and 2. Creative jobs, like art should always be in human hands (I don't work with art unfortunately).

I bet that in 20-30years from now, if not sooner, the human driving will be illegal.
Machines break. Relying 100% on a machine is just as idiotic as Tesla's fully electric door handle.
 
Based on my comment above, I have my doubts.


It drops the fun factor, too.


And what will humans do? Lay on the sofa, watch TV and get fat? Or work for AI-run companies? Quite unrealistic images, both of them.

Don't get me wrong, if AI could take over my job so I could get paid for nothing, I'm all in. But 1. I don't see that happen anytime soon, and 2. Creative jobs, like art should always be in human hands (I don't work with art unfortunately).


Machines break. Relying 100% on a machine is just as idiotic as Tesla's fully electric door handle.


Good watch about the COD benchmark and IGN results.




If IGN was accurate the 9070 would almost be 50% faster than the 7900XT
 
If IGN was accurate the 9070 would almost be 50% faster than the 7900XT
Huh? They positioned it just below the XTX, which isn't 50% faster than the XT.
 
Huh? They positioned it just below the XTX, which isn't 50% faster than the XT.

He ran both the 7900XT/7900XTX at actual extreme settings in that video. they were significantly lower than the IGN benchmark.
 
He ran both the 7900XT/7900XTX at actual extreme settings in that video. they were significantly lower than the IGN benchmark.
Ah, fairs. We'd better wait for proper benchmarks, I guess.
 
Back
Top