• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA 2025 International CES Keynote: Liveblog

Ah so you arguing your point to death is fine, but me replying is boring. Sure, then, have it your way, everything Nvidia does is for the consumer, frame generation is amazing and lying is not a bad thing if you make money on the hype generated by it. Better?
Don't put words in my mouth, I know you're better than that.
 
Again, I’m comparing features that were announced in the Nvidia keynote. Just because you don’t like that AMD doesn’t have them doesn’t mean the differences shouldn’t be pointed out.
No, I don't like the features, and I don't like that they're being used to compare performance to last gen that doesn't have them.

Of course people review with FG. I hate to tell you but it’s a fundamental gaming graphics technology now. UE5 is literally built around it and image reconstruction. Sorry if your team is behind, but those are the facts.
Nobody reviews with FG on. Period. And I don't have a team - I have had lots of AMD and Nvidia (and even ATi) cards through the years and loved them all. I still have way more Nvidia and Intel hardware than AMD. This is not a battlefield.
 
So DLSS4 means that there will be new DLLs for Super Res, Ray Reconstruction, Reflex 2.0 and Frame Generation to improve visuals and performance for previous RTX GPUs.

Meanwhile RTX 5000 takes it up a notch with Multi Frame Generation

Still can't find info about games that will feature Neural rendering though

Neural render I think is a long ways out. Here’s Microsoft’s post about it.

 
Don't put words in my mouth, I know you're better than that.
I'm not doing that. I'm just saying "your argument is getting boring" is not a very elegant way to close a discussion.
 
Again, I’m comparing features that were announced in the Nvidia keynote. Just because you don’t like that AMD doesn’t have them doesn’t mean the differences shouldn’t be pointed out.

Of course people review with FG. I hate to tell you but it’s a fundamental gaming graphics technology now. UE5 is literally built around it and image reconstruction. Sorry if your team is behind, but those are the facts.

Regardless of how anyone feels about frame generation that is all Nvidia showed for the most part throwing in a couple of apples to apples to see what we are actually getting generation to generation with actual natively rendered frames wouldn't have been the end of the world...

I hate using frame generation it's flaws are way too obvious on a monitor for me but if it's massively improved awesome but until native 240fps is the same latency as 240 frame generation with no artifacts benchmarks should only show natively rendered frames.
 
No, I don't like the features, and I don't like that they're being used to compare performance to last gen that doesn't have them.


Nobody reviews with FG on. Period. And I don't have a team - I have had lots of AMD and Nvidia (and even ATi) cards through the years and loved them all. I still have way more Nvidia and Intel hardware than AMD. This is not a battlefield.

Digital Foundry for one.

There goes your nobody argument.

Regardless of how anyone feels about frame generation that is all Nvidia showed for the most part throwing in a couple of apples to apples to see what we are actually getting generation to generation with actual natively rendered frames wouldn't have been the end of the world...

I hate using frame generation it's flaws are way too obvious on a monitor for me but if it's massively improved awesome but until native 240fps is the same latency as 240 frame generation with no artifacts benchmarks should only show natively rendered frames.

I’ve rarely seen a flaw with FG. I guess I’m too busy playing games to try to peep a flickering pixel somewhere.
 
Digital Foundry for one.

There goes your nobody argument.
Ah one. Perfect!

I'd still argue that data is irrelevant because it's not valid for comparison.
 
Digital Foundry for one.

There goes your nobody argument.



I’ve rarely seen a flaw with FG. I guess I’m too busy playing games to try to peep a flickering pixel somewhere.

They weren't too bad on my LG G2 but on my Samsung G8 ultrawide even with high ish base frame rate they are easy to see in every game.... Maybe as I get older and my eyesight worsens they will be less noticeable....
 
I’ve rarely seen a flaw with FG. I guess I’m too busy playing games to try to peep a flickering pixel somewhere.
Have you tried using it on a game that ran at 20 FPS without it?
 
Have you tried using it on a game that ran at 20 FPS without it?

To be fair both AMD and Nvidia Recommend 60fps+ and the CP results they showed were at 71 with DLSS SR So right around what you want to be at a min.
 
Ah one. Perfect!

I'd still argue that data is irrelevant because it's not valid for comparison.

Lol, I got whiplash from trying to watch how fast those goal posts moved.

Anyway, meetings in the morning. Good night.
 
To be fair both AMD and Nvidia Recommend 60fps+ and the CP results they showed were at 71 with DLSS SR So right around what you want to be at a min.
That's why I'm saying it's pointless. 60 FPS is pretty smooth in my books, I don't need to make 100 out of it.

Lol, I got whiplash from trying to watch how fast those goal posts moved.

Anyway, meetings in the morning. Good night.
My goal post never moved. Frame generation is useless, and frame generation enabled data is not valid for comparison. That's what I've been saying all along.
 
They weren't too bad on my LG G2 but on my Samsung G8 ultrawide even with high ish base frame rate they are easy to see in every game.... Maybe as I get older and my eyesight worsens they will be less noticeable....

I game on LG Ultragear. 2019 and 2022 vintage I think.
 
That's why I'm saying it's pointless. 60 FPS is pretty smooth in my books, I don't need to make 100 out of it.

I hear where your coming from and we agree Nvidia needs to show both the FG with the native data but it's a feature people seem to like and if Nvidia sells a ton because of it, it is what it is....

I game on LG Ultragear. 2019 and 2022 vintage I think.

Don't get me wrong man I like the idea of frame generation and want to see it get better and better but I just would have loved to see CP path traced DLSS quality vs Dlss quality at 4k or Alan Wake or even indiana jones with the FG on the side showing it's improvements.
 
That's why I'm saying it's pointless. 60 FPS is pretty smooth in my books, I don't need to make 100 out of it.


My goal post never moved. Frame generation is useless, and frame generation enabled data is not valid for comparison. That's what I've been saying all along.
Hmm I mean thats what makes FG appealing is because you can squeeze more performance out of it at the expense of image quality (and latency). Even though I am a strict native rendering-only user, I can see why these image enhancing and upscaling (and downscaling) techniques are appealing to gamers.
 
I hear where your coming from and we agree Nvidia needs to show both the FG with the native data but it's a feature people seem to like and if Nvidia sells a ton because of it, it is what it is....
I don't mind if it sells because people love it. I just wanted to see apples-to-apples comparison with the 40 series to make it fair.

Are you trying to convince me or yourself?
I don't need to convince myself. I've seen it in work and it was either pointless or crap.
 
I don't mind if it sells because people love it. I just wanted to see apples-to-apples comparison with the 40 series to make it fair.
Same
9fvb0r.jpg

Hmm I mean thats what makes FG appealing is because you can squeeze more performance out of it at the expense of image quality. Even though I am strict native rendering-only user, I can see why these image enhancing and upscaling (and downscaling) techniques are appealing to gamers.

Back when I was gaming at 4k on a 65 inch G2 from about 10 feet away with a controller in single play games I thought it was pretty great honestly and if I still gamed like that I would use it but since swapping to an Samsung G8 UW oled I only want to use DLAA.... It kinda depends on how you game.
 
Last edited:
Forget the 5090, I really want that Digits thingie now
P18vjVU.png

J2u2a2O.png


If it goes for $2k or less I'll insta buy one (even though I doubt it goes for that cheap)

$3k, and there goes away my hopes and dreams.
Honestly it's not a bad price given that it's cheaper than an equivalent Apple product, but it's still too much for what would basically be a toy for me.
 

if I'm interpreting that correctly, a new drop in DLL will enable the new "Transformer" model giving considerable IQ improvements. Neat they also lowered the memory footprint of FG and made it give a bigger uplift too.
 
5070 equal to 4090, but the DLSS4 chart is showing the 5090 native 4K in Alan Wake to be around 30FPS, which is exactly what the 4090 is doing. So I guess 5070 is equal to 5090? lol
 
This feels exactly like the 3,000 series release after the lack luster 2,000 series.
Making up for last generations (2,000) terrble price points.
Splashing another d.l.s.s & some more A.I bullcrap.
Its alsmo exqctly like that one guy said

2000 bad prices
3000 good prices
4000 bad price
5000 good prices

The game for nvidia is faking the supply limiation to drive prices to insanity
 
Last edited:
this is a small tweak to the 40 series with GDDR7 and lower accuracy DLSS for 2x output. 3080 was at least 50% faster. Not accounting for FG 5080 is only 20%.
another 25% missing. 5090 needs to shrink to N3 and become the 5080 for this to hold true.
 
This feels exactly like the 3,000 series release after the lack luster 2,000 series.
Making up for last generations (2,000) terrble price points.
Splashing another d.l.s.s & some more A.I bullcrap.
Its alsmo exqctly like that one guy said

2000 bad prices
3000 good prices
4000 bad price
5000 good prices

The game for nvidia is faking the supply limiation to drive prices to insanity

Until we actually see the gen on gen performance gains in a wide variety of games it's hard to know how good the prices are though it's looking like 20-30% across the board but we only have one benchmark to actually compare.
 
It's more DLSS4 announcement rather than RTX50 announcement. Looking at the official specs of 5080 for example, only major upgrade is GDDR7 and the bandwidth that it brings.
So we are stagnant at the shader performance (minor upgrades) and only working on RT cores and more importantly on Tensor cores.
At least they didn't price it at 1500$. So the pricing strategy works, ask for 1200$ for 4080, then lower 5080 to 999$ and suddenly you get praise that the prices are good this time around.
We will see if the prices will really be 999$, FE's probably will go out of stock in 5 minutes and then it's a rodeo for AIB and sellers to price it at whatever they want.
 
Back
Top