• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA 2025 International CES Keynote: Liveblog

Goodness Gracious Me

I like reading peoples comments and then playing catch up with each new comment with TPU notifications. But, 11-pages already!! F-that!

50-series:

They've got us by the balls again! Its the same old game - toss out rumors and leaks to make us think prices are gonna be through the roof, so when launch day hits and its still expensive, but not as expensive as our fuelled predictions, we’re somehow sitting there like “Oh thank goodness, what a bargain!”. Classic hustle.

Not gonna lie, I’m kinda relieved though. Definitely ready to upgrade and finally break free from my current GPU bottleneck. A $750 5070 Ti might just do the job. If the 5080 actually delivers 4090-level or better performance, it’s definitely up for consideration. The least i'm expecting, absent of FG/DLSS, at 1440p, is a 50% increase in performance over my 3080 - if the 5070 TI is capable, its a BUY.

Come on Whizzy, jump over them NDAs and drop them reviews already!! :clap:
 
Yeah I don't understand how people do math either. Ever since we knew the 5080 was half the 5090, it was a given this would never be faster than the 4090. Its also the best outcome for Nvidia because now they just position the 4090 between the new line up, and it sits there just fine. Effectively, nothing happened between Ada and Blackwell if you think of it. The 4090 is what, 1499? It slots perfectly in the middle there. The perf/$ metric has moved exactly zero that way. You're just paying the extra performance with a higher power target = power consumption. Its not on Nvidia's bill at all. It is complete stagnation. But hey, here's DLSS4! hahaha And look at my leather jacket.

And here we have people saying 'muh, good prices'. :roll::roll: what the fck

The 5080 is also a big nothing burger if you know the 4080 Super exists. Same shader count. Same VRAM but a slight bit faster. Similar price. Every last bit of extra perf is probably paid by having to buy a new PSU, as this fantastic x80 is the first one to consume power like a flagship card on an OC.
5070 = 4090 bro, what you are talking about

/s
 
FG doesn't necessarily double your framerate, therefore I assume the new FG isn't going to quadruple them either, so whatever comparison you are trying to make from that screenshot are wrong.
Yes it does double the framerate, that's the whole point, it inserts a frame between every 2 rendered ones thus doubling it. The scaling isn't 100% because it has a cost per interpolated frame.

I think we're entering the realm of some serious coping right here, 2X FG had ~90-95% scaling and clearly 4X has similar scaling as well from 2X perhaps ever so slightly worse, 85-90%. 5080 with no FG is barely any faster than a 4080 Super in this game, the writing is on the wall.
 
Yes it does double the framerate, that's the whole point, it inserts a frame between every 2 rendered ones thus doubling it. The scaling isn't 100% because it has a cost per interpolated frame.

I think we're entering the realm of some serious coping right here, 2X FG had ~90-95% scaling and clearly 4X has similar scaling as well from 2X perhaps ever so slightly worse, 85-90%. 5080 with no FG is barely any faster than a 4080 Super in this game, the writing is on the wall.
Clearly you haven't used it. It doesn't always translate to a 90-95% scaling. Especially at 4k im usually seeing 40-70% scaling. EG Ghost of tsushima, which was the most recent game I played with FG.

What would I be coping about? Im just explaining to you how the thing works. Whatever
 
Here's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
 
Here's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
It's us average Joes who go to local Micro Centers, BestBuys, Amazons, eBays, AliExpresses and the sorts and pay $20 for a GPU, $30 for the cooling, $600 for NV tax, $50 for ASUS tax and $100 in other taxes so that a die that cost $20 for NVIDIA to print ends up an 8-hunnit retail price GPU. Console developers bulk on chips on much merrier terms. There is no way they start inventing their own GPUs because first off, reaching RTX 2000 series in terms of performance is already extremely problematic if not impossible for a complete dGPU market newbie, and cakes are coming in cheap anyway.

Apple are way more experienced in this regard than any other "non-GPU" player out here.

On topic: I expected crystal clear vast nothingness from this Blackwell generation but it slightly proved me wrong as prices aren't THAT insane. I assume 5070 Ti will make short work of 4080 in virtually every scenario. 15% IPC gains will be enough for this GPU to become my likely purchase as it'll crawl dangerously close to 8 times the RT performance I have now. And since my pure raster needs are basically covered by whatever GPU beefier than 3080 it's totally a hmmmmmmmm.
 
It's us average Joes who go to local Micro Centers, BestBuys, Amazons, eBays, AliExpresses and the sorts and pay $20 for a GPU, $30 for the cooling, $600 for NV tax, $50 for ASUS tax and $100 in other taxes so that a die that cost $20 for NVIDIA to print ends up an 8-hunnit retail price GPU. Console developers bulk on chips on much merrier terms. There is no way they start inventing their own GPUs because first off, reaching RTX 2000 series in terms of performance is already extremely problematic if not impossible for a complete dGPU market newbie, and cakes are coming in cheap anyway.

Apple are way more experienced in this regard than any other "non-GPU" player out here.

On topic: I expected crystal clear vast nothingness from this Blackwell generation but it slightly proved me wrong as prices aren't THAT insane. I assume 5070 Ti will make short work of 4080 in virtually every scenario. 15% IPC gains will be enough for this GPU to become my likely purchase as it'll crawl dangerously close to 8 times the RT performance I have now. And since my pure raster needs are basically covered by whatever GPU beefier than 3080 it's totally a hmmmmmmmm.
Short work? I wouldn't bet on that, I think the 70ti will be close to the 4080
 
Short work? I wouldn't bet on that, I think the 70ti will be close to the 4080
70 VS 76 SM (slight disadvantage) but higher clocks, possibly higher IPC and significantly higher VRAM bandwidth at a lower price and lower TGP will mean it's an overall better GPU. Of course it's very subtle and sometimes one'll need a microscope to see a performance difference but all in all, it's more interesting. Especially considering overbuilt coolers, my 1 kW PSU and an absolute crap ton of cold days per year. If possible to OC beyond 3200 MHz on air then it's awesome. Expensive but awesome.
 
Clearly you haven't used it. It doesn't always translate to a 90-95% scaling. Especially at 4k im usually seeing 40-70% scaling. EG Ghost of tsushima, which was the most recent game I played with FG.

What would I be coping about? Im just explaining to you how the thing works. Whatever
I think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.

1736277192603.png


This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
 
I think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.

View attachment 378843

This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
Ok bud
 
I think the problem is most of you just don't know math, that's why everybody is so mystified. In reality we have all the information we need.

View attachment 378843

This is from the same video, 580% to 1000% is a ~72% increase, that's the scaling from 2X to 4X, from the previous screenshot I posted in order to reach a final percentage of 185% on 4X the starting value must be around ~108%.

This puts the 5080 at a meager ~10-8% faster with 2X FG vs 4080 Super.
And that's in a game that heavily utilizes RT Cores. What will happen in games that do not have RT or have light implementation.
 
In the meantime DF released a video essentially confirming basically all of that performance comes from the 4X FG.

View attachment 378809
it's slower. Twice as many fake frames, not twice as many frames per second

Net result, input lag gets even worse, and input lag is the main reason people don't like fake frames in the first place. Not the only reason, but definitely the main one.

More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.
 
More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.
DLSS4 MFG will be a driver-level toggle, won't it? Will it only apply to whatever game already has DLSS3 FG enabled, or will it work over anything like AFMF does?
 
DLSS4 MFG will be a driver-level toggle, won't it? Will it only apply to whatever game already has DLSS3 FG enabled, or will it work over anything like AFMF does?
It's based on DLSS3 frame gen, it's not universal.
 
Architecture-level improvements. And 40 series are not being deprived of any feature, they just won't support frame generation at factors above 2x...
That's the thing - architectural level inprovements are non existent. Blackwell is shrinked Ada on steroids. Brute force. Nvidia added as much new compute units as was possible and tried to balance it power-wise.

You can tell from 5090's specs that efficiency is also a problem now. 4090 has 5000 shaders less but also much lower TGP than 5090. Were there any significant architectural changes, it would not end like that. Nvidia brute forced everything towards so called AI features (DLSS, FG). Jensen already stated before that this is the only way for new stage of gaming. I have my doubts, though.

As for new DLSS, please, don't say that 4000 series will be deprived of nothing and basically they just won't support something here, something there, there and also there and god knows where epse as well. RTX 4090 is surely capable (hardware-wise) for new DLSS tech when slower 5080 is capable (and anything below 5080 as well). Or change my mind, give me one real reason why 4090 would not be capable.

RTX 5080 will not beat RTX 4090 in native. Because:
- not enough computing power
- that would negatively affect 4090 sales which is Jensen's golden goose, they can't just release something more powerful and price it 20-30% less, or else they would cripple their own sales
- there will be RTX 5080 Ti with around 14k shaders and this one maybe will be on par with 4090

Performance-wise, from best:
RTX 5090
RTX 4090
RTX 5080 Ti (Super) with around 400W TGP
RTX 5080
RTX 5070 Ti (Super)
RTX 5070
 
Last edited:
FG doesn't necessarily double your framerate, therefore I assume the new FG isn't going to quadruple them either, so whatever comparison you are trying to make from that screenshot are wrong.

I agree though it is looking like apples to apples the 5080 probably isn't much faster than the 4080..
 
Here's a more general question. Is there a point of GPU price inflation at which console makers start reexamining going outside , or developing their own custom graphics? They don't have to account for backwards compatibility after all, and that's a big barrier to entry in discrete graphics that they wouldn't have to deal with. Or are the designs simply too complex for such a project to have any hope of succeeding?

I know it sounds like craziness, but Apple isn't doing all that bad with their imagination technologies derived graphics silicon, after all.
The console makers are not paying GPU inflation. Console chips are, famously, very low margin designs for chip makers, part of why nvidia is happy to let AMD have it.

It's also monstrously expensive, if Sony/MS had to design their own, neither one would be making any profit from consoles, even with software sales. Apple gets away with it because they sell more iphones in 6 months then xbox series x/s and ps5/pro have sold combined the ENTIRE generation, and that tech is also used on all their ipads and macs.
 
U take this like personal offence? Why?
its Amd not You! Dont hurt u feelings if someone say bad about tech company.
I'm not taking it personally. I'd just like to stay on topic. I am equally disappointed in the AMD keynote, but there is a place to discuss that, which isn't here. I have expectations on certain products, but I do not have feelings for either company. It rather looks like you have feelings for Nvidia which you're trying to justify by convincing me. Believe me, it's pointless.

60 FPS is not smooth at all when playing years +100fps.
OFC u cant see difference if u are using 60Hz monitors
I'm on 144 Hz. Our perceptions are different. What's smooth for you might not be for me and vice versa. I want stability in my performance, and I want low input lag with no graphical glitches. Whether it's at 60 or 100 or 200 FPS, I don't care. But unfortunately, when I only have 30 FPS, I can't make 60 out of it with FG without introducing other problems, that's why I don't like the tech.

But can u just cool off and wait for reviews? we got u point allredy. Ok?
You asked for it.

Because Nvidia have best gpus also best features..
no need to use FG, but its still there when needed.
Ai is future
Believe that if it makes you feel better. Personally, I see the same games running on Nvidia and AMD cards (yes, I have both). The colour of the box doesn't matter. Price and performance do.

The console makers are not paying GPU inflation. Console chips are, famously, very low margin designs for chip makers, part of why nvidia is happy to let AMD have it.
More because Nvidia doesn't do APUs (if you don't count low-performance designs with ARM cores like in the Nintendo Switch).
 
it's slower. Twice as many fake frames, not twice as many frames per second

Net result, input lag gets even worse, and input lag is the main reason people don't like fake frames in the first place. Not the only reason, but definitely the main one.

More seriously, of the thousand or so demanding titles from the last half decade, only a tiny tiny handful (under 50) actually even support Nvidia's frame-gen.

I'm annoyed with how Nvidia portrayed 50 series but honestly the best announcement is DLSS RR, DLSS SR, and DLAA are getting meaningful improvements and that's coming to all RTX owners.
 
Nvidia didn't show any raster gains and AMD didn't even show their GPUs. I was telling people that AMD exited the GPU business. They'll argue that UDNA is coming. They killed off RDNA. They don't want to pour any money into graphics. You'll be gaming on their compute units. It migth work, it might not but they don't care about graphics and they made it very clear.

I was expecting the 5080 on down to be similar to the Super update. Single digit improvements. They focused on everything other than raster. Can't even buy any old stuff. Shelves are clear at Microcenter. 5080 here I come. I can't believe how hard it is to replace my 6950XT. 7900XTX is only 50% gain. 4080 close to the same. I prefer to at least double my fps when I upgrade. This will be the saddest upgrade ever for me. I previously went from 1060 to 6950XT. That's 4-5x fps improvement. Are we reaching diminishing returns? Good news will be that we don't have to upgrade as often with such measly gains.
 
You're not being serious when you're saying that you also believe a 5080 is going to be 30% faster than a 4090 ?

We'll have to wait and see but I personally don't think it's impossible
 
Ive read comments similar to yours a hundred times this past week. You are not doing your side any favors. Honestly, on the list of why im not buying amd GPUs "obnoxious comments by the company's fans" is at the top.
If I limited my choices by a few idiots and blind fans on an online forum, then I wouldn't have a PC at all (let alone three). There's plenty of them in every camp.
 
That's the thing - architectural level inprovements are non existent. Blackwell is shrinked Ada on steroids. Brute force. Nvidia added as much new compute units as was possible and tried to balance it power-wise.

You can tell from 5090's specs that efficiency is also a problem now. 4090 has 5000 shaders less but also much lower TGP than 5090. Were there any significant architectural changes, it would not end like that. Nvidia brute forced everything towards so called AI features (DLSS, FG). Jensen already stated before that this is the only way for new stage of gaming. I have my doubts, though.

As for new DLSS, please, don't say that 4000 series will be deprived of nothing and basically they just won't support something here, something there, there and also there and god knows where epse as well. RTX 4090 is surely capable (hardware-wise) for new DLSS tech when slower 5080 is capable (and anything below 5080 as well). Or change my mind, give me one real reason why 4090 would not be capable.

RTX 5080 will not beat RTX 4090 in native. Because:
- not enough computing power
- that would negatively affect 4090 sales which is Jensen's golden goose, they can't just release something more powerful and price it 20-30% less, or else they would cripple their own sales
- there will be RTX 5080 Ti with around 14k shaders and this one maybe will be on par with 4090

Performance-wise, from best:
RTX 5090
RTX 4090
RTX 5080 Ti (Super) with around 400W TGP
RTX 5080
RTX 5070 Ti (Super)
RTX 5070

Following this logic, there would never be a generational uplift over the previous halo part. This has been the case for the past few generations. It is possible, but personally I'm optimistic on at least a match. We'll have to wait and see. After all, it's pretty much what AMD is proposing with the 9070 XT. A leaner and meaner chip that will go toe to toe with their previous generation flagship with less raw hardware resources.
 
As much as I don't get all the hating about DLSS 4 etc., fact of the matter is they are not actual frames (they do not affect the game engine) and therefore they just shouldn't be on a framerate graph / slide from nvidias marketing or from reviewers.

On the other hand, it's really hard to demonstrate what FG actually does so what other way do you actually have besides putting them on a graph?

I have no issues with them showing how they've improve frame generation personally I think it's awesome that they are I'm just not a fan of them omitting actual apples to apples performance difference especially when turning frame generation on increases latency at each step from 2x-3x-4x I will say I'm impressed it isn't much higher after the first step but until there are no noticeable artifacts and latency goes down at each step it shouldn't be sold as extra performance.

I am happy that the core DLSS technologies are improving for all RTX owners probably the best announcement period.
 
Back
Top