• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA 2025 International CES Keynote: Liveblog

The "AI TOPS", of say the 5070, looks to be 988 INT4, instead of 4070's 466 INT8. 988 INT4 / 2 = 494 INT8, 494 INT8 [5070] / 466 INT8 [4070] = 1.06 -> 6% improvement, which reminds me of these 6%.
No, TOPS is INT4 for both products.
 
i like free FPS when gaming 4K, so thats fine for me if it looks good.
So it's perfectly fine for you when 75% of these FPS show something inaccurate, unreal, approximated?
You are okay that you paid so much for your GPU and this is what you get?
5080 is 20-30% faster than 4090
Is this a joke? How on Earth would 5080 with 60% of 4090's processing units could be 20-30% faster in native?
Only with FG. Now imagine what would happen if RTX 4090 supported newest generation of FG.

 
$2000 for a 5090 is a bit painful, but at least $1000 for a 5080 is "reasonable" compared to $1200 last gen. I really wish the 5090 got the full die. It really bothers me when they dont release a product like 3090Ti that uses the full GPU. my OCD i guess.
I still miss the days of a further cut down big die part i.e. 5080Ti for ~$1200. One can dream.
5070 can only hope to fake its way to 4090 horsepower with AI tricks.
I hope DLSS tech gets better this gen. Fake frames are nice to use in a pinch, but they definitely have noticeable and distracting artifacts. Not an end game solution, just a way to distract the public with more performance while leveraging the baked in AI tech of the GPUs.
 
Last edited:
but at least $1000 for a 5080 is "reasonable" compared to $1200 last gen.
lol no

People have truly done lost their minds. You are paying 1000$ for half of the flagship.
 
It's actually a shame they didn't release on the same day they were announced, that would have been something. I was hoping for that December launch. :)
 
lol no

People have truly done lost their minds. You are paying 1000$ for half of the flagship.
Compared to last gen it's yes.. would be nice if it was at least 20% faster as 4080s for the same price (native) but i have some doubts.
 
lol no

People have truly done lost their minds. You are paying 1000$ for half of the flagship.
Well, $1000 for half makes perfect sense when the flagship is $2000 :laugh:
 
@igormp
A bit too gaudy for my taste, a bit too noevau riche, too “Vegas pimp”. Worn saddle leather would be classier.
His jacket is made of the missing vram chips from the 40 series cards it’s soo shiny lol
 
How on Earth would 5080 with 60% of 4090's processing units could be 20-30% faster in native?


Architecture-level improvements. And 40 series are not being deprived of any feature, they just won't support frame generation at factors above 2x...
 
Architecture-level improvements.
You're not being serious when you're saying that you also believe a 5080 is going to be 30% faster than a 4090 ?
 
You're not being serious when you're saying that you also believe a 5080 is going to be 30% faster than a 4090 ?
Apples to oranges comparison. 50 and 40 series are based on different architectures and so the cores are not the same and you can’t take the same number of cores percentage and draw some conclusion about its performance because it will result in a wild nonsense answer. If it’s just like 40 vs 30 series then I would expect the 5070 to match a 4090 in certain cases but mostly being slightly inferior and as you go up to the TI and the 5080 you will see larger differences compared to a 4090. In some cases where the 4090s memory buffer plays a bigger role then you will see the 4090 getting close to or matching even a 5080 but for most cases unless you need that large memory buffer of the 4090 I would disregard the 40 series .

We’ve been through this in the past with nvidias releases
 
Last edited:
The only difference in my book is that if you Turn on DLSS4 you will get the % they claim but as I said above showing one thing a gpu does that a previous generation doesn't do makes it look like they are hiding real performance gains.



It's technically not false if you turn it on and it does what it shows though. False would be them showing 1000 fps but when reviewers get the product it maxes at 200 fps.....
As much as I don't get all the hating about DLSS 4 etc., fact of the matter is they are not actual frames (they do not affect the game engine) and therefore they just shouldn't be on a framerate graph / slide from nvidias marketing or from reviewers.

On the other hand, it's really hard to demonstrate what FG actually does so what other way do you actually have besides putting them on a graph?

Yeah we know how it ends, nvidia buyers buy nvidia anyway while the fans gloat about marketshare.
Ive read comments similar to yours a hundred times this past week. You are not doing your side any favors. Honestly, on the list of why im not buying amd GPUs "obnoxious comments by the company's fans" is at the top.
 
NVIDIA very sneakily teased a Windows AI PC chip.
Can anyone tell me what/how they teased it? Was there any actual info? I know it's an Arm based SoC, but was there any more?
 
the cores are not the same and you can’t take the same number of cores percentage and draw some conclusion
Yes you can, shader count has been one of the most reliable ways to estimate performance.

You are not doing your side any favors. Honestly, on the list of why im not buying amd GPUs "obnoxious comments by the company's fans" is at the top.
"Actually I don't buy AMD because I don't like what their fans have to say" is certainly a take.
 
Last edited:
I've been busy today but from what I can tell, AMD have launched nothing, just announced that the 9000-series is coming soon but not available yet, and that it's 4 models spanning a 7600XT to a 79000XT ish.

Nvidia have said AI about 9000 times and we have no benchmarks, just vague handwavium to say that the 5070 is as fast as a 4090 with no details on how that quote was obtained.

It's unlikely that the 5070 can match the 4090 without assistance, so presumably DLSS4 AI Neural AI Frame-Gen AI can AI deliver some AI fake frames at the same rate the 4090 can render frames natively without as much AI?

The real question is when do we get the first independent review (of the 5080FE, I think, right?)
 
So it's perfectly fine for you when 75% of these FPS show something inaccurate, unreal, approximated?
You are okay that you paid so much for your GPU and this is what you get?

Is this a joke? How on Earth would 5080 with 60% of 4090's processing units could be 20-30% faster in native?
Only with FG. Now imagine what would happen if RTX 4090 supported newest generation of FG.

Those are cheap prices and DLSS looks better than Native so yes im OK if i can get more FPS for free.
FG is also realy good, finaly high Fps in 4K

lol no

People have truly done lost their minds. You are paying 1000$ for half of the flagship.
Just buy the flagship 2000$
Or half the price, half the gpu 5080

Or just buy AMD
 
Just buy the flagship 2000$
Or half the price, half the gpu 5080
Nah, buy 10 of them, 10x 5090 = 2000$ * 10 = 20000$ = 10 times the performance. Pro tip.

I am very smart.
 
100%.... Although raster performance on the 4090 vs 5090 according to the slide is 50% and so probably 70-80% better RT.... So still hard to say the 5080 has half the hardware.
They're going to be similar at best, is my estimate, but I still think the 5080 will land just below the 4090.

Its a good 5000 shader units short of that GPU. There is no way its going to surpass it, honestly, even (or especially not) at 360W.
 
A plauge tale has frame generation 3.5 on and is a heavy ish game and the 5080 is 30% faster....
In the meantime DF released a video essentially confirming basically all of that performance comes from the 4X FG.

1736272927806.png
 
So it's perfectly fine for you when 75% of these FPS show something inaccurate, unreal, approximated?
You are okay that you paid so much for your GPU and this is what you get?

Is this a joke? How on Earth would 5080 with 60% of 4090's processing units could be 20-30% faster in native?
Only with FG. Now imagine what would happen if RTX 4090 supported newest generation of FG.

Yeah I don't understand how people do math either. Ever since we knew the 5080 was half the 5090, it was a given this would never be faster than the 4090. Its also the best outcome for Nvidia because now they just position the 4090 between the new line up, and it sits there just fine. Effectively, nothing happened between Ada and Blackwell if you think of it. The 4090 is what, 1499? It slots perfectly in the middle there. The perf/$ metric has moved exactly zero that way. You're just paying the extra performance with a higher power target = power consumption. Its not on Nvidia's bill at all. It is complete stagnation. But hey, here's DLSS4! hahaha And look at my leather jacket.

And here we have people saying 'muh, good prices'. :roll::roll: what the fck

The 5080 is also a big nothing burger if you know the 4080 Super exists. Same shader count. Same VRAM but a slight bit faster. Similar price. Every last bit of extra perf is probably paid by having to buy a new PSU, as this fantastic x80 is the first one to consume power like a flagship card on an OC.
 
Last edited:
In the meantime DF released a video essentially confirming basically all of that performance comes from the 4X FG.

View attachment 378809
FG doesn't necessarily double your framerate, therefore I assume the new FG isn't going to quadruple them either, so whatever comparison you are trying to make from that screenshot are wrong.
 
Back
Top