• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA GeForce RTX 50 Technical Deep Dive

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,753 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
In this article we cover everything NVIDIA revealed about the GeForce RTX 50 Series: the new graphics card models and their pricing, Blackwell architecture, DLSS 4 updates, Neural Rendering, Reflex 2 for faster headshots, improved AI performance and creator-focused tools.

Show full review
 
Very modest real gains from 4080 to 5080, putting aside the marketing drivel of DLSS4. We knew Moore's Law would end but who knew the empty space would be replaced with this.

4080 seemed a bad buy at £1k at launch but doesn't look so bad now!
 
Why do I get the feeling that introduction of Cooperative Vectors API for DirectX and Slang will end up being much more significant than we now realize?
 
Is there any indication that Nvidia will force certain features such as DLSS4 to be always on?
No. Quite the opposite, they are adding extra driver-level settings, so you can toggle transformers on and off, enable DLAA and ultra performance even in games that don't support it, and control DLSS 4 framegen to either be DLSS 3 just double, or enable DLSS 4
 
No. Quite the opposite, they are adding extra driver-level settings, so you can toggle transformers on and off, enable DLAA and ultra performance even in games that don't support it, and control DLSS 4 framegen to either be DLSS 3 just double, or enable DLSS 4
Okay cool. Then from what you are saying I can assume DLSS can be turned off all together.
 
@W1zzard
Did NV mention if there is any particular date of cut off for the DL override function? Or it can be potentially used to refresh and improve DLSS even in the earliest titles like Final Fantasy XV? Pretty big if so, makes it much less of a bother if a dev just never updated the version in their particular title.
 
That dual cooler slide does not convince me

Last two mainboards were the cpu tower coolers quite close to the graphic cards.

I'll wait and compare those review regarding the noise to my powercolor 7800xt hellhound. My main criteria is the fan noise.
 
Okay cool. Then from what you are saying I can assume DLSS can be turned off all together.
No, but that's not an NV choice. Alan Wake 2 requires you to pick an upscaler
 
Is the 5090 worse at pure raster performance than the 4090? in other words did they gimp it to fit in all the extra AI cores?
 
Is the 5090 worse at pure raster performance than the 4090? in other words did they gimp it to fit in all the extra AI cores?
"The number of GPU cores is 21,760, which is a +33% increase vs RTX 4090, which has 16,384 cores" this is shader cores for pure raster
 
No. Quite the opposite, they are adding extra driver-level settings, so you can toggle transformers on and off, enable DLAA and ultra performance even in games that don't support it, and control DLSS 4 framegen to either be DLSS 3 just double, or enable DLSS 4
Yeah that stuff adds a reason for the app, hopefully nvidia profile tweaker adds it as well.

Still waiting for this stuff to work on DLSS1 games.
 
I guess we'll see what it's really like when independent reviewers do their tests.

There are plenty of falsehoods in that set of slides from Nvidia, such as graphs/charts comparing improvements over Ada, that also show unrealistic claims of Ada over Ampere.

So if the chart is provably inaccurate in the Ampere > Ada jump, why should we believe the claimed jump from Ada > Blackwell?
 
so I was “pushed” over the edge to buy a ps5pro because my games look better…

Will the 5090 make my games look better than a 7900xtx. ? or a 5080? It is faster, but do I really need the speed of the 5090?, when I can pay half for a 5080 and get everything…
 
I wish I understood technical side so it would be more apparent how you could have 28% more power consumption, 33% more cores and 78% more memory bandwidth and end up less than 30% faster, when it's also a new (means better?) architecture. What an underwhelming couple of years for hardware, lol.
 
Well, rasterization could only take us so far I guess, A.I has to do the heavy lifting from here on out I suppose, DLSS 4 looks very promising, I like what I see and if they fixed the input lag with MFG, well... gg, gg.
 
Just watched this interview as this was posted on TPU:
So we will end up with 5080 being a bigger card than 5090 then? As looks like this innovation is exclusive to 5090. Ok correcting myself, pics on the TPU link suggest all the cards have the new thermal design.
 
Last edited:
So we will end up with 5080 being a bigger card than 5090 then? As looks like this innovation is exclusive to 5090. Ok correcting myself, pics on the TPU link suggest all the cards have the new thermal design.
It does look like almost the same hardware is being released down the product stack below 5090 level. This is why I worry about always on features like DLSS to cover up little to no gains due to physical hardware changes. Some games according to W1zzard do not allow for disabling upscaling.

Well, rasterization could only take us so far I guess, A.I has to do the heavy lifting from here on out I suppose, DLSS 4 looks very promising, I like what I see and if they fixed the input lag with MFG, well... gg, gg.
There is still upgrading to higher SKUs if you are at a lower performance point in your current build. Anyone on Ryzen 3 and 5 from last gen or earlier can upgrade to Ryzen 7 or 9 current gen (3D cache versions even better). Same goes for GPUs. If you are on a X050 or X060 from previous gen or earlier then upgrading to current gen X070 or X080 will give you a huge performance boost.

For those who are already at the top of the performance stack with a 3D cache chip or 4080/4090 then you probably won't need to upgrade this generation.
 
If you own a 4080 or a 4090, hang on to it. I know people who went and sold their 4080 and 4090 in expectation of the upcoming cards. I imagine they heard the uplift of the 5070 to similar performance to the 4090 in games. But they did not pay attention to the small print :laugh:. As I understand it, you will need help from AI features, like DLSS, to achieve performance similar to a 4090. But I imagine that the same could be said of the 4090 and the 5090, perhaps, with all AI help it can get. Too early to speculate, but it is a trend from previous generations.
 

So, RTX 5090 has up to 33% better performance in rasterization over RTX 4090 with 33% more processing units and 33% higher MSRP. Maybe fair, but barely any progress.

RTX 5080 has 11% more processsing units and up to 15% better perf. in raster than RTX 4080. So much for 5080 beating 4090 with just 10k shaders and all that new architecture magic on top of that.

How do they achieve 35ms latency with AI stuff in pipeline when on native it is 70ms? Please someone explain. EDIT: Already explained.
 
Last edited:
That dual cooler slide does not convince me

Last two mainboards were the cpu tower coolers quite close to the graphic cards.

I'll wait and compare those review regarding the noise to my powercolor 7800xt hellhound. My main criteria is the fan noise.

I 100% am concerned by that as well, RIP CPU temps for large air tower cooler owners. Literally will be pouring heat into the heatsink and into the fresh air being sucked up by the fan pulling air into the heatsink. Terrible design, their design before was better, at least it ejected some of the heat blower style, mitigating the issue mostly. The double whammy will 100% ruin air towers I bet. Better go with an AIO for the CPU if you plan to do a 5090 FE.

1736958689284.png
 
Back
Top