• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

549$ for upgrading my 1080, i´m in.
 
It will be the same frame interpolation trick that used in FSR frame gen anyway.
 
No man, there are very serious and complex reasons why they couldn't generate the extra frames on older cards. You know, despite the fact that Lossless Scaling introduced 4x frame generation six months ago and it works on everything.
Not saying that there is a hard requirement for RTX 5000 series but there is very likely a soft one.

Why would you think there isn't? DLSS relies on specific hardware bits. Nvidia primarily refers to Optical flow accelerator with whatever new features (or performance) the new generation has that enables doing more advanced stuff. It is entirely plausible that the previous generation cards would have problems with it. There is a lot of annoying space between works and does not work. For example, read posts from this thread - latency is a real concern here. And that is just the first one that comes to everyone's mind.

And as @Dr. Dro pointed out - looking back at previous DLSS updates, Nvidia has brought some new DLSS features to older generations as well.

Want a more extreme example from the other camp? Remember when RTX came out, AMD went pschaw and said they'd bring DXR support to their cards running on existing hardware, shaders basically... until they didn't. Nvidia had the same problems with 1000 series but actually did roll out DXR support. Turns out it worked fine - just very slowly - and was thus useless.
 
I don't consider progress having to spend $1000+ on a new gpu to have the newest features, only to lower the resolution while adding AI generated frames.
Also wait for real benchmarks, we have yet to see how it looks or how the latency is.
Think of it this way: I’m running at a lower res but I still get a higher quality image than native.
I’m running FG, but the latency is the same as not using DLSS 4.
240FPS in 4K in CP2077 with PATH TRACING.
Yes, that’s progress.

So DLSS will look better than DLAA which is native resolution, got it.
Yes. DLSS 4 will look better than current DLAA 3
 
And as @Dr. Dro pointed out - looking back at previous DLSS updates, Nvidia has brought some new DLSS features to older generations as well.

Want a more extreme example from the other camp? Remember when RTX came out, AMD went pschaw and said they'd bring DXR support to their cards running on existing hardware, shaders basically... until they didn't. Nvidia had the same problems with 1000 series but actually did roll out DXR support. Turns out it worked fine - just very slowly - and was thus useless.


Decent enough to get 3DMark Solar Bay to 60ish fps on a 1070, as you can see my 1070 Ti there doing 55-80fps so even in full software emulation you could still get phone-level RT on midrange Pascal cards at 1080p. My biggest argument for software DXR (and I requested AMD multiple times to reevaluate and ship this back then) is that it allowed developers to acclimate and experiment with ray tracing, even if the performance is not good enough to ship an RT AAA on pretty much anything below a Titan Xp.

Yet people still act shocked that NV managed to dominate this segment from day one. RT has effectively become RTX, because in the earliest days of DX raytracing... AMD just decided not to bother with it. It's a self inflicted wound which originates from a dismissive, arrogant mindset.

Huang won't give you extra discount

That's ok I am proud to tell you I fully paid off all of my bills and am starting 2025 with no debt whatsoever :D

I only wish it was with this shill money some posters said I take. I wish I did, imagine whaling for C6R5 Mavuika on Jensen Huang's dime!? You thought I'd buy a 5090 with that money? :eek::laugh:
 
but if has the same latency, what the point of having 400 fps instead of 100.

Before you guys make stupid comments about latency, the latency is actually the same (Nvidia has a video comparing DLSS FG vs Multi-Frame DLSS FG). They also just released Reflex 2 which further reduces latency by 50%.
So more generated frames but LOWER latency at the same time.
1736232201197.png


its not lower, its more generated frames with the SAME latency
 
Yes. DLSS 4 will look better than current DLAA 3.
So upscaling is not better than native res when the same process is applied to both.
 
Gpus pulling Consoles performance using tricks to get more fps with games unoptimized by people that don't know how to optimize fricking amazing :kookoo:
 
By the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
View attachment 378677
Cyberpunk 2077 runs between 15-18 fps on a 4090 in 4K with max settings native. (using path tracing, as mentioned on the Nvidia picture)
On this chart the 5090 runs it around a bit over 30 fps, which is close to x2.
So i wouldn't call x2 perf as slow garbage.
 
but if has the same latency, what the point of having 400 fps instead of 100.


View attachment 378753

its not lower, its more generated frames with the SAME latency

Not every scene will run at 400 FPS, but if you lock the FPS to your monitors maximum refresh rate, with G-Sync enabled, you will ensure that you get the maximum smooth frame locked gaming experience possible.

I am so glad this is finally possible.

A lot of competitor gamers like to say, "yeah, I get 500 FPS" "muh 600Hz" it's not just the high FPS that counts, but the CONSISTANCY, when you move, aim & learn at "constant" framerate, you will always aim the same way, no matter the scene rendered and in most cases, different games, so you as the gamer will remain constant across the board according to your abilities.

Fluctuations causes inconsistency, inconstancy causes bad gameplay and could cost you the match, aiming a half mm too short, or too far, just because of a framerate fluctuation.
 
Last edited:
If you knew anything about computer graphics you would know that raster as a whole is 100% fake. They are just TERRIBLE approximations of how real lighting works.

Progress is running things at a lower resolutions but still looking better than traditional native resolution.
DLSS 4 SR has done it.

So you're saying the raster is terrible, but interpolating it with upscaling and frame generation isn't ?

Or did I misread what you're saying
 
Insulting others
So upscaling is not better than native res when the same process is applied to both.
Upscaling is better than native vs previous AA technologies like older DLSS and especially TAA.

So you're saying the raster is terrible, but interpolating it with upscaling and frame generation isn't ?

Or did I misread what you're saying
DLSS 4 looks better than native = Progress
DLSS FG double performance for the same latency (vs native with no DLSS SR and Reflex) = Progress
DLSS MFG Triples performance for the same latency = Progress.

This is “fake” progress only in the eyes of AMD fanboys or low IQ individuals
 
Last edited:
Not every scene will run at 400 FPS, but if you lock the FPS to your monitors maximum refresh rate, with G-Sync enabled, you will ensure you get the maximum smooth frame locked gaming experience possible. So glad this is finally possible.

A lot of competitor gamers like to say, "yeah, I get 500 FPS" "muh 600Hz" it's not just the high FPS that counts, but the CONSISTANCY, when you move, aim & learn at "constant" framerate, you will always aim the same way, no matter the scene rendered and in most cases, different games, so you as the gamer will remain constant across the board according to your abilities.
if I have the same input time/ latency, the game will feel the same for me despite the fps number. If i have 20 ms with 60fps, and have the same 20ms with ultra cool generated 200fps, it will feel the same, coz 20ms is 20ms. Its prob cool for goofy OLED displays, that goes brrrrr flickering on low refresh numbers.
 
if I have the same input time/ latency, the game will feel the same for me despite the fps number. If i have 20 ms with 60fps, and have the same 20ms with ultra cool generated 200fps, it will feel the same, coz 20ms is 20ms. Its prob cool for goofy OLED displays, that goes brrrrr flickering on low refresh numbers.

Correct, so, get a 165Hz monitor, lock it to that Hz, enjoy 6ms latency. :D

This is why I can never get used to older systems anymore, the high refresh locks ruined me for life and I NEED, a better GPU to keep it near that 165Hz in AAA titles, these technologies from nVidia will ensure it, although, perhaps at a little graphical loss, but from what I saw, it doesn't seem that way anymore.

Anyways, we will see from W1zzards testing. :)
 
If this keeps latency the same as normal FG (which it should I guess, since it's still using a frame before and after) then it's super cool. If it increases latency, meeeh.
 
ok if go this road.
I think 5070 has unacceptable RAM size.


I wish it worked like that xd
12GB is indeed low.
But it seems like Nvidia tries to “force” devs to use Neural Textures.
Same or even better quality textures for up to 7x less VRAM usage.
 

Attachments

  • Screenshot_2025-01-07_082637.png
    Screenshot_2025-01-07_082637.png
    2.5 MB · Views: 69
  • Screenshot_2025-01-07_082645.png
    Screenshot_2025-01-07_082645.png
    2.4 MB · Views: 53
  • Screenshot_2025-01-07_082712.png
    Screenshot_2025-01-07_082712.png
    52 KB · Views: 56
Probably but image quality is going to be dire and they'll be other trade offs. E,g input lag, screen tearing, artifacting, shimmering and the rest.
 
Before you guys make stupid comments about latency, the latency is actually the same (Nvidia has a video comparing DLSS FG vs Multi-Frame DLSS FG). They also just released Reflex 2 which further reduces latency by 50%.
So more generated frames but LOWER latency at the same time.
You get so many frames at such low latency that you barely see what that games were meant to look like. 3 generated frames per 1 real frame, rendering accuracy my ass.
 
image quality is going to be dire and they'll be other trade offs. E,g input lag, screen tearing, artifacting, shimmering and the rest.
That reminds me of native TAA. Smearing, ghosting, input lag (without reflex input lag is horrible) etc. :roll:
 
Yep, called it many months ago, a gazillion interpolated frames. Screw it just modify the driver so that it always reports 99999999 FPS, why keep doing this ? That's the end game anyway.
 
12GB is indeed low.
But it seems like Nvidia tries to “force” devs to use Neural Textures.
Same or even better quality textures for up to 7x less VRAM usage.
So does the game needs to support Neutral Rendering? If so, I'm sol, most of the games I play the devs are lazy and most of the time there's only Fsr1 Dlss2.0(does almost nothing due to crap dev implementation)
 
I don't care about fake frames or fake resolution, show me pure raster performance
Yep, more blurryness for unoptimised games ??
sound great

By the way, Am I the only one who have the feeling that 5090 is ultra slow garbage that is barely doing 30 fps in 4k when DLSS is disabled? Its not me, its Nvidia's own claim
View attachment 378677
Alan wake : UE5 under 50 fps at 4k
YAY :rockout:
 
Last edited:
Back
Top