• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Introduces DLSS 4 with Multi-Frame Generation for up to 8X Framerate Uplifts

my body is ready
 
I guess the summary is that Nvidia has moved on from traditional gen ras rendering to AI/RT/DLSS rendering. If you buy a 5000 series card, expect to implement this new render method or fail to get your money’s worth.

For the rest of us who just want to game old school on an old school budget, Intel and AMD will do just fine.
Inb4 Intel/AMD introduce a similar tech. Even in the wake of the RX 9000 slide and FSR4, Xess2 people still seem to watch the GPU market with half rose tinted glasses and really believe that AMD/Intel won’t follow in Nvidia footsteps. RDNA4 is going to be the last Gaming focused arch from AMD, UDNA will merge the HPC side with the consumer side. With ever better performance in AI tasks, even if gamers don’t care. Same pattern as Nvidia.

Battlemage also got it’s own set of issues if you don’t use it with a recent fast CPU, the driver overhead is massive. And can diminish it’s price/performance ratio in some games.
IMG_3694.png


All I’m seeing are companies offering a slightly better price to performance ratio because they struggle to take the crown. Even with their focus on software trickery, Nvidia is somehow still selling the fastest GPU in rasterisation (when the other are supposedly rasterisation specialists )
 
The difference in details between the new transformers based models and the old CNN based models seems huge.

Old vs New:
1736256100371.png
1736256107698.png


At 4:26 he says that the new models require four times more compute during inference. Inference takes only a small part of the whole frame time, so the final performance impact won't be nearly so dramatic.

Interested to see a review of how the quality/performance of the new models compares to the old models.
 
. Even with their focus on software trickery, Nvidia is somehow still selling the fastest GPU in rasterisation (when the other are supposedly rasterisation specialists )
Thats crazy right? People complaining about generated frames and what not when the company that has those features also has the fastest cards in both RT and raster
 
As usual a good and sober take from HUB:

 
Something's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
 
Thats crazy right? People complaining about generated frames and what not when the company that has those features also has the fastest cards in both RT and raster
I don't see what those two things have to do with another. Complainers want GPU manufacturers to focus less on frame insertion and more on actual raw performance gains from one generation to the next.
It's a completely valid complaint.
I mean, look at those "performance" comparisons from the 5000 series marketing. Comparing 4000 series with a card that inserts 3x more frames. That's absurd at best.
 
Code:
DLSS4 = gpu.modelNumber > 5000

Sorry. Couldn't resist. :p
i like it, i like it alot… but,
wheres the declaration that variable DLSS4 is a boolean ?… and the declaration that gpu.modelNumber is an int variable.. LOL,

/possibly-wrong-have-not-programmed-in-20-years… so i am not sure what happens when variable are not declared in the lastest ”fancy” programming language… may AI compilers will clean that up…( lol)
but in any case, like it when people can refactor code into something shorter…. and means that they can debug code which is more important than writing code, IMO.
 
Last edited:
I don't consider faking things progress, especially when it comes with noticeable impacts on quality and latency
If you like it, and you cant notice the impacts yourself, I won't stop you but what I'm interested in is raster
Everything about rendering games is fake. You do know that right? It's just different methods of fakery.
 
Everything about rendering games is fake. You do know that right? It's just different methods of fakery.
I don't care if it's fake or not, all I care about is image quality I see on screen and performance. All I know is that native rasterization at high resolutions (I mainly game at 3160x3160p*2 in VR) looks much better than either DLSS, FSR, XSEE, or Valve's reprojection method or Nvidia's "optical flow" reprojection. DLSS frame generation not being available in VR says a lot about the quality of inserted frames. You can fool your brains when looking at the monitor, but it's hard to do the same on 10 megapixel panels inches away from your eyes.
 
Something's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
Rasterization is dead. You just don't know it yet.

Rasterization is just a way of painting something. That's it. There is more than one way. With CUDA nvidia turned into an AI company. So these are AI cards period. However AI is another way to paint a game if you want to. As nvidia is the market leader they are going to drag the entire industry to this. In the future AI and all these tricks will be how you render a game and rasterization will be dead as horse buggy. You don't have a choice in this it's already happening. Rasterization is on the way out and will be gone.

Once that's done even the engine and other things are going to move to AI. You just don't realize it yet. Everything is moving to AI and the cloud and PC gamers still have their heads in the sand about what's been going on for years now even though the companies involved have been talking about it openly.
 
Rasterization is dead. You just don't know it yet.

Rasterization is just a way of painting something. That's it. There is more than one way. With CUDA nvidia turned into an AI company. So these are AI cards period. However AI is another way to paint a game if you want to. As nvidia is the market leader they are going to drag the entire industry to this. In the future AI and all these tricks will be how you render a game and rasterization will be dead as horse buggy. You don't have a choice in this it's already happening. Rasterization is on the way out and will be gone.

Once that's done even the engine and other things are going to move to AI. You just don't realize it yet. Everything is moving to AI and the cloud and PC gamers still have their heads in the sand about what's been going on for years now even though the companies involved have been talking about it openly.
I'll believe it when I see it. Sure, maybe in 10, 15 years down the road rasterization will be dead, until then I'm still gonna buy GPUs on a basis of how fast GPU can rasterize. Software companies don't move as fast as hardware devs want them to move if at all. It took 10 years for Nvidia to force Cudas in general software. We had hyperthreading hardware option for ages and it's only today that parallel computing has really being implemented to some extend and in most cases still poorly I may add.
 
I'll believe it when I see it. Sure, maybe in 10, 15 years down the road rasterization will be dead, until then I'm still gonna buy GPUs on a basis of how fast GPU can rasterize. Software companies don't move as fast as hardware devs want them to move if at all. It took 10 years for Nvidia to force Cudas in general software. We had hyperthreading hardware option for ages and it's only today that parallel computing has really being implemented to some extend and in most cases still poorly I may add.
Then you'll get run over. There's a reason all the performance is TOPS now. You don't have a choice in this. You're seeing it now. Clinging to rasterization is like humping a dead pig now. Sure you can do it, but it doesn't mean you aren't humping a dead pig.
 
Then you'll get run over. There's a reason all the performance is TOPS now. You don't have a choice in this. You're seeing it now. Clinging to rasterization is like humping a dead pig now. Sure you can do it, but it doesn't mean you aren't humping a dead pig.
I've been in this hobby since Amiga/Commodore 64 days and I'm still here. New tech comes and go, lots of hype, some changes from time to time once dust settles as game goes on. No need to rush.
 
I've been in this hobby since Amiga/Commodore 64 days and I'm still here. New tech comes and go, lots of hype, some changes from time to time once dust settles as game goes on. No need to rush.
Same and I remember when people threw fits about Steam and digital distribution and here we are. Nvidia is an AI company they keep saying it. These are AI cards that are moving more and more of painting the game over to AI. The industry is following them. You mean not like it but rasterization is dead. The move is already in process. And gamers do not get a vote in it. The only vote is to stop gaming on the PC. Do you game on PC? You're voting for AI and no rasterization and cloud gaming then.
 
Before you guys make stupid comments about latency, the latency is actually the same (Nvidia has a video comparing DLSS FG vs Multi-Frame DLSS FG). They also just released Reflex 2 which further reduces latency by 50%.
So more generated frames but LOWER latency at the same time.

Bold choice to take Nvidia's marketing at face value given it's common knowledge that their numbers are always fanciful.

Wait for reviews, period.
 
i like it, i like it alot… but,
wheres the declaration that variable DLSS4 is a boolean ?… and the declaration that gpu.modelNumber is an int variable.. LOL,

/possibly-wrong-have-not-programmed-in-20-years… so i am not sure what happens when variable are not declared in the lastest ”fancy” programming language… may AI compilers will clean that up…( lol)
but in any case, like it when people can refactor code into something shorter…. and means that they can debug code which is more important than writing code, IMO.
You'll have to direct those questions to the OP because the only thing I did was a very little (and offtopic) refactoring based on the assumption that the original code was writen in a sane language. By sane I mean where true is true and not something like #define true whatever. You see, as a regular reader of The Daily WTF I have Opinions(TM) regarding the best ways to write code. :laugh:
 
Same and I remember when people threw fits about Steam and digital distribution and here we are. Nvidia is an AI company they keep saying it. These are AI cards that are moving more and more of painting the game over to AI. The industry is following them. You mean not like it but rasterization is dead. The move is already in process. And gamers do not get a vote in it. The only vote is to stop gaming on the PC. Do you game on PC? You're voting for AI and no rasterization and cloud gaming then.
Cloud gaming's gonna be a thing when latency, bandwith and stability aren't a problems anymore. I can see most PC and console gamers move to cloud in a 10, 20 years time, but not just yet. The proof is MSFS 2024, nearly 100% cloud based game. If flight simulator on Intel servers can't stream fast and stable enough, we're still far away from achieving satisfactory results for spoiled PC gaming crowd. I believe it will be a combo of streaming and local stored data when it comes to open world games for some time, before moving everything into the cloud.
 
Yep, called it many months ago, a gazillion interpolated frames. Screw it just modify the driver so that it always reports 99999999 FPS, why keep doing this ? That's the end game anyway.

Might as well just have the AI completely simulate the game at that point, no need to buy or install it. According to Nvidia you can have your AI simulated game upscaled with AI enabled DLSS with AI FG, AI textures, AI Animations, AI compression, AI, AI, and more AI. All the while the pasta you are eating was designed by AI, manufactured by AI, the grain grown and picked by AI, and even the factory operation optimized by AI. AI can be used to teach other AI and AI can be used to check the quality of work done by AI. That's the future Nvidia is pushing. I have to wonder where the humans come into the equation. What they are describing are AIs to replace humans, not supplement them. The highly specialized agents are designed to replace professionals. All this tech costs money of course and seeing as the rich are not the generous type I have a pretty good idea of who the primary beneficiaries are.
 
The difference in details between the new transformers based models and the old CNN based models seems huge.

Old vs New:
View attachment 378787 View attachment 378788

At 4:26 he says that the new models require four times more compute during inference. Inference takes only a small part of the whole frame time, so the final performance impact won't be nearly so dramatic.

Interested to see a review of how the quality/performance of the new models compares to the old models.

If the second image is the upscaled version, its very over sharpened, aliasing is far worse, and “details” that don’t exist are being added. Easily a worse end result.
 
Cloud gaming's gonna be a thing when latency, bandwith and stability aren't a problems anymore. I can see most PC and console gamers move to cloud in a 10, 20 years time, but not just yet. The proof is MSFS 2024, nearly 100% cloud based game. If flight simulator on Intel servers can't stream fast and stable enough, we're still far away from achieving satisfactory results for spoiled PC gaming crowd. I believe it will be a combo of streaming and local stored data when it comes to open world games for some time, before moving everything into the cloud.
The catch is what the PC gaming crowd wants doesn't matter one damn bit. PC crowd wants raster cards and free nvidia. The 5090 is an AI, ML, DL, NN card you can run games on and priced as such because gamers do not matter. All the other cards are now AI cards as well because gamers do not matter. Rasterization is being tossed out now for AI and other ways to render because gamers do not matter.

Gamers can toddler stomp footsies in the corner all they want and it doesn't change squat.

Something's not right. Not a single slide showing pure rasterization performance of new GPUs Vs 40 series. I wonder why?:rolleyes: Is it because it sucks as predicted based on shader counts and bit bus numbers (with exception of 5090)?
And this 4:1 frame generation, if Nvidia's "optical flow" used with OPenXR in VR for reprojection (FG), where extrapolating 30hz to 90Hz mostly sucks, it's just a big gimmick to mask poor rasterization advances in Blackwell series.
Let me put it this way. You yourself see it happening here. And then you spin around and say it won't happen and rasterization will stay. You see it with your own eyes and talk about it and then deny reality because you don't want it to be true. But that's the issue. It is true. And if you want to game on your PC you have to eat it now. And if you don't want to eat it you have to get off the PC. In the end, nvidia won and did exactly what they have been telling you they would do, were doing, and now did do!
 
The catch is what the PC gaming crowd wants doesn't matter one damn bit. PC crowd wants raster cards and free nvidia. The 5090 is an AI, ML, DL, NN card you can run games on and priced as such because gamers do not matter. All the other cards are now AI cards as well because gamers do not matter. Rasterization is being tossed out now for AI and other ways to render because gamers do not matter.

Gamers can toddler stomp footsies in the corner all they want and it doesn't change squat.


Let me put it this way. You yourself see it happening here. And then you spin around and say it won't happen and rasterization will stay. You see it with your own eyes and talk about it and then deny reality because you don't want it to be true. But that's the issue. It is true. And if you want to game on your PC you have to eat it now. And if you don't want to eat it you have to get off the PC. In the end, nvidia won and did exactly what they have been telling you they would do, were doing, and now did do!

This is an interesting way to state that you don’t understand rasterization or ray tracing has to exist for AI/Frame Gen/and Upscalers to do what they do. A game has to be rendered via rasterization or ray tracing/path tracing in order to interpolate additional frames. And seeing as no card is truly capable of ray/path tracing, in real time, without the assistance of denoisers and upscalers, rasterization is going no where.
 
Might as well just have the AI completely simulate the game at that point, no need to buy or install it. According to Nvidia you can have your AI simulated game upscaled with AI enabled DLSS with AI FG, AI textures, AI Animations, AI compression, AI, AI, and more AI. All the while the pasta you are eating was designed by AI, manufactured by AI, the grain grown and picked by AI, and even the factory operation optimized by AI. AI can be used to teach other AI and AI can be used to check the quality of work done by AI. That's the future Nvidia is pushing. I have to wonder where the humans come into the equation. What they are describing are AIs to replace humans, not supplement them. The highly specialized agents are designed to replace professionals. All this tech costs money of course and seeing as the rich are not the generous type I have a pretty good idea of who the primary beneficiaries are.
The end game is games designed by prompt and then created in AI by an engine done by AI and then rendered by AI and served by the cloud. The cloud part is up there. Games are already being rendered more and more by AI. Parts of games are already being designed by AI. This isn't a distant in the future parts of it are already in place and it has been speeding up. People just refuse to admit it because they think they are special snowflakes because of their gaming PC and so it won't happen despite the fact that PC gaming is what's leading the way to this future dragging everything else with it.
 
Back
Top