• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Editorial NVIDIA: Image Quality for DLSS in Metro Exodus to Be Improved in Further Updates, and the Nature of the Beast

Right again. Nobody is saying it will be perfect. But they are saying it will get better.

I was simply trying to convey that, over time, things should improve. 3DMark is clearly a best case scenario due to its static and limited FPS in the benchmark, but I don't feel it is misleading/FUD/PR stunt. Time will tell how much improvement we will see.
 
Right again. Nobody is saying it will be perfect. But they are saying it will get better.
There is an upper bound to how good it can get through, that's my point. That's the nature of ML. Accuracy will always be the bane of DLSS because it relies on ML.
 
Right again. Nobody is saying it will be perfect. But they are saying it will get better.

Imagine if they'd say it would get worse over time... :rolleyes:

There is one certainty: it will not be predictable and thus not quite as consistent as many people would want or expect. Keep in mind we're just talking about an alternative to AA, but its being sold as a 'performance improvement with unnoticeable IQ loss'. Well, its as unnoticeable as a 192kbps MP3 is lossy - sounds like absolute shit.

There is just no way this investment will ever pay off, neither for gamers or for Nvidia. It is exactly as Aquinus says, you've got the hardware, why not utilize it. Which once again supports my stance on RTX: afterthought material. I'm not sure how many big fat warning signs people need to realize that...
 
Last edited:
Well, its as unnoticeable as a 192kbps MP3 is lossy - sounds like absolute shit.
Here's the flaw in your logic. Your example depends on several things.
1. That you use the correct encoder and not one sloppily put together by nitwits.
2. That the audio in question being encoded is of enough dynamic acoustic range to need more than 192kbps.
3. That the listener can actually hear the difference between 192kbps and 256kbps or 320kbps.
See if you use a crap encoder it doesn't matter what bitrate you use because it's going to sound like crap anyway. If you use a good encoder but will be playing the music back in a low quality device or in a car, the encoder/bitrate still won't matter. And if the music is hard rock or dialog, there will be very little benefit to using much over 192kbps because the differences will be hard to make out. If you're encoding high dynamic range audio(such as classical music) and will be playing back on high quality equipment and have good encoder than yes, you'll hear the difference between 192kbps and 320kbps.

There is just no way this investment will ever pay off, neither for gamers or for Nvidia. It is exactly as Aquinus says, you've got the hardware, why not utilize it. Which once again supports my stance on RTX: afterthought material. I'm not sure how many big fat warning signs people need to realize that...
My point is; The reason your logic concerning RTX is flawed and incorrect is that you fail to see the points that count and matter. RTRT is going to replace the currently used imperfect and unrealistic lighting/rendering methods because ray-tracing is far more realistic. It's also got decades of proven use and learning behind it. The current lighting methods have reached the limit of what they can do and how far they can be taken before they literally become a form of ray-tracing. RTRT is the natural and logical progression development needs to go. The nay-saying you and others are continuing to regurgitate falls flat on it face becuase it's based on feelings instead of fact and merit based logic. RTRT is the future, the writing is on the wall. NVidia knows it, AMD knows it, Intel knows it, why don't you?
 
Here's the flaw in your logic. Your example depends on several things.
1. That you use the correct encoder and not one sloppily put together by nitwits.
2. That the audio in question being encoded is of enough dynamic acoustic range to need more than 192kbps.
3. That the listener can actually hear the difference between 192kbps and 256kbps or 320kbps.
See if you use a crap encoder it doesn't matter what bitrate you use because it's going to sound like crap anyway. If you use a good encoder but will be playing the music back in a low quality device or in a car, the encoder/bitrate still won't matter. And if the music is hard rock or dialog, there will be very little benefit to using much over 192kbps because the differences will be hard to make out. If you're encoding high dynamic range audio(such as classical music) and will be playing back on high quality equipment and have good encoder than yes, you'll hear the difference between 192kbps and 320kbps.


My point is; The reason your logic concerning RTX is flawed and incorrect is that you fail to see the points that count and matter. RTRT is going to replace the currently used imperfect and unrealistic lighting/rendering methods because ray-tracing is far more realistic. It's also got decades of proven use and learning behind it. The current lighting methods have reached the limit of what they can do and how far they can be taken before they literally become a form of ray-tracing. RTRT is the natural and logical progression development needs to go. The nay-saying you and others are continuing to regurgitate falls flat on it face becuase it's based on feelings instead of fact and merit based logic. RTRT is the future, the writing is on the wall. NVidia knows it, AMD knows it, Intel knows it, why don't you?

Ah, the merit based logic argument.

My logic is the market, and so far its not moving, despite Nvidia's constant push. There is no buzz. People don't give a shit. And neither do I. Games look fine and games with RTRT are absolutely not objectively better looking, in fact, in terms of playability they are occasionally objectively worse. AMD knows it? Nah, AMD is waiting for the dust to settle - a very wise choice, albeit one out of pure necessity. Don't mistake a push from the industry with popularity. There have been many innovations that simply didn't get picked up and are now, at the very best, a niche, if they even still exist. You can look at VR for that as a recent example.

I understand your stance and I recognize it, its the same enthusiasm as we saw with VR. 'This is the future of gaming' people said. Both VR and RTRT are technologies that require a very high penetration rate to actually gain traction, because the initial expense is high and the competition (regular games) is extremely strong and can make competitive product at a much lower price.

Don't mistake 'points that count and matter' - to you - as points that are applicable to everyone. The market determines what technologies live or die, and we all represent an equal, tiny portion of that market.
 
Last edited:
Right, yep. I get it. Makes sense to use the hardware they put on the card.

I was simply trying to convey that, over time, things should improve. 3DMark is clearly a best case scenario due to its static and limited FPS in the benchmark, but I don't feel it is misleading/FUD/PR stunt. Time will tell how much improvement we will see.

I still think there is something else to it. People who've posted comparisons on Port Royal on Reddit (ew), The original non TAA non DLSS textures are much sharper and have greater detail but some how the modeling and the fit looks off. Like over grainy and something is up with intersections of models to textures that benefit when its blurry ala TAA and DLSS is applied. Non TAA DLSS the edges or intersection seam reflective

7rQBWse.jpg
D72SxB3.jpg

The vaseline effect is there along with the loss of texture detail the games are experiencing in FF, BFV and Metro.
 
Last edited:
Something else like what?

If I didn't have a 9900K die on me and tested through 3 boards and endless configs today.. I would test it.
 
Something else like what?

If I didn't have a 9900K die on me and tested through 3 boards and endless configs today.. I would test it.

Aside from the vaseline effect and texture loss. I would call that the norm for how DLSS.

The something else is (unless i'm missing the explanation) kind of edge reflects and grain you don't see in any of the games. Its like the RT settings of roughness is highlighted on the edges causing them to be reflective. Maybe thats the intent or a by product of RT object roughness reflective but it just looks off.
 
No clue bud. All I know is that TPU wouldn't do anything intentionally as was referred to earlier.
 
No clue bud. All I know is that TPU wouldn't do anything intentionally as was referred to earlier.

Not saying they did. They only showed the promo stuff TAA vs DLSS. No ones done much of a comparison outside of that.

7O4VjDm.jpg

ia67yBc.jpg
 
Exodus received a patch today. One thing it addressed is DLSS fixes and improvements to sharpness . I don't have the game so I can't say if it made much of a difference yet.
 
AMD knows it? Nah, AMD is waiting for the dust to settle - a very wise choice, albeit one out of pure necessity.
Or the reason Navi is taking so long is that they are working to add RTRT to it.
I understand your stance and I recognize it, its the same enthusiasm as we saw with VR.
No it isn't. Not by a long shot. RTRT is something that can improve a gaming experience right away and without cumbersome headgear. VR might someday gain traction if they can make it less cumbersome to use and make a "killer app" for it. Raytracing is already a well used and proven technology and now it can be done real-time. RTRT is still in it's "growing pains" stage and will continue to improve.
'This is the future of gaming' very few people said.
Fixed that for ya.
Both VR and RTRT are technologies that require a very high penetration rate to actually gain traction
ALL new tech requires market adoption to succeed. Business 101. Hardly a startling revelation.
Don't mistake 'points that count and matter' - to you - as points that are applicable to everyone.
Ah but they are. Everytime you go to the movies or watch TV, in one form or another, the good SFX you see are ray-traced. The points that count and matter already effect everyone whether they know it or not. Now it has come to gaming in real time. Arguing against it is a waste of time and energy. The efforts made by you and others nay-saying RTRT are tant-amount to trying to stop a tsunami with a foam cup. You can whine and complain all you want, but you can not and will not stop it.

Raytracing is the past, present and future. RTRT is the present and future. Why? Because it mimics nature and looks great when done right. And for similar reasons DLSS is going to be the future of pixel-edge-blending. Anti-aliasing was good for it's time, but it's time has passed.
 
Or the reason Navi is taking so long is that they are working to add RTRT to it.

No it isn't. Not by a long shot. RTRT is something that can improve a gaming experience right away and without cumbersome headgear. VR might someday gain traction if they can make it less cumbersome to use and make a "killer app" for it. Raytracing is already a well used and proven technology and now it can be done real-time. RTRT is still in it's "growing pains" stage and will continue to improve.

Fixed that for ya.

ALL new tech requires market adoption to succeed. Business 101. Hardly a startling revelation.

Ah but they are. Everytime you go to the movies or watch TV, in one form or another, the good SFX you see are ray-traced. The points that count and matter already effect everyone whether they know it or not. Now it has come to gaming in real time. Arguing against it is a waste of time and energy. The efforts made by you and others nay-saying RTRT are tant-amount to trying to stop a tsunami with a foam cup. You can whine and complain all you want, but you can not and will not stop it.

Raytracing is the past, present and future. RTRT is the present and future. Why? Because it mimics nature and looks great when done right. And for similar reasons DLSS is going to be the future of pixel-edge-blending. Anti-aliasing was good for it's time, but it's time has passed.

Its really simple, seeing is believing. Not seeing all that much yet. VR had a very similar problem: content & killer apps.

Well check this thread on resetera, there's some comparison pics after the patch. Vaselin effect is gone so at least is more usable now. They have to fix HDR and some nasty water shimmering still.

Its a sharpening filter and creates different artifacts. Not really much to write home about...
 
Its really simple, seeing is believing.
Yes, and what we've seen so far is impressive and cool.
Not seeing all that much yet.
Gotta give it time. The market didn't see benefit's from Intel's MMX and SSE tech for over a year. AMD's 3DNow took about the same time. Developers need time to utilize the new tech, to refine it and make it shine. Cars don't instantly clean, buff and wax themselves. It takes time and hard work, even when you have tools. The nay-sayers need to get off their "instant gratification" high horse and be patient.
 
Yes, and what we've seen so far is impressive and cool.

Gotta give it time. The market didn't see benefit's from Intel's MMX and SSE tech for over a year. AMD's 3DNow took about the same time. Developers need time to utilize the new tech, to refine it and make it shine. Cars don't instantly clean, buff and wax themselves. It takes time and hard work, even when you have tools. The nay-sayers need to get off their "instant gratification" high horse and be patient.

That is just it. I'm very patient, and I think its too early. That is why I am not buying into Turing.

I still think there is something else to it. People who've posted comparisons on Port Royal on Reddit (ew), The original non TAA non DLSS textures are much sharper and have greater detail but some how the modeling and the fit looks off. Like over grainy and something is up with intersections of models to textures that benefit when its blurry ala TAA and DLSS is applied. Non TAA DLSS the edges or intersection seam reflective

7rQBWse.jpg
D72SxB3.jpg

The vaseline effect is there along with the loss of texture detail the games are experiencing in FF, BFV and Metro.

Wow, I can even notice the difference at this picture size...
 
Yes, and what we've seen so far is impressive and cool.

Gotta give it time. The market didn't see benefit's from Intel's MMX and SSE tech for over a year. AMD's 3DNow took about the same time. Developers need time to utilize the new tech, to refine it and make it shine. Cars don't instantly clean, buff and wax themselves. It takes time and hard work, even when you have tools. The nay-sayers need to get off their "instant gratification" high horse and be patient.


Shitty performance that can be matched with better visuals at 60-70% rendering is impressive and cool? My "Instant Gratification" didn't happen for other tech like AF until we had perfect AF that now costs essentially nothing due to it being slowly integrated into the hardware pipeline with perfect angle independent performance for example, this class of Turing card will never offer that same level of "AA" by nature of it being new, highly untested, and low market adoption makes it the Physx card of the year. Its the equal to a houseboat, sure its a house, its a boat, but it does neither very well and all together someone is all wet.
 
There is just no way this investment will ever pay off, neither for gamers or for Nvidia.
It matters for things like game consoles and mobile gaming because you want to squeeze every bit of performance out of it just to claim things like "runs games 4k". The reality is that it's just "smart" resolution scaling, using the tensor cores to scale it more realistically.

I want to make it clear that I don't think that this is bad, it's just disingenuous to call it in any way anti-aliasing, because it's not. It's really just smart scaling. It's the opposite of nVidia's DSR.
 
It matters for things like game consoles and mobile gaming because you want to squeeze every bit of performance out of it just to claim things like "runs games 4k". The reality is that it's just "smart" resolution scaling, using the tensor cores to scale it more realistically.

I want to make it clear that I don't think that this is bad, it's just disingenuous to call it in any way anti-aliasing, because it's not. It's really just smart scaling. It's the opposite of nVidia's DSR.

Fully agreed.
 
Shitty performance that can be matched with better visuals at 60-70% rendering is impressive and cool?
Have you actually seen the effect in question? 80-90fps is not "Shitty". BF5 is not that great a game, but it does show what can be done, to great effect, with raytracing. Maybe you're not seeing the subtleties?
My "Instant Gratification" didn't happen for other tech like AF until we had perfect AF that now costs essentially nothing due to it being slowly integrated into the hardware pipeline with perfect angle independent performance for example
But that illustrates perfectly the point I was making, AF took time to perfect. That doesn't mean early implementations were perfect, but they were good enough to show the benefit on offer.
this class of Turing card will never offer that same level of "AA" by nature of it being new
That is not logical as stated.
and low market adoption makes it the Physx card of the year
Except that 2080ti's are still selling out and the RTX series is one of the fastest selling lines of GPU's in history. And "Physx" functionality is literally built in to every GPU NVidia makes making it one of the most adopted feature sets in the world.
 
Have you actually seen the effect in question? 80-90fps is not "Shitty". BF5 is not that great a game, but it does show what can be done, to great effect, with raytracing. Maybe you're not seeing the subtleties?

But that illustrates perfectly the point I was making, AF took time to perfect. That doesn't mean early implementations were perfect, but they were good enough to show the benefit on offer.

That is not logical as stated.

Except that 2080ti's are still selling out and the RTX series is one of the fastest selling lines of GPU's in history. And "Physx" functionality is literally built in to every GPU NVidia makes making it one of the most adopted feature sets in the world.

Did we miss something you saw?... Just asking since Nvidia Gaming was down 46% since the intro of RTX. Unless we all missed something in their Financial report.

CEO is telling us it will take 3x times longer to sell cards and here your saying they are the fastest sellers in its GPU history.
 
Last edited:
CEO is telling us it will take 3x times longer to sell cards and here your saying they are the fastest sellers in its GPU history.
Can you remember a time when cards were sold out consistently for a 6 month period? That's never happened before IIRC, even with the GTX1000 series cards. I have done more upgrades to RTX this generation than I've ever done before with the exception of the release of the Radeon 9700. Now maybe, just maybe, it's something weird going on in my part of the country. I doubt it, but maybe.
 
Can you remember a time when cards were sold out consistently for a 6 month period? That's never happened before IIRC, even with the GTX1000 series cards. I have done more upgrades to RTX this generation than I've ever done before with the exception of the release of the Radeon 9700. Now maybe, just maybe, it's something weird going on in my part of the country. I doubt it, but maybe.

Well CEO shouldn't lie unless he wants to get sued by the SEC and possibly be booted out of his own company. Its hard to believe someone that makes a contradictory statement to what we just heard in a Financial report.
 
Back
Top