• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Doesn't Believe in NVIDIA's DLSS, Stands for Open SMAA and TAA Solutions

What do you mean there is less cores doing the rasterization - that without RT cores they could fit some more shaders onto the chip or something else?
RT is being processed concurrently with rasterization work. There are some prerequisites - generally G-Buffer - but largely it happens at the same time.
Are you sure about that it happens at the same time? you need the objects to be there due to make the ray tracing process. Maybe it works alongside but there must be a gap between the rasterization and ray tracing. How can you ray trace an object and add reflections, shadows etc. when the object isn't there?
I think yes. without RT cores there would be more for rasterization (unless RT cores do the rasterization as well? I don't think so) I'm not sure now since you've gotten surprised by my statement. Although it makes sense. RT cores take space. So without RT cores the 2080 TI would have been faster in rasterization.
 
The guy in the video gets some things wrong.
- DXR is not limited to reflections, shadows and AO. Nvidia simply provides a more-or-less complete solutions for these three. You are able do anything else RT yuo want with DXR. While not DX12 - and not DXR - the ray-traced Q2 on Vulkan clearly shows that full-on raytracing can be done somewhat easily.
- While RTX/DXR are named raytracing, they do also accelerate pathtracing.
- OctaneRender is awesome but does not perform miracles. Turing does speed up the non-realtime path-/raytracing software where it is implemented by at least 2, often 3-5 times compared to Pascal. And this is early implementations.

Are you sure about that it happens at the same time? you need the objects to be there due to make the ray tracing process. Maybe it works alongside but there must be a gap between the rasterization and ray tracing. How can you ray trace an object and add reflections, shadows etc. when the object isn't there?
Nvidia as well as various developers have said RT can start occurs after generating G-Buffer. Optimizing that start point was one of the bigger boosts in BFV patch.
 
If the rt cores are doing the ray tracing and that's how it has been shown then there's less cores doing the rasterization.
The RT cores are accelerating the calculation and intersection of rays, but the RT cores themselves are not rendering a complete image.
Rendering is a pipelined process; you have the various stages of geometry processing (vertex shaders, geometry shaders and tessellation shaders), then fragment("pixel shaders" in Direct3D terms) processing and finally post-processing. In rasterized rendering, the rasterization itself it technically only the transition between geometry and fragment shading; it converts vertices from 3D space to 2D space, performs depth sorting and culling, before the fragment shader starts putting in textures etc.
In a fully raytraced rendering, all the geometry will still be the same, but the rasterization step between geometry and fragments are gone, and the fragment processing will have to be rewritten to interface with the RT cores. All the existing hardware is still needed, except for the tiny part which does the rasterization. So all the "cores"/shader processors, TMUs etc. are still used during raytracing.

So in conclusion, he is 100% wrong about claiming a "true" raytraced GPU wouldn't be able to do rasterization. He thinks the RT cores does the entire rendering, which of course is completely wrong. The next generations of GPUs will continue to increase the number of "shader processors"/cores, as they are still very much needed for both rasterization and raytracing, and it's not a legacy thing like he claims.
 
The RT cores are accelerating the calculation and intersection of rays, but the RT cores themselves are not rendering a complete image.
Rendering is a pipelined process; you have the various stages of geometry processing (vertex shaders, geometry shaders and tessellation shaders), then fragment("pixel shaders" in Direct3D terms) processing and finally post-processing. In rasterized rendering, the rasterization itself it technically only the transition between geometry and fragment shading; it converts vertices from 3D space to 2D space, performs depth sorting and culling, before the fragment shader starts putting in textures etc.
In a fully raytraced rendering, all the geometry will still be the same, but the rasterization step between geometry and fragments are gone, and the fragment processing will have to be rewritten to interface with the RT cores. All the existing hardware is still needed, except for the tiny part which does the rasterization. So all the "cores"/shader processors, TMUs etc. are still used during raytracing.

So in conclusion, he is 100% wrong about claiming a "true" raytraced GPU wouldn't be able to do rasterization. He thinks the RT cores does the entire rendering, which of course is completely wrong. The next generations of GPUs will continue to increase the number of "shader processors"/cores, as they are still very much needed for both rasterization and raytracing, and it's not a legacy thing like he claims.

I understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point.
"true" raytraced GPU? You mean like full of RT core only?
 
I understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point.

"true" raytraced GPU? You mean like full of RT core only?
Once again, I refer you back to the quote from the video. He was talking as if the RT cores are the only used during raytracing, and the rest is used during rasterized rendering, which is not true at all.

Think of it more like this; the RT cores are accelerating one specific task, kind of like AVX extensions to x86. Code using AVX will not be AVX only, and while AVX will be doing a lot of the "heavy lifting", the majority of the code will still be "normal" x86 code.
 
I understand how rasterization process works and since it works in a pipeline there's one thing being done after another. If you add ray tracing to the pipeline it takes longer so this means the speed drops down. That's my point.
RT Cores work concurrently.
Wasn't this slide already posted in one of the threads?
lkdfghpzhx-2-100771726-large.jpg
 
Why do TP news often sound as if written by NV's CEO?

DLSS actually looks sharper
Algo that sharpers looks sharper.
Sounds logical.
 
Here's some more DLSS testing.

It doesn't look great. I really can't see the improvement like you guys say.
 
Heh, and yet G-Sync spawned FreeSync™.
I always understood you to be more "in tune" with technologies, as to not purport such a false statement.

Variable refresh rate monitor - VESA Adaptive-Sync standard, which is part of DisplayPort 1.2a; was always a working open standard. It was just Nvidia wanted to use/push there proprietary (added hardware) implementation to jump out in front before VESA (open coalition of monitor and GPU representatives) ultimately finalized and agreed upon standard that requires (no huge additional hardware/cost), but just enabling features already developed in the DisplayPort 1.2a.

AMD and the industry had been working to commence the "open standard", it was just Nvidia saw that not progressing to their liking, and threw their weight to licensing such "add-in" proprietary work-around to monitor manufacture earlier than VESA coalition was considering the roll-out. Now that those early monitor manufacture are seeing sales of those G-Sync monitors not as lucrative (as the once had been to the early adopter community). Nvidia see's themselves on the losing end, decided to slip back in the fold, and unfortunately there's no repercussions to the lack of support they delivered upon the VESA coalition.

Much like the Donald on the "birther" crap, it was "ok" to do it and spout the lies, until one day he finally could no-long pass muster and wasn't expedient to his campaign, and declared he would no talk about it again! I think that's how Jen Hsun Huang hopes G-Sync will just pass...
 
Last edited:
When G-Sync was shown, VESA Adaptive Sync in DP 1.2a did not exist. It did exist on eDP which AMD quickly found when they wanted to have a VRR solution in response. Freesync was first demoed on laptop screens. VESA Adaptive Sync took a while to take off, couple years actually until there was a good bunch of monitors available. Industry had not considered VRR important until then.

G-Sync was announced in October 2013. G-Sync monitors started to sell early 2014.
Freesync was announced in January 2014. VESA Adaptive Sync was added to DP 1.2a in mid-2014. Freesync monitors started to sell early 2015.
 
Last edited:
I know it is but it would seem that some people think it is really great. I just can't understand that. We have moved from smooth and sharp image to blurry which people call it's an improvement and looks better.
RTX on or RTX off lol

Sad part is that it's still easily noticeable enough in the comparisons despite the 1080p YouTube quality.
I looked over few videos in on YouTube. (Hardware unboxed) and they have shown the RTX on and off. It does improve the lighting reflections etc. Especially in Metro. It looks nice and more realistic. Though it eats a lot of resources. This RTX can be seen greatly in outside scenery but also when a lot of objects is moving inside and we have different light sources. In some other no change except FPS drop. Anyway it's a nice feature but when we will be playing with RTX on and 60+ FPS on 4k? I'd say long way to go.
 
RTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success
 
Funny enough, I assumed Nvidia would improve the driver AF levels now that more texture read requests could be issued in newer hardware iterations.
 
RTX & DLSS for sure will meet it's glory later on
Just like how phsyx & hairworks is still being used today, and how everybody have a g-sync monitor

Every Nvidia innovations is guaranteed to be a success
:roll: Oh my sides!! Thanks for the laugh. All of those things have faded away....
 
Heh yeah, all those TressFX games certainly taught them a lesson.
 
Heh yeah, all those TressFX games certainly taught them a lesson.
Exactly all just flash in the pan “shiny things” both sides are equal with their attempts...My New one I’m “enjoying” is Freesync 2 HDR in FC5, FC New Dawn and RE2. TBH it’s not that impressive or noticeable
 
Exactly all just flash in the pan “shiny things” both sides are equal with their attempts...My New one I’m “enjoying” is Freesync 2 HDR in FC5, FC New Dawn and RE2. TBH it’s not that impressive or noticeable

Yeah kinda confirms open doesn't mean good or popular either, TressFX has gone nowhere. Regardless Nvidia users get to enjoy both plus get a choice of G-Sync and FreeSync displays today. Swings and roundabouts eh.
 
Yeah kinda confirms open doesn't mean good or popular either, TressFX has gone nowhere. Regardless Nvidia users get to enjoy both plus get a choice of G-Sync and FreeSync displays today. Swings and roundabouts eh.
Well that just proves G-Sync was unnecessary and the open standard works for everyone.
 
Well that just proves G-Sync was unnecessary and the open standard works for everyone.

Not really, looking at their G-Sync compatible list, the FreeSync working ranges are more often than not awful, and the standards varied from monitor to monitor.... in short their was no standard. G-Syc displays are expensive to at least they have to achive minimum requirements. FreeSync took the machine gun approach.
 
Not really, looking at their G-Sync compatible list, the FreeSync working ranges are more often than not awful, and the standards varied from monitor to monitor.... in short their was no standard. G-Syc displays are expensive to at least they have to achive minimum requirements. FreeSync took the machine gun approach.
Yeah I get that but those monitors are those “cheap and easy” 75hz Freesync range stuff that’s literally “free” Freesync I don’t really put any stock into those.
 
Just what are you expecting out of a 60Hz LCD? 9FPS VRR? The upselling sickens me.
 
Just what are you expecting out of a 60Hz LCD? 9FPS VRR? The upselling sickens me.
I go 60Hz 4k Freesync screen and couldn't be happier :) It's just great. Buying something like that with G-sync would cost a lot more.
Now NV is joining the club with Freesync (this compatible stuff is so damn funny ) since they have seen they are going nowhere with the price. :)
 
Back
Top