• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2

there's require and require.
look what a 11tflops gpu with no dedicated rt hardware (1080Ti) does agaist a 6.5tflops rtx 2060 with rt asic


you'd need 2.5x 1080Ti's power to match a rtx 2060


imo RT will stay an optional technology for at least a decade.but I can't see any gpu manfufacturer not supporting it starting from mid range cards.

the alternative to buying a $400 card and enjoying ray traced shadows/reflections in a few games is paying the same for the same performance and not having it.really.
You do realize that Q2 RTX doesn't even use the RT cores for Ray tracing that NV is so fond of? What you are doing is quoting what NV has planned all along. Nice marketing for RTX cards stating that you need RT cores. Since NV is asking so much for RTX it would have been stupid if quake 2 worked well on 1080 Ti now would it? The price needs to be justified and you fall for it.
Do you know why I know this ?
Because https://www.cryengine.com/news/view...-time-ray-tracing-demonstration-for-cryengine
doesn't need RT cores to get this one done and works with sufficient performance and it is ray tracing just as in Quake 2. (Actually it looks even better than in quake) They have added RTX to make it believable that the RT core are actually necessary and also a great marketing for NVidia cards with Ray Tracing. Cripple the driver for 1080 TI so that it doesn't work properly and here you have a great evidence.
Did you expect that NV will release RTX cards with RT cores without giving any rational reason and justification for the price even if NV has to forge that reason which is Quake2 RTX?
 
AMD can cut price can NV do that too? With it's fancy, expensive RT cores? I really doubt it.

Nvidia's margins are much higher than AMD's. They have already cut prices somewhat at the launch of the Super cards, by offering better chips for the same value.

Nvidia doesn't cut prices even more, because it doesn't have to, really. They currently have about 73% market share in dedicated GPUs. It is AMD that needs to gain market share and so has to subject itself to earn less.
 
Nvidia's margins are much higher than AMD's. They have already cut prices somewhat at the launch of the Super cards, by offering better chips for the same value.

Nvidia doesn't cut prices even more, because it doesn't have to, really. They currently have about 73% market share in dedicated GPUs. It is AMD that needs to gain market share and so has to subject itself to earn less.
Sure but I don't see AMD actually being scared about NV's price cuts. Actually the 5700 series is selling pretty well.
Huuh. Cut price to offer better chips for the same value? I though it was the natural way of new graphics cards evolution and releases but I guess it has been brought down to the goodness of NVidia now.
You are missing the rest of the conversation to say NV doesn't have to cut prices. It will have to cut prices and you will see soon why. AMD my dear foreign friend doesn't need to do anything now and it is evidently clearly today. It is NV that is running around town like a boogeyman trying to scare people off with not having RT cores.
 
You do realize that Q2 RTX doesn't even use the RT cores for Ray tracing that NV is so fond of?
Yes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.

Do you know why I know this ?
Because https://www.cryengine.com/news/view...-time-ray-tracing-demonstration-for-cryengine
doesn't need RT cores to get this one done and works with sufficient performance and it is ray tracing just as in Quake 2.
First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.
 
Last edited:
You do realize that Q2 RTX doesn't even use the RT cores for Ray tracing that NV is so fond of? What you are doing is quoting what NV has planned all along. Nice marketing for RTX cards stating that you need RT cores. Since NV is asking so much for RTX it would have been stupid if quake 2 worked well on 1080 Ti now would it? The price needs to be justified and you fall for it.
what?of course it does.
Jesus the red base fans their theories :rolleyes:

Because https://www.cryengine.com/news/view...-time-ray-tracing-demonstration-for-cryengine
doesn't need RT cores to get this one done and works with sufficient performance and it is ray tracing just as in Quake 2. (Actually it looks even better than in quake) They have added RTX to make it believable that the RT core are actually necessary and also a great marketing for NVidia cards with Ray Tracing. Cripple the driver for 1080 TI so that it doesn't work properly and here you have a great evidence.
Did you expect that NV will release RTX cards with RT cores without giving any rational reason and justification for the price even if NV has to forge that reason which is Quake2 RTX?
NeonNoir only has reflections at 1 ray per 4 pixels and it already suffers immensely.This is worse than RTX low doing 1 ray per 2 pixels in worst case scenario,and it's a synthetic benchmark not a game.
lol,you're in a big bubble sir.

Huuh. Cut price to offer better chips for the same value? I though it was the natural way of new graphics cards evolution and releases but I guess it has been brought down to the goodness of NVidia now.
lel,just like 5500xt
 
Last edited:
Yes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.

First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.
There is no distinguished difference between Ray Tracing and Path Tracing. Path tracing is supposed to be a faster form of Ray Tracing and that is basically it.
what?of course it does.
Jesus the red base fans their theories :rolleyes:
No it doesn't :) That is the funny part :D You think that RT core speed up ray tracing and that is not the case.
Besides I'm not a red based fan so quit that. Who's being a prick now? :p

NeonNoir only has reflections at 1 ray per 4 pixels and it already suffers immensely.This is worse than RTX low doing 1 ray per 2 pixels in worst case scenario,and it's a synthetic benchmark not a game.
lol,you're in a big bubble sir.
We will see who is in a big bubble (Whatever that means) in time. New engine will be available in full extent soon and there will definitely be games using it. this will be a good indication of what is actually needed. Those rays per pixel can be increased you know. It is a demo showcase to show what it can do like a CPU sample. It is not the released product so be patient. You just don't see it yet and if I'm supposed to be a red based fan with theories than you are a blind green fan without any theories or reasoning for that matter. :)
 
Last edited:
No it doesn't :) That is the funny part :D You think that RT core speed up ray tracing and that is not the case.
absolutely.they built 750mm2 dies just to cripple 1080Ti in the end.

tenor.gif
 
absolutely.they built 750mm2 dies just to cripple 1080Ti in the end.
They have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.
The difference in performance in ray tracing scenarios and non ray tracing environment between 2080 and 1080 is more less the same in both cases. So how is the RT cores supposed to speed things up for ray tracing?
This means that the 2080S is simply faster graphics.

I will follow up this stuff a bit more to evaluate it and see if this is true for sure. I suggest you do the same thing.
 
You think that RT core speed up ray tracing and that is not the case.
Why would you claim this? Do you have any reference or proof?
The difference in performance in ray tracing scenarios and non ray tracing environment between 2080 and 1080 is more the same. So how is the RT cores supposed to speed things up for ray tracing?
This means that the 2080S is simply faster graphics.
RTX2080 is about on par with GTX1080Ti, if a little bit above it. Super variant is a few more percent ahead. There are improvements other than RT cores that allow Turing cards to get a performance lead over Pascal if used properly (and Neon Noir seems to be a good example of that). When RT Cores are used in games that have DXR effects or the Vulkan counterparts - Quake 2 RTX, BF V, Metro Exodus, SoTR - RTX cards blow GTX cards out of the water.
 
Why would you claim this? Do you have any reference or proof?
good one,haha.


I don't think he gets rasterized vs ray traced.


raytracingsm.jpg


raytracingacceleration.jpg





2080Ti with 13.5 tflops and rt+tensor cores 40 fps
Titan V with 15 tflops and no rt cores 28 fps

Yes, it does. RT cores are why Turing is that much faster in Q2 RTX over GTX cards. The problem with Q2 RTX is different - it is not a good representation for hybrid RTRT solutions because it is not one. Q2 RTX is completely pathtraced.

First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.
when using simpler forms of RT,like shadows only,1080Ti is closer to 2060,still loses by 40%


interestingly,the perfromance penalty is over 100% on 1080Ti,80% on 1660Ti (tensor) but only 19% on RTX 2060.
1080Ti tends to produce more noisy image too.
 
Last edited:
Sure but I don't see AMD actually being scared about NV's price cuts. Actually the 5700 series is selling pretty well.
I guess that depends on your definition of well.
As can be seen in the Steam Hardware Survey, it has done little to impact AMD's market share and is still outsold by Nvidia's comparable products;

AMD Radeon RX 5700 XT 0.22% (+0.07%)

NVIDIA GeForce RTX 2060 1.95% (+0.41%)
NVIDIA GeForce RTX 2070 1.60% (+0.19%)
NVIDIA GeForce RTX 2070 SUPER 0.42% (+0.17%)
NVIDIA GeForce RTX 2060 SUPER 0.25% (+0.10%)

As you can see, in this segment Nvidia is outselling them ~10:1.

They have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.
You're not even trying to be serious. Grow up or go play somewhere else!

Anyone with a basic understanding of 3D graphics knows ray tracing to be necessary to get good lighting.
 
I guess that depends on your definition of well.
As can be seen in the Steam Hardware Survey, it has done little to impact AMD's market share and is still outsold by Nvidia's comparable products;

AMD Radeon RX 5700 XT 0.22% (+0.07%)

NVIDIA GeForce RTX 2060 1.95% (+0.41%)
NVIDIA GeForce RTX 2070 1.60% (+0.19%)
NVIDIA GeForce RTX 2070 SUPER 0.42% (+0.17%)
NVIDIA GeForce RTX 2060 SUPER 0.25% (+0.10%)

As you can see, in this segment Nvidia is outselling them ~10:1.


You're not even trying to be serious. Grow up or go play somewhere else!

Anyone with a basic understanding of 3D graphics knows ray tracing to be necessary to get good lighting.
Market share is different from sales since we are not talking in general but one segment? Which market you are talking about here? I remember you claimed that MindFactory.de is not relevant and yet steam is? Anyway NV is making a lot of noise around RT is it not? I don't see that from AMD side and yet as you said AMD is the one should be trying harder.

I am serious the same way I see you being serious.
 
@efikkan a better comparison is probably Super cards as both RTX2060 and RTX2070 have been on the market for about a year more than Navi cards while RTX2060 Super/RTX2070 Super were released right before RX5700/RX5700XT. RX5700 does not seem to be listed separately in the Steam HW survey, meaning it is either rolled into 5700XT or more likely is <0.15%. There is still a twofold difference but Navi is doing quite well.
 
lol,show us the "data" you and your collague managed to obtain while you were doing your "research" wink wink ;);)
 
Last edited:
Market share is different from sales since we are not talking in general but one segment? Which market you are talking about here? I remember you claimed that MindFactory.de is not relevant and yet steam is?
I'm talking of market share in the gaming market, which is a subset of the entire PC market.
The fact is that AMD's market share among gamers have stayed stagnant at 15%, which also includes APUs from AMD. For the past three years AMD have not been present in the high-end, stayed at ~10% or less of the mid-range, while many have been touting Polaris, Vega and now Navi as "great successes". In general sales AMD have about 20-25% discrete GPUs, but most people keep forgetting that a lot of this is from OEM sales of low-end GPUs that are not used for gaming. Steam is the most dominant platform among PC gamers, and is very much representative of the PC gaming market, anyone who understands representative samples would understand this. There is nothing that is more representative than the Steam statistics at this point.

Anyway NV is making a lot of noise around RT is it not? I don't see that from AMD side and yet as you said AMD is the one should be trying harder.
Over the past five years AMD have been making way more noise over "their stuff" than anyone else, including Mantle, the myth of "better" Direct3D 12 performance, FreeSync being "free", etc.

While RT may not be super useful yet, it will be at some point. All hardware support have to start somewhere, and hardware support have to come before software support.

@efikkan a better comparison is probably Super cards as both RTX2060 and RTX2070 have been on the market for about a year more than Navi cards while RTX2060 Super/RTX2070 Super were released right before RX5700/RX5700XT.
Just in the past month there has been added nearly twice as many RTX 2060s as there are RX 5700 XTs in total. If you add up the percentage-points of gain for these Nvidia cards it is 0.87% compared to RX 5700 XTs 0.07% gain.
 
They have built it because it is a graphics company and "leather jacket" must have something to brag about and this time around it was RT cores. Let's see what he will come up with next year.

Yeah, Nvidia thought, "let's release cards with bigger and more expensive dies, with RT cores that do nothing, just to brag about it".

You must think that the people who work at Nvidia are all stupid and make business decisions that involve millions and millions of dollars, just for the bragging rights.
 
First, it is not raytracing just as in Q2 RTX. Neon Noir is hybrid RTRT solution with only reflections being raytraced. Out of RTX games, Battlefield V is its closest analogue.
Second, Neon Noir runs at 1080p 30FPS on Vega 56 and about the same on GTX 1080. For comparison, Battlefield V with DXR on (High, not Ultra) can be run on GTX 1080 at very similar 1080p 30FPS.

Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet? Add your score :)

The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.

Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.
 
Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet?

The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.

Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.
Exactly have we seen an example of DXR yet? the agnostic solution where the field is ”level”
 
Yeah, Nvidia thought, "let's release cards with bigger and more expensive dies, with RT cores that do nothing, just to brag about it".

You must think that the people who work at Nvidia are all stupid and make business decisions that involve millions and millions of dollars, just for the bragging rights.

Its very clear what Nvidia is looking at: 4K adoption rate is not really going places and those who do have it, tend to lower their res anyway. But Nvidia also has trouble making cards much faster than 1080ti, I mean the 2080s are baby steps and the 2080ti is way too large to be economical hence its price. At the same time, there is good growth in demand for high refresh rates, but even that is very feasible on the current crop of cards for most games, especially competitive ones.

Essentially, Nvidia was looking for a new buyer's incentive/upgrade incentive and found it in RT. Marketing then made us believe the world is ready for it. That is how these things go :)

So really, lacking the content, Nvidia surely released Turing cards with the idea to brag about it. It is what Jensen has been doing since day one. It just works, right? We were going to buy more to save more because dev work was going to become so easy, if you winked at the GPU it'd do the work for you. Or something vague like that. And then there is reality: a handful of titles with so-so implementations at a massive FPS hit ;)

This also explains why AMD cares a lot less, and just now starts to push it to console. Their target market doesn't really care, and represents the midrange. AMD has no urge to push this forward other than telling the world they still play along.
 
Last edited:
Eh what? I run that bench at 60-80 FPS on my 1080. Did you try it yet? Add your score :)

The problem Neon Noir has is accuracy, but RT isn't all that accurate yet either, it just resolves the lack of detail differently. Ill take the software box of tricks in Neon Noir over BFV's RT implementation any day of the week.

Really the debate is still ongoing on what is the best solution. Some hardware for it, sure. Large sections of a die? Not so sure, this will probably get integrated in a way and Turing is just an early PoC.
Sorry, my bad. 1080p@30FPS was the initial claim from CryTek. This seems to be the 99% low result for Vega56 and it actually runs faster. GTX1080 is in the same ballpark. Comparison to Battlefield is off, you are right about that.

Neon Noir has cool optimizations that benefit performance. Things like only doing RT for short range and falling back to Voxels when it is beneficial.
By the way, CryTek should (and plans to) use assistance from DXR or Vulkan's RT extensions in their engine.
In comparison to Neon Noir, what exactly makes you dislike BFV's RT implementation?

Best solution is relative. RT cores is not an RT solution. It is a hardware assist to casting rays. The exact algorithm and optimizations are up to developer.

Exactly have we seen an example of DXR yet? the agnostic solution where the field is ”level”
On Nvidia side of things, any DXR game will give an idea what RT performance differences are between Pascal, Turing and Turing with RT cores.
If we want to compare AMD vs Nvidia, we cannot. AMD cards/drivers have no DXR implementation.
 
Sorry, my bad. 1080p@30FPS was the initial claim from CryTek. This seems to be the 99% low result for Vega56 and it actually runs faster. GTX1080 is in the same ballpark. Comparison to Battlefield is off, you are right about that.

Neon Noir has cool optimizations that benefit performance. Things like only doing RT for short range and falling back to Voxels when it is beneficial.
By the way, CryTek should (and plans to) use assistance from DXR or Vulkan's RT extensions in their engine.
In comparison to Neon Noir, what exactly makes you dislike BFV's RT implementation?

Best solution is relative. RT cores is not an RT solution. It is a hardware assist to casting rays. The exact algorithm and optimizations are up to developer.

Not so much dislike, I just fancy the hybrid solution better because it will help adoption better. BFV is proof of concept, CryEngine makes it marketable for mainstream audience.

Also I don't believe games need the high accuracy at all. Especially in motion, the cost of that detail just isn't worth it. On top of that, games are an artistic product, even those that say they want to 'look real'. Its still a scene and it still has its limitations, and therefore still needs tweaking because just RT lighting makes lots of stuff unplayable.
 
Not so much dislike, I just fancy the hybrid solution better because it will help adoption better. BFV is proof of concept, CryEngine makes it marketable for mainstream audience.
All of these are hybrid solutions. It is just a question of LOD, falloff distances and what it falls back to. Neon Noir is not a good representation of a game. It is a fixed techdemo meaning it is no doubt very well optimized.

RT effects, including DXR support, are there or coming to large engines. Unreal has those, Unity has those (not sure if still in preview or production build), CryEngine has RT but no DXR support yet. Others will not be far behind.
 
Its very clear what Nvidia is looking at: 4K adoption rate is not really going places and those who do have it, tend to lower their res anyway.…
Essentially, Nvidia was looking for a new buyer's incentive/upgrade incentive and found it in RT…
So really, lacking the content, Nvidia surely released Turing cards with the idea to brag about it.
I'm seriously concerned if you believe your own words, because all of that is a truckload worth of ox manure.

Ray tracing has been requested by graphics developers for over a decade. Every new GPU generation has given us more performance and memory, easily allowing developers to throw in larger meshes, finer grained animations and higher detailed textures, which is easy since most assets are modeled in higher detail anyway. But lighting and shadows have been a continuous problem. Simple stencil shadows and pre-rendered shadow maps is not cutting it any more as the other details of the games keeps increasing. Pretty much every lighting effect you see in games are just cheap clever tricks to simulate the real thing, and quite often only "work well" under conditions and may result in unwanted side-effects. Programming all these effects is also quite challenging, and may have to be adapted to all the various scenes of a game.

Simply put; developers want RT more than Nvidia. But we are only in the infant stages of RT this far, it's still too slow to be used to the extent developers want. So for now, it has to be used in a limited fashion.
 
I'm seriously concerned if you believe your own words, because all of that is a truckload worth of ox manure.

Ray tracing has been requested by graphics developers for over a decade. Every new GPU generation has given us more performance and memory, easily allowing developers to throw in larger meshes, finer grained animations and higher detailed textures, which is easy since most assets are modeled in higher detail anyway. But lighting and shadows have been a continuous problem. Simple stencil shadows and pre-rendered shadow maps is not cutting it any more as the other details of the games keeps increasing. Pretty much every lighting effect you see in games are just cheap clever tricks to simulate the real thing, and quite often only "work well" under conditions and may result in unwanted side-effects. Programming all these effects is also quite challenging, and may have to be adapted to all the various scenes of a game.

Simply put; developers want RT more than Nvidia. But we are only in the infant stages of RT this far, it's still too slow to be used to the extent developers want. So for now, it has to be used in a limited fashion.

Source, please. And not an Nvidia branded or affiliated one if you wouldn't mind.

Even if just for sanity check purposes... because when I hear 'developers want for a decade' all I really hear is 'we've been working on this for 10 years, and finally, here it is' (Huang himself @ SIGGRAPH). I've seen too much spin in my life to take this at face value. There is always an agenda and its always about money.
 
Last edited:
On Nvidia side of things, any DXR game will give an idea what RT performance differences are between Pascal, Turing and Turing with RT cores.
If we want to compare AMD vs Nvidia, we cannot. AMD cards/drivers have no DXR implementation.
Yet, but DXR is where “the rubber meets the road” where both sides are using the same API and where we see how devs will focus the new tech. Vulkan RT also would apply We know AMD has their answer ready with the new Xbox announcement , just waiting on their dGPU answer.
 
Back
Top