• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Ray Tracing and Variable-Rate Shading Design Goals for AMD RDNA2

@efikkan, @Vayra86, I think neither of you are wrong. 4K adoption is not going all too well and with the generally expected rate of 30% more performance per generation there are a couple generations to go until 4K 60FPS is easy. GPU vendors do need a new incentive of some sort to push the envelope.

Raytracing has been kind of coming for a while. Theory is there, research is there but performance simply has not been for anything real-time. Now Nvidia pushed the issue to the brink of being usable. Someone has to push new things for these to be implemented and widespread enough especially when it comes to hardware features.

@Vayra86 just look at how lighting and shadowing methods have evolved. Shadow maps, dynamic stencil shadows, soft/hard shadows and the ever more complex methods for these. Similarly and closely related - GI methods. Latest wave of GI methods were SVOGI (that CryEngine uses for fallback in Neon Noir and their RT solution is evolved from) and Nvidia's VGXI are both Voxel-based and with a very noticeable performance hit. In principle both get more and more closer to raytracing. Also keep in mind that rasterization will apply several different methods on top of each other, complicating things.

If you think Nvidia is doing this out of the blue - they are definitely not. A decade of experience in OptiX gives them a good idea about where the performance issues are. Of course, same applies to AMD and Intel.
 
Pushing 4k cards would be way more profitable for nvidia than RTX,dunno what you're taking about Vayra
 
@efikkan, @Vayra86, I think neither of you are wrong. 4K adoption is not going all too well and with the generally expected rate of 30% more performance per generation there are a couple generations to go until 4K 60FPS is easy. GPU vendors do need a new incentive of some sort to push the envelope.

Raytracing has been kind of coming for a while. Theory is there, research is there but performance simply has not been for anything real-time. Now Nvidia pushed the issue to the brink of being usable. Someone has to push new things for these to be implemented and widespread enough especially when it comes to hardware features.

@Vayra86 just look at how lighting and shadowing methods have evolved. Shadow maps, dynamic stencil shadows, soft/hard shadows and the ever more complex methods for these. Similarly and closely related - GI methods. Latest wave of GI methods were SVOGI (that CryEngine uses for fallback in Neon Noir and their RT solution is evolved from) and Nvidia's VGXI are both Voxel-based and with a very noticeable performance hit. In principle both get more and more closer to raytracing. Also keep in mind that rasterization will apply several different methods on top of each other, complicating things.

Thank you once again for your nuance and informative posts :respect:
 
Pushing 4k cards would be way more profitable for nvidia than RTX,dunno what you're taking about Vayra
Yet ASUS just released a 27” 1080p 265hz IPS. I don‘t disagree 4K is more “reasonable” goal to shoot for but adoption is still low, but eventually we’ll be at the point where 4K and RT are gonna have to be a thing for the “bleeding edge crowd” I mean I’m part of the HDR minority another tech that’s still hit and miss.
 
Yet ASUS just released a 27” 1080p 265hz IPS. I don‘t disagree 4K is more “reasonable” goal to shoot for but adoption is still low, but eventually we’ll be at the point where 4K and RT are gonna have to be a thing for the “bleeding edge crowd” I mean I’m part of the HDR minority another tech that’s still hit and miss.
4k is waaaaaay more popluar.youve got 50 4k options for one 240hz
 
4k is waaaaaay more popluar.youve got 50 4k options for one 240hz
I’m not saying there”s not options for 4K but the entry barrier is still high when you factor the GPU to get the 4K 60hz experience. 1080 is still the “norm” when you consider the market overall when the 1060 is “median“ of the GPU user base.
 
Pushing 4k cards would be way more profitable for nvidia than RTX
Imagine if Nvidia used all* the die space of the RT cores for just more SMs; it would obviously be more profitable for Nvidia, and people would of course get more excited, boosting demand probably higher than supply. Instead Nvidia opted for a more "balanced" approach (in their opinion), with adding some more performance in general while also adding RT acceleration to bring the advancement of technology forward. In the short term not a financial "smart move", but it certainly is in the long term.

*) They would have to scale it a little back to avoid too high power consumption.
 
Imagine if Nvidia used all* the die space of the RT cores for just more SMs; it would obviously be more profitable for Nvidia, and people would of course get more excited, boosting demand probably higher than supply. Instead Nvidia opted for a more "balanced" approach (in their opinion), with adding some more performance in general while also adding RT acceleration to bring the advancement of technology forward. In the short term not a financial "smart move", but it certainly is in the long term.
RT Cores and Tensor cores are ~10% of the die space, RT cores are smaller part of that, in the range of 3%. This would not be enough for noticeable general performance boost. There is a case to be made about Tensor cores but a large part of that seems to be double-use for FP16 and possibly the concurrent FP+INT. Power concerns are on top of that.

AMD's RDNA2 seems to go and follow the same general idea as Nvidia did - small specialized ASIC added to existing pipeline, in RDNA2 reportedly next to or in the TMUs.
 
I hope they improve RT/tensor count.Either go big or go home nvidia.
Control was beautiful but still played like a demo with DLSS at 55-62 fps.I want steady 70 fps at native with a $600 prie tag max.
 
I hope they improve RT/tensor count.Either go big or go home nvidia.
Control was beautiful but still played like a demo with DLSS at 55-62 fps.I want steady 70 fps at native with a $600 prie tag max.
Of course it will get better and more viable as newer generations of cards move forward, like all new tech SM3.0
, Physx, the evolution of DX, Tessellation, all things that needed a generation to become useable and another one to become features we now take for granted and are just standard settings we don’t even think about anymore effecting our performance.
 
Of course it will get better and more viable as newer generations of cards move forward, like all new tech SM3.0
, Physx, the evolution of DX, Tessellation, all things that needed a generation to become useable and another one to become features we now take for granted and are just standard settings we don’t even think about anymore effecting our performance.
yup.
remember how expensive gtx 8800 was ?
it was a breakthrough card nevertheless.
now we've got rtx 2060 running circles around 1080Ti in ray tracing.
 
yup.
remember how expensive gtx 8800 was ?
it was a breakthrough card nevertheless.
now we've got rtx 2060 running circles around 1080Ti in ray tracing.
Ray tracing will improve a lot in a couple of generations.
GTX 1080 Ti received a lot of harsh criticism when it launched, then Turing came along and all of a sudden GTX 1080 Ti was excellent and Turing bad.
Also, AMD criticized ray tracing when Turing launched, and now they may be launching something similar next year (if rumors are to be believed).
As always, the perspective changes to fit the ever-changing narrative.
 
Ray tracing will improve a lot in a couple of generations.
GTX 1080 Ti received a lot of harsh criticism when it launched, then Turing came along and all of a sudden GTX 1080 Ti was excellent and Turing bad.
Also, AMD criticized ray tracing when Turing launched, and now they may be launching something similar next year (if rumors are to be believed).
As always, the perspective changes to fit the ever-changing narrative.
I don't recall AMD as the company criticizing Ray Tracing. Maybe they have criticized Turing graphics for the RT performance although I can't recall that either.
 
Also, AMD criticized ray tracing when Turing launched, and now they may be launching something similar next year (if rumors are to be believed).
AMD didn't really criticize RT. They were pretty quiet about it and the promiment (or maybe only) real statement they made is that they will add RT capability in hardware when it makes sense across the entire range.

From rumors and leaks, RDNA2 places RT ASIC into TMU. This should lead to an interesting situation where AMD has more RT capabilities assuming their unit has similar capabilities to Nvidia's RT Cores (which, based on AMD's patent is very likely). RX 5700XT has 160 TMUs while RTX 2080 Super has 48 SMs/RT Cores (Turing has one RT Core per SM).
 
From rumors and leaks, RDNA2 places RT ASIC into TMU. This should lead to an interesting situation where AMD has more RT capabilities assuming their unit has similar capabilities to Nvidia's RT Cores (which, based on AMD's patent is very likely). RX 5700XT has 160 TMUs while RTX 2080 Super has 48 SMs/RT Cores (Turing has one RT Core per SM).
That will be very hard to assess accurately without anything but speculation. AMD's solution here will be more "improvised", it may end up successful, but it could easily go the other way as well. So I don't dare to claim that it will be more capable than Turing at this point.

Also keep in mind that "RDNA2" will mostly compete with the next generation from Nvidia.
 
In the mean time another game has gone beta RTX "AMID EVIL". Follow the instructions on the steam how to access & download. You need to have the base game to access RTX features.
 
In the mean time another game has gone beta RTX "AMID EVIL". Follow the instructions on the steam how to access & download. You need to have the base game to access RTX features.
Elaborate? :confused:
 
 
Thank you. I read it like a typo “AMD EVIL” :roll:
 
Back
Top