• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

David Wang From AMD Confirms That There Will Eventually Be an Answer to DirectX Raytracing

No there aren't. Where did you get that from ?
I think me meant RTX. Which is not only RTRT, but also DLSS and advanced shading.
I don't know what the total number is, but 25 titles were looking to add DLSS alone when it was unveiled. Granted, one of those titles was FF XV, which has kicked the bucket since.
 
a year later, AMD eventually Answer to DirectX Raytracing
2 years later = eventually...
5 years later = eventually....
10 years later = eventually....
.
.
.
one eternity later = eventually....
How long did it take Nvidia to get tessellation going?
 
How long did it take Nvidia to get tessellation going?
I believe they supported tesselation the moment it made it into DirectX.
 
It's quite obvious why AMD won't have RT in their offerings anytime soon: the don't have GPUs able to produce the necessary frame rates required that, after enabling RT, drop to "acceptable levels" of performance: that's @ least 2070 kind of performance and AMD is nowhere near that.

Until AMD has GPUs with that kind of performance, might as well not even bother giving the GPUs the capability. Unless ofc they come up with a way to have RT @ a far far lower performance cost which, IMHO, is highly unlikely.
 
How long did it take Nvidia to get tessellation going?

Silly question, it took them as long as it was necessary in order to get developers to add obscene amounts of it to cripple performance on AMD hardware in early titles. This time around it will be interesting because Nvidia is the first one to it and they don't know what sort of solution AMD will have.

It's funny that AMD's interest in new technology is questioned when histrionically it was Nvidia who were the ones always reluctant to change their hardware according to the direction the industry was moving towards. Remember unified shaders ? Nvidia argued long and hard against it, how did that turned out ? They obviously knew very well that's the future, but they didn't have the hardware ready so they questioned it's use. Their behavior was always distasteful in this regard to say the least.
 
Last edited:
As long as the gaming consoles (main source of revenue for gaming developers) are running on AMD GPU's (RX470-480 equivalent), there will be no rush to develop games with RT.
 
We heard that even Nvidia's flagship cards buckle hard under their advertised ability of ray tracing (30fps 1080p...).
So that is just going to be nothing more then a silly gimmick (like PhysX) until there is proper support for it.
And these cards can do 4K 60 fps doing rasterization of the same scene which looks 80% as good. NVIDIA has to pay developers to implement that degree of stupid. There's absolutely no reason for them to consider it otherwise.
 
I believe they supported tesselation the moment it made it into DirectX.

Made it into Directx indeed... nvidia reportedly pushed microsoft to delay DX tesselation by 1 or 2 generations as their generation of hardware at the time had no hardware level tesselation. AMD had their hardware integration on time, but were left unable to use it until microsoft published DX extensions for it. Nvidia overshot the tesselation need so you ended up with sub-pixel sized tesselation wasting GPU resources on top of the possibility of being hidden from view.

That one was 100% on Microsoft with nvidia taking huge advantage of it.
 
Silly question, it took them as long as it was necessary in order to get developers to add obscene amounts of it to cripple performance on AMD hardware in early titles. This time around it will be interesting because Nvidia is the first one to it and they don't know what sort of solution AMD will have.

It's funny that AMD's interest in new technology is questioned when histrionically it was Nvidia who were the ones always reluctant to change their hardware according to the direction the industry was moving towards. Remember unified shaders ? Nvidia argued long and hard against it, how did that turned out ? They obviously knew very well that's the future, but they didn't have the hardware ready so they questioned it's use. Their behavior was always distasteful in this regard to say the least.
Don’t forget DX 10.1 that Nvidia eventually got shelved because they couldn’t do it and ATI could.
 
a year later, AMD eventually Answer to DirectX Raytracing
2 years later = eventually...
5 years later = eventually....
10 years later = eventually....
.
.
.
one eternity later = eventually....

Like intel's 10nm actually working...eventually.
 
It's quite obvious why AMD won't have RT in their offerings anytime soon: the don't have GPUs able to produce the necessary frame rates required that, after enabling RT, drop to "acceptable levels" of performance: that's @ least 2070 kind of performance and AMD is nowhere near that.

Until AMD has GPUs with that kind of performance, might as well not even bother giving the GPUs the capability. Unless ofc they come up with a way to have RT @ a far far lower performance cost which, IMHO, is highly unlikely.
AMD's Vega 64 has equal or even higher FPS than 2070 in Battlefield V, a title with raytracing. Your argument is invalid.
Not to mention upcoming 7nm Navi which will be at least 25% faster thanks to new 7nm process (official data on process' performance increase provided by AMD) not to mention architectural improvements that might come.
 
Last edited:
AMD's Vega 64 has equal or even higher FPS than 2070 in Battlefield V, a title with raytracing. Your argument is invalid.
Not to mention upcoming 7nm Navi which will be at least 25% faster thanks to new 7nm process not to mention architectural improvements that might come.
Exactly I got lots of Conpute units not really doing anything that could be leveraged to DXR but that’s up to AMD but I’m sure it would gimp it the same as Nvidia.
 
AMD's Vega 64 has equal or even higher FPS than 2070 in Battlefield V, a title with raytracing. Your argument is invalid.
Not to mention upcoming 7nm Navi which will be at least 25% faster thanks to new 7nm process (official data on process' performance increase provided by AMD) not to mention architectural improvements that might come.

Nope:

2160.png

Vega 64 is slightly behind 2070 in this title @ 4K resolution. However, notice how much is the drop from RT fully enabled on the 2070: Vega 64 has nearly 296% more performance over a fully RT enabled 2070 because fully enabling RT tanks performance soooo much.

Quite simply: unless AMD / nVidia come up with a much less costly way to enable RT, there's little point in enabling it @ this time.

@W1zzard : how did enabling RT in any of their configurations affect power consumption? I'm guessing the increase is substantial now that the whole chip is being utilized, instead of just a significan portion of it. IMO, power consumption figures (with / without RT) should have been included in RT's review.
 
Last edited:
Nope:

View attachment 110526

Vega 64 is slightly behind 2070 in this title @ 4K resolution. However, notice how much is the drop from RT fully enabled on the 2070: Vega 64 has nearly 296% more performance over a fully RT enabled 2070 because fully enabling RT tanks performance soooo much.

Quite simply: unless AMD / nVidia come up with a much less costly way to enable RT, there's little point in enabling it @ this time.

@W1zzard : how did enabling RT in any of their configurations affect power consumption? I'm guessing the increase is substantial now that the whole chip is being utilized, instead oj just a significan portion of it. IMO, power consumption figures (with / without RT) should have been included in this review.
Vega 64 is above RTX 2070 for 1080p and roughly equal in 1440p. Plus with the performance tanking, 4k result in regards to raytracing is completely irrelevant so I don't understand why would you even bring it up. Raytracing will be mainly used 1440p and below.
1080.png
1440.png
 
Vega 64 is above RTX 2070 for 1080p and roughly equal in 1440p. Plus with the performance tanking, 4k result in regards to raytracing is completely irrelevant so I don't understand why would you even bring it up. Raytracing will be mainly used 1440p and below.
1080.png
1440.png

When you need a 2080Ti to be able to have over 60 FPS avg @ 1080p with RT enabled (no idea of minimums, thus far):

1080.png

@ 1440p, the performance tanks from 90.9 to 33.8 FPS and @ 1080p it tanks from 112.3 to 47 FPS, and all these numbers are average numbers: minimum framerates would be more important here, IMO. RT's cost is simply far too great for "this round" of graphic cards.
 
When you need a 2080Ti to be able to have over 60 FPS avg @ 1080p with RT enabled (no idea of minimums, thus far):

View attachment 110529

@ 1440p, the performance tanks from 90.9 to 33.8 FPS and @ 1080p it tanks from 112.3 to 47 FPS, and all these numbers are average numbers: minimum framerates would be more important here, IMO. RT's cost is simply far too great for "this round" of graphic cards.
Not only this round's graphic cards. Just damn reflections is enough to reduce performance by 50% and they aren't any major graphical improvement. It will take a few more iterations of GPUs until this is actually common.
If there's anything good about this, it's that Nvidia will have to invest in DX12 performance optimization now, because raytracing is in DX12.
 
Not only this round's graphic cards. Just damn reflections is enough to reduce performance by 50% and they aren't any major graphical improvement. It will take a few more iterations of GPUs until this is actually common.
If there's anything good about this, it's that Nvidia will have to invest in DX12 performance optimization now, because raytracing is in DX12.

Agreed.
 
My mistake, I didn't pay attention who I was replying to.

You were replying to the guy who is right 90% of the time - unlike you who is defending a complete joke of a "new technology." Now we have results in from BFV, and it's hilariously underwhelming.

Even "Raytraced Reflections" have a mountain of graphical glitches just like Rasterized reflections have. In otherwords, you are cutting your framerate down by 75% just so you can have incorrect and glitchy godrays show up in puddles instead of shadowy reflections that are at least consistent.

What an amazing new graphical feature /s
 
I believe they supported tesselation the moment it made it into DirectX.
They did in fact.
In otherwords, you are cutting your framerate down by 75%
There have been an number of video's on Twitch and Youtube showing the performance difference of between 25%-35% when RTRT(DXR) was on full and settings were on Ultra VS RTRT(DXR) being off. However at no time did any of the framerates shown go below 60FPS.
 
Last edited:
You were replying to the guy who is right 90% of the time

I wish there was a way to pin this.
Actually, I think it can go in my sig.

Edit: what do you know, it works!
 
Back
Top