• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA to Enable DXR Ray Tracing on GTX (10- and 16-series) GPUs in April Drivers Update

My bad, it's roughly 40% performance drop (was thinking Shadow of the Tomb Raider which was just a press-only tech demo I guess). Still, question remains the same. DXR has a high price to pay not just in terms of hardware, but loss in framerate and the addition of visual noise.
 
if you think nvidia is generous enough to enable DXR on 10-16 series card, you are sorely mistaken
they enabled it because of crytek show off ray tracing on AMD GPU

How convenient for nvidia to turn some feature on/off via driver update to billions of GPU worldwide
from some cubical in office
 
Last edited:
I do hope that this is a positive bridge, long term, for ray tracing moving forward. Given the visual benefits that raytraced reflections and lighting have over traditional methods for rasterization. Enabling it for the 'masses', albeit as drastically reduced visual fidelity (over cards that have the dedicated hardware and can accelerate it much more effectively) should serve to increase developer uptake and bring us all better image quality in the long run. I mean it would be cool if games launched and virtually everyone can turn on the feature if they choose/have the gfx horsepower, and then like usual, those with more horsepower can turn the dials up further.
 
1660/Ti should easily beat old pascals.that said,it'll be unplayable on either.
 
Whatever Nvidia does this Ray Tracing implementation is up to the Developers to implement properly so as not to destroy performance and it's probably not what some people think. It's only possible to implement it in small amounts even with the fastest GPUs. If Real Time Ray Tracing were fully implemented in games we wouldn't be talking about frames per second we would be talking about frames per day.

I'm not saying RTRT is a gimmick. It definitely adds to the quality of the graphics but it is a mixture of ray tracing and rasterization.
 
Wish they'd add DXR to GTA 5. Reflections of dynamic objects are non-existent in the rain, in the city. For such a visually detailed game...
just fix the shitty anti aliasing and then i would have no complaints
 
Thanks for the pic, I saw the demo in a video, but looking at that still,frame, you can easily see those casing reflections are hexagonal, as opposed to rounded like the original casing models.
I take the geometry had to be reduced exponentially in order to have ray tracing without hardware support.
I remember someone mentioning that these RTRT algorithms generally do not do tessellation.
if you think nvidia is generous enough to enable DXR on 10-11 series card, you are sorely mistaken
they enabled it because of crytek show off ray tracing on AMD GPU
AMD showing up with RTRT definitely has something to do with the decision but timing is not likely related to the AMD or CryTek. Right now is GDC - Game Developers Conference. Generous is relative, performance on Pascal will be quite awful. Nvidia would like DXR to gain some traction which would not be a bad thing.

The reason for enabling DXR is simple, it reminds me of something @Vayra86 said in another thread:
If I can see a poorly running glimpse of my RT future on a low-end GPU, this will catch on, and it will be an immense incentive for people to upgrade, and keep upgrading. Thát is viable on a marketplace.

Edit:
Just look at that Metro comparison slide:
https://www.nvidia.com/content/dam/...eforce-rtx-gtx-dxr-one-metro-exodus-frame.png
That Turing RTX is clearly a marketing stunt on the slide - not directly comparable due to lower resolution - but RT cores are very clearly a huge boost.
 
Last edited:
All of these raytracing implementations are just layering on top of ye olde rasterization techniques. Why does that matter? It's relatively easy to scale what they're doing to the hardware they're doing it on. The only impact is noise.

If you're like me and can't stand noise (crawling surfaces) at all, then these implementations of raytracing are all downgrades.

We're a very, very, very long way from taking AAA games released today, deleting all of the old lighting models and replacing them completely with real time raytracing from sources. In twenty years, maybe, we'll be real time raytracing games released a decade ago (like Crysis). Not doing low resolution raytracing shadows then drowning it out with anti-aliasing like this garbage does.
 
I didn't use the AI based AA from Nvidia in Metro Exodus and ran the RT (lighting effects only) at the highest setting. It was well done. It should only take a couple of generations to get acceptable RT effects to a larger audience.
 
CryTek have already shown we can get highly realistic reflections with Vega 56, so this is great news all round:

edge2arkm3.png

7gDlpVF.png
 
These hexagonal problems are not AMD-specific. This is due to the way this raytracing solution is implemented. Yes, optimization is a part of it but it is not likely to be different on other (Nvidia) hardware.

Something interesting from Nvidia's article on this: https://www.nvidia.com/en-us/geforce/news/geforce-gtx-ray-tracing-coming-soon/
Note the FPS changes from using RT cores. Port Royal obviously gets a huge boost, Metro gets a really large boost, Shadow of Tomb Raider gets noticeable boost and Battlefield's boost is... small. It is not surprising but good to see actual data on this. This is due to two things - how much of the frame time goes to ray tracing as well as what is needed to set up ray tracing and how it is done. I wish we had detailed data for each of them, at least in the same way as the metro frame time slide.
geforce-rtx-gtx-dxr-port-royal-performance-850px.png

geforce-rtx-gtx-dxr-metro-exodus-performancev.png

geforce-rtx-gtx-dxr-shadow-of-the-tomb-raider-performance-850px.png

geforce-rtx-gtx-dxr-battlefield-v-performance-850px.png
 
Last edited:
Given that it wasn't even days after Crytek demo, they had this "amazing" tech somewhere in their pockets.
And, thank you so much, Crytek.

Considering RT is already poor on RTX cards, I guess this on Pascal will be a slideshow?
Hehe, decisions, decisions, right?

Huang's choices at this point:
1) Pretty fast DXR on non-RTX cards => leads to "WTF NVDA, did you fool us?"
2) Pretty slow DXR on non-RTX cards => leads to "WTF NVDA, who the f*ck needs this shit?"

#poorhuang
#leatherdudes

tumblr_n4pgy4kk1z1tv612co2_400.gif


I think what this is about is NVIDIA showing commitment to DXR because there aren't many developers actually interested in deploying it because so few people can actually use it. Because these cards they are adding support to are so crappy at doing it, this news won't support their cause.
Aren't most developers using existing engines anyhow?
 
Given that it wasn't even days after Crytek demo, they had this "amazing" tech somewhere in their pockets.
And, thank you so much, Crytek.
This has much more to do with GDC than CryTek. Well, CryTek will come out with their presentation in the same event but Nvidia's announcement is not a reaction to CryTek.
Tech has been in their pockets for a while. Pascal Quadros do raytracing in various forms (Optix, mostly). Titan V had DXR support from the beginning and was the primary development vehicle for it. They needed to move the tech to consumer drivers, test it and wait for a good time to release. Big event, fairly decent DXR implementation in Metro, seems like the time was now.
 
So now all nvidia cards can have the RTX suffix LOL. The best cheating company so far
 
So now all nvidia cards can have the RTX suffix LOL. The best cheating company so far
Honest question - are you just trolling or do you really not see a difference in GTX and RTX series GPUs?
 
Aren't most developers using existing engines anyhow?
Indeed, and most are still rendering on DirectX 11, nevermind 12 nor DXR. Unity and Unreal Engine 4 both only have prototype support for Vulkan and DirectX 12. Electronic Arts, Square Enix, and Crystal Dynamics likely only implemented RTX because NVIDIA paid them to (likely in the form of graphics card gifts). Crytek most likely only implemented DXR as a means to generate some buzz/stay relevant (AMD/Microsoft could have incentivized them to as well).

TL;DR: DXR strikes me as something that's just ground work for engine developers to dabble with. It's not something game developers should waste their time on. And most won't, unless gifts.
 
Doesn't all that apply to DX12 and Vulkan as well?
 
TL;DR: DXR strikes me as something that's just ground work for engine developers to dabble with. It's not something game developers should waste their time on. And most won't, unless gifts.

Quite the opposite. DXR promises to shave 1000's of development hours by not having to paint lightmaps. The biggest thing that is currently holding back DX12 though is the Windows 7 install base with no DX12 support, hence the reason why Microsoft just opened up DX12 to Windows 7 on a case by case basis so developers will start moving to DX12 and fully taking advantage of it, rather than having to ensure their games work well under both.
 
So many brainless idiots in this topic it's actually cringe-worthy. They have absolutely no clue as to what ray tracing is, how it works, why rtx is essential for it, and why this news means you can try it on RTX-less cards but you won't really be able to enjoy it.

Also DXR is a D3D12 extension which means it should work on any D3D12 capable GPU via shaders emulation. NVIDIA has not enabled RTX on RTX-less cards, they have enabled the software emulation of a D3D12 feature, DXR.

Crytek most likely only implemented DXR as a means to generate some buzz/stay relevant (AMD/Microsoft could have incentivized them to as well).

Crytek has not implemented DXR - they have a custom algorithm for ray-traced reflections.

if you think nvidia is generous enough to enable DXR on 10-16 series card, you are sorely mistaken
they enabled it because of crytek show off ray tracing on AMD GPU

How convenient for nvidia to turn some feature on/off via driver update to billions of GPU worldwide
from some cubical in office

This has nothing to do with Crytek. Crytek (speaking of the demo) has nothing to do with DXR. This is all about enabling a feature which should work by definition and either be accelerated in HW or emulated via shaders.
 
Last edited:
Well it was to be expected after crytek demo on AMD gpu..
yeah.. because these decisions are made in a few days based on media news, no planning whatsoever...
 
I wouldn’t be surprised if it is a ploy for them to say later “See, we told you. You need our RTX cards to do this right.”

I think that is obvious, isn't it? Nvidia already offered a carefully prepared slide showing the massive performance gap. But that doesn't matter. Content is still lacking, the incentive to get Turing is still low. But in the long run this is that 'glimpse' people need to get if you want them to spend on it. I'll tell you, that Quake 2 RTX demo tickled me more than any RTX-content to date. Its a much clearer showcase of what can be done. And I think that's one example of a game you could run at somewhat playable FPS on Pascal.

Anyway, this is a great way to tease the technology and boost adoption rates. Overall what's coming out now is looking a whole lot more like actual industry effort, broad adoption and multiple ways to attack the performance problem. The one-sided RTX approach carried only by a near-monopolist wasn't healthy. This however, looks promising.
 
Thank you Crytek
 
With the cost of RTX cards at each performance point, this very well could be the beginning of the end of my gaming. $500 minimum for a GPU (RTX 2070) that can actually do RT at half-way acceptable frame rates is just not worth the investment to me. Today's computer games are just not worth purchasing a $500 video card for these days IMHO.
 
This has much more to do with GDC than CryTek.
I see.

So nVidia spent years upon years "doing something with RT" to come up with glorious (stock market crashing :))))) RTX stuff that absolutely had to use RTX cards, because, for those stupid who don't get it: it has that R in it! It stands for "raytracing". Unlike GTX, where G stands for something else (gravity perhaps)

So they come with that ultimate plan to bend over customers in 360°+ manner, with shiny expensive RTX cards absolutely needed to experience that ungodly goodness brought to you by the #leatherman.

Did I say "years"? Right, so years of development to bring you that RTX thing.

But then, there is a twist. Somewhere in January, the #leatherman smokes something unusual and asks his peasants, whether it wouldn't be even cooler, if that whole R thing somehow worked with G things.
And engineers kinda say "kinda yes", but finance officers say "kinda no, because less monez".

Surely, Crytek RT stealing RTX thunder => "no no no, wait, we can also do it on non RTX cards" is just a coincidence, it is more than obvious.

giphy.gif


DXR promises to shave 1000's of development hours by not having to paint lightmaps.
Yeah, but the problem here is the #leatherman.
For the said savings to work, one should not have to do 2 versions of the game.
RT must become a massive product for devs to embrace it en mass and not for "gifts".
 
Back
Top