• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Crytek Shows Off Neon Noir, A Real-Time Ray Tracing Demo For CRYENGINE

You’re the one that wants to make this about bias. They are equal too or more expensive than VII for the MAJORITY of available cards full stop. 2-3 Budget line examples doesn’t change the fact that VII is reference only so one size/price fits all.
I wasn't the one that made up numbers,I just checked them cause 40% more for a card with the same msrp seemed way off.
so now a card that doesn't fit your 9K-10K class is too budget,eh?
maybe do some research on those 7K models,they're cooler and quieter than VII.
I'm having fun.
 
so now a card that doesn't fit your 9K-10K class is too budget,eh?
maybe do some research on those 7K models,they're cooler and quieter than VII.
Budget card is budget not a hard concept to grasp? You get what you pay for. Stil, doesn’t change the fact the majority of cards are more expensive than VII no matter how you’re trying to split hairs now.
Let’s get back to Ray Tracing shall we? I don’t feel like arguing with the resident Nvidia cheerleader anymore.
 
Budget card is budget not a hard concept to grasp? You get what you pay for. Stil, doesn’t change the fact the majority of cards are more expensive than VII no matter how you’re trying to split hairs now.
Let’s get back to Ray Tracing shall we? I don’t feel like arguing with the resident Nvidia cheerleader anymore.
sorry,but you're the one that brought up pricing in the first place and are now resorting to name calling.
 
Nice. So now that it has been done on AMD hardware, we have three pages of comments and nobody calls RTRT a "gimmick" or a "fad" anymore. Who could have seen that one coming? :D

Well, there are at least two people who still don't believe in it. @me and @Vayra86. Can't speak for the others but TPU has a lot of bandwagoners.
 
sorry,but you're the one that brought up pricing in the first place.
And you being the resident NV cheerleader couldn’t help but latch on to it to support your team. Bottom line is still most 2080 are more expensive than VII let’s just stop there.
 
And you being the resident NV cheerleader couldn’t help but latch on to it to support your team. Bottom line is still most 2080 are more expensive than VII let’s just stop there.
if you count correcting your statements as "supporting my team"......
 
okay big guy,there's no need for that,I told you it wouldn't matter if you said you prefer rvii to 2080 anyway.

ps for other users,you can use your mouse to scroll the list down
 
okay big guy,there's no need for that,I told you it wouldn't matter if you said you prefer rvii to 2080 anyway.

ps for other users,you can use your mouse to scroll the list down
And you prefer 2080 to VII but the numbers are still In my favour. Let’s just stop the back and forth.
 
And you prefer 2080 to VII but the numbers are still In my favour. Let’s just stop the back and forth.
I think you're either blind or obnoxiously lying.
and why "in your favor" ? are you,by any chance, an AMD cheerleader ?
 
I think you're either blind or obnoxiously lying.
Sorry but the price list is lying? Give it up now you’re just trying being intentionally argumentative...
 
Sorry but the price list is lying? Give it up now you’re just trying being intentionally argumentative...
it isn't lying,and that's the whole thing.
 
33 2080s 27 more expensive than VII. Where’s the lie?
what am I gonna repeat myself five times for you to understand ? :laugh:
 
Can we please move back on topic? I find this software ray tracing done by Crytek to be fantastic! It really does make me think there may be more ways to crack this nut.

If they can gain a little traction and it finds interest with a developer or two, I also think we could have a vhs/beta type competition that will take a few years to decide (by consumers) which will be the way forward for ray tracing in games. I for one am happy I can sit this out and watch for now.
 
Sorry but the price list is lying? Give it up now you’re just trying being intentionally argumentative...
The price list isn't lying, but you comparing a video card that's only available in reference design to another card that in its reference design costs the same and then bringing custom models into the discussion was either you trying to convince yourself Radeon VII was the better pick (it could be, I don't know your needs) or just obvious flame bait.

Can we please move back on topic? I find this software ray tracing done by Crytek to be fantastic! It really does make me think there may be more ways to crack this nut.

If they can gain a little traction and it finds interest with a developer or two, I also think we could have a vhs/beta type competition that will take a few years to decide (by consumers) which will be the way forward for ray tracing in games. I for one am happy I can sit this out and watch for now.
Oh, sure there are many ways to go about it. And it obviously won't take off without more widespread support. I'm going to make a rather out of place comparison here, the Turing is like Gagarin's flight. It didn't mean that's how we're going to be flying from then on, but it was the moment when the idea was no longer something that would happen at some point, but it was there and then all of a sudden.
 
No DXR or fallback layer. So this "ray tracing" method will be locked on Cryengine. While rest will use industry standard DXR.

Crytek convulsions...
 
33 2080s 27 more expensive than VII. Where’s the lie?

If you can comfortably find a 2080 at the same price as a Radeon VII, why would you even consider the higher prices?
Heck, even the shittiest 2080s are usually better than Radeon VII in terms of noise and cooling.
 
If you can comfortably find a 2080 at the same price as a Radeon VII, why would you even consider the higher prices?
Heck, even the shittiest 2080s are usually better than Radeon VII in terms of noise and cooling.
Horses for courses. Let’s just get back to CryEngine bringing Ray Tracing to the masses.
 
Oh, sure there are many ways to go about it. And it obviously won't take off without more widespread support. I'm going to make a rather out of place comparison here, the Turing is like Gagarin's flight. It didn't mean that's how we're going to be flying from then on, but it was the moment when the idea was no longer something that would happen at some point, but it was there and then all of a sudden.

I can totally agree with that view on RTX, for sure!

No DXR or fallback layer. So this "ray tracing" method will be locked on Cryengine. While rest will use industry standard DXR.

Crytek convulsions...

Why would you need a software falback for a software solution? Also industry standard is a bit of a stretch when it is tied to an API with low adoption and is implemented in half a handful of games...
 
I just thought I'd copy something I posted in another thread as it's relevant to this discussion Regarding the cryengine RT approach.

While that is nice, honestly I don't expect it to perform anywhere near what Turing 20-series can do with the ASIC RT cores. It runs well on Vega because those GPUs have a lot of shader processors that aren't doing a whole lot when gaming (Vega SP's are underutilised due to waiting for other parts of the pipeline to finish, i.e geometry). Filling them up with RT ops while they wait will result in Vega doing quite well with GPGPU RT on the shaders. Vega also can potentially use Rapid Packed Math FP16X2 to accelerate that process. I don't think it will run amazingly well on Pascal or the baby Turings (TU116) because these GPUs are already close to peak shader utilisation, I think.

Well optimised RT code running on GPGPU shaders is great of course, as its vendor and API agnostic, but dedicated HW is going to perform better at the same IQ or have superior IQ at the same frame rate, I think. NVIDIA will be working pretty hard to optimise the driver and code for the RT cores, too. Also: NVIDIA bet a lot on dedicated fixed function units to do the BVH part of ray tracing. I think, if the same or better performance could be achieved with throwing more GPGPU CUDA cores at the problem and running that code on those instead, they would have done it. Just my thoughts.

These are just my 2 cents on the ray tracing. Maybe I'm wrong and big Turing implementation is innately inferior to a GPU approach... Time will tell.

I didn't, and won't, invest in 20 series GPU because I feel it isn't worth it yet. But I do think Nvidia will improve dedicated rtx cores.
 
we'll see the results and we'll judge it then.
so far all we can do with old cards in real time is tray racing

tokyo-japan-22nd-may-2016-japanese-waiter-daisuke-sonoda-c-carries-G22XK3.jpg
 
Last edited:
OMG!!! What the hell are you doing ? This Crytek demo is not big GUN that you want to defend RTX! GET OVER IT! Two Pages wasted for nonsense posts ! MOD Please Remove All non-related Posts , Thanks
 
Back
Top