Friday, March 15th 2019

Crytek Shows Off Neon Noir, A Real-Time Ray Tracing Demo For CRYENGINE

Crytek has released a new video demonstrating the results of a CRYENGINE research and development project. Neon Noir shows how real-time mesh ray-traced reflections and refractions can deliver highly realistic visuals for games. The Neon Noir demo was created with the new advanced version of CRYENGINE's Total Illumination showcasing real time ray tracing. This feature will be added to CRYENGINE release roadmap in 2019, enabling developers around the world to build more immersive scenes, more easily, with a production-ready version of the feature.


Neon Noir follows the journey of a police drone investigating a crime scene. As the drone descends into the streets of a futuristic city, illuminated by neon lights, we see its reflection accurately displayed in the windows it passes by, or scattered across the shards of a broken mirror while it emits a red and blue lighting routine that will bounce off the different surfaces utilizing CRYENGINE's advanced Total Illumination feature. Demonstrating further how ray tracing can deliver a lifelike environment, neon lights are reflected in the puddles below them, street lights flicker on wet surfaces, and windows reflect the scene opposite them accurately.
Neon Noir was developed on a bespoke version of CRYENGINE 5.5., and the experimental ray tracing feature based on CRYENGINE's Total Illumination used to create the demo is both API and hardware agnostic, enabling ray tracing to run on most mainstream, contemporary AMD and NVIDIA GPUs. However, the future integration of this new CRYENGINE technology will be optimized to benefit from performance enhancements delivered by the latest generation of graphics cards and supported APIs like Vulkan and DX12.

Ray tracing is a rendering technique that simulates complex lighting behaviors. Realism is achieved by simulating the propagation of discreet fractions of energy and their interaction with surfaces. With contemporary GPUs, ray tracing has become more widely adopted by real-time applications like video games, in combination with traditionally less resource hungry rendering techniques like cube maps; utilized where applicable.
The experimental ray tracing tool feature simplifies and automates the rendering and content creation process to ensure that animated objects and changes in lighting are correctly reflected with a high level of detail in real-time. This eliminates the known limitation of pre-baked cube maps and local screen space reflections when creating smooth surfaces like mirrors, and allows developers to create more realistic, consistent scenes. To showcase the benefits of real time ray tracing, screen space reflections were not used in this demo.
Add your own comment

139 Comments on Crytek Shows Off Neon Noir, A Real-Time Ray Tracing Demo For CRYENGINE

#1
Ferrum Master
RT and made on Vega...

Prepare popcorn everyone.
Posted on Reply
#3
mtcn77
Ferrum Master said:
RT and made on Vega...

Prepare popcorn everyone.
Once again, we will be able to ask the obvious question...
Posted on Reply
#4
Tomgang
That is all great for developers and such. But come on Crytek, give us the game that started it all and showed us what the cryengine cut really do. I am off cause talking about Crysis. I want a Crysis 4 that shows the Cryengine newest advantage with ray-tracing, DX12 and of cause the glories graphics the games are so known for and make the GPU´s sweet once again.

Please Crytek. Im tired of demos after Demos. I want crysis:banghead:
Posted on Reply
#5
jmcslob
Really really like the no RTX needed...
And as soon as I seen Crytek I thought "Yes Crysis 4"... But it's just a demo...
It's cool and all tho.
Posted on Reply
#6
Warsaw
Tomgang said:
That is all great for developers and such. But come on Crytek, give us the game that started it all and showed us what the cryengine cut really do. I am off cause talking about Crysis. I want a Crysis 4 that shows the Cryengine newest advantage with ray-tracing, DX12 and of cause the glories graphics the games are so known for and make the GPU´s sweet once again.

Please Crytek. Im tired of demos after Demos. I want crysis:banghead:
++++++++1 I agree 99999%! I want graphics to push hardware again and be in awe with it! I even wrote a thank you message to Crytek when I noticed they responded to one of the comments in the YouTube video. I want a new Crysis and SWEET SWEET graphics to make our machines crawl. Never before and since then has a game made me just want to look at and appreciate the visuals the same that Crysis did. Just blew me away
Posted on Reply
#7
JB_Gamer
Ok..., then what about RTX, will it be redundant???
Posted on Reply
#8
Nxodus
JB_Gamer said:
Ok..., then what about RTX, will it be redundant???
Don't expect Crytek raytracing to be easy on hardware.
The extra RT cores on Nvidia cards will give a serious edge over non-RT cards
Posted on Reply
#9
yakk
(hybrid)Ray Tracing & Vulkan

All good then!

nvidia will probably use their dedicated cores for it, and AMD can leverage their superior compute for it. Interesting to see how this progresses.

Only hardware left out seem to be non-rtx nvidia cards which also do not have necessary compute performance.
Posted on Reply
#10
dinmaster
No one remembers this company almost folding multiple times because they couldn't pay their employees? They are selling a game engine and the demos show the improvements to it. The hunt game is small and worth it for them. If they come out with a battle royal game that makes them some good money, they will be able to make those games again. I don't see them making another crisis for some time unfortunatly...
Posted on Reply
#11
Mistral
JB_Gamer said:
Ok..., then what about RTX, will it be redundant???
Well, there's still a chance nVidia muscles it's way with Crytek like they did with the Crysis 2 tessellation back in the day...
Posted on Reply
#12
natr0n
Would be cool if they released the demo for us to check out.
Posted on Reply
#13
FreedomEclipse
~Technological Technocrat~
Ubisoft: DOWNGRADING INTENSIFIES
Posted on Reply
#14
XiGMAKiD
Nice to see that they're still afloat, and the demo is pretty nice too
Posted on Reply
#15
Tomorrow
Tomgang said:
That is all great for developers and such. But come on Crytek, give us the game that started it all and showed us what the cryengine cut really do. I am off cause talking about Crysis. I want a Crysis 4 that shows the Cryengine newest advantage with ray-tracing, DX12 and of cause the glories graphics the games are so known for and make the GPU´s sweet once again.

Please Crytek. Im tired of demos after Demos. I want crysis:banghead:
Rather Vulkan. Thay way Crysis 4 if it ever comes would not be limited to Win10 and could ever run on Linux potentially. We can dream tho...
Posted on Reply
#16
lexluthermiester
Nxodus said:
The extra RT cores on Nvidia cards will give a serious edge over non-RT cards
Very likely. There's no reason not to use them.
Posted on Reply
#17
jmcslob
yakk said:
(hybrid)Ray Tracing & Vulkan

All good then!

nvidia will probably use their dedicated cores for it, and AMD can leverage their superior compute for it. Interesting to see how this progresses.

Only hardware left out seem to be non-rtx nvidia cards which also do not have necessary compute performance.
https://www.anandtech.com/show/11987/the-nvidia-geforce-gtx-1070-ti-founders-edition-review/13
Yes AMD is better but it really just sucks for 1060 owners.
Posted on Reply
#20
Imsochobo
Nxodus said:
Don't expect Crytek raytracing to be easy on hardware.
The extra RT cores on Nvidia cards will give a serious edge over non-RT cards
No and yes.
they're just more efficient cores, they have absolutely no performance benefit per se looking away from energy use, IE a RTX2060 may be slower than VII in RT games despite the VII having no RT cores.
Big misconception that is brought forward by nVidia's marketing team that makes people believe that there is some special sauce in every nook of their arch kinda like apple does.

It's just less precision cores that isn't as fat and thus efficient in one way, and it makes the chips very inefficient in performance\die area where I doubt they can do this on smaller nodes at all so MCM is likely where we see RT can actually take off as I feel the RTX2xxx is just too early!

Not to take anything away from nvidia's highly efficient arch, superior performance etc, but new magical stuff it is not.. it's the same old design choice and for the first time since kepler I think they've done the wrong choice, Titan and 2080TI as RTX would actually not be so dumb with rest non RT.
The arch and technology is all capable it is just too early and that is absolutely not something nvidia does often :)

Edit: rephrased quite a bit
Posted on Reply
#21
cdawall
where the hell are my stars
Ferrum Master said:
RT and made on Vega...

Prepare popcorn everyone.
Yea AMD has talked about ray tracing since vega first released they had gobs of PR about it for the Vega FE.
Posted on Reply
#22
Zubasa
JB_Gamer said:
Ok..., then what about RTX, will it be redundant???
Thats the thing, in BF5 the RTRT option is actually labeled DXR.
RTX is just nVidia's branding of DXR + DLSS etc in this case.
Posted on Reply
#23
cucker tarlson
Imsochobo said:
No and yes.
they're just more efficient cores, they have absolutely no performance benefit per se looking away from energy use
Imsochobo said:

a RTX2060 may be slower than VII in RT games
fanboy fantasies.
no performance benefit,yeah,right.





2080Ti is 1.55x faster than Titan V in BF5 RTX.So 2060 would be around Titan V performance in RTRT.
If you think RVII can beat a 15 TFlop nvidia card with 640 dedicated tensor cores (more than 2080Ti) then yeah, good luck with that.

<div class="youtube-embed" data-id="yHfP82FwXio"><img src="https://i.ytimg.com/vi/yHfP82FwXio/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=yHfP82FwXio" target="_blank" class="youtube-title"></a></div>


I already heard fanboys say that Fury X will be faster in RTRT than Pascal,now RVII would be faster than RTX cards in doing RTX
Posted on Reply
#24
Vayra86
JB_Gamer said:
Ok..., then what about RTX, will it be redundant???
Yes. Been saying since day one. A hardware implementation that takes such a massive amount of die space is so grossly inefficient, simple economics will destroy it. If not with Turing then later down the line. Its just not viable. Sales numbers currently only underline that sentiment. I'm not the only one frowning at this; already with the first gen and a meagre implementation we're looking at a major price bump because the die is simply bigger. The market ain't paying it and devs will not spend time on it as a result. Another aspect: I'm not looking to sell my soul to Nvidia's overpriced proprietary bullshit, I'm not paying for inefficiency. Its been the reason I've bought Nvidia the past few generations... they were more efficient. Their wizardry for example with VRAM, and balancing out (most) GPUs in the stack so well is quite something. Turing is like a 180 degree turn.

This, however... yes. Simply yes. Attacking the performance problem from the angle of a software-based implementation that can scale across the entire GPU instead of just a part of it, while the entire GPU is also available should you want the performance elsewhere. Even if this runs at 5 FPS today in realtime on a Vega 56, its already more promising than dedicated hardware. This is the only way to avoid a PhysX situation. RT needs widespread adoption to get the content to go along. If I can see a poorly running glimpse of my RT future on a low-end GPU, this will catch on, and it will be an immense incentive for people to upgrade, and keep upgrading. Thát is viable on a marketplace.

Another striking difference I feel is the quality of this demo compared to what Nvidia has put out with RTX. This feels like a next step in graphics in every way, the fidelity, the atmosphere simply feels right. With every RTX demo thus far, even in Metro Exodus, I don't have that same feeling. It truly feels like some weird overlay that doesn't come out quite right. Which, in reality, it also is. The cinematically badly lit scenes of Metro only emphasize that when you put them side by side with non-RT scenes. The latter may not always be 'correct' but it sure is a whole lot more playable.

cucker tarlson said:
fanboy fantasies.
no performance benefit,yeah,right.





2080Ti is 1.55x faster than Titan V in BF5 RTX.So 2060 would be around Titan V performance in RTRT.
If you think RVII can beat a 15 TFlop nvidia card with 640 dedicated tensor cores (more than 2080Ti) then yeah, good luck with that.

<div class="youtube-embed" data-id="yHfP82FwXio"><img src="https://i.ytimg.com/vi/yHfP82FwXio/hqdefault.jpg" /><div class="youtube-play"></div><a href="https://www.youtube.com/watch?v=yHfP82FwXio" target="_blank" class="youtube-title"></a></div>


I already heard fanboys say that Fury X will be faster in RTRT than Pascal,now RVII would be faster than RTX cards in doing RTX
*DXR. In the end Nvidia is using a customized setup that works for them, it remains to be seen how well AMD can plug into DXR with their solution, or how Crytek does it now, and/or whether they even want to or need to. The DX12 requirement sure doesn't help it and DXR will be bogged down by rasterization as well as it sits within the same API. There is a chance the overall trend will move away from DXR altogether, leaving RTX in the dust or out to find a new point of entry.
Posted on Reply
#25
GinoLatino
lexluthermiester said:
Very likely. There's no reason not to use them.
Considering there are only two games with RT support, I think that with a "run everywhere" engine those RT cores will end up sitting there watching... unless the engine ha RT implementation from the get go of course.

I think it will be interesting to have this demo as a benchmark, make it so Crytek! :)
Posted on Reply
Add your own comment