• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

NVIDIA Recreates Lunar Landing with RTX Technology and Ray Tracing

Because any showcase so far has shown that actual performance is terrible when it’s on.
Citation?
Not sure why your being such a cheerleader on this...
Maybe because raytracing is not new, only the real-time aspect of it, and because I see the big picture and the writing on the wall. Raytracing is not a fad that will disappear. It is a proven technology that has been around for more than two decades. Dev's have been looking for ways to do it real time for just as long and now we have that way. It will only improve.

If you don't see the big picture, that's a thing. However such a narrow view will stop no one from moving forward with progress in this area.

The new cards offer performance over the previous gen until you turn on the “novelty” then it’s crippled.
Again, citation?
 
Last edited:
The Metro demo was done using 3spp overly crippling to boost visuals but still had noise.

During GTC 18 Nvidia specificaly said 2spp on certain affects would be preferable at but performance wasnt there. Would mean going down to 2spp or 1spp would introduce more noise and the demo was environment only.

Should wait until the final version of the implementation are done. Going off these demos done at higher outputs that wont be feasible in a actual game isnt good for expectations.
 
Last edited:
Citation?

Maybe because raytracing is not new, only the real-times aspect of it is, and because I see the big picture and the writing on the wall. Raytracing is not a fad that will disappear. It is a proven technology that has been around for more than two decades. Dev's have been looking for ways to do it real time for just as long and now we have that way. It will only improve.

If you don't see the big picture, that's a thing. However such a narrow view will stop anyone from moving forward with progress in this area.


Again, citation?
Again not denying the tech, I’m saying the hardware isn’t ready. You want RTRT right now expect a hard performance hit to get it.
 
You want RTRT right now expect a hard performance hit to get it.
The point was, where are you getting that from? There are no comparisons to make that conclusion yet. When there are current benchmarks/games that have known performance metrics and are then patched with RTRT we can make comparisons and conclusions. We're not there yet and there is only speculation as to real world performance.
 
Here's a better question; Why the skepticism? This is an exciting new technology that has great and far reaching potential for the future of consumer computing. Why do so many people effort to tear it down? What do they fear?

what is one hell of a fanboy defense if I ever saw one.
You should ALWAYS be skeptical.
No this is no exciting at all, its I guess ok, some progress in visual fidelity but pls cool it a bit.
"Far reaching potential" yeah ok.

We fear nothing, we doubt the marketing that Nvidia put forth.
 
Here's a better question; Why the skepticism? This is an exciting new technology that has great and far reaching potential for the future of consumer computing. Why do so many people effort to tear it down? What do they fear?

I am not tearing down RTRT just the laughable marketing. While this is unarguably a large step forward as RTRT has never been a factor in pc gaming, it isn't a game change...yet. For all intents and purposes, RTRT is useless right now because enthusiasts are not going to playing on 1080@60. They are going to be 4K, 21:9, or 1440 high refresh. Next gen is when it will be a game changer.

When every major dev(and even many of the smaller ones) are chomping at the bit to take advantage of RTRT, that is something to get the warm fuzzies over.

I have yet to even see them be able to accomplish multi-gpu in DX12 yet. I don't have high hopes that they'll get around to an efficient implementation of RTRT for many years either. Besides, what good is a pretty game if the underlying story line is trash and the game play is a rendition of either Assassin's Creed, COD, or Battlefield?
 
I have yet to even see them be able to accomplish multi-gpu in DX12 yet. I don't have high hopes that they'll get around to an efficient implementation of RTRT for many years either. Besides, what good is a pretty game if the underlying story line is trash and the game play is a rendition of either Assassin's Creed, COD, or Battlefield?

I have yet to hear anyone outside of those in partnership with Nvidia. Until games not back by Nvidia jump on the bandwagon its going to be dependent on Nvidia bank-rolling them to include those features.

Nvidia RT is going to be part of GameWorks so there is your answer
 
Using 60fps capture when the content only runs at 24fps :roll:
 
what is one hell of a fanboy defense if I ever saw one.
Or I'm just excited for the future. And if you had paid attention you would have read that I'm also excited to see AMD's answer RTRT. You were saying?
You should ALWAYS be skeptical.
Ok, I'm going to be skeptical of that statement because it seems needlessly narrow-minded. Thanks for the advice! :rockout:
No this is no exciting at all, its I guess ok, some progress in visual fidelity but pls cool it a bit.
You guess eh? Terrible guess.
 
So, how well does this run on Vega?
I bet this particular thing is probably running directly in OptiX.
The other RTRT demos supposedly running on DXR cannot be tested either as AMD has yet to release DXR drivers for Vega.

Nvidia RT is going to be part of GameWorks so there is your answer
The implementation is part of GameWorks. All the games showing off RTRT are running in DX12 with DXR. Vulkan got NV Extensions and while unclear whether it will get a generic RTRT support, there is a good chance it might. This is pretty good as far as standardization goes.
 
The point was, where are you getting that from? There are no comparisons to make that conclusion yet. When there are current benchmarks/games that have known performance metrics and are then patched with RTRT we can make comparisons and conclusions. We're not there yet and there is only speculation as to real world performance.

Everything we have seen so far is clearly low FPS material, there is no denying that. So there is that, for measurable performance. On top of that, we see low poly models left and right - this Lunar Landing video even has them and they are quite noticeable - and that is happening in *tech demos*. And on top of all that, we know many of these demos require multiple GPUs and a fixed, highly optimized run/scenario. Gaming is none of that: SLI is no longer interesting (NVlink won't change that, its a niche more than it ever was since midrange no longer has it) and games are naturally not perfectly optimized and much more dynamic than a demo.

You say you see the writing on the wall - THAT is the writing on the wall. Performance is abysmal and that is underlined every time we see or hear of RTRT. And if its not, its blurry junk that, while 'dynamic', is convincingly uglier than the tried and tested approach; while still performing worse than that approach.

The quality we need in RTRT is still far beyond reach for most games in any sort of playable setting.

Last but not least, and this is very important to keep in the back of your head:

'RT cores' and the current state of Turing is just a hand-me-down from Volta. The intention was never to go all out on ray tracing, its just something that tensor is quite capable of. But almost nothing in the Turing GPU was specifically built for RTRT in games. Its still mostly meant for development work. This tech was meant for Quadro and Tesla class GPUs. That also confirms why we get cut down chips. It also confirms that yields are not fantastic, that the cost/risk is rather high, and the performance confirms that none of this was purpose built for real time RT. It just works 'okay ish' on Turing as well. That is a somewhat different, but far more realistic perspective to the 'why' behind this release, and I daresay a tad more believable than Nvidia saying they've worked on it for 10 years.

So, bottom line, for me:
- this can easily turn out to be PhysX 2.0
- industry 'support' says about nothing. Remember DX12's MGPU support? If you leave it to devs to implement, it will be scarce.
- Nvidia is destroying all incentive to saturate the market with RT capable cards (only high end, no content to trigger buyers on release, killing SLI @ midrange all contribute to this)
- Implementation today means spending resources and time for a tiny niche of the gaming market
- Ray Tracing isn't new, and trying to get it done in realtime isn't new either. If its such a holy grail, why approach it with such a sad attempt and for such a tiny subgroup of buyers? If Nvidia truly believed in it, the approach would be different.

Turing is a test vehicle for RTRT. Mark my words. If it doesn't stick, it'll remain an afterthought that will barely see implementation.
 
Last edited:
Oh goodie, the conspiracy theorists will use this as more evidence that the moon landings were faked

You can find a particular picture yourself from NASA's website, adjust some values and easily see photoshop in play. There are many reasons why it never happened. Stay asleep though. Sleep is good for you.

Personally i really like nvidias ray tracing tech

I'm certainly not opposed to it. I welcome all endeavors to enhance realistic lighting. I would prefer a solution that could become an industry standard though.
 
Last edited:
There are many reasons why it never happened.
The most insurmountable evidence why it DID happen is the sworn enemy of the U.S. during the Cold War was the USSR, who despite the fact they would have JUMPED at the chance to make the U.S. look bad, agree it happened, several times.
 
You can find a particular picture yourself from NASA's website, adjust some values and easily see photoshop in play. There are many reasons why it never happened. Stay asleep though. Sleep is good for you.

Jesus hernandez christ, people this stupid really do exist on a tech forum.
 
You can find a particular picture yourself from NASA's website, adjust some values and easily see photoshop in play.

Not to suggest in either direction as I don't care to get involved in this one but I don't believe Photoshop existed in '69.
 
The point was, where are you getting that from? There are no comparisons to make that conclusion yet. When there are current benchmarks/games that have known performance metrics and are then patched with RTRT we can make comparisons and conclusions. We're not there yet and there is only speculation as to real world performance.

GTC 2018 Europe

137115-176462-176461_rc.jpg


However, the costs are high: the contact and sun shadows calculated with two beams per pixel, including noise rejection, together require 2.3 ms per frame and the reflections a whopping 4.4 ms. Global denoising lighting extends the rendering process by another 2.5ms.

This is a total of 9.2 ms per frame and thus an almost one-third higher computational overhead if we take 30 frames per second as the basis (42.2 ms instead of 33 ms per frame)

This is bare minimal (1rpp/1spp) and the hit is significant to exclude 100hz at 1080p by its process alone, never mind anything else the game needs.


My turn: citation?

:toast:

Nvidia said:
The upcoming GameWorks SDK — which will support Volta and future generation GPU architectures — enable ray-traced area shadows, ray-traced glossy reflections and ray-traced ambient occlusion.
 
Last edited:
all of those who thinks this is an RTX Demo I can assure you that it is NOT! this demo was released @ 07 April 2015 when W10 was not even released... if you don't belive me, check this out!

https://www.nvidia.com/coolstuff/demos#!/apollo-11

I was running this like the video did with 8 gb DDRIII 1600Mhz + i7 4770 & a GTX 970 STRIX.... so Nvidia is a huge fuckup nowdays....
 
all of those who thinks this is an RTX Demo I can assure you that it is NOT! this demo was released @ 07 April 2015 when W10 was not even released... if you don't belive me, check this out!

https://www.nvidia.com/coolstuff/demos#!/apollo-11

I was running this like the video did with 8 gb DDRIII 1600Mhz + i7 4770 & a GTX 970 STRIX.... so Nvidia is a huge fuckup nowdays....
It’s the same Demo yes, but they’ve added the ray tracing.
 
Back
Top