• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Performance Slide of RTX 3090 Ampere Leaks, 100% RTX Performance Gain Over Turing

bug

Joined
May 22, 2015
Messages
13,214 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Why is quite telling too (and applies to any game developer, not just game engine developers)
The fact that they didn't need any "RTRT" yet deliver light effects that impressive was what I was referring to.
Stop that. Rasterization has many tricks and can look really, really god. But it can't do everything RT can (e.g. free ambient occlusion, off screen reflections).
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
That's not how it works. Nvidia spends time to train some neural networks, the developer only has to make some specific API calls.
For DLSS 1.0, Nvidia had to train on a per-title basis. Past a certain point that was no longer necessary and now we have DLSS 2.0. I've even heard DLSS 3.0 may do away with the API specific calls, but I'll believe that when I see it (I don't doubt that would be the ideal implementation, but I have no idea how close/far Nvidia is from it).

DLSS still requires a support per title. Whether that involves extensive training is not very relevant for the end-user. You need an update.

The same story as with VR. And where is VR now?

You're spot on this time...

VR and RT are rather comparable. Its not even a weak comparison like @lexluthermiester thinks. They both are addons to the base gaming experience. They both require additional hardware. They both require a specific performance floor to be enjoyable. They both require additional developer effort while no gameplay (length, content) is added - in fact it actually inspires shorter playtime, as more time is lost on other things.

Effectively, while it was touted as a massive time saver, RTRT does impact the dev time to market, VR does the exact same thing. It adds base cost to any game that is to be developed for a broad market and multiple input devices. Its a step harder than the ones we had until now, too. Somehow Jensen is selling the idea that this will at some point pay off for devs, but there is no proof of that yet. The games with RT don't sell more because of it. The vast majority can't use the features. I believe BF V was one of the less played Battlefields in recent times, for example. RT didn't help it. Metro Exodus... same thing. Cyberpunk - it will again be more of the same. None of these games sold on their RT features, and none of them added playtime for players due to that either. I've not seen a single person saying he'd replay Control just to gaze at reflections again.

The chicken/egg situation is similar between these technologies, they both apply to gaming, and they both, still, are only in view for a niche of the PC gaming market.

Anyone thinking this will gain traction now or in the next two years is still deluding himself. It wasn't that way with Turing, and it hasn't changed since. Not one bit. HL: Alyx also didn't become the catalyst to mass VR adoption, did it?

Bottom line... the RT hype train should not be boarded just yet, if you ask me. Especially not when the hardware is apparently forcing so many sacrifices from us, financially, heat and size wise, and in raw raster perf growth. I'll take a software solution any day of the week and we know they exist. CryEngine has a neat implementation, give me that instead, its cheap, simple, and can 100% utilize existing hardware while the visual difference is minimal at best.
 
Last edited:

bug

Joined
May 22, 2015
Messages
13,214 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
DLSS still requires a support per title. Whether that involves extensive training is not very relevant for the end-user. You need an update.
The post I replied to was not about the end user, but about how hard it is to implement DLSS for developers. I was just pointing out devs' life is much easier now than it was with DLSS 1.0. They still to make some specific API calls, but they don't need to wait for Nvidia to train specifically for their title before doing their part.
I'm sure Nvidia still refines their training, but the results of that can be delivered in a driver update, they're not blocking anything anymore.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.11/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
You see how people's minds work? I couldn't resist mentioning Ratchet and Clank since it is the first next gen only game coming. All those visual improvements are in rasterization, but because there is some inconsequential amount of RT in the game, somehow it is an RT showcase for silly people already. nVidia has already won the marketing battle for the silly people that is for sure. I believe there will be an RT off setting for 60fps, then we can compare after launch. Prepare to prefer the non RT version...



Yeah that is another good example :)

Imagine being stupid enough to think not having RT is a plus point and having it means the game developer has to cripple the visuals to make RT look good.
TPU's comment section feels like a joke sometimes.
We all knew RT is only used for part of the rendering, but you're making a joke out of yourself by acting like RT is a bad thing and developers should stick to rasterization forever because it somehow gives better results in your ignorant mind.
 
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
Imagine being stupid enough to think not having RT is a plus point and having it means the game developer has to cripple the visuals to make RT look good.

Adding RT means wasting effort on something only a fraction of the market will use, surely there are better things to do with a game.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.11/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
VR and RT are rather comparable.

Have you tried VR though?
Try playing a racing game with VR and suddenly you can't go back anymore...
 

bug

Joined
May 22, 2015
Messages
13,214 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Adding RT means wasting effort on something only a fraction of the market will use, surely there are better things to do with a game.
Wasn't that the case with every gfx advancement already?

I've said it before, I'll say it again: bolting RT on top of rasterization is like a hybrid car: you still have to source the materials for everything and you get more things that can go wrong. That doesn't mean an electric car (or going just RT) is a bad idea.
 

M2B

Joined
Jun 2, 2017
Messages
284 (0.11/day)
Location
Iran
Processor Intel Core i5-8600K @4.9GHz
Motherboard MSI Z370 Gaming Pro Carbon
Cooling Cooler Master MasterLiquid ML240L RGB
Memory XPG 8GBx2 - 3200MHz CL16
Video Card(s) Asus Strix GTX 1080 OC Edition 8G 11Gbps
Storage 2x Samsung 850 EVO 1TB
Display(s) BenQ PD3200U
Case Thermaltake View 71 Tempered Glass RGB Edition
Power Supply EVGA 650 P2
Adding RT means wasting effort on something only a fraction of the market will use, surely there are better things to do with a game.
Yeah, you'd be better off wasting your development time tinkering and tweaking the SSR implementation and hoping to get it right, and even if you get it right it will be flawed because of the flawed nature of Screen Space effects.
What a time to be alive...
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Wasn't that the case with every gfx advancement already?

I've said it before, I'll say it again: bolting RT on top of rasterization is like a hybrid car: you still have to source the materials for everything and you get more things that can go wrong. That doesn't mean an electric car (or going just RT) is a bad idea.

As time and research progresses, in fact, it seems electric cars are increasingly looking to be a bad idea. The investment won't likely outweigh the benefits compared to 'riding out petrol' for a few more decades and then straight up moving to other fuel types. There are lots of similarities, too. Essential resources are scarce for example to produce batteries; the production processes are highly polluting, and it requires a massive, costly and painful ecosystem shift. All of that while knowing that the overall TCO and carbon footprint is still lower, so there is a benefit.

RT is no different. I think many can see the benefit, and so do I, but they question whether it is feasible. I'm kind of on the same train of thought when it comes to the current implementation (brute forcing) of RT. The whole rasterized approach was based on the idea that brute forcing everything was not ever going to work out well. I don't see why this has changed, if you zoom out a bit, we see all sorts of metrics are strained. Die size / cost. Wafer/production capacities versus demand. Market adoption problems... And since recently, the state of the world economy, climate, and whether growth is always a good thing. None of that supports making bigger dies to do mostly the same things.

But yea, been over this previously... let's see that momentum in the delivery of working RT in products. So far... its too quiet. A silence that is similar to that of an electric car ;)

Yeah, you'd be better off wasting your development time tinkering and tweaking the SSR implementation and hoping to get it right, and even if you get it right it will be flawed because of the flawed nature of Screen Space effects.
What a time to be alive...

Nobody considered it flawed until Jensen launched RTX and said it was. A good thing to keep in mind. Maybe devs did. But end users really never cared. A demand was created where there wasn't one. its good business before anything else.
 
Last edited:
  • Like
Reactions: bug
Joined
Jul 9, 2015
Messages
3,413 (1.06/day)
System Name M3401 notebook
Processor 5600H
Motherboard NA
Memory 16GB
Video Card(s) 3050
Storage 500GB SSD
Display(s) 14" OLED screen of the laptop
Software Windows 10
Benchmark Scores 3050 scores good 15-20% lower than average, despite ASUS's claims that it has uber cooling.
and even if you get it right it will be flawed because of the flawed nature of Screen Space effects.
I think we are way past the point of taking "realistic RT" at its face value.

Also, please dig a part of this demo, that would benefit from RTRT:
 

bug

Joined
May 22, 2015
Messages
13,214 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
@Vayra86 We have rasterization because the need for real-time rendering. Movies and whatever needs accurate rendering is already using RT, only they need to pre-render everything.
Unlike EVs (which I also feel were pushed too soon, too hard), there is a demand for RTRT, even if it's not all about games. As for RTRT adoption, two years ago nobody was talking about it, today RTRT is in both Nvidia and (soon to be released) AMD silicon. Windows and both major consoles. That's a pretty good start if you ask me.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
DLSS still requires a support per title. Whether that involves extensive training is not very relevant for the end-user. You need an update.



You're spot on this time...

VR and RT are rather comparable. Its not even a weak comparison like @lexluthermiester thinks. They both are addons to the base gaming experience. They both require additional hardware. They both require a specific performance floor to be enjoyable. They both require additional developer effort while no gameplay (length, content) is added - in fact it actually inspires shorter playtime, as more time is lost on other things.

Effectively, while it was touted as a massive time saver, RTRT does impact the dev time to market, VR does the exact same thing. It adds base cost to any game that is to be developed for a broad market and multiple input devices. Its a step harder than the ones we had until now, too. Somehow Jensen is selling the idea that this will at some point pay off for devs, but there is no proof of that yet. The games with RT don't sell more because of it. The vast majority can't use the features. I believe BF V was one of the less played Battlefields in recent times, for example. RT didn't help it. Metro Exodus... same thing. Cyberpunk - it will again be more of the same. None of these games sold on their RT features, and none of them added playtime for players due to that either. I've not seen a single person saying he'd replay Control just to gaze at reflections again.

The chicken/egg situation is similar between these technologies, they both apply to gaming, and they both, still, are only in view for a niche of the PC gaming market.

Anyone thinking this will gain traction now or in the next two years is still deluding himself. It wasn't that way with Turing, and it hasn't changed since. Not one bit. HL: Alyx also didn't become the catalyst to mass VR adoption, did it?

Bottom line... the RT hype train should not be boarded just yet, if you ask me. Especially not when the hardware is apparently forcing so many sacrifices from us, financially, heat and size wise, and in raw raster perf growth. I'll take a software solution any day of the week and we know they exist. CryEngine has a neat implementation, give me that instead, its cheap, simple, and can 100% utilize existing hardware while the visual difference is minimal at best.
A decent line of argumentation, but founded on a false equivalency: while both RTRT and VR need dedicated hardware, one needs extra dedicated hardware that has no other (relevant/large scale) use, while the other simply requires a current upper midrange or likely future midrange and upwards GPU. A GPU is a necessity for gaming; a VR headset is an extra no matter what (and you still need that GPU). While it is of course possible to both use and keep a non-RT GPU for many years (my GPU just had its 5th birthday!), in time nearly all mainstream and upwards GPUs will have RTRT support, as well as both high performance consoles. In addition to this, RTRT does not necessitate any changes to the space in which games are played, nor does it introduce radical changes to the interface of the PC/console. VR is a niche and will likely stay one because it adds cost, not because it has a cost. If you have a gaming PC, you can play games. If you want to play VR games, you need that gaming PC and a VR headset, and a room conducive to VR play, etc. All you need for RTRT gaming is that gaming PC, as long as the GPU supports RTRT. Which most will in a few years.

Of course there is a very solid argument to be made that the value of RTRT in RTX 20xx GPUs has so far been minimal, expecting that to stay true in the future comes off as naive. Sure, UE5 does great things with software-based quasi-RT rather than hardware RTRT, but even UE4 has optional support for RTRT, so there's no doubt UE5 will do the same. There's also no knowing if the techniques Epic have shown off for UE5 translate to RTRT's strength such as realistic reflections. Not to mention RTRT support in the upcoming consoles guaranteeing its proliferation in next-gen console titles.

Will most games in two years be RTRT? Of course not. Developers can't master new techniques that quickly at scale. But slowly and surely more and more games will add various RTRT lighting and reflection techniques to increase the visual fidelity, which is a good thing for all of us. Is this a revolution? Not in and of itself, as rasterization will likely be the basis for the vast majority of games for at least the next decade. But rasterization and RTRT can be combined with excellent results. RTRT obviously isn't necessary or even suited for all types of games or visual styles, and comes with an inherent performance penalty (at least until GPU makers find a way of overcoming this, which might not ever happen), but it has a place just like all the other tools in the game developer's arsenal of bag of tricks.
 
Joined
Mar 21, 2016
Messages
2,197 (0.74/day)
Those performance metrics are coming. I believe the NDA lifts tomorrow or Tuesday, so there isn't long to wait.
Yeah it'll be intriguing to see just how much progress Nvidia's made with the new GPU architecture for standard baseline performance metrics improvements as oppose to proprietary enhancements that needs to be baked in to function. I find it vastly more important to improve performance that's readily accessible all the time and doesn't require new and specialized development to be implemented and taken advantage of it. Adding functionality is great, but I still previous functionality improved so any game I wish to replay or still lay runs faster and smoother with higher or at better image quality. To me that's a bigger and more important metric. RTX is great and DLSS is handy and becoming more malleable and usable, but they are in the same boat as AMD's mantle really is the truth of the matter. I think we won't truly see a real RTRT push until the development end of it becomes painless and seamless truth be told and that's sort of true of DLSS as well on the AI upsample thing it needs to be more like a mClassic, but perhaps with different toggle options on how functions at the task.

It's a first taste of the tech. Since when does a first iteration get everything right? Was the first iPhone an expensive paperweight? Was 8086? 3dfx Voodoo?
I think that was just a gross exaggeration of the situation. I get what you're saying though I mean the 3DFX chip for Nintendo is what later led to the N64 and that chip was partially contrived from a hardware hackerthat modified a game boy to run some primitive 3D on it that Nintendo hired and combined with part of it's team that had been working a little bit with early 3D as well. Things need to start somewhere right!? I think what's needed with RTRT is a variable rate shading means and approach of rendering the real time lighting and shading itself. Something akin to light flicker. Instead of trying to render it all at 60FPS only render the lighting and shading changes them-self at 24/25FPS and object animation at 48/50FPS while the geometry could aim for 72/75FPS. You wouldn't have the bad input lag and animation would be suitable enough while the lighting would be slower yet presentable enough I mean it's still standard film quality. The most important parts of the graphic pipeline would be rendered more quickly doing it in that manner. To me that is much better than trying to simply brute force it. I wonder if Quake 2 RTX editions game engine could actually simulate just that type of configuration since it's open source if I'm not mistaken at least the original Quake 2 is so I imagine the RTX version is as well. I'd love to see what it looks like and hear what it feels like to the player testing it.

I am no tech person, but looking at how DLSS works, it appears that it requires quite a fair bit of investment from the game developers to make it work properly and affects time to market. And because it is proprietary to Nvidia, there is little motivation for a game developer to spend much time on it, unless Nvidia is willing to put in significant effort to help them optimize it. I don't think this is sustainable. I do agree that VRS seems to make more sense since future products from AMD and Intel should support VRS, and is universal.
VRS is far more suitable I agree with that. They do different things however, but that's in part or primarily because VRS is still in it's primitive stage. I think as VRS evolves it'll incorporate much of what DLSS can do as well and be a malleable tool to adjust performance on the fly in different types of ways to the end user to scale in manner they best see fit. I see VRS as like a neural network along with post process injection and/or in process injection upon different stages of the pipelines rendering in order to achieve the best performance to image quality compromises along with artistic emphasis to match the end user and/or developers desires.

If you want to play VR games, you need that gaming PC and a VR headset, and a room conducive to VR play, etc. All you need for RTRT gaming is that gaming PC, as long as the GPU supports RTRT. Which most will in a few years.
RTRT will probably be both fairly common in a decade, but also quite a bit improved, but that's certainly not a few years. I think it'll still be relatively rough around the edges for the next half of the decade, but hit a bit of a turning point probably 5-6 years and start to age like a fine a wine and get better and better along the way. We'll see sweeping changes to RTRT in the next 5 years and more refined changes in the following 5 years and while it'll be exciting to see the advancement of it early on the culmination of it in about a decade will be quite amazing. I do hope however that it doesn't stagnate rasterization heavily in the process. There is still a lot of need for improvement in that area both for higher refresh rate and resolution gaming not to mention better geometric scene detail as a whole which improving that has a big impact on overall image quality things become more lifelike and even those faked shading and lighting details because better in turn and less jarring I mean feel free to compare a low poly game to a high poly game and how that impacts lighting and shading signficantly faked or not everything becomes more immersive and realistic the higher poly count things become and that includes RTRT for that matter.

I think we are way past the point of taking "realistic RT" at its face value.

Also, please dig a part of this demo, that would benefit from RTRT:
The thing that impresses me the most honestly with that demo isn't so much the lighting and shading, but the overall geometry and texture quality and bump map detail it looks so much more detailed and life like in fact the bump map and texture quality reminds me a fair bit of Ghost Recon Wildland's, but taken up a notch further to another level and that's exciting to see. I find VRAM and TMU's extremely important for GPU's nothing kills graphic quality more quickly as a whole than reducing texture quality. I feel like bump mapping was probably the single biggest advancement in 3D graphics outside of the basic geometry itself reaching a point of being reasonably presentable. Take a game like Stunt Race FX or StarFox on the SNES imagine how it would look with high quality textures and bump mapping with the same geometric detail the game would still look impressive and greatly transformed and improved from it's original inception even as low poly count as it is the difference would be night and day.
 
Last edited:
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
@Vayra86 We have rasterization because the need for real-time rendering. Movies and whatever needs accurate rendering is already using RT, only they need to pre-render everything.
Unlike EVs (which I also feel were pushed too soon, too hard), there is a demand for RTRT, even if it's not all about games. As for RTRT adoption, two years ago nobody was talking about it, today RTRT is in both Nvidia and (soon to be released) AMD silicon. Windows and both major consoles. That's a pretty good start if you ask me.

A decent line of argumentation, but founded on a false equivalency: while both RTRT and VR need dedicated hardware, one needs extra dedicated hardware that has no other (relevant/large scale) use, while the other simply requires a current upper midrange or likely future midrange and upwards GPU. A GPU is a necessity for gaming; a VR headset is an extra no matter what (and you still need that GPU). While it is of course possible to both use and keep a non-RT GPU for many years (my GPU just had its 5th birthday!), in time nearly all mainstream and upwards GPUs will have RTRT support, as well as both high performance consoles. In addition to this, RTRT does not necessitate any changes to the space in which games are played, nor does it introduce radical changes to the interface of the PC/console. VR is a niche and will likely stay one because it adds cost, not because it has a cost. If you have a gaming PC, you can play games. If you want to play VR games, you need that gaming PC and a VR headset, and a room conducive to VR play, etc. All you need for RTRT gaming is that gaming PC, as long as the GPU supports RTRT. Which most will in a few years.

Of course there is a very solid argument to be made that the value of RTRT in RTX 20xx GPUs has so far been minimal, expecting that to stay true in the future comes off as naive. Sure, UE5 does great things with software-based quasi-RT rather than hardware RTRT, but even UE4 has optional support for RTRT, so there's no doubt UE5 will do the same. There's also no knowing if the techniques Epic have shown off for UE5 translate to RTRT's strength such as realistic reflections. Not to mention RTRT support in the upcoming consoles guaranteeing its proliferation in next-gen console titles.

Will most games in two years be RTRT? Of course not. Developers can't master new techniques that quickly at scale. But slowly and surely more and more games will add various RTRT lighting and reflection techniques to increase the visual fidelity, which is a good thing for all of us. Is this a revolution? Not in and of itself, as rasterization will likely be the basis for the vast majority of games for at least the next decade. But rasterization and RTRT can be combined with excellent results. RTRT obviously isn't necessary or even suited for all types of games or visual styles, and comes with an inherent performance penalty (at least until GPU makers find a way of overcoming this, which might not ever happen), but it has a place just like all the other tools in the game developer's arsenal of bag of tricks.

I think we're seeing the same distinctions and limitations, really. I'm just applying the current state of the technology to what is in the upcoming GPU gens in terms of hardware, availability, cost and sacrifices made in the product itself. We're not ready yet to jump into this and it shows - in TDP, in cost, die sizes, and overall API development and adoption. It all felt and still feels like a rushed process, that had to be connected to whatever was going to come out in the near future.

This was the case when Jensen sold his 10 gigarays for the PC environment, and it seems to be the case for the next console gen for that environment. One might say that the actual adoption is actually just going to start now, and Turing was a proof of concept at best. That also applies to the games we've seen with RT. Very controlled environments. Linearity or a relatively small map, or a big one (Exodus) but then all you get is a GI pass... Stuff you can optimize and tweak almost per scene and sequence. So far... I'm just not convinced. Maybe it is good for a first gen... but the problem is, we're paying the premium ever since with little to show for it. Come Ampere, and you have no option to avoid it anymore, even if there is no content at all that you want to play.

@Valantar great point about the difference with VR hardware adoption because yes, this is absolutely true. But don't underestimate the capability of people to ride it out when they see no reason to purchase something new. GPUs can last a looong time. Again, a car analogy fits here. Look at what ancient crap you still see on the road. And for many of the same reasons: mostly emotional, and financial, and some functional: we like that engine sound, the cancerous smell, etc. :)

You only need to look at the sentiment on this forum. I don't think people are really 'anti RT'. They just don't want to pay more for it and if they do, it better be worth it.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
RTRT will probably be both fairly common in a decade, but also quite a bit improved, but that's certainly not a few years. I think it'll still be relatively rough around the edges for the next half of the decade, but hit a bit of a turning point probably 5-6 years and start to age like a fine a wine and get better and better along the way. We'll see sweeping changes to RTRT in the next 5 years and more refined changes in the following 5 years and while it'll be exciting to see the advancement of it early on the culmination of it in about a decade will be quite amazing. I do hope however that it doesn't stagnate rasterization heavily in the process. There is still a lot of need for improvement in that area both for higher refresh rate and resolution gaming not to mention better geometric scene detail as a whole which improving that has a big impact on overall image quality things become more lifelike and even those faked shading and lighting details because better in turn and less jarring I mean feel free to compare a low poly game to a high poly game and how that impacts lighting and shading signficantly faked or not everything becomes more immersive and realistic the higher poly count things become and that includes RTRT for that matter.
I think you are vastly underestimating the effect the addition of a featureset to a console generation has to developers making use of said featureset. Sure, it takes a couple of years as they all have to learn to use the new tools, and things made a couple of years later than that will again be vastly improved, but that doesn't change the fact that time and time again, consoles adopting new graphics features has directly led to the proliferation of those features among both high-profile and smaller games in the following years. It definitely does not take a decade. The first high profile first-party games typically arrive around launch, with third-party studios trickling out titles at an ever increasing pace over the next couple of years. In two years I sincerely doubt there will be a single AAA game launched without some form of RTRT implemented. It might be limited, but it will be there.

I think we're seeing the same distinctions and limitations, really. I'm just applying the current state of the technology to what is in the upcoming GPU gens in terms of hardware, availability, cost and sacrifices made in the product itself. We're not ready yet to jump into this and it shows - in TDP, in cost, die sizes, and overall API development and adoption. It all felt and still feels like a rushed process, that had to be connected to whatever was going to come out in the near future.

This was the case when Jensen sold his 10 gigarays for the PC environment, and it seems to be the case for the next console gen for that environment. One might say that the actual adoption is actually just going to start now, and Turing was a proof of concept at best. That also applies to the games we've seen with RT. Very controlled environments. Linearity or a relatively small map, or a big one (Exodus) but then all you get is a GI pass... Stuff you can optimize and tweak almost per scene and sequence. So far... I'm just not convinced. Maybe it is good for a first gen... but the problem is, we're paying the premium ever since with little to show for it. Come Ampere, and you have no option to avoid it anymore, even if there is no content at all that you want to play.

@Valantar great point about the difference with VR hardware adoption because yes, this is absolutely true. But don't underestimate the capability of people to ride it out when they see no reason to purchase something new. GPUs can last a looong time. Again, a car analogy fits here. Look at what ancient crap you still see on the road. And for many of the same reasons: mostly emotional, and financial, and some functional: we like that engine sound, the cancerous smell, etc. :)

You only need to look at the sentiment on this forum. I don't think people are really 'anti RT'. They just don't want to pay more for it and if they do, it better be worth it.
I entirely agree on not wanting to pay more for it, though I don't see RT as related to the recent price hikes in GPUs either - they were already rising steeply before RTRT was a thing, and this is just a continuation of a trend. Sure Nvidia is making ridiculously oversized dice, but that's mainly a process node thing - their huge current chips are likely cheaper than their new 8nm chips at a similar CUDA core count. I think it's more about lack of full-range competition (with said competition desperately needing whatever margins they could get until recently), coupled with a console generation with very modest specs even at launch. The PS4 and X1 launched with lower midrage PC GPU equivalents, while the XSX will likely be near 2080 levels with the PS5 somewhere around a 2060S or 2070. AMD will also (hopefully!) be back in the high-end GPU game soon, so I am really hoping this will start to bring prices back down to sensible levels again - though the leaked PCB shots for the 3090 certainly doesn't point towards a cheap GPU. Still, around $500 for a ~2070-2080-level console is bound to make PC GPU makers drop their prices at least somewhat.

I do agree that RTX seemed rushed, but then again that's the chicken and egg problem of a high-profile GPU feature - you won't get developer support until hardware exists, and people won't buy hardware until it can be used, so ... early adopters are always getting the short end of the stick. I don't see any way to change that, sadly, but as I said above I expect RTRT proliferation to truly take off within two years of the launch of the upcoming consoles. Given that the XSX uses DXR, code for that can pretty much be directly used in the PC port, dramatically simplifying cross-platform development.

In the beginning I was a serious RTX skeptic, and I still think the value proposition of those features has been downright terrible for early RTX 20xx buyers, but I've held off on buying a new GPU for a while now mainly because I want RTRT in an AMD GPU - as you say, GPUs can be kept for a long time, and with my current one serving me well for five years I don't want to be left in the lurch in a couple of years by buying something lacking a major feature. A 5700 or 5700 XT would have been a massive upgrade over my Fury X, but if I'd bought one of those last year or earlier this year I know I would have regretted doing so within a couple of years. I'd rather once again splurge on a high-end card and have it last me 3-5 years. By which time I think I will have gotten more than my money's worth in RT-enabled games.
 
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
"It is safe to upgrade now [my Pascal friends]."

:D
 
Joined
May 15, 2020
Messages
697 (0.49/day)
Location
France
System Name Home
Processor Ryzen 3600X
Motherboard MSI Tomahawk 450 MAX
Cooling Noctua NH-U14S
Memory 16GB Crucial Ballistix 3600 MHz DDR4 CAS 16
Video Card(s) MSI RX 5700XT EVOKE OC
Storage Samsung 970 PRO 512 GB
Display(s) ASUS VA326HR + MSI Optix G24C4
Case MSI - MAG Forge 100M
Power Supply Aerocool Lux RGB M 650W
I'll wait for benchmarks and AMD, though.

Those TFlops numbers are multiplied by 2 because 2 operations per shader, but I remember that when Geforce 2 GTS was launched there was a similar 2 ops per pixel, and real performance was way lower than theoretical performance.

But that was one good presentation, Jensen and Nvidia marketing is top-notch. And the prices are decent this time.
 
Joined
Sep 17, 2014
Messages
20,906 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Joined
Mar 21, 2016
Messages
2,197 (0.74/day)
I think you are vastly underestimating the effect the addition of a featureset to a console generation has to developers making use of said featureset. Sure, it takes a couple of years as they all have to learn to use the new tools, and things made a couple of years later than that will again be vastly improved, but that doesn't change the fact that time and time again, consoles adopting new graphics features has directly led to the proliferation of those features among both high-profile and smaller games in the following years. It definitely does not take a decade.
I know that RTRT will come and trickle in, but I still don't think it'll be convincing enough for the better part of a decade because the hardware has to mature further. I mean just based on what I've seen with Quake RTX video's and based on Blender with cycles rendering and path tracing I can tell you right now it has a long way to go before it's more presentable to the end user. Sure I don't have the most cutting edge GPU and CPU at my disposal that doesn't help, but just the same it's going to take time before it really is adopted at a reasonable price to the end user just look at 4K we still aren't really there quite yet in reality and things are only getting more difficult. Think about it how long to you think it'll be before the not yet released CyperPunk 2077 can be rendered at 4K with the highest graphic quality settings!!? Just look at Crysis as a example case how far off do you think we are til 60FPS at 4K for that particular game at maximum image quality if you were to estimate just based on screen shots of the game?

To go further base upon the earlier trailers of the game before the geometric detail was somewhat castrated in area's. I want the fully polygon 4K max detail experience with RTRT just give me a target year to wait for. I'm thinking a decade is in the ballpark area short of some huge advancements that said some of those thing like VRS could skew and compromise things a touch and alter the situation and isn't what I'm asking I don't image quality compromises. Though I do defiantly believe those kinds of advancements will make RTRT more widely and easily available at a pretty progressive pace especially so in these earlier generations where the tech is more in it's infancy where relatively smaller improvements and advancements can lead to rather huge and staggering perceived improvements DLSS for example is defiantly becoming more and more intricate and usable and I do strongly believe combined with VRS it'll become harder and harder to discern some of the difference while improving performance by interjecting post process upscale in a VRS manner.

I entirely agree on not wanting to pay more for it, though I don't see RT as related to the recent price hikes in GPUs either - they were already rising steeply before RTRT was a thing, and this is just a continuation of a trend.
That's the crux of the problem a continuation of a situation people already disheartening situation compounds the issue further. That's why in some aspects AMD's RNDA is a slight reprieve, but defiantly not perfect we all know that Nvidia in general has more resources to deliver a more reliable, seamless, and overall less problematic and frustrating experience than AMD is really capable of doing so financially at present. They've often got solid enough hardware, but enough dedicated resources to provide as well of a pain free experience so to speak they lack that extra layer of polish. In part due to the lack of resources and budget in big part at least combined with the the debt that they took on and then got mismanaged by it's previous CEO, but Lisa Su has really reatroactively been correcting a lot of those things over time. I believe we'll see a stronger advancement from AMD going forward under her leadership in both CPU and GPU divisions though I feel the GPU won't fully get the push it needs until AMD's lead on the CPU diminishes. I see it as more of a fall back pathway if they can't maintain a CPU dominance, but if they can they'll just as well assume to continue on that pathway while making incremental progress with GPU's. I feel that applies in both directions because that's how they are diversified at present.
 
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I know that RTRT will come and trickle in, but I still don't think it'll be convincing enough for the better part of a decade because the hardware has to mature further. I mean just based on what I've seen with Quake RTX video's and based on Blender with cycles rendering and path tracing I can tell you right now it has a long way to go before it's more presentable to the end user. Sure I don't have the most cutting edge GPU and CPU at my disposal that doesn't help, but just the same it's going to take time before it really is adopted at a reasonable price to the end user just look at 4K we still aren't really there quite yet in reality and things are only getting more difficult. Think about it how long to you think it'll be before the not yet released CyperPunk 2077 can be rendered at 4K with the highest graphic quality settings!!? Just look at Crysis as a example case how far off do you think we are til 60FPS at 4K for that particular game at maximum image quality if you were to estimate just based on screen shots of the game?

To go further base upon the earlier trailers of the game before the geometric detail was somewhat castrated in area's. I want the fully polygon 4K max detail experience with RTRT just give me a target year to wait for. I'm thinking a decade is in the ballpark area short of some huge advancements that said some of those thing like VRS could skew and compromise things a touch and alter the situation and isn't what I'm asking I don't image quality compromises. Though I do defiantly believe those kinds of advancements will make RTRT more widely and easily available at a pretty progressive pace especially so in these earlier generations where the tech is more in it's infancy where relatively smaller improvements and advancements can lead to rather huge and staggering perceived improvements DLSS for example is defiantly becoming more and more intricate and usable and I do strongly believe combined with VRS it'll become harder and harder to discern some of the difference while improving performance by interjecting post process upscale in a VRS manner.

That's the crux of the problem a continuation of a situation people already disheartening situation compounds the issue further. That's why in some aspects AMD's RNDA is a slight reprieve, but defiantly not perfect we all know that Nvidia in general has more resources to deliver a more reliable, seamless, and overall less problematic and frustrating experience than AMD is really capable of doing so financially at present. They've often got solid enough hardware, but enough dedicated resources to provide as well of a pain free experience so to speak they lack that extra layer of polish. In part due to the lack of resources and budget in big part at least combined with the the debt that they took on and then got mismanaged by it's previous CEO, but Lisa Su has really reatroactively been correcting a lot of those things over time. I believe we'll see a stronger advancement from AMD going forward under her leadership in both CPU and GPU divisions though I feel the GPU won't fully get the push it needs until AMD's lead on the CPU diminishes. I see it as more of a fall back pathway if they can't maintain a CPU dominance, but if they can they'll just as well assume to continue on that pathway while making incremental progress with GPU's. I feel that applies in both directions because that's how they are diversified at present.
Well, Nvidia just kicked the "paying extra for RTX" thing onto the trash heap. The new GPUs look like great value, disregarding the ridiculous 3090. The $499 3070 supposedly beating the $1199 (yeah, sure, $999 on paper) 2080 Ti just serves to demonstrate how ridiculously priced that GPU was.

As for the rest, you're mixing up fully path traced games (Quake RTX) with other titles with various RT implementations in lighting, reflections, etc. Keep your arguments straight if you want them taken seriously - you can't complain about the visual fidelity of showcase fully path-traced games and use them as an example of mixed-RT-and-rasterization games not looking good. That makes no sense. I mean, sure, there will always be a performance penalty to RTRT, at least until hardware makers scale up RTRT hardware to "match". But does that include sacrificing visual fidelity? That depends on the implementation. As for you insisting on no-holds-barred, full quality renders all the time ... well, that's what the adage "work smart, not hard" is for. If there is no perceptible difference in fidelity, what is the harm in using smarter rendering techniques to increase performance? Is there some sort of rule saying that straightforward, naïve rendering techniques are better somehow? It sounds like you need to start trusting your senses rather than creating placebo effects for yourself: if you can't tell the difference in a blind comparison test, the most performant solution is obviously better. Do you refuse to use tessellation too? Do you demand that developers render all lighting and shadows on geometry in real time rather than pre-baking it? You have clearly drawn an arbitrary line which you have defined as "real rendering", and are taking a stand for it for some reason that is frankly beyond me. Relax and enjoy your games, please.

As for a "target year" for full quality 4k RTRT: likely never. Why? Because games are a moving target. New hardware means game developers will push their games further. There is no such thing as a status quo in performance requirements.
 
Joined
Mar 21, 2016
Messages
2,197 (0.74/day)
I just think in the situation I gave might not be plausible in a full decade from now.. That to me proves how vital rasterization is in relation to RTRT. I do feel RTRT has it's place, but prematurely pushing it's limits might not be ideal to the end user. It's entirely possible we could see a stunt race FX type of early adoption fidelity on the RTRT side of things, but w/o the input lag negative aspects which I think is a pretty ideal way to tackle it infusing some VRS with RTRT.
 
Joined
Nov 4, 2019
Messages
234 (0.14/day)
Imagine being stupid enough to think not having RT is a plus point and having it means the game developer has to cripple the visuals to make RT look good.
TPU's comment section feels like a joke sometimes.
We all knew RT is only used for part of the rendering, but you're making a joke out of yourself by acting like RT is a bad thing and developers should stick to rasterization forever because it somehow gives better results in your ignorant mind.

I said nothing you think I said. Way to go. And Control did have crippled non RT visuals. Compare with the SSR in Gears 5 for example. Control had nothing with RT off. The point is: the fact that RT is in the game doesn't mean the visuals would look meaningfully different with the RT off. But your performance would double. And that hasn't changed with the RTX 3090 either. RT has been used for a long time for many rasterization features (check out that PS5 Unreal demo, not RT, but uses rays).

The only real RT I've seen is Quake and the marble demo. Look at all the videos released yesterday:

If I didn't tell the average person, would anyone even know RT was on? Does Watch Dogs look better? No. Does the Cyberpunk video impress more than the earlier Cyberpunk videos? No. Even when you are watching RT videos, the quality is coming from the rasterization, while RT is almost doing nothing but tanking your framerate. Some people are gullible.

Did Horizon 2 look better or worse than everything we saw yesterday? Exactly.

People forget Battlefield 5 had the most advanced implementation of RT. And everyone, including me, turned it off. Speedy optimization and art > reality and accuracy in gaming.

I'll wait for benchmarks and AMD, though.

Those TFlops numbers are multiplied by 2 because 2 operations per shader, but I remember that when Geforce 2 GTS was launched there was a similar 2 ops per pixel, and real performance was way lower than theoretical performance.

But that was one good presentation, Jensen and Nvidia marketing is top-notch. And the prices are decent this time.

It was kind of funny a lot of the partners had marketing material already to go and some even published, with the "half the cuda core numbers" so yeah, wait for reviews. The positive thing for me with the RTX 3000 release is they doubled down on rasterization performance, instead of putting everything into RT. It's a bit of a return to form.
 
Last edited:
Joined
May 2, 2017
Messages
7,762 (3.05/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
I said nothing you think I said. Way to go. You are really out of it. The point is: the fact that RT is in the game doesn't mean the visuals would look meaningfully different with the RT off. But your performance would double. And that hasn't changed with the RTX 3090 either. RT has been used for a long time for many rasterization features (check out that PS5 Unreal demo, not RT, but uses rays).

The only real RT I've seen is Quake and the marble demo. Look at all the videos released yesterday:

If I didn't tell the average person, would anyone even know RT was on? Does Watch Dogs look better? No. Does the Cyberpunk video impress more than the earlier Cyberpunk videos? No. Even when you are watching RT videos, the quality is coming from the rasterization, while RT is almost doing nothing but tanking your framerate. Some people are gullible.

Did Horizon 2 look better or worse than everything we saw yesterday? Exactly.
That entirely depends on what you're looking for. Rasterization can't do lighting or reflections like RTRT can. GI can be faked to a certain extent, but it's (at least up until now) extremely expensive. Screen space reflections are great, until you're looking at something reflective that ought to be reflecting off-screen objects or scenery, yet suddenly isn't. Did you notice how there is nothing reflective in the UE5 demo? That isn't an accident, and I would imagine one of the first add-ons (or just most used optional features) for UE5 being RTRT reflections, as that would serve to complete the otherwise excellent looking package. You don't necessarily notice these things in play, but you obviously might. Control is an excellent example of the settings feeling more realistic when RTRT is added. That obviously doesn't mean that there arent/won't be bad RTRT implementations that don't add anything of real value - there definitely will be! - but the technology nonetheless has the potential to surpass anything possible with rasterization in certain aspects. Full scene RTRT is unnecessary and likely of little real benefit, but selective use of the tech can provide good IQ improvements, even relative to the performance cost, when used in the right way in the right games. On the other hand, some games don't benefit from this whatsoever, and should really skip it. As is the case with all kinds of graphical features.
 
Top