• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Intel and Wargaming Join Forces to Deliver Ray Tracing to World of Tanks

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
well RT for now ...(might be imho, mind you.) is unimpressive, minor and a freaking huge chunk of a RTX card is used to run it ... while it could be used for something else, that's one hell of a drawback ...

comparing that to a cell phone or a SSD is a bit ... optimistic ... at best it's comparable to a cartridge fountain pen coming from a quill pen, in term of advancement and usefulness and unfortunately .... i has more the price to pay akin to the prior than the later ...

My point is that it's expensive because the hardware is new just like with past new hardware. You've been around long enough to see that many times over.

We don't go straight to this overnight:

 
Joined
May 9, 2012
Messages
8,398 (1.93/day)
Location
Ovronnaz, Wallis, Switzerland
System Name main/SFFHTPCARGH!(tm)/Xiaomi Mi TV Stick/Samsung Galaxy S23/Ally
Processor Ryzen 7 5800X3D/i7-3770/S905X/Snapdragon 8 Gen 2/Ryzen Z1 Extreme
Motherboard MSI MAG B550 Tomahawk/HP SFF Q77 Express/uh?/uh?/Asus
Cooling Enermax ETS-T50 Axe aRGB /basic HP HSF /errr.../oh! liqui..wait, no:sizable vapor chamber/a nice one
Memory 64gb Corsair Vengeance Pro 3600mhz DDR4/8gb DDR3 1600/2gb LPDDR3/8gb LPDDR5x 4200/16gb LPDDR5
Video Card(s) Hellhound Spectral White RX 7900 XTX 24gb/GT 730/Mali 450MP5/Adreno 740/RDNA3 768 core
Storage 250gb870EVO/500gb860EVO/2tbSandisk/NVMe2tb+1tb/4tbextreme V2/1TB Arion/500gb/8gb/256gb/2tb SN770M
Display(s) X58222 32" 2880x1620/32"FHDTV/273E3LHSB 27" 1920x1080/6.67"/AMOLED 2X panel FHD+120hz/FHD 120hz
Case Cougar Panzer Max/Elite 8300 SFF/None/back/back-front Gorilla Glass Victus 2+ UAG Monarch Carbon
Audio Device(s) Logi Z333/SB Audigy RX/HDMI/HDMI/Dolby Atmos/KZ x HBB PR2/Edifier STAX Spirit S3 & SamsungxAKG beans
Power Supply Chieftec Proton BDF-1000C /HP 240w/12v 1.5A/4Smart Voltplug PD 30W/Asus USB-C 65W
Mouse Speedlink Sovos Vertical-Asus ROG Spatha-Logi Ergo M575/Xiaomi XMRM-006/touch/touch
Keyboard Endorfy Thock 75% <3/none/touch/virtual
VR HMD Medion Erazer
Software Win10 64/Win8.1 64/Android TV 8.1/Android 13/Win11 64
Benchmark Scores bench...mark? i do leave mark on bench sometime, to remember which one is the most comfortable. :o
My point is that it's expensive because the hardware is new just like with past new hardware. You've been around long enough to see that many times over.

We don't go straight to this overnight:

i fully know that and in movies industry it has its use, as you said for now for games ... not there now :laugh: (what was the hardware cost/need behind that demo)

to me it's like PhysX which ought to be a huge hit, it might be one or it might be not ... time will tell.


although one thing i find interesting in that news, it's that RT is CPU driven.
 
Joined
Feb 11, 2009
Messages
5,393 (0.97/day)
System Name Cyberline
Processor Intel Core i7 2600k -> 12600k
Motherboard Asus P8P67 LE Rev 3.0 -> Gigabyte Z690 Auros Elite DDR4
Cooling Tuniq Tower 120 -> Custom Watercoolingloop
Memory Corsair (4x2) 8gb 1600mhz -> Crucial (8x2) 16gb 3600mhz
Video Card(s) AMD RX480 -> ... nope still the same :'(
Storage Samsung 750 Evo 250gb SSD + WD 1tb x 2 + WD 2tb -> 2tb MVMe SSD
Display(s) Philips 32inch LPF5605H (television) -> Dell S3220DGF
Case antec 600 -> Thermaltake Tenor HTCP case
Audio Device(s) Focusrite 2i4 (USB)
Power Supply Seasonic 620watt 80+ Platinum
Mouse Elecom EX-G
Keyboard Rapoo V700
Software Windows 10 Pro 64bit
It's embarrassing seeing people who normally embrace tech advancement fighting against RTRT so hard.
Of course hardware that improves handling RTRT is too expensive. It's supposed to be. Remember when cell phones were new or SSDs?
RTRT kills the performance of my GPU. It's supposed to. It's for the future. It's not for past GPUs.
Nvidia is taking control of RTRT with RTX GPUs. They may try but they will fail. AMD already has a hardware solution for handling RTRT better with the upcoming PS5. Intel has a solution as well.
Developers are slow to embrace RTRT and some implementations are sloppy. Time takes time.
To me it looks like a good thing for gamers eventually but there are of course some rough spots right now.

People dont really fight RT, people fight the high prices because of it.
If Nvidia released a line of GTX2000 cards that did not have the RT stuff in it for the normal non insane prices of old, then it would be fine.
But they dont, so now you are stuck with these inflated prices for hardware you dont even want but have to pay for, that really is the problem here.

Yes we remember when cell phones were new and SSD's, and did we buy those? nope, instead we said its too expensive and just like here made arguments we did not need them, which at the time was correct as everything in our world was build around not having those new gadgets yet, same now with RTX where we can say its just not worth it.
Being an early adopter is just pretty much always a poor choice.
By the time RT is worth it and games are build from the ground up using it the prices will have gone down a lot and quality will have gone up, but we all know this.

If RTRT is killing performance....then why is it implemented now? when the hardware itself is not even ready for it? and yet you are paying through the nose for it...

Ultimately we all agree though, but yeah, you should see why people dont care about RT to the point of hating on the attention it gets and rather had cheaper non RT gpu's
That and this going at it themselves that Nvidia likes to do, is not the way to go, they should work together with other companies and establish a standard to follow and improve on.
 
Joined
Sep 17, 2014
Messages
20,891 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
My point is that it's expensive because the hardware is new just like with past new hardware. You've been around long enough to see that many times over.

We don't go straight to this overnight:


Yes, we do, we did this a million times already. Now we try to brute force it in realtime and sell it like its new.

This is not comparable to other, past advances in graphics. Its a departure from what was an iterative and constant process of increasing efficiency and raw performance. RT adds a layer of gross inefficiency on top - conveniently timed with mainstream resolutions becoming very easy to render on rather cheap GPUs.

You said it right though! Its supposed to be expensive - RT is the means to that goal, indeed.

@ZoneDymo said it right. We dont fight RT, we are struggling to see the added value vs the added cost.

Time will tell if the tech ever gets wings. So far it hasnt, except in marketing and announcements. There is not a single truly jaw dropping live product out there today. And thats not due to lack of attention - RT news is near clickbait grade material.
 
Last edited:

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
People dont really fight RT, people fight the high prices because of it.
If Nvidia released a line of GTX2000 cards that did not have the RT stuff in it for the normal non insane prices of old, then it would be fine.
But they dont, so now you are stuck with these inflated prices for hardware you dont even want but have to pay for, that really is the problem here.

Yes we remember when cell phones were new and SSD's, and did we buy those? nope, instead we said its too expensive and just like here made arguments we did not need them, which at the time was correct as everything in our world was build around not having those new gadgets yet, same now with RTX where we can say its just not worth it.
Being an early adopter is just pretty much always a poor choice.
By the time RT is worth it and games are build from the ground up using it the prices will have gone down a lot and quality will have gone up, but we all know this.

If RTRT is killing performance....then why is it implemented now? when the hardware itself is not even ready for it? and yet you are paying through the nose for it...

Ultimately we all agree though, but yeah, you should see why people dont care about RT to the point of hating on the attention it gets and rather had cheaper non RT gpu's
That and this going at it themselves that Nvidia likes to do, is not the way to go, they should work together with other companies and establish a standard to follow and improve on.

We have to start somewhere. It's like this. If there is no hardware that supports RTRT then why would Developers learn how to use it and implement it? The hardware had to come first and it was going to be expensive whether it was released now or 20 years from now. New hardware is always expensive partly due to the expensive R&D that went into developing it. The early adopters are paying for that R&D just like always. Prices will continue to come down just like SSD prices came down. What is necessary is competition and now Nvidia will be getting that from AMD and Intel as well. When I saw that article about Sony saying there would be a hardware solution for RTRT in the PS5 that was the end of my caution about whether or not RTRT was here to stay. With Nvidia, AMD, Intel and the Console makers all involved in RTRT there's nothing to stop it now.

What concerns me is that people are judging RTRT as if what we are seeing right now is all that we will get. This is only the tip of the iceberg.
 
Joined
Sep 17, 2014
Messages
20,891 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
We have to start somewhere. It's like this. If there is no hardware that supports RTRT then why would Developers learn how to use it and implement it? The hardware had to come first and it was going to be expensive whether it was released now or 20 years from now. New hardware is always expensive partly due to the expensive R&D that went into developing it. The early adopters are paying for that R&D just like always. Prices will continue to come down just like SSD prices came down. What is necessary is competition and now Nvidia will be getting that from AMD and Intel as well. When I saw that article about Sony saying there would be a hardware solution for RTRT in the PS5 that was the end of my caution about whether or not RTRT was here to stay. With Nvidia, AMD, Intel and the Console makers all involved in RTRT there's nothing to stop it now.

What concerns me is that people are judging RTRT as if what we are seeing right now is all that we will get. This is only the tip of the iceberg.

You can look at VR for a very recent confirmation that a console announcement does not carry its popularity on the market. PSVR ran quite well, but is VR here to stay now? Nope...

Cautious optimism, thats about the best way to approach this.
 

64K

Joined
Mar 13, 2014
Messages
6,104 (1.66/day)
Processor i7 7700k
Motherboard MSI Z270 SLI Plus
Cooling CM Hyper 212 EVO
Memory 2 x 8 GB Corsair Vengeance
Video Card(s) MSI RTX 2070 Super
Storage Samsung 850 EVO 250 GB and WD Black 4TB
Display(s) Dell 27 inch 1440p 144 Hz
Case Corsair Obsidian 750D Airflow Edition
Audio Device(s) Onboard
Power Supply EVGA SuperNova 850 W Gold
Mouse Logitech G502
Keyboard Logitech G105
Software Windows 10
You can look at VR for a very recent confirmation that a console announcement does not carry its popularity on the market. PSVR ran quite well, but is VR here to stay now? Nope...

That's a poor comparison. If the console came with a mandatory VR headset and the customer had to pay for it whether they wanted to or not then it would be more like what Sony is doing. Sony will be putting chips in the PS5 that will support RTRT and the customer will be paying the cost whether they want RTRT or not. To me that means Sony believes RTRT is here to stay and there are benefits to the tech.
 
Joined
Mar 24, 2012
Messages
528 (0.12/day)
People dont really fight RT, people fight the high prices because of it.
If Nvidia released a line of GTX2000 cards that did not have the RT stuff in it for the normal non insane prices of old, then it would be fine.
But they dont, so now you are stuck with these inflated prices for hardware you dont even want but have to pay for, that really is the problem here.

nah even if nvidia release 2080Ti without RT for $700. people will still going to find something to complain about.

If RTRT is killing performance....then why is it implemented now? when the hardware itself is not even ready for it? and yet you are paying through the nose for it...

why now? because if we keep waiting for the hardware to catch up then we never able to implement it....forever. even if GPU maker push RT 5 years from now we still going to said the same thing. because at that time we are going to say "it is too early to implement RT because we cannot get 60FPS on 8k (or even 16k) res even with the fastest GPU"

Ultimately we all agree though, but yeah, you should see why people dont care about RT to the point of hating on the attention it gets and rather had cheaper non RT gpu's
That and this going at it themselves that Nvidia likes to do, is not the way to go, they should work together with other companies and establish a standard to follow and improve on.

that standard already exist. they were called MS DXR.
 
Joined
Oct 31, 2013
Messages
186 (0.05/day)
One of those games is using RT for full Global Illumination, the other is using RT for limited shadow effects, the fact that the performance hit is similar should give you a very good idea how much better dedicated hardware is. The Crytek demo ended up showing the same thing when they noted how much better performing RTX cards would be if the dedicated hardware was used.

This SIGGRAPH presentation showed some of the numbers regarding how much dedicated hardware is boosting RT performance, it's substantial.

View attachment 134364

The dedicated hardware itself, takes up a relatively tiny amount of die space, both the Tensor cores and the RT cores together only account for ~9% of the die space on current RTX cards and of that 9%, the RT cores represent only about a third of the amount. So assuming no other bottlenecks, you're getting 2x-3x times performance for a ~3% die space cost.

A lot of people think RT is too expensive even WITH the performance boost from dedicated hardware. Seems quite clear that going forward dedicated RT hardware is indeed a "must have" for real time RT, if it's ever going to be taken seriously at least.

I think there is some kind of mismatch between the power needed for RTX and that what was expected. When you run games with RTX enabled the power draw goes down. So, now the RTX cores are used, but the rest can not be fully utilized. The RTX cores should be within the shader pipeline. With the power draw going down, the 3% of die space for them might not be enough to keep the the rest of the shader pipeline fully occupied.

I am thinking a little in the future here. We need a unified shader model with the shaders able to run the Raytracing calculations efficiently as well. Vulcan 2 or DirectX 13 maybe? So every game developer can manage the raytracing how he likes. This brings me to intel Larabee, which was a GPU as general purpose as it gets. It was, so simplify it, just a bunch of x86 cores packed together. And there were some interesting Raytracing projects with it too.

Quake Wars Raytracing

The current Raytracing can still be used for Adventure games. We only need 30FPS for them. And a fully raytraced Adventure game, based on that technology would look great :)

The current raytracing attemps are difficult. Companies try to pack it somehow into rasterizded engines. (WOT and RTX promo games). There are better 3d model implimentations for it, which come with it's own advantages and disadvantages. Building a raytraced game from the ground up, with the raytraced preferred model and texture structures might include some thinking out of the box. And if you want to support rasterization at that stage, you probably have to built 2 different games.

I hope that Nvidia is dropping it's properitary RTX tech and goes with DXR and Vulcan ractracing. Just the same way AMD dropped Mantle for DX12 and Vulcan. This way me might all benefit from better, fully raytraced games :)
Background thinking was hardware PhysiX games, which could only add physical simulation, but could not built a game around it. A game developer would never cripple it's sales to go with one companys hardware.
 
Joined
Mar 26, 2009
Messages
175 (0.03/day)
The power of RT is not able to play on 60 fps in 2019 with a $500+ card, nice moral victory you have there.
All the % is nice in theory, but WoT is a PVP game that can get rather competitive, so no one will sacrifice even 10% fps, let alone 55%.
As for limited effects, all implmentations of Ray Tracing is very minimal given the horrible performance of even Hardware based RT.
To the point where it isn't hard to see noise / artifacts in the lighting. Practically the RT cores are just wasted die space for this generation of GPUs.
So here is a reality pill for you.
Don't like that Metro Exodus uses RT GI which is the heaviest form of RT there is? The kind that was thought to be impossible to run on a single GPU?!

Take this smack to the face then: scene wide RT shadows in Shadow of Tomb Raider, running @60fps 1440p max settings on a single RTX 2070, This is the power of hardware RT! Go back to your dark cave of software RT that can't even draw some limited tank shadows without demolishing fps.

134414
 
Joined
Oct 1, 2006
Messages
4,884 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
Don't like that Metro Exodus uses RT GI which is the heaviest form of RT there is? The kind that was thought to be impossible to run on a single GPU?!

Take this smack to the face then: scene wide RT shadows in Shadow of Tomb Raider, running @60fps 1440p max settings on a single RTX 2070, This is the power of hardware RT! Go back to your dark cave of software RT that can't even draw some limited tank shadows without demolishing fps.

View attachment 134414
Not so heavy when current RTX runs such a low ray count that I can see artificacts in the lighting quite often without trying to look for them.
This is worse on BF5 which is only used on reflections. Even with such limited RT, the current GPUs just aren't good enough.
There is an alternative to software or hardware RT, that is NO RT in 2019.
I like the only thing that you can convince yourself with is how poorly games runs with RT on, oh this card runs LESS BAD! :laugh:
 
Last edited:
Joined
Jun 18, 2018
Messages
150 (0.07/day)
Processor i5 3570K @ 5GHz (1.32V) - de-lid
Motherboard ASRock Z77 Extrem 4
Cooling Prolimatech Genesis (3x Thermalright TY-141 PWM) - direct die
Memory 2x 4GB Corsair VENGEANCE DDR3 1600 MHz CL 9
Video Card(s) MSI GTX 980 Gaming 4G (Alpenföhn Peter + 2x Noctua NF-A12) @1547 Mhz Core / 2000 MHz Mem
Storage 500GB Crucial MX500 / 4 TB Seagate - BarraCuda
Case IT-9001 Smasher -modified with a 140mm top exhaust
Audio Device(s) AKG K240 Studio
Power Supply be quiet! E9 Straight Power 600W
It's embarrassing seeing people who normally embrace tech advancement fighting against RTRT so hard.

Of course hardware that improves handling RTRT is too expensive. It's supposed to be. Remember when cell phones were new or SSDs?

RTRT kills the performance of my GPU. It's supposed to. It's for the future. It's not for past GPUs.

Nvidia is taking control of RTRT with RTX GPUs. They may try but they will fail. AMD already has a hardware solution for handling RTRT better with the upcoming PS5. Intel has a solution as well.

Developers are slow to embrace RTRT and some implementations are sloppy. Time takes time.

To me it looks like a good thing for gamers eventually but there are of course some rough spots right now.

I can only speak for myself and to clarify, I do believe that RTRT belongs in future 3D render engines.
The major caveats, in my opinion, are that NVidia halfassed the implementation big time.

We know, that rendering with RTRT can deliver better illumination, shadow casting and reflections than any comparable rasterized renderer could, in some cases it even enhances other render technics, like e.g subsurface scattering or physically based rendered textures quite a bit.

But neither the current software does nor the current RTX hardware could.

Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Shadow of the Tomb Raider only implemented RTX shadows.

For RTRT to clearly up the ante compared to current rasterized rendering, you need the full suite:
Global illumination with a sufficient amount of sample rays and bounces to get proper lighting, shadows, and reflections at 60+ fps.

RTX, unfortunately, is not optimized enough to deliver that with the current RTX cards, especially since only the upper range of cards (2080, 2080SUPER, 2080 TI, RTX TITAN) could push the current RTX implementations at proper framerates and resolution.
Add to it, that it is an proprietary feature to Turing and the 20xx series received an extra bump in price over the last generations steadily rising prices.

You get really hard-pressed, to applaud Nvidia for their innovation here.
They delivered RTRT, yes, but at a mediocre level against traditional rasterized rendering with a steep price premium.

One can only hope that the somewhat bad impression Nvidia printed on RTRT in trying to corner the market with RTX, gets dispelled with the coming solutions from Intel and AMD.
 
Joined
Oct 1, 2006
Messages
4,884 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
I can only speak for myself and to clarify, I do believe that RTRT belongs in future 3D render engines.
The major caveats, in my opinion, are that NVidia halfassed the implementation big time.

We know, that rendering with RTRT can deliver better illumination, shadow casting and reflections than any comparable rasterized renderer could, in some cases it even enhances other render technics, like e.g subsurface scattering or physically based rendered textures quite a bit.

But neither the current software does nor the current RTX hardware could.

Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Shadow of the Tomb Raider only implemented RTX shadows.

For RTRT to clearly up the ante compared to current rasterized rendering you need the full suite:
Global illumination with a sufficient amount of sample rays and bounces to get proper lighting, shadows, and reflections at 60+ fps.

RTX, unfortunately, is not optimized enough to deliver that with the current RTX cards, especially since only the upper range of cards (2080, 2080SUPER, 2080 TI, RTX TITAN) could push the current RTX implementations at proper framerates and resolution.
Add to it, that it is an proprietary feature to Turing and the 20xx series received an extra bump in price over the last generations steadily rising prices.

You get really hard-pressed, to applaud Nvidia for their innovation here.
They delivered RTRT, yes, but at a mediocre level against traditional rasterized rendering with a steep price premium.

One can only hope, that the somewhat bad impression Nvidia printed on RTRT in trying to corner the market with RTX gets dispelled with the coming solutions from Intel and AMD.
Pretty much this, we know Real Time Ray Tracing might be great in the future, but somehow there are so many people that believe the futures means 2018 and not 2023.
Even nVidia believes that RTRT will not be ready until 2023.
134436
 
Last edited:
Joined
Mar 26, 2009
Messages
175 (0.03/day)
Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Metro and Battlefield V run 1440p @60fps on a 2080Ti.


Not so heavy when current RTX runs such a low ray count that I can see artificacts in the lighting quite often without trying to look for them.
It"s still miles ahead of any software RT solution, and it delivers massive IQ increases regardless of low ray count or whatever you convince yourself of.

This is worse on BF5 which is only used on reflections. Even with such limited RT, the current GPUs just aren't good enough.
Battlefield 5 / METRO Exodus had to tune every RTX effect back, to achieve fluid 60+ fps @ 1080p with the 2080ti / RTX TITAN.
Shadow of the Tomb Raider only implemented RTX shadows.
Battlefield V was just the first game to use RTX, developers learned a lot after it. Control used Reflections + Shadows + GI and it ran and looked wonderfully. We now have full Path tracing for all shadows, reflections and lighting for games such as Quake 2 and Minecraft.
 
Last edited:
Joined
Oct 1, 2006
Messages
4,884 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
Battlefield V was just the first game to use RTX, developers learned a lot after it. Control used Reflections + Shadows + GI and it ran and looked wonderfully. We now have full Path tracing for all shadows, reflections and lighting for games such as Quake 2 and Minecraft.
Minecraft is the exactly the "filthy non-RTX Ray Tracing / Path Tracing " that you hated so much.
As for Quake RTX, a game from 1997 that requires a 2070 Super to even run on 1080p 60FPS. nVidia made this themselves so there is no "the devs doesn't know how to make games" argument.
 
Joined
Oct 1, 2006
Messages
4,884 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
Minecraft is accelerated by RTX genius, the non RTX version is both of lower quality and lower performance.
Oh really? There are many versions of Minecraft, and they are not really compatible with each other.
Also when you act like you know everything, and start calling people names, you better make sure what you say is true.
Minecraft RTX was announced in August 2019 and has no solid release date.
 
Joined
Mar 24, 2012
Messages
528 (0.12/day)
The current raytracing attemps are difficult. Companies try to pack it somehow into rasterizded engines. (WOT and RTX promo games). There are better 3d model implimentations for it, which come with it's own advantages and disadvantages. Building a raytraced game from the ground up, with the raytraced preferred model and texture structures might include some thinking out of the box. And if you want to support rasterization at that stage, you probably have to built 2 different games.

our hardware still very far to get to that point. maybe in 10 to 15 years. right now what they try to push is hybrid rendering which combine both RT and rasterization. in nvidia vision they expect triple A dev to completely ditch baked effect to be completely replaced with RT solution in 2023 time frame. and that is just triple A. it still require a few years after that for indie developer to use this approach.

I hope that Nvidia is dropping it's properitary RTX tech and goes with DXR and Vulcan ractracing. Just the same way AMD dropped Mantle for DX12 and Vulcan. This way me might all benefit from better, fully raytraced games :)
Background thinking was hardware PhysiX games, which could only add physical simulation, but could not built a game around it. A game developer would never cripple it's sales to go with one companys hardware.

IHV implementation will always be proprietary even if they were build their solution based on existing open standard. in nvidia case they don't need to drop anything from RTX. their RTX solution already work with existing open standard (MS Direct X DXR and Vulkan). this time around nvidia is smart not to push it via their proprietary API (like GPU PhysX needing CUDA) and make their RTX implementation to be compliant with existing open standard from the get go.

Also AMD did not drop Mantle necessarily to make way for DX12 and Vulkan. AMD initial plan was to make Mantle the forefront for modern 3D API that will continue to exist alongside DX and Vulkan. the problem is AMD want to handle Mantle exactly the same way Nvidia did for their CUDA: AMD have full control of Mantle development but others can take mantle and work around it to make mantle work on their hardware. Mantle will be always built for AMD GCN hardware strength first. Game developer probably want to see mantle to be handled exactly like MS DX/Khronos Group OpenGL where every IHV have their hands in shaping the API spec because they know if AMD did not fully open Mantle development to other IVH then other IHV will never going to fully support the API.
 
Joined
Sep 17, 2014
Messages
20,891 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Don't like that Metro Exodus uses RT GI which is the heaviest form of RT there is? The kind that was thought to be impossible to run on a single GPU?!

Take this smack to the face then: scene wide RT shadows in Shadow of Tomb Raider, running @60fps 1440p max settings on a single RTX 2070, This is the power of hardware RT! Go back to your dark cave of software RT that can't even draw some limited tank shadows without demolishing fps.

View attachment 134414

Heaviest there is? Global Illumination has been done for decades on all sorts of GPUs. The fact RT needs such an amount of horsepower for it - while still being only a *single point light* is utterly ridiculous. It does show us nicely how incredibly far away we still are to achieving any sort of realism required to truly make full RT stick in games. Because as long as its not, it will be fighting a much cheaper to render rasterized opposition - in screenshots, trailers, and all non-RT games.

RT is a big toolbox. And the problem is, if you're only using part of it, its no longer truly RT, its just a fancy post effect. All these examples you gave are one trick ponies... with 'optimal' hardware for it.

I'm not a huge fan of him, but Raja said it very right in some interview - RT shouldn't cost the end user extra money. Today, it does. You're paying less for a non-RT enabled GPU in both camps. This tech will ONLY survive in gaming if it gets shoved in slowly and gently, as harmless as possible. So far, Nvidia's Turing approach is more akin to someone stomping you in the face because 'It just works'.
 
Joined
Mar 26, 2009
Messages
175 (0.03/day)
Heaviest there is? Global Illumination has been done for decades on all sorts of GPUs.
That's static GI, low precision and full of defects. RT GI is completely dynamic and is void of most defects.

This tech will ONLY survive in gaming if it gets shoved in slowly and gently, as harmless as possible
It's already thriving, next consoles have RT now, all future games will have it.
far away we still are to achieving any sort of realism required to truly make full RT stick in games.
We don't need full RT now, what we need is Hybrid solution. Which is what RTX is and what next consoles will provide.
 
Joined
Jun 2, 2017
Messages
7,901 (3.15/day)
System Name Best AMD Computer
Processor AMD 7900X3D
Motherboard Asus X670E E Strix
Cooling In Win SR36
Memory GSKILL DDR5 32GB 5200 30
Video Card(s) Sapphire Pulse 7900XT (Watercooled)
Storage Corsair MP 700, Seagate 530 2Tb, Adata SX8200 2TBx2, Kingston 2 TBx2, Micron 8 TB, WD AN 1500
Display(s) GIGABYTE FV43U
Case Corsair 7000D Airflow
Audio Device(s) Corsair Void Pro, Logitch Z523 5.1
Power Supply Deepcool 1000M
Mouse Logitech g7 gaming mouse
Keyboard Logitech G510
Software Windows 11 Pro 64 Steam. GOG, Uplay, Origin
Benchmark Scores Firestrike: 46183 Time Spy: 25121
our hardware still very far to get to that point. maybe in 10 to 15 years. right now what they try to push is hybrid rendering which combine both RT and rasterization. in nvidia vision they expect triple A dev to completely ditch baked effect to be completely replaced with RT solution in 2023 time frame. and that is just triple A. it still require a few years after that for indie developer to use this approach.



IHV implementation will always be proprietary even if they were build their solution based on existing open standard. in nvidia case they don't need to drop anything from RTX. their RTX solution already work with existing open standard (MS Direct X DXR and Vulkan). this time around nvidia is smart not to push it via their proprietary API (like GPU PhysX needing CUDA) and make their RTX implementation to be compliant with existing open standard from the get go.

Also AMD did not drop Mantle necessarily to make way for DX12 and Vulkan. AMD initial plan was to make Mantle the forefront for modern 3D API that will continue to exist alongside DX and Vulkan. the problem is AMD want to handle Mantle exactly the same way Nvidia did for their CUDA: AMD have full control of Mantle development but others can take mantle and work around it to make mantle work on their hardware. Mantle will be always built for AMD GCN hardware strength first. Game developer probably want to see mantle to be handled exactly like MS DX/Khronos Group OpenGL where every IHV have their hands in shaping the API spec because they know if AMD did not fully open Mantle development to other IVH then other IHV will never going to fully support the API.

Um as far as I know Mantle essentially became DX12 as all of it's benefits were integrated into that API.
 
Joined
Dec 28, 2006
Messages
4,378 (0.69/day)
Location
Hurst, Texas
System Name The86
Processor Ryzen 5 3600
Motherboard ASROCKS B450 Steel Legend
Cooling AMD Stealth
Memory 2x8gb DDR4 3200 Corsair
Video Card(s) EVGA RTX 3060 Ti
Storage WD Black 512gb, WD Blue 1TB
Display(s) AOC 24in
Case Raidmax Alpha Prime
Power Supply 700W Thermaltake Smart
Mouse Logitech Mx510
Keyboard Razer BlackWidow 2012
Software Windows 10 Professional
Rt is like hdr was in 2004. Sure gf6 could do it, but it brought the mighty 6800 ultra to it's knees. It wasn't really until the 8800 that hdr got to a playable state at any decent resolution, same with rt, give it time.
 
Joined
Sep 17, 2014
Messages
20,891 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
That's static GI, low precision and full of defects. RT GI is completely dynamic and is void of most defects.


It's already thriving, next consoles have RT now, all future games will have it.

We don't need full RT now, what we need is Hybrid solution. Which is what RTX is and what next consoles will provide.

Next gen consoles that don't exist have it 'now'. Future games that don't exist have it 'now'.

That's a rather strange definition of 'now'. And even stranger of 'thriving'. If 'thriving' equals the amount of marketing/press announcements you get fed with... I guess VR rules the planet by now and RT is close second. Maybe after 'AI', though.

RT GI void of defects? That's cool stuff, with a limited number of rays. It has a resolution much like any rasterized affair, handily helped out by algorithms to make it look better. Don't let the marketing get to your head - and when in doubt, fire up some RTX -low/medium for proof. Dynamic lighting is also not very new for us. The only real win is that its less work to develop that, because it is based on set rules. You're not left creating the rules. That's about all the 'win' you will really have for the near future. The quality improvement only comes many years later and only on high end GPUs -definitely not next-gen consoles.

Still, it will be interesting to see what they are capable of squeezing out of that new console with an AMD GPU. So far, we know just about nothing wrt that, but we do know Nvidia needs to create a massive die to get somewhat playable 'effects' going. So.. I guess that 1080p 60 FPS Playstation fun is over, soon. 25-30 seems the golden target again.
 
Last edited:
Top