• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce RTX 2060 Shows Up in Final Fantasy XV Benchmarks

Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
If you can't admit Nvidia had a great idea then you must be in denial. The problem was the execution.

Ideas dont mean a thing if you cannot execute them properly. At that point an idea just turns into a scammy, crappy product and promise.
 
Joined
Dec 31, 2009
Messages
19,366 (3.72/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
its consistent if you do the presets like 3dmark, and there really is no difference except ff15 benchmark looks better.

gamers nexus did custom presets, and ruined the rep of the benchmark. atleast in my thread its an even playing field where your score is only allowed under standard 1080p
https://www.techpowerup.com/forums/threads/post-your-final-fantasy-xv-benchmark-results.242200/
Consistency/reliability was not the issue. That's something you brought up. :)

The issue happens with presets AFAIK (needs hairworks). Regardless, I just showed what the hullabaloo was about. ;)

You arent running this as an analog for the fps in game, so there isnt a worry. It's still a crappy bench for those reasons, however. Again, just sharing what it the issues are since it was brought up. :)
 
Last edited:
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
This isnt a bad move by nvidia though, you need to start from somewhere or it never gets adopted in the first place, the pricing is insane but the tech is what we needed to begin with.

Explain how you see it scaling up over the next few generations. That is one hell of a long way to trickle down before we see 2080ti RT performance in lower and midrange cards. And even then the performance is just enough for some reflections at 1080p.

Im not seeing it when 7nm might be the last big step forward.
 
Joined
Jul 5, 2013
Messages
25,559 (6.52/day)
A benchmark that doesn't do proper object culling and renders offscreen objects at all times is bad. Period. Poorly developed, poorly implemented, poorly optimized.
Your opinion, not shared by everyone. Consistency is a very critical metric where benchmarking tools are concerned.
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
Explain how you see it scaling up over the next few generations. That is one hell of a long way to trickle down before we see 2080ti RT performance in lower and midrange cards. And even then the performance is just enough for some reflections at 1080p.

Im not seeing it when 7nm might be the last big step forward.
we havent even seen this tech implemented properly yet. this tech is still new

wait until more games are implemented with RTX which they will. and then we can figure out where this leads, until then this is exactly what we needed to move on.
RTX is far from an invalid idea or tech.

A benchmark that doesn't do proper object culling and renders offscreen objects at all times is bad. Period. Poorly developed, poorly implemented, poorly optimized.
the benchmark works as intended.
 
Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Explain how you see it scaling up over the next few generations. That is one hell of a long way to trickle down before we see 2080ti RT performance in lower and midrange cards. And even then the performance is just enough for some reflections at 1080p.

Im not seeing it when 7nm might be the last big step forward.
If I may throw in my two cent.
From looking at the disastrous implementation in Battlefield V, it becomes obvious to me that simulating mostly specular reflections on selected objects becomes pretty useless and creates more visual problems than it solves. Clearly, a game engine built from the ground up around this technology would do it better, but I still think they chose the wrong approach. I would start by doing the diffuse global illumination primarily, this will get the overall lighting and soft shadows right with great visual improvements. Then, I would do specular reflections fairly softly, because most surfaces are generally not highly reflective. This way, raytracing can become useful without requiring unobtainable computational performance. And with these considerations in mind, I do believe some games could make good use of the performance level RTX 2080 Ti offers.

You're right that 7 nm might be the last "good shrink", while there will certainly be new techniques pushing transistor counts in the future, the progress beyond this will be much slower than the last few decades. The days of "easy" expansions of GPUs is pretty much passed. This means it's up to GPU makers to make the hardware more efficient, and game developers to utilize this hardware more efficiently.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Your opinion, not shared by everyone. Consistency is a very critical metric where benchmarking tools are concerned.
the benchmark works as intended.
Sure, it's a consistent benchmark of how one game might act if it were abysmally configured and enforced vendor-specific features at some levels of visual fidelity. It's not even representative of FFXV gameplay...

we havent even seen this tech implemented properly yet. this tech is still new

wait until more games are implemented with RTX which they will. and then we can figure out where this leads, until then this is exactly what we needed to move on.
RTX is far from an invalid idea or tech.
Rather than repeat all of this, I'll just quote myself from another thread:
One thing we need to remember here is that hybrid RT is in its very infancy, and there is no doubt it will improve dramatically over time. There are undoubtedly undiscovered or untried tricks and workarounds to make this perform better, like rendering the RT scene at a lower resolution and upscaling with DLSS or hereto unknown ways of compensating for reduced ray counts. However, it seems unlikely that we'll see >100% performance improvements with current hardware, at least in the next couple of years, which is more or less what is needed to bring this to acceptable levels of price/performance. The previous generation brought us to solid 4k>/=60Hz, and now suddenly we're back down to half that, and worse performance than even two generations back even at lower resolutions. Add in the dramatic price increases, and we're looking at something that simply isn't acceptable. Making RTRT feasible at all is great, but what Turing really shows us is that it isn't really, and given the necessary die area and bleak outlook for future node shrinks on silicon, it might not be for a long, long time.
If at all.
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
If I may throw in my two cent.
From looking at the disastrous implementation in Battlefield V, it becomes obvious to me that simulating mostly specular reflections on selected objects becomes pretty useless and creates more visual problems than it solves. Clearly, a game engine built from the ground up around this technology would do it better, but I still think they chose the wrong approach. I would start by doing the diffuse global illumination primarily, this will get the overall lighting and soft shadows right with great visual improvements. Then, I would do specular reflections fairly softly, because most surfaces are generally not highly reflective. This way, raytracing can become useful without requiring unobtainable computational performance. And with these considerations in mind, I do believe some games could make good use of the performance level RTX 2080 Ti offers.

You're right that 7 nm might be the last "good shrink", while there will certainly be new techniques pushing transistor counts in the future, the progress beyond this will be much slower than the last few decades. The days of "easy" expansions of GPUs is pretty much passed. This means it's up to GPU makers to make the hardware more efficient, and game developers to utilize this hardware more efficiently.
I agree with everything except the built around rtx from the ground up, there simply was not enough time to do that.

How ever this reminds me of PhysX where u needed a separate card to get good performance before the 400 series.

Sure, it's a consistent benchmark of how one game might act if it were abysmally configured and enforced vendor-specific features at some levels of visual fidelity. It's not even representative of FFXV gameplay...


Rather than repeat all of this, I'll just quote myself from another thread:

If at all.
we are moving in the right direction regardless of how long it takes

Edit: and like i said the only issue is the pricing at which nvidia chose to adopt this in with its msrp.
 
Last edited:
Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I agree with everything except the built around rtx from the ground up, there simply was not enough time to do that.
I do know raytracing and how much time it takes to do it properly, and I didn't expect any game to use it properly in this timeframe.
I'm just saying that raytracing is not flawed because of a bad demonstration, and we should not dismiss Nvidia's groundwork here. This is only the beginning.

To make an analogy; Direct3D 10 was not bad despite Crysis being a flawed game…
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
People say the reflective surfaces done by ingame ray tracing is too reflective, well here is real life
20181125_133629.jpg20181125_133800.jpg20181125_133724.jpg20181125_133653.jpg20181125_133819.jpg20181125_133838.jpg
 
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
I agree with everything except the built around rtx from the ground up, there simply was not enough time to do that.

How ever this reminds me of PhysX where u needed a separate card to get good performance before the 400 series.


we are moving in the right direction regardless of how long it takes

Edit: and like i said the only issue is the pricing at which nvidia chose to adopt this in with its msrp.

Do you really think developers will build engines from the ground up, just to reach a level of graphical fidelity that is comparable to what they can already do with simple, tried and 'cheap' techniques? Dp you realize that most of the popular engines today follow a completely different model? There was good reason to implement this through DXR and the API DX12: low risk. Engines have become SaaS applications, so this whole RT business will be given a priority determined by its customers - the developers that have to implement a costly technique. There are only very few studios that are even in a position to really push an engine. Frostbite is one of those rare exceptions, and the 4A engine probably is another. In-house engines, which means that the amount of content they are used for is never going to be widespread.

I'll refer you to every other tech that required specific, dedicated dev time. It takes a very, very solid business case to even remotely start that kind of work. Where is that business case? Just an analysis: there isn't one, as long as RTRT isn't available in all but the slowest of GPUs. So that 2080ti performance, right now, for the adoption of RTX, means absolutely nothing at all. There IS no adoption rate as long as this is the price point it requires. And I think everyone can agree that the lower tier GPUs offer unacceptable performance.

I agree there are quite possibly many tweaks and improvements to be made, no doubt about it. But in the same way, we had high hopes for DX12 features like mGPU. How's that going? Its another one of those time- and configuration-heavy implementations with questionable benefits that only touch a minority of the gaming market. Again: there was no business case to be made for it, and look where mGPU is at today.

There are so many parallels with failed technologies its unreal. The inability to recognize that, to me, is a dangerous form of blindness. This is NOT the same as hardware T&L or anything like that. The timeframe isn't the same, the industry support isn't the same, and the demand for such a tech is radically different. We will always chase photo-realism in digital content, and yes, RT can be a means to this end. But somehow, ever since Nvidia said this was a holy grail, we seem to have forgotten there were very, very good reasons to take a different approach than brute forcing everything, because that is what RTRT really means, no matter how you optimize it. RTRT will ALWAYS be in competition with cheaper tech that gets very similar results and is supported everywhere.

In the end, the only viable developments in any kind of product depend on the price they can be sold at. Look at Elon Musk, and his plans to colonize Mars, and how the very core of everything he plans is based on the cost of that return trip, and bringing it down to a reasonable level. If there is no profitability, it will die - if you cannot bring a development to the masses, it will die.

if there is one thing this industry (gaming) has shown us the last ten years, it is a steady trend towards the 'one-size-fits-all' method of development. Consoles moved to x86. Most of the games are ported back and forth. Exclusives are rapidly drying up compared to previous gen. Everything is aimed at maximizing profit while minimizing development time before a product is rushed out the door, only to incrementally fix it afterwards with a much smaller team. That is not the landscape where RTRT will be able to flourish. Its one of those items on a loooong list of 'nice-to-haves' in software development. And it sure as hell isn't on the top of that list.
 
Last edited:
Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
People say the reflective surfaces done by ingame ray tracing is too reflective, well here is real life
I think you missed the point. There are way more less reflective than highly reflective surfaces in games and the real world, and more imporantly: getting the overall diffuse lighting correctly is more important for "realism" than getting the sharpest details on specular reflections.

BTW; My car stops being reflective after one trip.:(</off_topic>
 
Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
Joined
Sep 17, 2014
Messages
20,776 (5.97/day)
Location
The Washing Machine
Processor i7 8700k 4.6Ghz @ 1.24V
Motherboard AsRock Fatal1ty K6 Z370
Cooling beQuiet! Dark Rock Pro 3
Memory 16GB Corsair Vengeance LPX 3200/C16
Video Card(s) ASRock RX7900XT Phantom Gaming
Storage Samsung 850 EVO 1TB + Samsung 830 256GB + Crucial BX100 250GB + Toshiba 1TB HDD
Display(s) Gigabyte G34QWC (3440x1440)
Case Fractal Design Define R5
Audio Device(s) Harman Kardon AVR137 + 2.1
Power Supply EVGA Supernova G2 750W
Mouse XTRFY M42
Keyboard Lenovo Thinkpad Trackpoint II
Software W10 x64
Joined
Mar 10, 2015
Messages
3,984 (1.21/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
Can you imagine Bethesda trying to put any flavor of RTRT on creation, lol...I'll return when I stop laughing.
 
Joined
May 2, 2017
Messages
7,762 (3.08/day)
Location
Back in Norway
System Name Hotbox
Processor AMD Ryzen 7 5800X, 110/95/110, PBO +150Mhz, CO -7,-7,-20(x6),
Motherboard ASRock Phantom Gaming B550 ITX/ax
Cooling LOBO + Laing DDC 1T Plus PWM + Corsair XR5 280mm + 2x Arctic P14
Memory 32GB G.Skill FlareX 3200c14 @3800c15
Video Card(s) PowerColor Radeon 6900XT Liquid Devil Ultimate, UC@2250MHz max @~200W
Storage 2TB Adata SX8200 Pro
Display(s) Dell U2711 main, AOC 24P2C secondary
Case SSUPD Meshlicious
Audio Device(s) Optoma Nuforce μDAC 3
Power Supply Corsair SF750 Platinum
Mouse Logitech G603
Keyboard Keychron K3/Cooler Master MasterKeys Pro M w/DSA profile caps
Software Windows 10 Pro
Can you imagine Bethesda trying to put any flavor of RTRT on creation, lol...I'll return when I stop laughing.
I can imagine it. Specifically, I can imagine it flaking out completely, like all reflective surfaces becoming... transparent? Or maybe only working on the entrails of exploded mutants? Or something. Or global illumination that suddenly stops working, giving us a bright skybox and sun but throwing the entire game into darkness outside of non-RT light sources. Never underestimate Bethesda's ability to make the game flake out in new and exciting ways.
 
Joined
Mar 10, 2015
Messages
3,984 (1.21/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
I'm still laughing but...RTRT is doomed unless nVidia spends the money to do proper implementations for devs as studios and/or publishers are at the point where they are doing only the bare minimums. Like Vayra said, if it isn't going to increase profit or speed development, they aren't going to do it.

Really, there are only a few studios/publishers that I think are capable enough (fiscal freedom, creative freedom, talent?) to implement this with enough efficiency improvements to make RTRT even remotely viable for this gen: Epic, CDPR, Ubisoft, and I was going to say DICE but...we saw how that worked out.

On top of that, there will be a nearly subatomic level audience that will even be able to use RTRT as only the 2080+ levels will have the grunt to do it. Until there is 2080ti levels of RTRT processing on the 2060 at a sub $300USD, there will be little work put in by anyone outside of nVidia. That may even be next gen but seeing what we have already seen, I suspect that Star Citizen has an equal chance of launching first.

we havent even seen this tech implemented properly yet.

I mean, I don't believe that nVidia threw DICE some manuals about DXR 3 days before launch. I would be willing to bet that nVidia was working hand in hand with Dice for many, many, months on this implementation and everyone can agree it was less than impressive.

I would start by doing the diffuse global illumination primarily, this will get the overall lighting and soft shadows right with great visual improvements. Then, I would do specular reflections fairly softly, because most surfaces are generally not highly reflective.

I would have hoped that the devs and nVidia would have figured out the optimum approach for implementing RTRT to show it's value and provide a path forward instead of this...mess. The work now has to be to fixing what they have already done which will still make the game look like it has been sprayed with clear coat. They certainly can't go backwards and remove that and implement the low hanging fruit per se. Maybe it's just me but I haven't really seen anything that strikes me as this looks awesome and I can't wait to have it. I am sure it will come but I'll probably be playing Star Citizen.

I equate it to trying to hit a grand slam with the bases loaded and striking out instead of just shooting for a base hit to drive some runs in.
 
Joined
Jun 10, 2014
Messages
2,889 (0.81/day)
Processor AMD Ryzen 9 5900X ||| Intel Core i7-3930K
Motherboard ASUS ProArt B550-CREATOR ||| Asus P9X79 WS
Cooling Noctua NH-U14S ||| Be Quiet Pure Rock
Memory Crucial 2 x 16 GB 3200 MHz ||| Corsair 8 x 8 GB 1333 MHz
Video Card(s) MSI GTX 1060 3GB ||| MSI GTX 680 4GB
Storage Samsung 970 PRO 512 GB + 1 TB ||| Intel 545s 512 GB + 256 GB
Display(s) Asus ROG Swift PG278QR 27" ||| Eizo EV2416W 24"
Case Fractal Design Define 7 XL x 2
Audio Device(s) Cambridge Audio DacMagic Plus
Power Supply Seasonic Focus PX-850 x 2
Mouse Razer Abyssus
Keyboard CM Storm QuickFire XT
Software Ubuntu
I would have hoped that the devs and nVidia would have figured out the optimum approach for implementing RTRT to show it's value and provide a path forward instead of this...mess. <snip>
Well, we're still waiting for people to figure out the optimum approach for Direct3D 12, all the games so far are using wrappers.

RTX has been on the market for a massive 2 months, I think we can give them some time…
 
Joined
Mar 10, 2015
Messages
3,984 (1.21/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
RTX has been on the market for a massive 2 months, I think we can give them some time…

I don't disagree but I am pretty sure DICE did not receive their RTX 2080tis the same day we did.
 

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
I don't disagree but I am pretty sure DICE did not receive their RTX 2080tis the same day we did.
They recieved them in july, still not enough time
 
Joined
Mar 10, 2015
Messages
3,984 (1.21/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum

T4C Fantasy

CPU & GPU DB Maintainer
Staff member
Joined
May 7, 2012
Messages
2,562 (0.59/day)
Location
Rhode Island
System Name Whaaaat Kiiiiiiid!
Processor Intel Core i9-12900K @ Default
Motherboard Gigabyte Z690 AORUS Elite AX
Cooling Corsair H150i AIO Cooler
Memory Corsair Dominator Platinum 32GB DDR4-3200
Video Card(s) EVGA GeForce RTX 3080 FTW3 ULTRA @ Default
Storage Samsung 970 PRO 512GB + Crucial MX500 2TB x3 + Crucial MX500 4TB + Samsung 980 PRO 1TB
Display(s) 27" LG 27MU67-B 4K, + 27" Acer Predator XB271HU 1440P
Case Thermaltake Core X9 Snow
Audio Device(s) Logitech G935 Headset
Power Supply SeaSonic Platinum 1050W Snow Silent
Mouse Logitech G903 Lightspeed
Keyboard Logitech G915
Software Windows 11 Pro
Benchmark Scores FFXV: 19329
Didn't BFV RTX trailer come out in July?
No? But i guess they could have used quadros as dev kits, but even those were assembled in july

The big announcements were august
 
Joined
Mar 10, 2015
Messages
3,984 (1.21/day)
System Name Wut?
Processor 3900X
Motherboard ASRock Taichi X570
Cooling Water
Memory 32GB GSkill CL16 3600mhz
Video Card(s) Vega 56
Storage 2 x AData XPG 8200 Pro 1TB
Display(s) 3440 x 1440
Case Thermaltake Tower 900
Power Supply Seasonic Prime Ultra Platinum
That may as well have been the case. I always assumed devs would receive much more than a two month lead time on such revolutionary tech.
 
Joined
Apr 12, 2017
Messages
147 (0.06/day)
System Name Dell Dimension P120
Processor Intel Pentium 120 MHz 60Mhz FSB
Motherboard Dell Pentium
Memory 24 MB EDO
Video Card(s) Matrox Millennium 2MB
Storage 1 GB EIDE HDD
Display(s) Dell 15 inch crt
Case Dell Dimension
Audio Device(s) Sound Blaster
Mouse Microsoft mouse, no scroll wheel
Keyboard Dell 1995
Software Windows 95 + Office 95
I guess people have forgotten that gtx 1060 6gb is over 10% faster than the gtx 970.
and the 970 itself is quite a bit faster than the gtx 780
with that in mind we should expect the RTX 2060 to at least outperform the gtx 1070
 
Top