• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Further Optimizations to NVIDIA RTX, DLSS For Battlefield V

Raevenlord

News Editor
Joined
Aug 12, 2016
Messages
3,755 (1.17/day)
Location
Portugal
System Name The Ryzening
Processor AMD Ryzen 9 5900X
Motherboard MSI X570 MAG TOMAHAWK
Cooling Lian Li Galahad 360mm AIO
Memory 32 GB G.Skill Trident Z F4-3733 (4x 8 GB)
Video Card(s) Gigabyte RTX 3070 Ti
Storage Boot: Transcend MTE220S 2TB, Kintson A2000 1TB, Seagate Firewolf Pro 14 TB
Display(s) Acer Nitro VG270UP (1440p 144 Hz IPS)
Case Lian Li O11DX Dynamic White
Audio Device(s) iFi Audio Zen DAC
Power Supply Seasonic Focus+ 750 W
Mouse Cooler Master Masterkeys Lite L
Keyboard Cooler Master Masterkeys Lite L
Software Windows 10 x64
DICE and NVIDIA have been hard at work on their partnership to bring RTX and DLSS to Battlefield V. It seems the tech is a constant work in progress, as this isn't the first time the companies have introduced optimizations to the games' handling of DLSS and RTX since its release. According to the patch notes from the latest update, the Trial by Fire Update #2, there have been further optimizations to RTX on Ultra - with increased ray trace counts to improve quality of reflections, which will definitely hit performance further.

Additionally, DLSS now supports rendering in borderless mode, and DLSS sharpness has also been improved. This likely means that NVIDIA's servers are still hard at work processing their "ground truth" image for the available scenarios in-game, further optimizing image quality. This is one of those rare technologies that will be improving with time, bringing the "fine wine" argument to (likely) its clearest scenario yet.



View at TechPowerUp Main Site
 
"It seems the tech is a constant work in progress "

That is precisely why DLSS is dead in the water. An AA that needs specific dev time and crunch time to 'git gud'... oh yeah, and don't use resolution X on card Y, because that's not supported because Nvidia said so.

It'd be hilarious if it wasn't so real.
 
Nvidia working hard to get more rtx in ONE game
 
more optimization means more blurrrr Kappa
DLSS is DOA
 
Even though I have access to RTX/DLSS with a 2060 I mainly bought the card for the performance uplift over the RX 480.
 
Even though I have access to RTX/DLSS with a 2060 I mainly bought the card for the performance uplift over the RX 480.

Same here, mainly for the performance uplift from my old fury x as well as better VR experience.

Found use for the CUDA cores and Tensor cores, but haven't found any use for the RT cores yet. Maybe I will try the Quake II RTX rebake when it is available.
 
I wonder how well Intel's discrete GPU will be able to handle RT.
 
"It seems the tech is a constant work in progress "

That is precisely why DLSS is dead in the water. An AA that needs specific dev time and crunch time to 'git gud'... oh yeah, and don't use resolution X on card Y, because that's not supported because Nvidia said so.

It'd be hilarious if it wasn't so real.
Search algorithm and other stuff we do online on a daily basis also need "dev time and crunch time to 'git gud'". Do you still think that's the criterion that makes a tech dead in the water?
 
I wonder how well Intel's discrete GPU will be able to handle RT.
Regular Ray Tracing isn't to demanding, but Real Time Ray Tracing is the one making headlines with a significant performance hit.
 
Search algorithm and other stuff we do online on a daily basis also need "dev time and crunch time to 'git gud'". Do you still think that's the criterion that makes a tech dead in the water?

When its just Nvidia carrying that and with the wealth of games coming out, yes. There is no way this can really last or apply to a large catalog. I'm under the impression DLSS is largely meant to carry RT performance loss, so it'll launch alongside that support. Which is very limited.
 
Tried the update and found it was still not worth the performance hit at 2k resolution.
2080 w/ Ryzen 7 2700x
 
When its just Nvidia carrying that and with the wealth of games coming out, yes. There is no way this can really last or apply to a large catalog. I'm under the impression DLSS is largely meant to carry RT performance loss, so it'll launch alongside that support. Which is very limited.
What if (some of) the data could be shared between titles?
(And I'm not saying DLSS is a runaway success. Just that I don't count it as dead just yet.)

Regular Ray Tracing isn't to demanding, but Real Time Ray Tracing is the one making headlines with a significant performance hit.
What do you mean? When everything but RT can be done in real time with relative ease, how is RT not too demanding? ;)
 
When its just Nvidia carrying that and with the wealth of games coming out, yes. There is no way this can really last or apply to a large catalog. I'm under the impression DLSS is largely meant to carry RT performance loss, so it'll launch alongside that support. Which is very limited.

I think the pros of DLSS outweigh the pros of RTRT frankly. I don't really know the limitations of DLSS but if it is limited to 1440 upscaled to 4k then I guess the next point is moot. Think of budget cards or IGP that struggle with 1080 or 1440. I do think ML has the potential to turn an upscaled image into a pretty decent looking one, in all honesty, it does now.

I shit on DLSS in the beginning but the more I looked at the images the more I came to the realization that I have to stare at a still image to really pick it apart. If I had these images moving by sixty times a second all while trying not to die and yelling expletives at the sap trying to kill me....I doubt I would notice.

Now, the exact opposite of that is true for RTRT. I have to really stare at a still image before I can objectively say 'Yeah, that looks better' or 'Nope, looks about the same'. With that flying by at sixty times a second, it just isn't noticeable or provide any value for the hit. Except for the hot trash that is BFV RTX. It straight up looks garbage and I would turn it off.
 
I think the pros of DLSS outweigh the pros of RTRT frankly.

There is no technical limitation that says DLSS can only be applied to a specific resolution. There is a technical limitation that says DLSS must be trained for each resolution and I'm thinking Nvidia has started with resolutions that make more sense (to them, at least).

And you're on to something about RTRT. Lighting can be pretty subjective to the human eye. I've posted before, as long as developers can't go full RTRT (and they won't, for years to come), games will offer a hybrid solution that prevents devs from shedding off rendering steps that aren't needed when doing RT only. That's not what will make RT shine in the next few years. But the first developer that lets you get a glimpse of an enemy sneaking up on you in a reflection, just might.
 
And you're on to something about RTRT. Lighting can be pretty subjective to the human eye. I've posted before, as long as developers can't go full RTRT (and they won't, for years to come), games will offer a hybrid solution that prevents devs from shedding off rendering steps that aren't needed when doing RT only. That's not what will make RT shine in the next few years. But the first developer that lets you get a glimpse of an enemy sneaking up on you in a reflection, just might.

I think we are all dependent on consoles which means at the least we are all dependent on an AMD implementation. With exception to something from say CDPR or another dev that really focuses on PC and just goes all out for it.

There is no technical limitation that says DLSS can only be applied to a specific resolution.

Technical as far as NV giving it a go. I don't see them dropping all this loot for budget and mobile GPUs....for a while anyway.
 
how is RT not too demanding?
Because it's not Real Time-RT, I was poking fun at the @TheOne for only saying RT meaning what can be done on a farm of servers rendering the next Pixar movie or POV-Ray. ;)
 
I think we are all dependent on consoles which means at the least we are all dependent on an AMD implementation. With exception to something from say CDPR or another dev that really focuses on PC and just goes all out for it.
You can't go full RTRT even if on PC. The installed user base would be tiny. And if Nvidia's next iteration doesn't seriously beef up RTRT and lower prices, that user base will remain tiny for a while longer.
The good news is every game engine worth something is adding RTRT support (whether it's done right or just a haphazard attempt is another story). And I'm hoping we'll at least get some 100% RTRT demos to shows us what's really possible - even though we know that already.
 
What if (some of) the data could be shared between titles?
(And I'm not saying DLSS is a runaway success. Just that I don't count it as dead just yet.)

Seeing is believing ;)
 
Interesting as you could have bought a Vega 56 for $80-90 less with 3 games included.
Sure, I could have bought Vega 56 but what if I decided I would rather have Nvidia this time?
 
You can't go full RTRT even if on PC. The installed user base would be tiny. And if Nvidia's next iteration doesn't seriously beef up RTRT and lower prices, that user base will remain tiny for a while longer.

Which is exactly why I think DLSS has more potential. The fact they are wasting it on 4k is silly.
 
Which is exactly why I think DLSS has more potential. The fact they are wasting it on 4k is silly.


DLSS requires RT.


Perhaps DLSS requires it due to an algorithm that equates angle dependence to pixel Z depth as calculated by RTRT? At the end there is a huge penalty in performance that cannot be masked on high frame rate lower resolutions.
 
DLSS requires RT.


Perhaps DLSS requires it due to an algorithm that equates angle dependence to pixel Z depth as calculated by RTRT? At the end there is a huge penalty in performance that cannot be masked on high frame rate lower resolutions.



I don't think so. Doesn't FFXV only have DLSS and not RTX?
 
I don't think so. Doesn't FFXV only have DLSS and not RTX?

What about Anthem, just got DLSS support and iirc has no RT in it of any kind either.
 
Regular Ray Tracing isn't to demanding, but Real Time Ray Tracing is the one making headlines with a significant performance hit.
WHAT?
It's the same problem. What "Real Time" means is that it needs to output a significant number of frames per second.

How do you measure "being demanding"? Are these operations "demanding" in your world?
1554356699592.png
 
Back
Top