• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA DLSS Test in Battlefield V

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,986 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Today's Battlefield V update has brought numerous improvements to the game, including long-awaited support for NVIDIA's DLSS technology. We took a closer look at image quality with DLSS on and off and measured its performance cost.

Show full review
 
Last edited:
I'd like to see DLSS performance vs just turning the resolution down. I'm sure it looks ok in motion, but man that is some Vaseline right there. Seems like adaptive resolution would do the same thing, just minus the "upscaling".
 
Well, I can certainly see a difference in some shots. Others, I swear its like it is taking an overly sharpened image and bringing it back to 'normal' and looks just fine. As many have said with RT in general, chances are you don't see this too much in fast paced action like this has most of the time. The performance gains, IMO outweight and IQ differences, in particular with fast paced FPS like this.
 
Great for streamers, because blurry images compress fast and efficiently.
 
IMO this looks much more tolerable then the FF15 implementation, as long as there is no foilage involved.
Would be quite useful if nVidia actually have low end cards that can run DLSS.
 
Can you guys please tell me if DLSS is supposed to look like this ? I mean I'm so disappointed by the image quality that I'm even suspecting that my RTX 2080 is faulty...
Make sure to open them in a new tab and zoom 100%

I've compared DLSS OFF vs DLSS ON vs DLSS OFF resolution scale 75%

DLSS ON vs OFF vs 75%.pngDLSS ON vs OFF vs 75%_2.png
 
DLSS is a dead thing for sure...

It felt like needing to pay a visit to an optometrist for a stronger pair of glasses.

Those moaning about 4K and FPS, it defeats the whole idea of having a sharper screen in the first place. It is irrational to have such tech. Just UPSCALE from your desired resolution.
 
Can you guys please tell me if DLSS is supposed to look like this ? I mean I'm so disappointed by the image quality that I'm even suspecting that my RTX 2080 is faulty...
Make sure to open them in a new tab and zoom 100%

I've compared DLSS OFF vs DLSS ON vs DLSS OFF resolution scale 75%

View attachment 116389View attachment 116390
Nah, it is just the nature of DLSS.
DLSS is fundementally rendering at lower resolution then upscale.
It tries to compensate in the lost of imagine quality with machine learning etc but it doesn't always work well.

Edit: After seeing what DLSS did in Metro, it certainly didn't work well in that.
 
Last edited:
idk, even with the performance boost you're crearly loosing image quality... unless you are playing competitive games or e-sports (like PUBG or CS:GO) when the higher framerates can help this tech is just not good.
 
I am deeply shocked at lack of criticism from this website. DLSS is nothing else than fancy name for rendering game at lower resolution, THAT'S WHERE PERFORMANCE UPLIFT IS COMING FROM.
I play bfv at 1080p 144hz and already very unhappy about TAA that makes the game very blurry, so adding another "feature" that adds more blur makes image even more corrupted. Also there is a quiet noticeable lack of detail in dlss rendered pictures. NO, thank you I want my game to look great, not blurry mess supported by greedy trickery ;-)
 
Last edited:
IMO this looks much more tolerable then the FF15 implementation, as long as there is no foilage involved.
Would be quite useful if nVidia actually have low end cards that can run DLSS.

it's appearantly running on the RT cores and doesn't seem nvidia wants to put rt cores in low end cards for now.
The market that actually may benefit from it, thus DLSS is a fail so far.
 
Nah, it is just the nature of DLSS.
DLSS is fundementally rendering at lower resolution then upscale.
It tries to compensate in the lost of imagine quality with machine learning etc but it doesn't always work well.

Edit: After seeing what DLSS did in Metro, it certainly didn't work well in that.

And does some anti-aliasing, which actually shows on @MikjoA game capture images... But that horrible horrible blur, yak.
 
Can you guys please tell me if DLSS is supposed to look like this ? I mean I'm so disappointed by the image quality that I'm even suspecting that my RTX 2080 is faulty...
I want to know if when using DLSS if it is combined with TAA or does DLSS replace TAA? and when having DLSS disabled is TAA disabled as well?
 
Seems like DLSS is not a good fit for Real-Time rendering in games.
 
With DLSS you can turn your gaming PC into a console finally! Good job, Nvidia! :-)
 
I think this is a situation where the marketing department has failed at Nvidia. What DLSS is is basically a content-aware anti-aliasing and upscaling technology. It may work better than TAA or MSAA when comparing 1440p rendering. By using supersampled images to train the AA, they can get a much better approximation of the "properly" upscaled and anti-aliased image. But calling this "4k" is wrong. You can't create information from nothing. All you can do is make a better guess at the information. Calling DLSS "4k" when it is applied to a 1440p image just leads to people being disappointed.

The problem is, Nvidia has to convince us that we can have our cake and eat it too. They want you to turn on DLSS and RTX and say "wow, I am getting raytracing at 4k and it runs fine!" Their marketing game is completely focused on proving that the 2080 Ti can do 4k and RTX at the same time. Well, it can't, and it leaves a sour taste in enthusiast's mouths. The pictures that @MikjoA posted proves it. Resolution scale may lead to more aliasing, but it looks cleaner too. Maybe all that extra rendering time they're spending on DLSS actually makes more sense being put towards a higher rendering resolution.
 
Bleh, more ugliness from DLSS. It just looks blurred out. Best way I can describe it is like when you're in art class using oil pastels and you rub the colors together to get them to blend.
 
I wonder... can anyone test out the image quality and performance if you use DSR + DLSS at the same time.
So they cancel each other out resolution wise. Will the image quality benefit compared to NO-DSR + NO-DLSS setting?
 
I wonder... can anyone test out the image quality and performance if you use DSR + DLSS at the same time.
So they cancel each other out resolution wise. Will the image quality benefit compared to NO-DSR + NO-DLSS setting?

My guess is you would still have blur because it is trying to do upscaling and anti-aliasing. The AA part seems aggressive even compared to 1440p + TAA.
 
Don't like your FPS on a 4K monitor? Buy better hardware or drop at 1440p ultra preset. It will look better and provide more FPS that this. So what's the point?
 
I would like to agree with the overall sentiment and add another down vote to this technology. This is a fail in my opinion and I'm surprised at the positive reviewer sentiment.
Well, it IS a "marketing technology". No matter how pretty you put it, you can't upscale things into existence from nothing, which is why the details that weren't there won't just appear.
It would be much better if it hadn't a performance cost, for example - if it was carried out by a separate chip - in such case it would be a very useful upscaling technology to replace native monitor upscaling. But right now it's a choice between sacrificing performance for DLSS, or sacrificing performance for resolution, where resolution will always win because it natively adds details instead of pretending to add them.
 
Looks like a loss in image quality. Everything looses sharpness and looks fuzzy to me. I don't have an RTX card, but I don't think I use that feature if I did.
 
Back
Top