• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Battlefield V Tides of War GeForce RTX DirectX Raytracing

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
29,004 (3.75/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
DICE released the Battlefield V Tides of War update today, which includes improvements to the NVIDIA RTX Raytracing technology. We tested this new version on the GeForce RTX 2070, RTX 2080, and RTX 2080 Ti, showing impressive performance gains.

Show full review
 
Last edited:
Much to the chagrin of the NVIDIA haters DXR has turned not to be a dud/failure it was portrayed to be. Also, just also, real time ray tracing has been a pipe dream up to this moment however for some reasons people fail to appreciate the monumental work that NVIDIA has poured into the technology to make it possible. I'm not buying any of the available RTX cards but that's because I don't live in an economically developed country and this card is just too expensive for me. But I will surely consider the RTX 3060 once it becomes available. Meanwhile I'm quite content with the GTX 1060. While Europeans/Americans (I guess most of TPU visitors) enjoy an income of over $2500 a month, I earn literal pennies in comparison, so I'm not sure where this "NVIDIA is ripping everyone off" comes from.

@W1zzard

You say that image quality is roughly the same between various presets but you've surely missed the difference in guns rendering.
 
Nice progress, now I just need $300 DXR capable card that can do 1080p60 on high and I'm sold, oh don't forget the game that utilizes it as well

Now I'm waiting for Vulkan version of raytracing, maybe id Sofware and AMD can collaborate on that
 
1. Performance will get better. But no, we'll probably never see another 50% uplift ever again.
2. DXR may not be that visible in a fps (then again, neither is AA), but you can see how on lower settings some objects tend to be brighter than they should. Of course, you're not playing split screen with DXR set to ultra and off to compare while playing. Still, it's little things like this that make the whole experience more life-like. And for those not looking for life-like in a game, don't worry, once the designers master this, they can use it for more artistic rendering (anyone remembers XIII?)
 
Seriously wth is going on with all the big sites? Everyone just rushes to publish quick benchmarks, but no-one bothers to check if the IQ is identical or did they sacrifice quality for performance.
These things, especially with such big boosts, always need a before/after comparison
 
You say that image quality is roughly the same between various presets but you've surely missed the difference in guns rendering.
That was on the medium setting. We didn't notice much (any?) on high and ultra.
DXR may not be that visible in a fps (then again, neither is AA),
That depends. I absolutely notice AA in all the FPS games I play (2560x1440). Now, when in fast action, I don't really notice. However, anything slower than a twitch gun fight, I can clearly see alaising.



W1z - great job on this. :)

I do have a quick question... you saw performance INCREASE when going from high to ultra in the graphs?

Seriously wth is going on with all the big sites? Everyone just rushes to publish quick benchmarks, but no-one bothers to check if the IQ is identical or did they sacrifice quality for performance.
These things, especially with such big boosts, always need a before/after comparison
That takes time... but we've already looked at it in the other thread and compared stills. See my first comment in this post.
 
Seriously wth is going on with all the big sites? Everyone just rushes to publish quick benchmarks, but no-one bothers to check if the IQ is identical or did they sacrifice quality for performance.
These things, especially with such big boosts, always need a before/after comparison
MSAA sacrifices quality for performance over SSAA, are you still upset about that?

PS You have screenshots in this very article. But it's ok, I know that's not what you were asking for ;)

That depends. I absolutely notice AA in all the FPS games I play (2560x1440). Now, when in fast action, I don't really notice. However, anything slower than a twitch gun fight, I can clearly see alaising.

Fair enough. I don't expect this to be any different.
 
You say that image quality is roughly the same between various presets but you've surely missed the difference in guns rendering.
Added an explanation for that. Depending on the RTX quality settings, different materials become "RTX reflective" or not.

you saw performance INCREASE when going from high to ultra in the graphs?
benchmark isn't 100% repeatable, so a few percent difference between runs isnt uncommon
 
MSAA sacrifices quality for performance over SSAA, are you still upset about that?
No, of course not. However I might be if a game had just Anti-aliasing on/off switch and a patch suddenly switched it from SSAA to MSAA and the developer goes guns blazing to announce how they "optimized the game and gained 100% performance"

PS You have screenshots in this very article. But it's ok, I know that's not what you were asking for ;)
Yeah, I know, but they're all from the updated build, and like you said, that's not what I was looking for ;)
 
@W1zzard could you add RTX off SSR on screenshots too?
The problem is that setting RTX off requires a game restart, so finding the exact same scene again is kinda difficult
 
The problem is that setting RTX off requires a game restart, so finding the exact same scene again is kinda difficult

Ah, ok. Started to ask about frame times, but you did mention them on conclusion page. So many thanks for the test :toast:
 
The problem is that setting RTX off requires a game restart, so finding the exact same scene again is kinda difficult
Since Turing is so big on AI, try showing it a screenshot taken with DXR on after restart, maybe it can figure out how to get you there on its own :P
 
Overclocked RTX 2070 can do 1440p RTX Ultra at 60 fps or near it. Not great for compretitive shooter but pretty good for the technology it uses.
 
Much to the chagrin of the NVIDIA haters DXR has turned not to be a dud/failure it was portrayed to be. Also, just also, real time ray tracing has been a pipe dream up to this moment however for some reasons people fail to appreciate the monumental work that NVIDIA has poured into the technology to make it possible. I'm not buying any of the available RTX cards but that's because I don't live in an economically developed country and this card is just too expensive for me. But I will surely consider the RTX 3060 once it becomes available. Meanwhile I'm quite content with the GTX 1060. While Europeans/Americans (I guess most of TPU visitors) enjoy an income of over $2500 a month, I earn literal pennies in comparison, so I'm not sure where this "NVIDIA is ripping everyone off" comes from.

@W1zzard

You say that image quality is roughly the same between various presets but you've surely missed the difference in guns rendering.

lol, your post...
haters... :nutkick:
for me, it's still failure..tbh.
 
Overclocked RTX 2070 can do 1440p RTX Ultra at 60 fps or near it. Not great for compretitive shooter but pretty good for the technology it uses.

I guess overclocking only improves Rasterization performance which is bottlenecked by RT cores when ray tracing is enabled.
You might see some improvements here and there but it's not going to be as effective as Usual.
Don't know for sure ,I may be wrong.
 
I guess overclocking only improves Rasterization performance which is bottlenecked by RT cores when ray tracing is enabled.
You might see some improvements here and there but it's not going to be as effective as Usual.
Don't know for sure ,I may be wrong.
good point,but we'll have to see.
 
I still don't like how RTX looks but you can't knock the performance increase no matter what 'side' you are one. Kudos to all involved.
 
Something doesn't seem right: the "new" ultra is faster then the "old" low across all resolutions. There's just no way that can be done without messing with image quality.

Dunno if this is possible but the only way i can see this happening is if nVidia managed to do with RTX what they did with DLSS: render it @ a lower resolution and upscale it in real time.
 
Something doesn't seem right: the "new" ultra is faster then the "old" low across all resolutions. There's just no way that can be done without messing with image quality.
Did you look at the video yet????
 
Back
Top