• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Battlefield V Benchmark Performance Analysis

Did I miss where the testing method (scene? Built in bench?) is listed? I see ultra dx12 on p1 and settings on p2... but how was this tested?
 
This made me laugh.

GTX 1080 struggles against Vega 64 in well made games that's not sponsored by Nvidia like Wolfenstein and in many cutting edge API games that properly utilize Vulkan and DX12.

GTX 1080 does not support FreeSync, fans are screwed out of $200 to 300 extra for the same thing.

GTX 1080 is a far less compute card than any of the Vega cards. Actually Vega 64 trades blows with GTX 1080Ti and RTX 2080 in most compute senario, and will kill any Pascal cards other than the P100 in FP16 compute by a landslide.

just a few things Vega has over 1080.

B-Real :D
Compute TFLOPS doesn't include rasterization and ROPS read/write/graphics fix function bottleneck issues. Both Vega 64 and GTX 1080 has similar quad rasterization and 64 ROPS configuration while GTX 1080 Ti has six rasterization and 88 ROPS superiority.

Pure Compute workloads uses TMU read/write path which both GTX 1080 Ti and Vega 64 has similar TMU unit count.

CUDA mode also disables graphic path's NV superior delta color compression.

For raster graphics, TFLOPS is almost meaningless without rasterization and ROPS read/write/graphics fix functions being factored in. I expect more from AMD. I do NOT support Raja Koduri's TFLOPS bias arguments i.e. AMD was increasing their GPU's ROPS (32 ROPS to 64 ROPS) power before Koduri jointed AMD in 2013.

AMD should be focused on designing GPUs not large DSPs with smaller GPU hardware.
 
Last edited:
I'm curious about the VRAM utilisation. Benchmarks recently performed by Hardware Unboxed indicated that the performance at 1080P with ultra (DX11) settings was identical between the 4GB and 8GB variants of the RX 580. Additionally, even though Tech Power Up has used DX12 (which may potentially increase VRAM utilisation), the performance delta between the 4GB RX 570 and 8GB RX 580 is pretty insignificant (to the point where it could simply be attributed to the difference in core configurations).

So, I'm not saying the test results are incorrect, but surely the performance would be affected more adversely if the VRAM buffer is actually being exceeded, right?

I recently sold my GTX 1070 with the intent to purchase a RX 570/580 for better Linux compatibility. However, I haven't made up my mind. In Australia, the 4GB RX 570 is $199, the 4GB RX 580 is $239, and the 8GB RX 580 is $329. Since my displays are 1080P, the 4GB RX 580 seemed like the best ratio of performance/value, but if Battlefield V is actually using 5GB of VRAM at 1080P, well I should probably cough up the extra $90.
 
So, I'm not saying the test results are incorrect, but surely the performance would be affected more adversely if the VRAM buffer is actually being exceeded, right?

VRAM usage is a lot more complicated than just "the game needs this much and if you have less you're going to have a bad time". It used to be this way, when VRAM was just used as a framebuffer. But those days are long gone.

Now game developers use the VRAM for more, they store all types of things in VRAM. Some of what they store isn't actually needed to render the current scene, but they put it there anyway just in case the next scene might need it. That is why you'll see some games that, given the space, will use 11GB of VRAM but still run just fine on a 4GB card. All that extra crap stored in VRAM isn't actually needed, so when a lower VRAM card is used all that crap is dumped with no real performance impact.

It seems to be that the actual VRAM necessary for this game is somewhere between 3GB and 4GB with Ultra settings.
 
hwunboxed made a huge gpu comparison, in dx11


probably the biggest difference between TPU and this is gtx 1080 now leading V56 not by 2% but by 13%.
 
I asked for dx11 so if you tested in dx11, that would please me. PCGH,computerbase and guru3d all tested in dx11 cause it's better. TPU is the only site that always tests in dx12 even if it runs worse. That was the case with BF1 and Deus Ex, though I can kind of understand those since at least it provides some performance improvement for Radeons. BF5 runs better in dx11 on all cards.
guru3D used DX12 (they recommend using DX11 but tested in DX12 since you'll need it for DXR): https://www.guru3d.com/articles_pages/battlefield_v_pc_performance_benchmarks,6.html
DX12 has problems in another recent game that it shipped with and that it is automatically set to run with, Shadow of The Tomb Raider. DX11 ran much better and both looked the same.
It depends on the system. With my Vega 56 I got far better performance with DX12 in SotTR. If you look at Steam's forum results are definitely mixed. Some people have better performance with DX11, some with DX12.
 
Hi

Do you have any comparisons on CPU ? I've already bought a motherboard and im wondering which will be better for BFV 8700 or 9600k ?
Im leaning towards 9600K but there are opinions that threads are important for BF5 on multiplayer
 
BF:V was developed to use upto 12 threads, according to DICE.

My thinking is they targeted the six core 12 thread CPUs, mostly on AMD value.

But Intel decided to once again crap on their clients and do HT shenanigans making the i5 i7 and i9 branding meaningless once again and confusing as hell from previous generations standpoint.
 
BF:V was developed to use upto 12 threads, according to DICE.

My thinking is they targeted the six core 12 thread CPUs, mostly on AMD value.

But Intel decided to once again crap on their clients and do HT shenanigans making the i5 i7 and i9 branding meaningless once again and confusing as hell from previous generations standpoint.


BFV is using every core/thread on my 2700x. Even load across the entire CPU with no idle threads or cores.
 
It's a shame you don't include more strategic older cards like the 980ti & 390x, It's always interesting to see how older gen cards that had good performance in their day line up and compare with other old models. After all they don't all get magically telleported to the GPU Retirement Home, they're still out there running in someones PC.
 
It's a shame you don't include more strategic older cards like the 980ti & 390x, It's always interesting to see how older gen cards that had good performance in their day line up and compare with other old models. After all they don't all get magically telleported to the GPU Retirement Home, they're still out there running in someones PC.
I'm pretty sure this is time issue. Old card have been benched on different systems and would need benchmarking again.
I would like to see that info, too. And it is very useful to more rational people that hang onto their video cards for 3-4 years before updating to see what they'd get. But at the same time, I understand why that's not happening.
 
It's a shame you don't include more strategic older cards like the 980ti & 390x, It's always interesting to see how older gen cards that had good performance in their day line up and compare with other old models. After all they don't all get magically telleported to the GPU Retirement Home, they're still out there running in someones PC.

True, but it does take more time adding more cards to the lineup for testing. Sometimes they're pressed for time or don't see the value in doing such a thing because....For example, a 980Ti that has a nice OC on it will perform like a 1070 (or even slightly better). Meaning, try to find a newer card that's equivalent to how the 980Ti performs to have a rough idea of how it would handle a game the game. While the 980Ti still delivers great performance on 1080p and it can do well on 1440p, the card is pushing close to 4 years old.
 
This might turn into another Crysis 3 game, than no one will play in multiplayer, yet continue to benchmark it for years to come? I wonder how long BF 5 will last as a benchmark game... This RTX nonsense is so insignificant compared to the content and mechanics of what truly makes a great game.
 
This might turn into another Crysis 3 game, than no one will play in multiplayer, yet continue to benchmark it for years to come? I wonder how long BF 5 will last as a benchmark game... This RTX nonsense is so insignificant compared to the content of what truly makes a great game.
It depends on what other titles bring to the table. You read so much about BFV because so far it's the only title that makes use of DXR (and does a pretty good job at it). If other titles manage to look better, they'll have no trouble joining BFV in benchmarking.
 
This might turn into another Crysis 3 game, than no one will play in multiplayer, yet continue to benchmark it for years to come? I wonder how long BF 5 will last as a benchmark game.

Probably till the next BF release :roll:. The sort of tent pole franchises usually stick around.
Or even something like Metro 2033, which wasn't actually all that big a hit at least on PC anyway, was such a system melter that it was useful for a while.

BTW Crysis 3 MP was a blast for the short 3-4 months it was alive.
 
Back
Top