• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Battlefield 2042 Open Beta Benchmark Test & Performance

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
28,698 (3.74/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Battlefield 2042 is one of the most anticipated titles this year. The open Beta started today, and the visuals look stunning. We took the game for a spin on a selection of the latest graphics cards to get a feel for what to expect in terms of performance and hardware requirements.

Show full review
 
Can't say I'm too excited for this one, but I suppose fans of futuristic shooters will be. Really wanted to like BF5, but found the MP too unforgiving for a more casual player...
 
CPU bench ? I played warzone and with Ryzen 1600X , all cores were maximum !
 
I'm finding it pretty meh so far both visually and gameplay wise.
Maybe my expectations were just too high.
 
I'm suprised that people call this open beta. You wanna play "open beta"? Pre-order or buu EA pass, haha.

Limited / closed beta :}.

Watched stream for few minutes. At least decent fps. Assault rifle action looked good. Map is big :). Every stream looked to "blue'ish" at least for me. I guess every map will be that way. Looking forward to release. Maybe gonna buy it later.
 
Last edited:
I'm suprised that people call this open beta. You wanna play "open beta"? Pre-order or buu EA pass, haha.

Limited / closed beta :}.

At least nice fps.

Yeah, I buy almost every major MP game so it is a none issue but they should have just made it open the whole time.
 
  • Like
Reactions: ixi
Not as stellar as I expected, but this map is old to me now.
 
Hehe, pre-order is a bad idea. In the past we have had too many eff... releases.
Pre-order gets early access. From 6 October (U.S. time I think). Us money-not-burning-our-pockets folks get in at 8th. And in this case it's not a bad idea. You pre-order. Play the game for four days instead of two and form your your opinion. You can cancel pre-order anytime between 19 November.
 
Kinda disappointed by this beta. No unlocks, lag, AI bots, bugs... The game looks promising but no wow factor like BF4 beta
 
Thanks for the test ! I guess I will wait 2022 for this one, I'm sure it can be a good Battlefield.
 
Kinda disappointed by this beta. No unlocks, lag, AI bots, bugs... The game looks promising but no wow factor like BF4 beta
Yeah, their bots are shite. Also not AI despite how they have them labeled... Pre-Knowledge is not intelligence.
 
That's a surprisingly dense performance spread - are you seeing significant cpu limitations at >100fps? Also, not that it matters irl (it's not like it would have delivered playable frame rates at 2160p anyhow - going by the other results it would have been ~35fps at best), but any idea why the 6600 XT drops off so sharply there? Very aggressive asset streaming being limited by the narrow VRAM bus? Something else?
 
That's a surprisingly dense performance spread - are you seeing significant cpu limitations at >100fps? Also, not that it matters irl (it's not like it would have delivered playable frame rates at 2160p anyhow - going by the other results it would have been ~35fps at best), but any idea why the 6600 XT drops off so sharply there? Very aggressive asset streaming being limited by the narrow VRAM bus? Something else?
Look at the first page, AMD uses more Vram at 4k than Nvidia, over 8gb. While that was with the 6900xt I am making the assumption that is what is killing the 6600xt at 4k, combined with slow memory bus.
1633593040640.png
 
Look at the first page, AMD uses more Vram at 4k than Nvidia, over 8gb. While that was with the 6900xt I am making the assumption that is what is killing the 6600xt at 4k, combined with slow memory bus.
View attachment 219859
That's too simplistic IMO. VRAM usage, especially in games with complex, aggressive asset-streaming systems like these games have, is never reducible down to "I took a measurement, here's the number". This is a methodological weakness of these measurements, though an understandable one, as an actual investigation into real-world VRAM usage (rather than just opportunistic allocation) is very, very time consuming (if possible at all).

The core of the issue here is that not (even close to) all allocated VRAM is actually put to use in any given play situation, partly because the asset-streaming systems need to cover all eventualities of player movement - do you run straight down a hallway, exit the building, or climb to its roof? Assets for all these possibilities need to be available, despite only one of them being the outcome, and thus data for only one of them is actually put to use. And as time passes, unused assets are ejected from VRAM to make room for new possible progressions of play. The breaking point between being able to cover enough eventualities and VRAM size is where you start to see storage/IO based stutter, as the game starts needing to stream in assets as they are needed rather than beforehand. But, especially as we only have average FPS numbers here and no .1%/.01% lows, we don't know what actual performance looks like here. Is the average an even chug, or is it incredibly spiky and wide-ranging? We don't know. Hence why I asked the person actually doing the benchmarks for some insight. It might be down to the game actually needing more than 8GB of VRAM, but it might also just be down to the asset-streaming system not keeping up (and thus being bottlenecked by the bus), it might be down to a driver issue, or it could be a bunch of other factors.
 
That's too simplistic IMO. VRAM usage, especially in games with complex, aggressive asset-streaming systems like these games have, is never reducible down to "I took a measurement, here's the number". This is a methodological weakness of these measurements, though an understandable one, as an actual investigation into real-world VRAM usage (rather than just opportunistic allocation) is very, very time consuming (if possible at all).

The core of the issue here is that not (even close to) all allocated VRAM is actually put to use in any given play situation, partly because the asset-streaming systems need to cover all eventualities of player movement - do you run straight down a hallway, exit the building, or climb to its roof? Assets for all these possibilities need to be available, despite only one of them being the outcome, and thus data for only one of them is actually put to use. And as time passes, unused assets are ejected from VRAM to make room for new possible progressions of play. The breaking point between being able to cover enough eventualities and VRAM size is where you start to see storage/IO based stutter, as the game starts needing to stream in assets as they are needed rather than beforehand. But, especially as we only have average FPS numbers here and no .1%/.01% lows, we don't know what actual performance looks like here. Is the average an even chug, or is it incredibly spiky and wide-ranging? We don't know. Hence why I asked the person actually doing the benchmarks for some insight. It might be down to the game actually needing more than 8GB of VRAM, but it might also just be down to the asset-streaming system not keeping up (and thus being bottlenecked by the bus), it might be down to a driver issue, or it could be a bunch of other factors.
Now you are making me want to break out the application tracing utills.
 
Anybody has any idea what's up with "borderless fullscreen"? What problems with good old fullscreen does it solve?
(I know, it's not specific to this title, sorry for the OT.)
 
Now you are making me want to break out the application tracing utills.
Just following up with an interesting comparison on this topic: the Far Cry 6 benchmark from yesterday showed noticeably higher VRAM utilization in the measurements, yet there's no 2160p drop-off to be seen for the 6600 XT. It does lose to the 3060 at 2160p, so there's an indication that its narrow bus is holding it back somewhat, but the difference is negligible.
Anybody has any idea what's up with "borderless fullscreen"? What problems with good old fullscreen does it solve?
(I know, it's not specific to this title, sorry for the OT.)
Borderless fullscreen makes alt-tabbing out of the game easier, and has some benefits for driver stability in scenarios like that. It also makes it easier if you're using multiple monitors and want to easily swap over to an application on the second monitor from time to time.
 
Borderless fullscreen makes alt-tabbing out of the game easier, and has some benefits for driver stability in scenarios like that. It also makes it easier if you're using multiple monitors and want to easily swap over to an application on the second monitor from time to time.
So it was easier to implement a new mode from scratch than to fix Alt+Tab?

Fwiw, when I was gaming I never had problems alt-tabbing. On a multi-monitor setup. Though I didn't alt-tab that much. This is actually why I asked, when I tried there didn't seem to be any difference between the two modes. Ty.
 
So it was easier to implement a new mode from scratch than to fix Alt+Tab?

Fwiw, when I was gaming I never had problems alt-tabbing. On a multi-monitor setup. Though I didn't alt-tab that much. This is actually why I asked, when I tried there didn't seem to be any difference between the two modes. Ty.
It's quite dependent on the game, the GPU driver, and other variables (likely WDDM and similar system-level mechanisms as well). AFAIK borderless fullscreen just treats the game as if it were any other desktop window, just one that happens to fill the entirety of the screen without borders, while regular fullscreen overrides the desktop completely, sidelines the regular desktop rendering pathway, and gives the game near complete control over the monitor (changing resolution, refresh rate, other mode changes), which can then lead to trouble (rapid, unplanned mode changes when alt-tabbing or similar) and thus causing driver or application instability. My understanding of this is pretty limited though, and I'm sure there's some part of what I just said that isn't quite right.
 
  • Like
Reactions: bug
It's quite dependent on the game, the GPU driver, and other variables (likely WDDM and similar system-level mechanisms as well). AFAIK borderless fullscreen just treats the game as if it were any other desktop window, just one that happens to fill the entirety of the screen without borders, while regular fullscreen overrides the desktop completely, sidelines the regular desktop rendering pathway, and gives the game near complete control over the monitor (changing resolution, refresh rate, other mode changes), which can then lead to trouble (rapid, unplanned mode changes when alt-tabbing or similar) and thus causing driver or application instability. My understanding of this is pretty limited though, and I'm sure there's some part of what I just said that isn't quite right.
Yeah, this is rather technical (e.g. do you still control v-sync/refresh when dealing with a mere maximized window?). I'll need to dig deeper, I haven't been able to find all of this summed up so far.
 
Last edited:
Look at the first page, AMD uses more Vram at 4k than Nvidia, over 8gb. While that was with the 6900xt I am making the assumption that is what is killing the 6600xt at 4k, combined with slow memory bus.
View attachment 219859
Thats not definitive proof. It may be a bandwidth issue akin to FC6. We would need to verify with the 5700xt.
 
Had no idea the open beta already started - my download is complete so i'll be jumping into a game shortly
 
Pretty funny: 3060ti owners can turn on DLSS and play comfortably at 4K while 6600XT owners will be stuck at 12fps on average regardless. Yeah, yeah, I know one is currently 200$ more than the other instead of just 20$ and market is nuts, but still!
 
Back
Top