• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

DOOM: The Dark Ages Performance Benchmark

Good to see no UE5, will buy the game to reward the devs for that decision, and since the original DOOM is one of the very few shooter games that I actually enjoyed as well.

There's some crazy differences in performance between TPU's testing and HUB's testing.
5000 series look very underwhelming in HUB's testing and looks fine in TPU's testing.
8GB cards look typically fine in TPU's testing until like 4K, but break much quicker in HUB's testing.
HUB's testing shows that the XTX and 9070XT are evenly matched, whereas TPU's testing shows the 9070XT clearly ahead.

How very odd.
Watched the HUB video now, so for the results differences, I think they tested a different part of the game to TPU, thats easily explained I think. For the 8 gig stuff, HUB was open that they went out looking for problems rather than just being happy with the short bench run, he deliberately played the game for a longer performance to let the VRAM buffer fill up, and then the problems started, the bench run for the graphs wasnt enough to trigger the issue. He also noticed frame gen glitches with 8 gig cards, however he was able to fix it all by dropping the texture pool down to the min 1.5gig. He also reported the texture quality is poor, which is likely as a result of optimising the game for 8 gig GPUs.

I am going to guess DF will do a video on this at some point as well.
 
Last edited:
That slim 2-3fps gap between the average and the 1% low of low-end GPUs strikes me as equally unusual. I've never seen a game so stable...

1746871389743.png

1746871409295.png
 
5090 series lacking its usual boost compared to 4090, I wonder if that is down to the engine.
 
Good to see no UE5, will buy the game to reward the devs for that decision, and since the original DOOM is one of the very few shooter games that I actually enjoyed as well.


Watched the HUB video now, so for the results differences, I think they tested a different part of the game to TPU, thats easily explained I think. For the 8 gig stuff, HUB was open that they went out looking for problems rather than just being happy with the short bench run, he deliberately played the game for a longer performance to let the VRAM buffer fill up, and then the problems started, the bench run for the graphs wasnt enough to trigger the issue. He also noticed frame gen glitches with 8 gig cards, however he was able to fix it all by dropping the texture pool down to the min 1.5gig. He also reported the texture quality is poor, which is likely as a result of optimising the game for 8 gig GPUs.

I am going to guess DF will do a video on this at some point as well.
Correct. That is what HUB did. Presumably some people don't watch the videos, and just skip to the benchmark graphs, then are surprised that it didn't conform to their expectations, despite it being pretty clear from the audio why that may be.
 
It means the indiana jones game is just badly optimized and crashes due to expecting to have more available VRAM then there is. More VRAM only increases performance in situations where it does not have to pull from RAM. This whole discussion about 8GB can't in 2025 is nonsense - the game runs perfectly fine at 1080p / 1440p at maxed out settings.
It's safe to upgrade from 8GB.
 

It’s amazing how the 9070 XT performs better in Australia than in the US, no wonder it’s so expensive there.
In Australia, the 9070 XT performs at the level of a 5080 or faster,
while in the US, it barely reaches 5070 Ti levels. :rockout:

The Russian and German 9070 XT's comes from the same lineage. :pimp:

View attachment 398846

View attachment 398845

Note: The above may contain sarcasm and might not make complete sense.
dont trust usf review tbh
 
Performance isnt as bad as I expected it to be. Not too bad to run natively but 60 FPS for a game as fast paced as DOOM: The Dark Ages is gonna be a hard pill to swallow. Still though, not as bad as I thought it was gonna be.
 
If I bought a 5080, I would be pissed. No 60fps at 4K ultra nightmare= booooo
 
Last edited:
What do you mean by that ? o_O HWU is one of the best if not the best.
How can they be the best if they're using outdated drivers? It just means they value clicks above valid testing. I understand it. It sucks when you spend a ton of time testing and then half a day before the embargo Nvidia releases a new driver to improve the 5000 series performance which makes all your work pointless. If you delay your video you miss a ton of traffic so they made the choice to release it anyway. Their results are no longer valid though so I don't see how they can ever be the best when they knowingly release outdated information.

It's all just a matter of confirmation bias on which sites/channels people like. Whenever I see someone praising HWU I can all already tell which brand of GPU they have. For some reason HWU almost always has AMD score a little bit better than other sites so naturally people with AMD cards like HWU.
 
How can they be the best if they're using outdated drivers? It just means they value clicks above valid testing. I understand it. It sucks when you spend a ton of time testing and then half a day before the embargo Nvidia releases a new driver to improve the 5000 series performance which makes all your work pointless. If you delay your video you miss a ton of traffic so they made the choice to release it anyway. Their results are no longer valid though so I don't see how they can ever be the best when they knowingly release outdated information.

It's all just a matter of confirmation bias on which sites/channels people like. Whenever I see someone praising HWU I can all already tell which brand of GPU they have. For some reason HWU almost always has AMD score a little bit better than other sites so naturally people with AMD cards like HWU.
ComputerBase used the latest drivers, and their results matched HUB's(9070XT > 5080). It’s always better to cross-reference multiple sources than to place blind trust in just one.

1746884732742.png

1746885047484.png


1746884789094.png
 
The game is not done yet! Nvidia always releases drivers on the release of the game so all these benchmarks are based off Beta drivers is insane to say that is what the performance is considering that the game is considering the final driver release is using Nvidia DLSS Enhanced "Transformer Model" based technology with Enhanced Ray Reconstruction, Enhanced Super Resolution, Enhanced Frame Generator, Enhanced DLAA and Path Training..... once Transformer model is activated AMD doesn't stand a chance. (Transformer Model works on all RTX cards!)

AMD gets destroyed by "Transformer Model" every time and AMD sucks at Path Tracing. Looks to me some people are Cherry Picked fudging these benchmarks to make it look more tempting with people that have underwhelming Hardware. If you don't have 16GB of video VRAM minimum (as per-review) for this game all you're doing is bastardizing it cutting it down watering it down till playable undesirable looking game my point of view my opinion of course just saying what it is as I see it.

Cheers
 
Heard it was around 5060Ti performance.
The B750? Maybe. The B770 should at least be 5070 range. But who knows, we'll see with the card hit the market and are reviewed.

Intel stated it's probably not happening. At least it's rumored so.
They're working on something, they're just being tight lipped about it.
 
The B750? Maybe. The B770 should at least be 5070 range. But who knows, we'll see with the card hit the market and are reviewed.

Yeah I'd hope a B770 at least beats the RTX 5060Ti 16GB. I mean if it doesn't then it inherently needs to be quite a bit cheaper to offset that. Then again if it pretty much ties it and was priced at like cost of a RTX 5060Ti 8GB model Intel would still have something pretty convincing. I almost prefer they don't release a B770 at this point because I'm on the fence about just holding out and waiting for Celestial and don't need that temptation.
 
@W1zzard

If you have
1746900904393.png

How do you do native full screen testing at 4k?
 
ComputerBase used the latest drivers, and their results matched HUB's(9070XT > 5080). It’s always better to cross-reference multiple sources than to place blind trust in just one.

View attachment 399002
View attachment 399004

The weird thing is that 5080 is below 4080 Super. Something is wrong with the 5000 series in this combination of game/drivers.

@W1zzard

If you have
View attachment 399046
How do you do native full screen testing at 4k?

The review says that THE SCREENSHOTS were taken at 1440p, but I guess the reviewer also has a 4K monitor.
 
Looking at these numbers It's like crysis all over again will have to wait couple more years before something is out that can run it good.
 
Looking at these numbers It's like crysis all over again will have to wait couple more years before something is out that can run it good.

Or perhaps a little make-up and a swipe of red lipstick (optimisations) could trim that long wait down to just a couple/few months... not that i'm holding my breath.

TBH, its crysis all over again in a bunch of newer titles... something I wouldn't mind if we're getting one-step closer to photo-realism - but thats hardly the case. Some of these newer games are just GPU brutal by design, at a time when GPU price brutality is in full swing.
 
Crysis again? Yup, can confirm new games are hitting my 5090 about as hard as Crysis would hit a GT710... And these results are hinting at how much harder the PT enhancement is going to rip our systems up.
 
If I bought a 5080, I would be pissed. No 60fps at 4K ultra nightmare= booooo

I think the important information is also that you can’t just lower the settings from “Ultra Nightmare” to something more reasonable, and receive a massive boost, as you can with almost all games out there - even going from highest setting to “Low” only gets you about 20% higher FPS - why so many setting options then?

Image Quality Comparison
 
Back
Top