• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DOOM: The Dark Ages Performance Benchmark

Other than Indiana Jones (which is hardly a paragon of optimization), I can't think of any other game that uses the Vulkan API and includes path tracing. So with this being the 2nd Vulkan game (and first on the new id engine), I'm not surprised they're working closely with Nvidia to incorporate path tracing. Probably working with Khronos as well to improve pure RT and PT performance in Vulkan.
Your probably right...

But I am surprised, it's ID Software. Part of there business is developing there own engine. Having to lean on a 3rd party to do a cutting edge 3D graphics feature... it's abit of "the mighty have fallen" moment for ID software. Love to be wrong though, have to wait and see.
 
? But it’s still id Tech 8. Not some other graphics game engine made outside. They all “have to lean on 3rd party”, if they want to advance the API, game developers do that all the time.
 
no game awards ? :)
good to see 9070xt > nvidia 5080 / 4080 / super
what timeline are we in?
(in upscaling) Though the 9070XT still fairs well outside of that, about as expected. Just isnt beating the 5080 / 4080S natively, which is perfectly reasonable

Your probably right...

But I am surprised, it's ID Software. Part of there business is developing there own engine. Having to lean on a 3rd party to do a cutting edge 3D graphics feature... it's abit of "the mighty have fallen" moment for ID software. Love to be wrong though, have to wait and see.
Its still their own engine? This isn't the first time ID has worked with a separate company as far as i'm aware to implement a new cutting edge 3d feature. This is just part of the norm, especially nowadays with much harder to implement path tracing, before that RT, etc.. If anything, its probably a good thing they're working as closely as they are to ensure the PT is 'good' (hopefully that means its fairly optimized) Valve does it pretty often too, who are also known for making their own engines in house (though based closely on Quake at their cores). Definitely not a 'the mighty have fallen' moment lol.
 
I think the important information is also that you can’t just lower the settings from “Ultra Nightmare” to something more reasonable, and receive a massive boost, as you can with almost all games out there - even going from highest setting to “Low” only gets you about 20% higher FPS - why so many setting options then?

Image Quality Comparison
Just to say, they did something?
 
Pre-loading available on Steam right now...
 
(in upscaling) Though the 9070XT still fairs well outside of that, about as expected. Just isnt beating the 5080 / 4080S natively, which is perfectly reasonable
Why is perfectly reasonable for 9070xt to lose to 5080/4080s ? didn't you understand how the game is played now ?
Take Doom, developer makes the game as usual to work for every hardware, then comes a GPU maker and says, please put our tech in this game so the game works better on our hardware, developer says - but what do i gain ? gpu maker says - money, we will bundle/promote your game with our gpu's so we make it legal.
Developer accepts and everyone is happy, that's how you get 3000$ gpu's and a monopoly.
 
HUB is a massively AMD biased site, so probably thats why.
Do we watch the same HUB?

"It's worth spending the NVIDIA premium for RT and DLSS."
"AMD needs to be 20% cheaper or more to be worth buying."

Unless this is sarcasm.
The game is not done yet! Nvidia always releases drivers on the release of the game so all these benchmarks are based off Beta drivers is insane to say that is what the performance is considering that the game is considering the final driver release is using Nvidia DLSS Enhanced "Transformer Model" based technology with Enhanced Ray Reconstruction, Enhanced Super Resolution, Enhanced Frame Generator, Enhanced DLAA and Path Training..... once Transformer model is activated AMD doesn't stand a chance. (Transformer Model works on all RTX cards!)
According to HUB it was NVIDIA who gave them early access and asked them to do this testing, presumably because of their promotional offer with Doom.

It seems NVIDIA didn't expect AMD to be anywhere close in these benchmarks.
 
On the Image Quality Comparison page, the low vs ultra nightmare numbers aren't very far apart which surprised me.
 
Why is perfectly reasonable for 9070xt to lose to 5080/4080s ? didn't you understand how the game is played now ?
Take Doom, developer makes the game as usual to work for every hardware, then comes a GPU maker and says, please put our tech in this game so the game works better on our hardware, developer says - but what do i gain ? gpu maker says - money, we will bundle/promote your game with our gpu's so we make it legal.
Developer accepts and everyone is happy, that's how you get 3000$ gpu's and a monopoly.
.. What?

None of what you said is relevant to what I said. Its performance is within expectation of what the product is capable of and what other benchmarks have shown natively. It winning in upscaling is a outlier though, I am not sure why, but on the surface that is impressive, and I never denied that. Thats what makes it reasonable. Don't like it? Well, unfortuantely, thats how the apple fell. AMD was not aiming for a high tier product anyway.

ID is likely not putting in pathtracing just to make AMD look bad. People who are gonna use path-tracing are not buying AMD products anyway. AMD can deliver good RT performance now, so RT still exists and their native performance is within expectation. Where is the problem? I'm sure AMD will catch up even more, now that AMD has geuinenly compelling products for tradiationally NVIDIA consumers (atleast as far as alot of my less enthuiast folk have said, who also buy computers)

As someone whos been vocal about their problems with current GPU's, AMD, NVIDIA, or otherwise. If you wanna talk potential monopoly, overpriced gpu's, GPU companies screwing over AIB's, AIB's screwing over consumers, etc, theres a ton of other threads where that discussion is more relevant.
 
It’s funny that we treat cards like Nvidia RTX 5080 (a 1150+ EUR card) and AMD RX 9070 XT (750+ EUR cards) as “not high tier”. But on the other hand we dismiss the stratospheric price of RTX 5090 as not relevant, because that’s not a gaming card, it’s a “Titan” pro-sumer level card aimed for content creation and home LLM / AI acceleration. But it still counts when we applaud the performance of Nvidia…
 
None of what you said is relevant to what I said. Its performance is within expectation of what the product is capable of and what other benchmarks have shown natively. It winning in upscaling is a outlier though, I am not sure why, but on the surface that is impressive, and I never denied that. Thats what makes it reasonable. Don't like it? Well, unfortuantely, thats how the apple fell. AMD was not aiming for a high tier product anyway.
If it's winning in upscaling is because it uses fsr 3.1 which doesn't use machine learning/tensor cores and it's lighter than FSR4, Nvidia uses ML based upscaling, looks much better but needs more processing power.
ID is likely not putting in pathtracing just to make AMD look bad. People who are gonna use path-tracing are not buying AMD products anyway. AMD can deliver good RT performance now, so RT still exists and their native performance is within expectation. Where is the problem? I'm sure AMD will catch up even more, now that AMD has geuinenly compelling products for tradiationally NVIDIA consumers (atleast as far as alot of my less enthuiast folk have said, who also buy computers)
Actually you are wrong, they are putting path tracing later and AMD will likely look bad, you know why ? because Nvidia does denoising for ray tracing in hardware and AMD does it in software, it has the hardware now in RDNA4 but i doubt it will be supported.
This is the magic sauce Nvidia has right now in ray tracing, it's not architecture, it's the other GPU maker not having ray tracing denoising in hardware supported, the performance hit can be enormous and render AMD unusable, and that is the whole point.
So, if AMD gives a fu...k, and it doesn't, but let's say they do, in a few years 9070xt can equal RTX 5080 in ray tracing with no problems, but only if they help/enable developers to use hardware denoising on RDNA4, which they don't because they have other priorities.
So mystery solved, whenever you see path tracing you remember this, AMD brute forcing it/not supported , Nvidia efficiently hardware accelerated, it's like playing movies on CPU instead of GPU video decoder, on CPU you consume 100W and runs like shit and on GPU it consumes 10W and runs perfectly.
 
Actually you are wrong, they are putting path tracing later and AMD will likely look bad, you know why ? because Nvidia does denoising for ray tracing in hardware and AMD does it in software, it has the hardware now in RDNA4 but i doubt it will be supported.
This is the magic sauce Nvidia has right now in ray tracing, it's not architecture, it's the other GPU maker not having ray tracing denoising in hardware supported, the performance hit can be enormous and render AMD unusable, and that is the whole point.
So, if AMD gives a fu...k, and it doesn't, but let's say they do, in a few years 9070xt can equal RTX 5080 in ray tracing with no problems, but only if they help/enable developers to use hardware denoising on RDNA4, which they don't because they have other priorities.
So mystery solved, whenever you see path tracing you remember this, AMD brute forcing it/not supported , Nvidia efficiently hardware accelerated, it's like playing movies on CPU instead of GPU video decoder, on CPU you consume 100W and runs like shit and on GPU it consumes 10W and runs perfectly.

It's nVidia code... you think nVidia is going to code a universial method to denoise that looks great for everyone? Never going to happen and them paying to get there method implemented everywhere isn't great for PCGaming.
 
Last edited:
The graphics look similar to Doom Eternal but the performance is much lower, wonder if their implementation of ray tracing is causing this? (still a lot better than most studios)

nGreedia drivers are simply a mess, there are even instances where the 4070Ti Super is on par or slightly faster than the 5070Ti, dispite the latter having superior hardware.

They should be ashamed.

Someone said earlier this year that nvidia drivers are trouble free :D

The 5070ti is basically a 4070ti super with gddr7 so similar performance is expected
 
? But it’s still id Tech 8. Not some other graphics game engine made outside. They all “have to lean on 3rd party”, if they want to advance the API, game developers do that all the time.

You don't see EPIC doing it do you for the renderer tech? Or Crytek, or Unity, or Gadot doing it either do you? The only time your going to see nVidia's Pathtracer... is when nVidia pays to have it implemented. So in this case... some one at ID, Bethesda or Microsoft sold out Doom for some easy nVidia money. Something that ID Software has never typically done for there render tech.
 
If it's winning in upscaling is because it uses fsr 3.1 which doesn't use machine learning/tensor cores and it's lighter than FSR4, Nvidia uses ML based upscaling, looks much better but needs more processing power.
Ehh its here or there. The more processing power isnt so much more that its a big deal (Could be different because max settings on DOOM:TDA uses a fair bit of RT though?), though I still lean towards efficiency more than 'pretty picture' so I have always leaned towards FSR for my needs despite DLSS generally looking better. Not necessary, but I do it anyway.
Actually you are wrong, they are putting path tracing later and AMD will likely look bad, you know why ? because Nvidia does denoising for ray tracing in hardware and AMD does it in software, it has the hardware now in RDNA4 but i doubt it will be supported.
This is the magic sauce Nvidia has right now in ray tracing, it's not architecture, it's the other GPU maker not having ray tracing denoising in hardware supported, the performance hit can be enormous and render AMD unusable, and that is the whole point.
So, if AMD gives a fu...k, and it doesn't, but let's say they do, in a few years 9070xt can equal RTX 5080 in ray tracing with no problems, but only if they help/enable developers to use hardware denoising on RDNA4, which they don't because they have other priorities.
So mystery solved, whenever you see path tracing you remember this, AMD brute forcing it/not supported , Nvidia efficiently hardware accelerated, it's like playing movies on CPU instead of GPU video decoder, on CPU you consume 100W and runs like shit and on GPU it consumes 10W and runs perfectly.
Then thats AMD problem. To be fair, the RT hit in 9000 cards is not so bad to the point where it renders AMD 'unusuable'. Infact the 9070 is a extremely comparable product to the 5070 and they are almost deadlocked in many regards. And the 9070XT is only a slightly behind the 5070Ti on average which can solved with slight tweaking / tuning, keep in mind pricing too.

Trust me, theres definitely a bigger elephant in the room thats leading to the 'monopoly' people were fearing (which is hopefully alleivated not than 9000 series dropped.), nothing youv'e said makes AMD look bad intentionally? Again, my point was that ID is very likely not playing sides. Keep in mind that NVIDIA is also the dominant marketshare, developers will prioritize their stuff over AMD / Intel because of that, unfortunately.

Will AMD look bad? Maybe, but I doubt it because most consumers are already aware pathtracing is still kind of out there anyway. Will it be done INTENTIONALLY? no.
 
The 9070 XT nearly matches the 5080? Ouch.:nutkick:

Not that I'm rooting for any brand, but considering that the 5080 costs nearly double than the 9070 XT, it's a bit awkward. Where are all the "AMD should be cheaper" people now?
 
Then thats AMD problem. To be fair, the RT hit in 9000 cards is not so bad to the point where it renders AMD 'unusuable'. Infact the 9070 is a extremely comparable product to the 5070 and they are almost deadlocked in many regards. And the 9070XT is only a slightly behind the 5070Ti on average which can solved with slight tweaking / tuning, keep in mind pricing too
AMD is not just behind, it's years behind Nvidia.
Remember, when Nvidia was talking about machine learning, AMD was a baby in diapers, they were learning how to walk and how to talk, we are talking about 6-7 years ago.
What has happened with RDNA4 is that AMD finally managed to copy Nvidia, that's why optiscaler works so well translating DLSS inputs to FSR4.
Denoising for ray tracing is similar to machine learning upscaling, and it has to be trained just like DLSS and FSR4, it's explained by Nvidia 5 years ago
So once you have machine learning upscaling figured out, and FSR4 is very good, then denoising is very simple.
I understand why Nvidia might be pissed about this, AMD are stealing years of work but us consumers don't care, we buy what works best, even if AMD profits from Nvidia work.
 
The 9070 XT nearly matches the 5080? Ouch.:nutkick:

Not that I'm rooting for any brand, but considering that the 5080 costs nearly double than the 9070 XT, it's a bit awkward. Where are all the "AMD should be cheaper" people now?
There are other games with was bigger players based, where the 9070XT is around RTX3080 . And nobody will ever test it if driver updates are fixing it . At least with Doom nvidia said they will fix it.
 
There are other games with was bigger players based, where the 9070XT is around RTX3080 . And nobody will ever test it if driver updates are fixing it . At least with Doom nvidia said they will fix it.
Haven't seen it personally.. lowest I've seen a 9070XT dip game per game was around a 4070Ti.
 
Crazy how they can go from Doom Eternal, which has amazing optimization, straight back to the terrible sloppy optimization so many games come out with now
thanks to forced RT and no visual difference in high or low settings
 
Why is perfectly reasonable for 9070xt to lose to 5080/4080s ? didn't you understand how the game is played now ?
Take Doom, developer makes the game as usual to work for every hardware, then comes a GPU maker and says, please put our tech in this game so the game works better on our hardware, developer says - but what do i gain ? gpu maker says - money, we will bundle/promote your game with our gpu's so we make it legal.
Developer accepts and everyone is happy, that's how you get 3000$ gpu's and a monopoly.
Uhm, no. When you have 80-90% of the market you don't need to pay any dev to do anything. They are doing it on their own cause 90% of their potential customers have access to that feature. That's how we get amd sponsored games that work like garbage on nvidia and how we get games that had RT and DLSS support they removed both after they got a sponsorship from amd.
 
There are other games with was bigger players based, where the 9070XT is around RTX3080 . And nobody will ever test it if driver updates are fixing it . At least with Doom nvidia said they will fix it.
Fix what ? 100 fps is not enough at 1440p ? or "fix" performance so 9070 xt is lower in charts ?
Uhm, no. When you have 80-90% of the market you don't need to pay any dev to do anything. They are doing it on their own cause 90% of their potential customers have access to that feature. That's how we get amd sponsored games that work like garbage on nvidia and how we get games that had RT and DLSS support they removed both after they got a sponsorship from amd.
Yeah sure, 80-90% of the market comprised of what ? RTX 4060, 3060 and 2060 laptops ? and the desktop versions of it ? i'm sure they will rush to buy doom and turn on path tracing immediately.
Could very well make the argument AMD is in both consoles and they should optimize for that architecture, that is 100% market share AMD and what ? cyberpunk and black myth wukong still exists.
 
There are other games with was bigger players based, where the 9070XT is around RTX3080 . And nobody will ever test it if driver updates are fixing it . At least with Doom nvidia said they will fix it.
In other games, the £600 card works like an actual £600 card (and not a £1200 one). How dreadful! :rolleyes:

Nvidia seems to have their hands full with fixing driver issues since the 50-series came out. I wonder what fix they can do here.

Yeah sure, 80-90% of the market comprised of what ? RTX 4060, 3060 and 2060 laptops ? and the desktop versions of it ? i'm sure they will rush to buy doom and turn on path tracing immediately.
Could very well make the argument AMD is in both consoles and they should optimize for that architecture, that is 100% market share AMD and what ? cyberpunk and black myth wukong still exists.
This. Nvidia having 90% of the desktop GPU market share doesn't mean that 90% of the players are using one. Most of them are on consoles (with AMD hardware in them).
 

Oof. This same GPU was pulling above 100FPS at 1440p in Eternal, now it can't break 60 at 1080p without upscaling in Dark Ages.

We love forced RT! :rolleyes:
 
Back
Top