• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DOOM Eternal Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,036 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
DOOM Eternal is the long-awaited sequel to the epic DOOM series. There's even more carnage, and gameplay is super fast-paced. Built upon the id Tech 7 engine, visuals are excellent, and graphics performance is outstanding. We tested the game on all modern graphics cards at Full HD, 1440p and 4K Ultra HD.

Show full review
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Another game that fills up 8GB of VRAM but doesn't actually use it.
 
Joined
Nov 24, 2017
Messages
853 (0.36/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
Why the 2 most pupular cards(GTX 1650/Super) are missing?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,036 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Why the 2 most pupular cards(GTX 1650/Super) are missing?
Just not part of my benchmarking routine, didn't think they are that popular. Let me see if I can get some runs in for those
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.

People always forget there are architect improvement from Pascal to Turing
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,036 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.
Turing shaders can do FP + INT at the same time
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,036 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
What are the FP + INT?
Floating Point + Integer calculations

so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.
 

hamstertje

New Member
Joined
Mar 20, 2020
Messages
1 (0.00/day)
Location
Netherlands
The charts show a 5600 XT with 8 GB?
Wondering how the 5500 XT with 8 GB compares to the 4 GB version and the 580 and 590 in perfomance
 
Joined
Dec 31, 2009
Messages
19,366 (3.71/day)
Benchmark Scores Faster than yours... I'd bet on it. :)
@W1zzard - Is there an integrated benchmark here? If not, how did you test? Apologies if I missed in glancing over the article.
 
Joined
Feb 27, 2018
Messages
12 (0.01/day)
Location
Budapest
Floating Point + Integer calculations

so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.

Now I see. Thanks. That explains a lot.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.
In addition to FP+INT, there is also Variable Rate Shading that idTech definitely supports and very likely uses.

Edit:
There might be other features they are using, Rapid Packed Math (2*FP16 in place of FP32) comes to mind.
 
Last edited:
Joined
Oct 26, 2018
Messages
58 (0.03/day)
"The good thing is that our results show no major loss of performance (due to VRAM) for GTX 1060 3 GB and RTX 570 4 GB. What's surprising though is that RX 5500 XT 4 GB is doing much worse than expected. My best guess is that AMD's VRAM management for Navi isn't as refined yet as that for Polaris. At least the game doesn't crash when VRAM is exceeded, and continues to run fine."

The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.
 
Joined
Feb 1, 2013
Messages
1,248 (0.30/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5.2GHz @1.312v
Motherboard MXI APEX
Cooling Raystorm Pro + 1260mm Super Nova
Memory 2x16GB TridentZ 4000-14-14-28-2T @1.6v
Video Card(s) RTX 4090 LiquidX Barrow 3015MHz @1.1v
Storage 660P 1TB, 860 QVO 2TB
Display(s) LG C1 + Predator XB1 QHD
Case Open Benchtable V2
Audio Device(s) SB X-Fi
Power Supply MSI A1000G
Mouse G502
Keyboard G815
Software Gentoo/Windows 10
Benchmark Scores Always only ever very fast
Now I see. Thanks. That explains a lot.
The technical terminology for it is concurrent execution of floating point and integer operations. It is actually only made possible by a hardware change in Turing, by moving the INT32 blocks to be separate. https://hexus.net/tech/reviews/grap...g-architecture-examined-and-explained/?page=2

As support matures for new hardware features, performance will pull away from last generation -- leaves a bit of a bitter after taste too.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,036 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
The charts show a 5600 XT with 8 GB?
Whoops, fixed

Wondering how the 5500 XT with 8 GB compares to the 4 GB version and the 580 and 590 in perfomance
Should be roughly between RX 580 and RX 590 I'd say

Variable Rate Shading that idTech definitely supports and very likely uses.
I doubt they would secretly enable that as it would reduce image quality (if only a small bit)

Is there an integrated benchmark here? If not, how did you test?
No integrated benchmark, just play the game, find a good scene and keep playing that.

The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.
Very good point, let me mention that in the review
 
Joined
Jun 27, 2019
Messages
1,851 (1.05/day)
Location
Hungary
System Name I don't name my systems.
Processor i3-12100F 'power limit removed/-130mV undervolt'
Motherboard Asus Prime B660-PLUS D4
Cooling ID-Cooling SE 224 XT ARGB V3 'CPU', 4x Be Quiet! Light Wings + 2x Arctic P12 black case fans.
Memory 4x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Asus TuF V2 RTX 3060 Ti @1920 MHz Core/950mV Undervolt
Storage 4 TB WD Red, 1 TB Silicon Power A55 Sata, 1 TB Kingston A2000 NVMe, 256 GB Adata Spectrix s40g NVMe
Display(s) 29" 2560x1080 75 Hz / LG 29WK600-W
Case Be Quiet! Pure Base 500 FX Black
Audio Device(s) Onboard + Hama uRage SoundZ 900+USB DAC
Power Supply Seasonic CORE GM 500W 80+ Gold
Mouse Canyon Puncher GM-20
Keyboard SPC Gear GK630K Tournament 'Kailh Brown'
Software Windows 10 Pro
I have a question about the Vram limit when the game was tested on Ultra Nightmare.

Was the texture quality lowered with 3-4GB cards and everything else left on max?

I'm playing the game on a RX 570 4GB and 2560x1080 res and with that I'm unable to use High texture cause the ingame counter goes over by 11 'yes 11..' Mb of Vram and it tells me to lower stuff else it wont let me apply the settings.
So now I'm playing with Medium textures,I could lower Shadows to low and use high Textures but I kinda prefer a more balanced setting. 'luckily I can't really see a diff between medium and high but still'
 
Last edited:
Joined
Feb 21, 2006
Messages
1,978 (0.30/day)
Location
Toronto, Ontario
System Name The Expanse
Processor AMD Ryzen 7 5800X3D
Motherboard Asus Prime X570-Pro BIOS 5003 AM4 AGESA V2 PI 1.2.0.B
Cooling Corsair H150i Pro
Memory 32GB GSkill Trident RGB DDR4-3200 14-14-14-34-1T (B-Die)
Video Card(s) AMD Radeon RX 7900 XTX 24GB (24.3.1)
Storage WD SN850X 2TB / Corsair MP600 1TB / Samsung 860Evo 1TB x2 Raid 0 / Asus NAS AS1004T V2 14TB
Display(s) LG 34GP83A-B 34 Inch 21: 9 UltraGear Curved QHD (3440 x 1440) 1ms Nano IPS 160Hz
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500 + HS80 Wireless
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB SE
Keyboard Corsair K100
Software Windows 10 Pro x64 22H2
Benchmark Scores 3800X https://valid.x86.fr/1zr4a5 5800X https://valid.x86.fr/2dey9c
"The good thing is that our results show no major loss of performance (due to VRAM) for GTX 1060 3 GB and RTX 570 4 GB. What's surprising though is that RX 5500 XT 4 GB is doing much worse than expected. My best guess is that AMD's VRAM management for Navi isn't as refined yet as that for Polaris. At least the game doesn't crash when VRAM is exceeded, and continues to run fine."

The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.

Would be nice to see the numbers for the 5500XT in a ryzen system as PCie 4.0 x8 = PCie 3.0 x16

But the test rig is intel so hopefully another site runs this on a PCIe 4.0 board.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,844 (0.33/day)
Location
Pittsburgh, PA
System Name Titan
Processor AMD Ryzen™ 7 7950X3D
Motherboard ASUS ROG Strix X670E-I Gaming WiFi
Cooling ID-COOLING SE-207-XT Slim Snow
Memory TEAMGROUP T-Force Delta RGB 2x16GB DDR5-6000 CL30
Video Card(s) ASRock Radeon RX 7900 XTX 24 GB GDDR6 (MBA)
Storage 2TB Samsung 990 Pro NVMe
Display(s) AOpen Fire Legend 24" (25XV2Q), Dough Spectrum One 27" (Glossy), LG C4 42" (OLED42C4PUA)
Case ASUS Prime AP201 33L White
Audio Device(s) Kanto Audio YU2 and SUB8 Desktop Speakers and Subwoofer, Cloud Alpha Wireless
Power Supply Corsair SF1000L
Mouse Logitech Pro Superlight (White), G303 Shroud Edition
Keyboard Wooting 60HE / NuPhy Air75 v2
VR HMD Occulus Quest 2 128GB
Software Windows 11 Pro 64-bit 23H2 Build 22631.3447
@W1zzard Looks like your benchmarking is in-line with what I'm getting on my 2080 Super (442.74) and RX 5700 XT (Pro 20.Q1.1). :rockout:

I'm sure if I was on the latest Adrenalin it would see more FPS from the driver optimization.
 
Joined
Jan 8, 2017
Messages
8,929 (3.36/day)
System Name Good enough
Processor AMD Ryzen R9 7900 - Alphacool Eisblock XPX Aurora Edge
Motherboard ASRock B650 Pro RS
Cooling 2x 360mm NexXxoS ST30 X-Flow, 1x 360mm NexXxoS ST30, 1x 240mm NexXxoS ST30
Memory 32GB - FURY Beast RGB 5600 Mhz
Video Card(s) Sapphire RX 7900 XT - Alphacool Eisblock Aurora
Storage 1x Kingston KC3000 1TB 1x Kingston A2000 1TB, 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) LG UltraGear 32GN650-B + 4K Samsung TV
Case Phanteks NV7
Power Supply GPS-750C
Floating Point + Integer calculations

so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.

Just for clarification, it's not that for each floating point operation Turing can also do an integer operation, it's just that they can occur concurrently within the same clock cycle. Before, the scheduling logic was simpler and allowed either for floating point or integer computations within 1 clock cycle.

To be fair I reckon the real wold gain in performance from this is modest, because usually after one clock cycle of doing something floating point related you probably had to compute a set of addresses in the next clock cycle anyway which is why they never bothered with this until now.
 
Joined
Nov 24, 2017
Messages
853 (0.36/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GiB(2x4GiB) DDR3-1600 [800MHz]
Video Card(s) XFX RX 560D 4GiB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB HDD
Display(s) Samsung S20D300 20" 768p TN
Case Cooler Master MasterBox E501L
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point in HD 4600
Just not part of my benchmarking routine, didn't think they are that popular. Let me see if I can get some runs in for those
According to steam survey GTX 1650 alone is more pupular than RX 570, even RX 580. So it deserve it place in the benchmark chart.

As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.
Because Turing is the first Nvidia architecture to fully support low level API like D3D12/Vulkan and as a result dont have performence penalty like Maxwell/Pascal.
 
Joined
Apr 8, 2010
Messages
992 (0.19/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
Seems.. optimized? For 1920x1200 60Hz, looks like my GTX1080 will be good enough for a long time yet
 
Joined
Mar 24, 2012
Messages
528 (0.12/day)
As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.
Turing is better at id tech engine than pascal. this is not that surprising. you can see similar performance difference with doom 2016 as well.
 

ARF

Joined
Jan 28, 2020
Messages
3,934 (2.55/day)
Location
Ex-usa
RX 5700 XT is 10% faster than Radeon VII at 1920x1080.
While the Radeon VII is 5% faster than RX 5700 XT at 3840x2160.

:eek:

1920x1080:
1584715630251.png


3840x2160:
1584715660171.png
 
Top