• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

DOOM Eternal Benchmark Test & Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,220 (3.48/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
DOOM Eternal is the long-awaited sequel to the epic DOOM series. There's even more carnage, and gameplay is super fast-paced. Built upon the id Tech 7 engine, visuals are excellent, and graphics performance is outstanding. We tested the game on all modern graphics cards at Full HD, 1440p and 4K Ultra HD.

Show full review
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
26,960 (5.13/day)
Location
Indiana, USA
Processor Intel Core i7 9900K@5.0GHz
Motherboard AsRock Z370 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB Corsair DDR4-3000
Video Card(s) ASUS Strix GTX 1080Ti
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
Another game that fills up 8GB of VRAM but doesn't actually use it.
 
Joined
Nov 24, 2017
Messages
660 (0.76/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GB(2x4GB) DDR3-800MHz [1600MT/s]
Video Card(s) XFX RX 560 4GB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB
Display(s) Samsung S20D300 20" 768p TN
Case Delux DLC-MV888
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point
Why the 2 most pupular cards(GTX 1650/Super) are missing?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,220 (3.48/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
Why the 2 most pupular cards(GTX 1650/Super) are missing?
Just not part of my benchmarking routine, didn't think they are that popular. Let me see if I can get some runs in for those
 
Joined
Mar 18, 2008
Messages
4,912 (1.12/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA 2080Ti
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) Acer K272HUL, HTC Vive
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
Software Windows 10 Professional/Linux Mint
As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.
People always forget there are architect improvement from Pascal to Turing
 

ribizly

New Member
Joined
Feb 27, 2018
Messages
9 (0.01/day)
Location
Budapest
People always forget there are architect improvement from Pascal to Turing
That is okay. But then it should have been there at the release as well. It is simply driver tweak against the "older".

Turing shaders can do FP + INT at the same time
What are the FP + INT?
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,220 (3.48/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
What are the FP + INT?
Floating Point + Integer calculations

so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.
 

hamstertje

New Member
Joined
Mar 20, 2020
Messages
1 (0.05/day)
Location
Netherlands
The charts show a 5600 XT with 8 GB?
Wondering how the 5500 XT with 8 GB compares to the 4 GB version and the 580 and 590 in perfomance
 

ribizly

New Member
Joined
Feb 27, 2018
Messages
9 (0.01/day)
Location
Budapest
Floating Point + Integer calculations

so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.
Now I see. Thanks. That explains a lot.
 
Joined
Feb 3, 2017
Messages
2,201 (1.90/day)
Processor i5-8400
Motherboard ASUS ROG STRIX Z370-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-3200 CL16
Video Card(s) Gainward GeForce RTX 2080 Phoenix
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Logitech G700
Keyboard Corsair K60
As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.
In addition to FP+INT, there is also Variable Rate Shading that idTech definitely supports and very likely uses.

Edit:
There might be other features they are using, Rapid Packed Math (2*FP16 in place of FP32) comes to mind.
 
Last edited:

MKRonin

New Member
Joined
Oct 26, 2018
Messages
15 (0.03/day)
"The good thing is that our results show no major loss of performance (due to VRAM) for GTX 1060 3 GB and RTX 570 4 GB. What's surprising though is that RX 5500 XT 4 GB is doing much worse than expected. My best guess is that AMD's VRAM management for Navi isn't as refined yet as that for Polaris. At least the game doesn't crash when VRAM is exceeded, and continues to run fine."

The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.
 
Joined
Feb 1, 2013
Messages
487 (0.19/day)
System Name Gentoo64 /w Cold Coffee
Processor 9900K 5GHz 1.224v
Motherboard EVGA Z370 Micro
Cooling Custom 480mm H2O, Raystorm Pro, Nemesis GTX, EK-XRES
Memory 2x8GB Trident Z 4000-C16-1T 1.425v
Video Card(s) MSI Seahawk EK X 1080Ti 2100.5/12600
Storage Samsung 970 EVO 500GB, 860 QVO 2TB
Display(s) XB271HU 165Hz
Case FT03-T
Audio Device(s) SBz
Power Supply SS-850KM3
Mouse G502
Keyboard G710+
Software Gentoo 64-bit, Windows 7/10 64-bit
Benchmark Scores http://www.userbenchmark.com/UserRun/7242501
Now I see. Thanks. That explains a lot.
The technical terminology for it is concurrent execution of floating point and integer operations. It is actually only made possible by a hardware change in Turing, by moving the INT32 blocks to be separate. https://hexus.net/tech/reviews/graphics/122045-nvidia-turing-architecture-examined-and-explained/?page=2

As support matures for new hardware features, performance will pull away from last generation -- leaves a bit of a bitter after taste too.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
20,220 (3.48/day)
Processor Core i7-4790K
Memory 16 GB
Video Card(s) GTX 1080
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 7
The charts show a 5600 XT with 8 GB?
Whoops, fixed

Wondering how the 5500 XT with 8 GB compares to the 4 GB version and the 580 and 590 in perfomance
Should be roughly between RX 580 and RX 590 I'd say

Variable Rate Shading that idTech definitely supports and very likely uses.
I doubt they would secretly enable that as it would reduce image quality (if only a small bit)

Is there an integrated benchmark here? If not, how did you test?
No integrated benchmark, just play the game, find a good scene and keep playing that.

The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.
Very good point, let me mention that in the review
 
Joined
Jun 27, 2019
Messages
233 (0.81/day)
Location
Mid EU
Processor Ryzen 5 1600x @Stock
Motherboard ASUS ROG STRIX B350-F
Cooling Be quiet! Pure Rock Slim 'CPU', 3x Raijintek Auras 12 +3x Cooler Master MF120L non led
Memory 2x8GB G.SKILL Ripjaws V DDR4 3200MHz
Video Card(s) Gigabyte RX 570 Gaming 4G 'undervolted'
Storage 1 TB WD Blue, 3 TB Toshiba P300, 120 GB WD Green 2.5 SSD
Display(s) LG 29WK600-W
Case In Win 101c Black
Audio Device(s) Onboard + Kingston HyperX Cloud Stinger
Power Supply Cooler Master 650W MWE Gold
Mouse Motospeed V20
Keyboard Genius Scorpion K10
Software Windows 10 Pro
I have a question about the Vram limit when the game was tested on Ultra Nightmare.

Was the texture quality lowered with 3-4GB cards and everything else left on max?

I'm playing the game on a RX 570 4GB and 2560x1080 res and with that I'm unable to use High texture cause the ingame counter goes over by 11 'yes 11..' Mb of Vram and it tells me to lower stuff else it wont let me apply the settings.
So now I'm playing with Medium textures,I could lower Shadows to low and use high Textures but I kinda prefer a more balanced setting. 'luckily I can't really see a diff between medium and high but still'
 
Last edited:
Joined
Feb 21, 2006
Messages
578 (0.11/day)
Location
Toronto, Ontario
System Name AMD Ryzen
Processor 3800X
Motherboard Asus Prime X570-Pro
Cooling Corsair H150i Pro
Memory 16GB Gskill Trident RGB DDR4-3200 14-14-14-34-1T
Video Card(s) GIGABYTE Radeon RX 580 GAMING 8GB
Storage Corsair MP600 1TB PCIe 4 / Samsung 860Evo 1TB x2 Raid 0
Display(s) HP ZR24w
Case Fractal Design Meshify S2
Audio Device(s) Creative X-Fi + Logitech Z-5500
Power Supply Corsair AX850 Titanium
Mouse Corsair Dark Core RGB
Keyboard Logitech G810
Software Windows 10 Pro x64 1903
"The good thing is that our results show no major loss of performance (due to VRAM) for GTX 1060 3 GB and RTX 570 4 GB. What's surprising though is that RX 5500 XT 4 GB is doing much worse than expected. My best guess is that AMD's VRAM management for Navi isn't as refined yet as that for Polaris. At least the game doesn't crash when VRAM is exceeded, and continues to run fine."

The RX 5500 XT 4 GB, despite supporting PCIe 4.0 (and 3.0), is only physically x8 lanes. On the test setup, it's running PCIe 3.0 x8, where as the 1060 and 570 are x16 lane cards, so they can run PCIe 3.0 x16.
Would be nice to see the numbers for the 5500XT in a ryzen system as PCie 4.0 x8 = PCie 3.0 x16

But the test rig is intel so hopefully another site runs this on a PCIe 4.0 board.
 

Cheeseball

Not a Potato
Supporter
Joined
Jan 2, 2009
Messages
1,144 (0.28/day)
Location
Pittsburgh, PA
System Name Dell G5 15 5587 (2018) / AMD Custom Rig
Processor Intel® Core™ i7-8750H Processor (Coffee Lake) / AMD Ryzen 7 3800X
Motherboard Intel HM370 / ASUS TUF GAMING X570-PLUS (WI-FI)
Cooling Stock Dell twin-fan and copper heatpipes cooling system / AMD Wraith Prism
Memory SK hynix 16GB (8GBx2) - 2Rx8 DDR4 SODIMM 2666MHz / G.SKILL TridentZ 16GB (8GBx2) F4-3200C16D-16GTZR
Video Card(s) GeForce GTX 1060 Max-Q / NVIDIA GeForce RTX 2080 SUPER FE + Sapphire RX 5700 XT (Reference)
Storage ADATA XPG SX8200 Pro 512GB NVMe M.2 / ADATA SU800 1TB SATA 2.5" / Lots of M.2 NVMe SSDs
Display(s) Stock Dell 60 Hz IPS LCD panel / LG 27GL650F-B UltraGear 27" 1080p 144 Hz 1ms
Case Stock Dell G5 15 5587 (2018) body / NZXT H510i Matte White
Audio Device(s) Kingston HyperX Cloud II
Power Supply Dell Power Adapter 180W NDFTY / Corsair RMx Series RM750x
Mouse Dell touchpad / Razer Viper Ultimate
Keyboard Stock Dell G5 15 5587 laptop keyboard with backlit RED LEDs
Software Windows 10 Home 64-bit 1909 / Windows 10 Pro 64-bit 1909
@W1zzard Looks like your benchmarking is in-line with what I'm getting on my 2080 Super (442.74) and RX 5700 XT (Pro 20.Q1.1). :rockout:

I'm sure if I was on the latest Adrenalin it would see more FPS from the driver optimization.
 
Joined
Jan 8, 2017
Messages
4,842 (4.09/day)
System Name Good enough
Processor AMD Ryzen R7 1700X - 4.0 Ghz / 1.350V
Motherboard ASRock B450M Pro4
Cooling Scythe Katana 4 - 3x 120mm case fans
Memory 16GB - Corsair Vengeance LPX
Video Card(s) OEM Dell GTX 1080
Storage 1x Samsung 850 EVO 250GB , 1x Samsung 860 EVO 500GB
Display(s) 4K Samsung TV
Case Zalman R1
Power Supply 500W
Floating Point + Integer calculations

so you can do 1.0 + 1.0 = 2.0 at the same time as 1 + 1 = 2, effectively running two operations at the same time, for each GPU core. If game (or driver) code is properly crafted to optimize for that capability you can gain a lot of performance. That's why most recent games run much better on Turing.
Just for clarification, it's not that for each floating point operation Turing can also do an integer operation, it's just that they can occur concurrently within the same clock cycle. Before, the scheduling logic was simpler and allowed either for floating point or integer computations within 1 clock cycle.

To be fair I reckon the real wold gain in performance from this is modest, because usually after one clock cycle of doing something floating point related you probably had to compute a set of addresses in the next clock cycle anyway which is why they never bothered with this until now.
 
Joined
Nov 24, 2017
Messages
660 (0.76/day)
Location
Asia
Processor Intel Core i5 4590
Motherboard Gigabyte Z97x Gaming 3
Cooling Intel Stock Cooler
Memory 8GB(2x4GB) DDR3-800MHz [1600MT/s]
Video Card(s) XFX RX 560 4GB
Storage Transcend SSD370S 128GB; Toshiba DT01ACA100 1TB
Display(s) Samsung S20D300 20" 768p TN
Case Delux DLC-MV888
Audio Device(s) Realtek ALC1150
Power Supply Corsair VS450
Mouse A4Tech N-70FX
Software Windows 10 Pro
Benchmark Scores BaseMark GPU : 250 Point
Just not part of my benchmarking routine, didn't think they are that popular. Let me see if I can get some runs in for those
According to steam survey GTX 1650 alone is more pupular than RX 570, even RX 580. So it deserve it place in the benchmark chart.

As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.
Because Turing is the first Nvidia architecture to fully support low level API like D3D12/Vulkan and as a result dont have performence penalty like Maxwell/Pascal.
 
Joined
Apr 8, 2010
Messages
450 (0.12/day)
Processor Intel Core i5 8400
Motherboard Gigabyte Z370N-Wifi
Cooling Silverstone AR05
Memory Micron Crucial 16GB DDR4-2400
Video Card(s) Gigabyte GTX1080 G1 Gaming 8G
Storage Micron Crucial MX300 275GB
Display(s) Dell U2415
Case Silverstone RVZ02B
Power Supply Silverstone SSR-SX550
Keyboard Ducky One Red Switch
Software Windows 10 Pro 1909
Seems.. optimized? For 1920x1200 60Hz, looks like my GTX1080 will be good enough for a long time yet
 
Joined
Mar 24, 2012
Messages
358 (0.12/day)
As I see there is 19-21% between RTX2080 and GTX1080Ti. How? It looks like Nvidia breaks GTX1080Ti from driver.
Turing is better at id tech engine than pascal. this is not that surprising. you can see similar performance difference with doom 2016 as well.
 

ARF

Joined
Jan 28, 2020
Messages
447 (6.30/day)
System Name ARF System 1 | Portable 1
Processor AMD Athlon 64 4400+ X2 | AMD Ryzen 5 2500U
Motherboard ASRock 939A790GMH 790GX |
Cooling Arctic Freezer 13 | Dual-fan, dual heat-pipe Acer inbuilt
Memory 4 x 1GB DDR-400 | 2 x 8GB DDR4-2400
Video Card(s) Radeon ASUS EAH4670/DI/512MD3 | Radeon RX 560X 4G & Vega 8
Storage ADATA XPG SX900 128GB | Western Digital Blue 500GB
Display(s) LG 24UD58-B
Case Cooler Master HAF 912 Plus | 15-inch notebook chassis
Mouse Genius NetScroll 100X | Genius NetScroll 100X
Keyboard | Logitech Wave
Software Windows 7U SP1| Windows 10 Pro 1909
RX 5700 XT is 10% faster than Radeon VII at 1920x1080.
While the Radeon VII is 5% faster than RX 5700 XT at 3840x2160.

:eek:

1920x1080:
1584715630251.png


3840x2160:
1584715660171.png
 
Top