• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU Test System Update May 2019

Joined
Oct 12, 2009
Messages
319 (0.06/day)
Location
Zagreb, Croatia
Processor Core i5 4460
Motherboard Gigabyte Z97-D3H
Cooling Zalman CNPS10X Optima
Memory 1x 8GB DDR3 @ 900Mhz
Video Card(s) Gigabyte GTX1660 6GB OC
Storage Patriot Blast 240GB SSD; Caviar Black 500 GB; Caviar Green 1 TB
Display(s) Dell U2311H (23'' IPS)
Power Supply FSP Hyper 700
Mouse Sharkoon Fireglider
Software Win 10 Pro 64-bit
Personal opinion: I don't much care about my FPS in Civilization games in general. As long as I can play it, FPS matters very little.

So I'd vote to remove Civ 6.
 
Joined
May 15, 2014
Messages
235 (0.06/day)
What about framerate consistency and latency, the most important aspect of a gaming card making a game "feel smooth"?
I think its long overdue.

Yep. GN frametime plots are killer. Bust out fcat @W1zz.

Witcher 3 is on CDPR's own REDEngine 3. Cyberpunk 2077 will be REDEngine 4 that will likely take Witcher 3's place in benchmarking.
Deus Ex Mankind Divided is on Dawn Engine which is modified Glacier Engine II - the same one that powers Hitman games, currently Hitman 2.

While we are at it, why not:
Ace Combat 7 - Unreal Engine 4
Anno 1800 - Anno Engine (?)
Assassin's Creed Odyssey - AnvilNext 2.0
Battlefield V - Frostbite 4
Civilization VI - Firaxis Engine (?)
Darksiders 3 - Unreal Engine 4
Devil May Cry 5 - RE Engine
Divinity Original Sin 2 - Divinity Engine 2
F1 2018 - EGO Engine 3.0
Far Cry 5 - Dunia 2
Hitman 2 - Glacier Engine II
Metro Exodus - 4A Engine
Monster Hunter World - MT Framework 2.x
Rage 2 - Apex Engine
Rainbow Six Siege - AnvilNext 2.0
Sekiro - PhyreEngine (?)
Shadow of the Tomb Raider - Foundation Engine
Shadow of War - LithTech Firebird
Strange Brigade - Asura Engine
Witcher 3 - REDEngine 3
Wolfenstein 2 - idTech 6

Good adds. The more games the better, else pick your games & get the result you want...

Why don't you add World War Z? Swarm Engine supports DX11 and Vulkan.

Because:


Especially in DX12 / Vulkan titles due to Turing having proper support for Async-Compute and etc.

So TU no longer requires context switching for concurrent graphics & compute loads?

Moreover Turing has geometry performance advantage over Pascal as well.

Didn't know TU had >6 polymorph engines & substantially higher clocks than GP. We haven't seen mesh shader performance beyond demos but it does seem a tad better than Vega NGG/prim shader implementation at least. ;)
 
Last edited:
Joined
Sep 15, 2007
Messages
3,944 (0.65/day)
Location
Police/Nanny State of America
Processor OCed 5800X3D
Motherboard Asucks C6H
Cooling Air
Memory 32GB
Video Card(s) OCed 6800XT
Storage NVMees
Display(s) 32" Dull curved 1440
Case Freebie glass idk
Audio Device(s) Sennheiser
Power Supply Don't even remember
Just in time to redo with Zen 2
 
Joined
Oct 1, 2006
Messages
4,884 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
So TU no longer requires context switching for concurrent graphics & compute loads?
Correct, Turing can finally run Float and Int both concurrently.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Why don't you add World War Z? Swarm Engine supports DX11 and Vulkan.
This is the trick with selecting games.

Is there a good reason to add World War Z to the list of games being tested? This is a single game that very heavily leans towards AMD GPUs which makes it an outlier. Swarm engine seems to be a new version of Saber3D engine that is not used in any other significant games. The only one that comes to mind is Quake Champions where parts of it are used in conjunction with idTech6.

TPU list of tested games has always had outliers but there have been valid reasons for involving them.
- Despite leaning towards Nvidia cards Anno is fairly unique game and engine and a long-running series.
- Civilization, again, is unique game and engine and a long-running series with a definite large fanbase.
- GTA V is a significant game in its own right.
- Hitman: Absolution, Deus Ex: Mankind Divided, then Hitman and Hitman 2 are basically on the same engine that also went for DX12 early on.
- Sniper Elite 4 had (and still has) one of the best DX12 implementations that benefits all GPUs across the board over DX11. Strange Brigade is on improved version of the same engine.
- Unreal Engine is a touchy subject here as it tends to favor Nvidia cards by default. More so when developer has not worked on optimizing the game - this can be clearly seen from now-out Dragon Quest XI and Darksiders that is still on the list. Personally, I would have let Hellblade stay in its place as that is (slightly) better optimized as well as a more remarkable game. However, Unreal Engine is important to have in the lineup and preferably with a couple of games because it is widespread and popular engine used by a lot of games.

On the other hand as someone pointed out earlier there are some popular engines missing from the list mostly because the games have not been significant enough recently. CryEngine, for example. Both Prey and Kingdom Come Deliverance are on CryEngine 4 but neither has been significant enough to keep in the list. Unity has games but not big games like AAA releases or anything that would resemble a graphical powerhouse - I have my favourite Unity games as do many people but there is not one that I can think of that is significant enough game and would provide useful performance results at the same time.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
However, Unreal Engine is important to have in the lineup and preferably with a couple of games because it is widespread and popular engine used by a lot of games.
That. and it's the same reason we don't have any CryEngine titles in our bench anymore, seems nobody uses it for their games
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Looking at the recent CryEngine games, the only one that would make sense for benchmarking is Hunt: Showdown. This should be on CryEngine V but is still technically in early access.

Edit:
By making sense I mean the game is popular although a bit niche, it looks good and needs a reasonably beefy PC to keep FPS high. No idea whether The Hunt prefers one side or the other in terms of GPUs but if things are done right CryEngine should be pretty neutral.
 
Last edited:
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
@W1zzard why did you drop Wildlands.
Was just getting old, same engine as AC:O
Before someone points out that Rainbow Six Siege is also on the same engine and is older - all this is true - but R6 Siege keeps being updated, is popular and is one of very few competitive FPS games that can use as much hardware as possible thrown at it. This is also one of the few games where I would seriously argue the FPS differences in 100-200 range and perhaps above do matter.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
but R6 Siege keeps being updated, is popular and is one of very few competitive FPS games that can use as much hardware as possible thrown at it. This is also one of the few games where I would seriously argue the FPS differences in 100-200 range and perhaps above do matter.
Exactly why it's included, I was also thinking about adding DOTA 2 for this rebench, but decided against it, would make the bench selection too eSports heavy I think
 

bug

Joined
May 22, 2015
Messages
13,229 (4.06/day)
Processor Intel i5-12600k
Motherboard Asus H670 TUF
Cooling Arctic Freezer 34
Memory 2x16GB DDR4 3600 G.Skill Ripjaws V
Video Card(s) EVGA GTX 1060 SC
Storage 500GB Samsung 970 EVO, 500GB Samsung 850 EVO, 1TB Crucial MX300 and 2TB Crucial MX500
Display(s) Dell U3219Q + HP ZR24w
Case Raijintek Thetis
Audio Device(s) Audioquest Dragonfly Red :D
Power Supply Seasonic 620W M12
Mouse Logitech G502 Proteus Core
Keyboard G.Skill KM780R
Software Arch Linux + Win10
Actually have Vega 64 (1650MHz@1.05V) and its pretty fine graphic card. I quite dont understand why most manufacturers overvolt it as hell, since it can run on very low volts with very nice efficiency.

The only logical explanation would be that not all chips are stable at the lower voltage.
Manufacturers don't overvolt it as hell, they simply follow AMD's specs. At least for reference designs.
 
Joined
Mar 10, 2014
Messages
1,793 (0.48/day)
Looking at the recent CryEngine games, the only one that would make sense for benchmarking is Hunt: Showdown. This should be on CryEngine V but is still technically in early access.

Edit:
By making sense I mean the game is popular although a bit niche, it looks good and needs a reasonably beefy PC to keep FPS high. No idea whether The Hunt prefers one side or the other in terms of GPUs but if things are done right CryEngine should be pretty neutral.

+It's the game made by Crytek themselves.
 
Joined
Oct 1, 2006
Messages
4,884 (0.76/day)
Location
Hong Kong
Processor Core i7-12700k
Motherboard Z690 Aero G D4
Cooling Custom loop water, 3x 420 Rad
Video Card(s) RX 7900 XTX Phantom Gaming
Storage Plextor M10P 2TB
Display(s) InnoCN 27M2V
Case Thermaltake Level 20 XT
Audio Device(s) Soundblaster AE-5 Plus
Power Supply FSP Aurum PT 1200W
Software Windows 11 Pro 64-bit
So, you think the ability to pack both int & fp is "Async-Compute"?
This explains the details much better than I do.
https://www.techpowerup.com/reviews/NVIDIA/GeForce_RTX_2080_Ti_Founders_Edition/2.html
What has changed, however, is that the Streaming Multiprocessor (SM), the indivisible sub-unit of the GPU, now packs CUDA cores, RT cores, and Tensor cores, orchestrated by a new Warp Scheduler that supports concurrent INT and FP32 ops, which should improve the GPU's asynchronous compute performance.


https://techgage.com/article/tracing-the-groundwork-of-nvidias-turing-architecture/
Asynchronous Compute
In what will be a big boon to a number of games, and less so to others, is a restructuring in how Turing handles asynchronous compute. NVIDIA has historically had issues with mixed workloads, long before Pascal, and it was a sore point even when it was introduced with Kepler and somewhat addressed with Maxwell.


When having to switch between graphics processing and compute, such as with physics calculations or various shaders, the GPU would have to switch modes, and this incurred quite a substantial performance loss due to the latency in the switch. Pascal made this switch almost seamless, but it was still there.

Turing on the other-hand, now finally allows concurrent execution of both integer and floating point math. Better still, they can now share the same cache. On paper, this results in a 50% boost in performance per CUDA core in mixed workloads. How this translates into real-world performance will be very different, but we expect to see quite a few games gain tangible improvements from this alone.

Games like Ashes of the Singularity will respond well to the changes, as well as games with mixed workloads, notably those that make use of DX12, but this is something we won’t see the full effect of until later. Generally though, a lot of games will see some kind of uptick in performance just from this change alone.
 
Last edited:
Joined
Mar 10, 2014
Messages
1,793 (0.48/day)
@W1zzard are you still using DX12 renderer on Strange Brigade? After all the updates I'm kind of curious how graphics cards stack up with Vulkan these days on that tittle. I.E. Babeltech's nvidia driver update review shows Vulkan now runs circles around directX(Though I'm sure they have typo with dx11, which game does not support).
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
are you still using DX12 renderer on Strange Brigade?
Yes. Given how similar both APIs are I wouldn't expect any meaningful difference in anything
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
Yes. Given how similar both APIs are I wouldn't expect any meaningful difference in anything
Performance in Strange Brigade in terms of Vulkan/DX12 difference has swayed a little to one side or the other over time (well, over patches and driver versions) but in big picture the current state of the game should be DX12 performing slightly better than Vulkan across the board.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
If Deus Ex Mankind Divided was dropped because it was "too old" even though it was DX12, I'd say the same for Civ VI.
Only because they have a newer game with the same engine basically.

Witcher 3 is an epic game but its too old now, released in 2015.
It still taxes a number of GPU’s, especially at 2560x1440. Plus as W1z said earlier, it’s a very popular review page for people.
 
Last edited:

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
If Deus Ex Mankind Divided was dropped because it was "too old" even though it was DX12, I'd say the same for Civ VI.
Civ 6 got a big update just a few months ago
 
Joined
Jul 24, 2009
Messages
1,002 (0.19/day)
Counterpoint - I had a Vega64. It was having serious trouble reaching the spec 1546MHz boost clock. When overclocked and undervolted, I still could not get 1600MHz out of it no matter what I did. It was not power limit and higher voltages simply did not help. And that was at 300+W power consumption. Blower, but for testing I ran it at 100% which did take care of temperatures. It was nice and efficient at low voltages but it was never fast at the same time. I suppose there would be a way to get it a little higher today with what we have learned but that took months or close to a year to mature.

This seems to be why Vegas are overvolted - the variability is huge. For every nice one that can do 1700MHz under water and extreme tuning there is a dud like mine apparently was.

Yea I understand that. Every piece is different. Mine can run a lot higher than that, just would need better cooling. Also its guaranteed to run at 1650 MHz, probably reason why I had no issues. Actually ran slightly faster out of the box, but I didnt like temps, so went onwards with tweaking.

It has a lot of limits, temp, power consumption, how is VRM handled and its not just about raising voltages, but raising or lowering right ones. Kinda tricky thing must say. Probably not even worth time I spend with it, given I dont actually use it. :D
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
@W1zzard for Power Consumption (Average and Gaming) Metro Last Light isn't part of your current gaming list. Wouldn't using a game on the list be more representative like moving to Metro Exodus you could also use it to gauge DX11, DX12 & RTX power differences if warranted in the future.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
@W1zzard for Power Consumption (Average and Gaming) Metro Last Light isn't part of your current gaming list. Wouldn't using a game on the list be more representative like moving to Metro Exodus you could also use it to gauge DX11, DX12 & RTX power differences if warranted in the future.
I second that. Metro Exodus is indeed a big power draw game.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
@W1zzard for Power Consumption (Average and Gaming) Metro Last Light isn't part of your current gaming list. Wouldn't using a game on the list be more representative like moving to Metro Exodus you could also use it to gauge DX11, DX12 & RTX power differences if warranted in the future.
The great thing about Metro Last Light is that it has varying power consumption, not just full max all the time, so it also acts as a test to see how well gpus operate at slightly below full load. thats why we have avg/max
 
Joined
Aug 13, 2010
Messages
5,386 (1.08/day)
I'm a firm believer in quality over quantity.

Would 200%, without a doubt prefer cutting number of tests\games by half just to have a zen2 system on board.
 
Top