• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sniper Elite 4: Performance Analysis

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,028 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
We test Rebellion's new shooter on 10 modern graphics cards, with the latest game-optimized drivers from AMD and NVIDIA. A surprise is that AMD gains up to 27% performance from switching to DX12 with Async Compute, while the performance uplift for NVIDIA users is only up to 5%.

Show full review
 
Last edited:
Joined
Oct 16, 2012
Messages
734 (0.17/day)
Location
Malaysia
System Name Cypher-C4
Processor Ryzen 9 5900X
Motherboard MSI B450 Tomahawk Max
Cooling Deepcool AK620
Memory 32GB Crucial Ballistix Elite 3200MHz
Video Card(s) Palit RTX 3060 Ti Dual
Storage 250GB Crucial MX500 + 2TB Crucial MX500 + 2TB WD Black + 4TB Toshiba + 1TB Samsung F3
Display(s) Acer XV272UP
Case Corsair Obsidian 750D
Audio Device(s) Behringher UMC202HD Audio Interface + Mackie HM-4 + Sennheiser HD 280 Pro + Shure SM58
Power Supply Corsair HX750i
Mouse Steelseries Rival 310
Keyboard Keychron K8 + Kailh BOX Crystal Jade switches + Ducky Good in Blue keycaps
Software Windows 10 Pro 64-bit
Probably not a very popular opinion but if you tweak the settings a little bit, I'm thinking the GTX 1060 and RX 480 would be more than adequate for 1440p 60fps (avg). Some people seem to think those 2 GPUs are exclusively for res up to 1080p but I very much disagree on that. It really boils down on what the user wants in the end.

NOTE: Not that I'm saying you're wrong in your conclusion, @W1zzard. You clearly state that the GPUs I mentioned are for THE highest detail settings at 1080p. :)
 
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11
With the gains AMD show on DX12, i assume the DX11 performance graphs were to horrible to show?
 
Joined
Apr 30, 2012
Messages
3,881 (0.89/day)
Either there is a typo on the RX 480 FPS # or it should switch places with the GTX 1060

 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,028 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Either there is a typo on the RX 480 FPS # or it should switch places with the GTX 1060


Oh, I manually sort them and missed that one. Will fix tomorrow morning
 

VSG

Editor, Reviews & News
Staff member
Joined
Jul 1, 2014
Messages
3,462 (0.97/day)
Do you happen to know what the standard deviation was for the FPS results? Perhaps not for all the cards and all resolutions, but some idea would tell a nice story.
 
Joined
Jun 11, 2015
Messages
34 (0.01/day)
System Name main rig
Processor 1600X @4.05ghz
Motherboard ASUS B350 Prime PLUS
Cooling Corsair H60
Memory Corsair 16 GB (dual ch.)
Video Card(s) Pitcairn XT
Storage 500 GB SSD, 12 TB RAID0
Display(s) UltraGear 27GL850-B w/ Freesync
Case Antec P193
Audio Device(s) ASUS Xonar
Power Supply Corsair 650W
Mouse MX518
Keyboard daskeyboard s /w cherry red switches
Software Fedora Rawhide
Hi!

Nice review, although... what are the clock speed readouts of the cards during the run?
You write the CPU clock but nowhere the GPU clock is mentioned, which is quite important to know in a graphics benchmark. Even if they are stock, it would be important to know WHICH stock clock it is.

Thanks!
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
W1zzard said:
The only exception is the R9 Fury X which does extremely well, beating the GTX 1070 at 4K, sitting right between GTX 1070 and 1080. What makes this victory even more memorable is that Fury X "only" has 4 GB of VRAM. At least 4 GB of VRAM seems to be the ideal memory configuration for Sniper Elite 4, to run at highest details. Our memory usage testing reveals that the game doesn't use excess memory and today's gaming cards won't run into trouble, memory capacity-wise.

I am soooooooo happy that I bought my lovely FuryX. Hopefully with more Aysnc Compute enabled titles the full 4096SP will be utilized better in the future.
 
Joined
Mar 18, 2008
Messages
5,717 (0.97/day)
System Name Virtual Reality / Bioinformatics
Processor Undead CPU
Motherboard Undead TUF X99
Cooling Noctua NH-D15
Memory GSkill 128GB DDR4-3000
Video Card(s) EVGA RTX 3090 FTW3 Ultra
Storage Samsung 960 Pro 1TB + 860 EVO 2TB + WD Black 5TB
Display(s) 32'' 4K Dell
Case Fractal Design R5
Audio Device(s) BOSE 2.0
Power Supply Seasonic 850watt
Mouse Logitech Master MX
Keyboard Corsair K70 Cherry MX Blue
VR HMD HTC Vive + Oculus Quest 2
Software Windows 10 P
Also an interesting note regarding HBM VRAM. In recent AMD VEGA events it seems they have stopped calling HBM as VRAM, but rather HBC=High Bandwidth Cache. So I can only assume somewhere down the pipeline AMD will have a card that has ~8GB of HBM as Cache and >16GB of GDDR5/GDDR5X/GDDR6 as VRAM. That would be very interesting.
 
Joined
Mar 24, 2012
Messages
528 (0.12/day)
Also an interesting note regarding HBM VRAM. In recent AMD VEGA events it seems they have stopped calling HBM as VRAM, but rather HBC=High Bandwidth Cache. So I can only assume somewhere down the pipeline AMD will have a card that has ~8GB of HBM as Cache and >16GB of GDDR5/GDDR5X/GDDR6 as VRAM. That would be very interesting.

will they mixed the usage of HBM and GDDR memory? isn't that going to beat the purpose of going HBM in the first place? AFAIK what AMD intend to do is something different. high capacity HBM is very expensive. so they want games to reduce VRAM usage with cleaver usage of HBC. but we don't know how transparent is this to game and operating system. do we need game developer specifically to code for it? does it need to be included in 3D API spec? just look at async itself. AMD have the hardware since the very first GCN but they end up leaving it idle in their hardware because DX11 cannot access the feature.
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,028 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Hi!

Nice review, although... what are the clock speed readouts of the cards during the run?
You write the CPU clock but nowhere the GPU clock is mentioned, which is quite important to know in a graphics benchmark. Even if they are stock, it would be important to know WHICH stock clock it is.

Thanks!
we've always used reference design cards at reference clocks for all GPU related articles
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
with pascal and gpu boost 3.0, reference clocks don't necessarily mean all that much :)
 
Joined
Jul 4, 2015
Messages
197 (0.06/day)
Processor Intel I7 6700
Motherboard Msi Z170i Pro Gaming AC
Cooling Be Quiet Shadow Rock LP
Memory Corsair LPX 16GB
Video Card(s) Gigabyte 980ti Extreme Gaming W3
Storage Samsung Evo 850 500GB + 250GB
Display(s) 3x Dell Ultrasharp U2515H
Case Ncase M1
Power Supply Sharkoon Silentstorm SFX
Mouse Logitech MX Master, Steelseries Rival 300
Keyboard Corsair K65RGB
Software Win 10
with pascal and gpu boost 3.0, reference clocks don't necessarily mean all that much :)

I would guess that most people buy cards and run them stock. A fair deal of people are also buying reference design
 
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
what i mean was that gpu boost 3.0 lets the clock increase a lot provided that power or tdp limit is not reached.

for example, gtx1080 reference clock is 1607mhz and boost clock is 1733mhz.
- founders edition gtx1080 cards (depending on temperature inside the case) generally average to little above the boost clock .
- aib gtx1080 with openair coolers (again, depending on case airflow, even more so than fe cards) will usually average considerably higher.

specific example - my gainward phoenix with its annoyingly huge 2.5slot cooler averages to around 1850mhz inside somewhat warm case and stays at 1885mhz outside a case. slapping a fullcover waterblock on the same card (with max 40-45c temperatures at full load) results in clock being stuck at 1885mhz at load. 1885mhz seems to be the highest bin for boost in this case.

all of this is at reference clocks, with cooling being the only difference.
tpu review of fe gtx 1080 had average clock of 1785mhz. 1885 is 5.6% over that and 8.7% over the 1733mhz boost clock. these are not negligible differences.

this behaviour is currently unique to pascal and gpu boost 3.0. yes, previous gen cards and amd cards have varying degrees of boost for clocks but not to this degree.
 
Joined
Dec 22, 2011
Messages
3,890 (0.86/day)
Processor AMD Ryzen 7 3700X
Motherboard MSI MAG B550 TOMAHAWK
Cooling AMD Wraith Prism
Memory Team Group Dark Pro 8Pack Edition 3600Mhz CL16
Video Card(s) NVIDIA GeForce RTX 3080 FE
Storage Kingston A2000 1TB + Seagate HDD workhorse
Display(s) Samsung 50" QN94A Neo QLED
Case Antec 1200
Power Supply Seasonic Focus GX-850
Mouse Razer Deathadder Chroma
Keyboard Logitech UltraX
Software Windows 11

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
what i mean was that gpu boost 3.0 lets the clock increase a lot provided that power or tdp limit is not reached.

for example, gtx1080 reference clock is 1607mhz and boost clock is 1733mhz.
- founders edition gtx1080 cards (depending on temperature inside the case) generally average to little above the boost clock .
- aib gtx1080 with openair coolers (again, depending on case airflow, even more so than fe cards) will usually average considerably higher.

specific example - my gainward phoenix with its annoyingly huge 2.5slot cooler averages to around 1850mhz inside somewhat warm case and stays at 1885mhz outside a case. slapping a fullcover waterblock on the same card (with max 40-45c temperatures at full load) results in clock being stuck at 1885mhz at load. 1885mhz seems to be the highest bin for boost in this case.

all of this is at reference clocks, with cooling being the only difference.
tpu review of fe gtx 1080 had average clock of 1785mhz. 1885 is 5.6% over that and 8.7% over the 1733mhz boost clock. these are not negligible differences.

this behaviour is currently unique to pascal and gpu boost 3.0. yes, previous gen cards and amd cards have varying degrees of boost for clocks but not to this degree.

You got the reference clocks part right and focused on that, while mentioning different coolers. Apparently you missed the part where W1zzard also said reference design, which in the case of Pascal would be FE models.
 
Joined
Dec 22, 2011
Messages
286 (0.06/day)
Processor Ryzen 7 5800X3D
Motherboard Asus Prime X570 Pro
Cooling Deepcool LS-720
Memory 32 GB (4x 8GB) DDR4-3600 CL16
Video Card(s) Gigabyte Radeon RX 6800 XT Gaming OC
Storage Samsung PM9A1 (980 Pro OEM) + 960 Evo NVMe SSD + 830 SATA SSD + Toshiba & WD HDD's
Display(s) Samsung C32HG70
Case Lian Li O11D Evo
Audio Device(s) Sound Blaster Zx
Power Supply Seasonic 750W Focus+ Platinum
Mouse Logitech G703 Lightspeed
Keyboard SteelSeries Apex Pro
Software Windows 11 Pro
Also an interesting note regarding HBM VRAM. In recent AMD VEGA events it seems they have stopped calling HBM as VRAM, but rather HBC=High Bandwidth Cache. So I can only assume somewhere down the pipeline AMD will have a card that has ~8GB of HBM as Cache and >16GB of GDDR5/GDDR5X/GDDR6 as VRAM. That would be very interesting.
Nope.
AMD's plan, if it works, is to get only the data you need on the video card, rather than all the data the game wants to allocate (there can gigabytes of difference on allocated mem and actually used mem). HBM2's role in this scenario is to hold the data that's actually being accessed, the rest should reside on your system RAM, NVRAM, SSD, HDD, heck, even network storage is supported. Future Radeon Pro SSG -models will probably benefit from this even more, as they house SSDs of their own. Vega supports 512TB virtual address space to accomodate all this.
 
Joined
Dec 15, 2006
Messages
1,703 (0.27/day)
Location
Oshkosh, WI
System Name ChoreBoy
Processor 8700k Delided
Motherboard Gigabyte Z390 Master
Cooling 420mm Custom Loop
Memory CMK16GX4M2B3000C15 2x8GB @ 3000Mhz
Video Card(s) EVGA 1080 SC
Storage 1TB SX8200, 250GB 850 EVO, 250GB Barracuda
Display(s) Pixio PX329 and Dell E228WFP
Case Fractal R6
Audio Device(s) On-Board
Power Supply 1000w Corsair
Software Win 10 Pro
Benchmark Scores A million on everything....
Joined
Feb 3, 2017
Messages
3,481 (1.32/day)
Processor R5 5600X
Motherboard ASUS ROG STRIX B550-I GAMING
Cooling Alpenföhn Black Ridge
Memory 2*16GB DDR4-2666 VLP @3800
Video Card(s) EVGA Geforce RTX 3080 XC3
Storage 1TB Samsung 970 Pro, 2TB Intel 660p
Display(s) ASUS PG279Q, Eizo EV2736W
Case Dan Cases A4-SFX
Power Supply Corsair SF600
Mouse Corsair Ironclaw Wireless RGB
Keyboard Corsair K60
VR HMD HTC Vive
You got the reference clocks part right and focused on that, while mentioning different coolers. Apparently you missed the part where W1zzard also said reference design, which in the case of Pascal would be FE models.
my bad, you're probably right.

Maybe something to do with DX12 using more cores in the CPU.... I dunno, I'm not an expert.
dx12 is much more than just async compute.
that german article actually shows that very clearly with turning on async compute giving only a small performance boost even on usual suspects for good boost (furyx, 390x) and dx12 in general giving a much larger boost on all cards.
 

rtwjunkie

PC Gaming Enthusiast
Supporter
Joined
Jul 25, 2008
Messages
13,909 (2.42/day)
Location
Louisiana -Laissez les bons temps rouler!
System Name Bayou Phantom
Processor Core i7-8700k 4.4Ghz @ 1.18v
Motherboard ASRock Z390 Phantom Gaming 6
Cooling All air: 2x140mm Fractal exhaust; 3x 140mm Cougar Intake; Enermax T40F Black CPU cooler
Memory 2x 16GB Mushkin Redline DDR-4 3200
Video Card(s) EVGA RTX 2080 Ti Xc
Storage 1x 500 MX500 SSD; 2x 6TB WD Black; 1x 4TB WD Black; 1x400GB VelRptr; 1x 4TB WD Blue storage (eSATA)
Display(s) HP 27q 27" IPS @ 2560 x 1440
Case Fractal Design Define R4 Black w/Titanium front -windowed
Audio Device(s) Soundblaster Z
Power Supply Seasonic X-850
Mouse Coolermaster Sentinel III (large palm grip!)
Keyboard Logitech G610 Orion mechanical (Cherry Brown switches)
Software Windows 10 Pro 64-bit (Start10 & Fences 3.0 installed)
Doooh, this game looks like one of Call of duty tittles back in 2006, yet needs better than average rig to run it on max details. I'll pass.
Oh no, it's extremely good looking! I actually was not prepared for the increase of visual details and fidelity in the game beyond where it was for SE3.

Also, the maps are huge, the AI is a lot better, and I can go almost anywhere I want. Excellent for finding those perfect sniper nests! :D
 
Joined
Apr 30, 2011
Messages
2,651 (0.56/day)
Location
Greece
Processor AMD Ryzen 5 5600@80W
Motherboard MSI B550 Tomahawk
Cooling ZALMAN CNPS9X OPTIMA
Memory 2*8GB PATRIOT PVS416G400C9K@3733MT_C16
Video Card(s) Sapphire Radeon RX 6750 XT Pulse 12GB
Storage Sandisk SSD 128GB, Kingston A2000 NVMe 1TB, Samsung F1 1TB, WD Black 10TB
Display(s) AOC 27G2U/BK IPS 144Hz
Case SHARKOON M25-W 7.1 BLACK
Audio Device(s) Realtek 7.1 onboard
Power Supply Seasonic Core GC 500W
Mouse Sharkoon SHARK Force Black
Keyboard Trust GXT280
Software Win 7 Ultimate 64bit/Win 10 pro 64bit/Manjaro Linux
Great implementation of DX12 that helps FuryX to show off its power. Being close to 1080 on 4K with half its VRAM and almost half its clock speed is great imho and shows that CGN is future proof as much as it gets. Vega will probably be too good to be true on DX12 and Vulcan.
 
Joined
Dec 28, 2012
Messages
3,475 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
Great implementation of DX12 that helps FuryX to show off its power. Being close to 1080 on 4K with half its VRAM and almost half its clock speed is great imho and shows that CGN is future proof as much as it gets. Vega will probably be too good to be true on DX12 and Vulcan.
While i agree, lets wait for AMD to actually deliver VEGA first before getting all excited about it.
 
Joined
Oct 2, 2005
Messages
3,059 (0.45/day)
Location
Baltimore MD
Processor Ryzen 5900X
Motherboard ASUS Prime X470 Pro
Cooling Arctic liquid freezer II 240
Memory 2 x 16 Gb Gskill Trident Z 3600 Mhz
Video Card(s) MSI Ventus 3060 Ti OC
Storage Samsung 960 EVO 500 Gb / 860 EVO 1 Tb
Display(s) Dell S2719DGF
Case Lian Li Lancool II Mesh
Audio Device(s) Soundblaster Z
Power Supply Corsair RM850x
Mouse Logitech G703
Keyboard Logitech G513
Software Win 11
Where are the DX11 benchmark numbers tho without them the chart is meaningless.
If AMD cards pulling 40fps in DX11 and 60fps in DX12 but Nvidia was getting 55fps in DX11 and 60fps in DX12 makes that 27% moot.
 
Top