• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Sniper Elite 4: Performance Analysis

Where are the DX11 benchmark numbers tho without them the chart is meaningless.
If AMD cards pulling 40fps in DX11 and 60fps in DX12 but Nvidia was getting 55fps in DX11 and 60fps in DX12 makes that 27% moot.

They are not really needed since all cards run better on DX12. You can however calculate everything with the help of the DX11 vs DX12 graph. Just divide DX12 results by approx 1.18 for AMD (1.22 for Fury) and 1.03 for nVidia.
 
91.8 FPS on GeForce GTX 1070 8GB. I wonder how the game will work on my computer with lower RAM. (8GB - not the vram)

1080.png
 
one to go on my to buy list.. i enjoy a bit of leisurely sniping activity..

trog
 
While i agree, lets wait for AMD to actually deliver VEGA first before getting all excited about it.
It's simple math imho. At least 50% higher clocks for Vega with the same shader number, next gen HDM and better CGN or an even better evolved arch means at least 50% higher FPS. Which for games like this means close to Titan Pascal level of performance (on 4K at least).
 
I wonder how is the SLI support?
 
So when the AMD compared the Fury X with the 980 ti, they weren't that wrong, and they took a lot of BS due to it.
 
It's simple math imho. At least 50% higher clocks for Vega with the same shader number, next gen HDM and better CGN or an even better evolved arch means at least 50% higher FPS. Which for games like this means close to Titan Pascal level of performance (on 4K at least).

Actually the hottest Vega feature is the tile based rendering introduced by PowerVR @ 2001( http://www.anandtech.com/show/735/3 ) and which was forgotten for over 10 years until the last 2 generations of nVidia.

Now that AMD will catch up the show must go on. They had to use brute force to counter the tile tech revival, but If they'll have another monster like the Fury X with 4000 shading units, HBM 2 and Vega architecture we will definitely have a new king of the hill. If they'll be more conservative and look for volumes at generous margins, maybe not, but I expect Vega to be an amazing card.
 
So when the AMD compared the Fury X with the 980 ti, they weren't that wrong, and they took a lot of BS due to it.

Reference clocks though, Fury X never really shined, especially against custom 980 Ti's, aka all of them.
 
With the gains AMD show on DX12, i assume the DX11 performance graphs were to horrible to show?

Why the hell you run the game in DX11 when you can run it DX12 lol.
 
Why the hell you run the game in DX11 when you can run it DX12 lol.

Because this seems to be literally the first game ever that actually runs better using DX12 lol.

Not everyone uses Windows 10 either, but Nv perform well with both, don't hate.
 
Last edited:
Because this seems to be literally the first game ever that actually runs better using DX12 lol.

Not everyone uses Windows 10 either, but Nv perform well with both, don't hate.

Only hate i am seeing is from your self, or at least trying to make everything AMD negative. Been a while now that AMD have been improving in DX12 over DX11 if you have not seen this you been asleep.

This is just another positive step, and you see as bad for what ever reason as to get more than 0 is to pass 0 ;).

Surly your not expecting suddenly they have this boost in every game, time to be serious.
 
Only hate i am seeing is from your self, or at least trying to make everything AMD negative. Been a while now that AMD have been improving in DX12 over DX11 if you have not seen this you been asleep.

This is just another positive step, and you see as bad for what ever reason as to get more than 0 is to pass 0 ;).

Surly your not expecting suddenly they have this boost in every game, time to be serious.

No I expect decent support across the board, If nVidia didn't bother with DX11 they would have seen a big boost too.

Time to be serious.
 
See for yourself:

Interesting. What I see here is that every time I stop the video, the RX480 performs better than GTX 1060 in DX12. Also on Guru3D the DX12 charts show a 7% advantage of RX480 over GTX 1060 (75-70 fps). So what's the truth then?
 
Interesting. What I see here is that every time I stop the video, the RX480 performs better than GTX 1060 in DX12. Also on Guru3D the DX12 charts show a 7% advantage of RX480 over GTX 1060 (75-70 fps). So what's the truth then?

Obviously, the test systems (custom cards, cpu used, etc) vary or even game builds can change (press release, review version, final, etc), different locations in the game can produce highly varying results, so it's not easy to come up with a final golden result. On the other hand, BoostClock provides really detailed frame time graphs and final result sheets as well, so they are 100% transparent: http://boostclock.com/show/000047/fta-gtx1060-vs-rx480-SniperElite4.html
 
sniper elite 4 seems to be the first game where dx12 is undeniably used to give an advantage over dx11. i think this is the first of its kind.
this is definitely a good thing.
 
user reviewers report still extremely stupid AI that keep walking around although shots are being fired, etc etc... (just like godlike ARMA 3 SP, it should be said)
 
W1zzard, any reason you've excluded the GTX 1060 3GB? As a reseller, I have to tell you that it's one of the best selling cards at the moment (even though I advice against it because of the limited memory).
I'm pretty sure its performance would tank with only 3GB of RAM.
 
user reviewers report still extremely stupid AI that keep walking around although shots are being fired, etc etc... (just like godlike ARMA 3 SP, it should be said)

I've got much improved AI. Hell these guys will go way out of their way to flank me with part of their force while a few keep me pinned down.

And no, I don't see that what you mentioned. Even guys I tagged in the distance I see taking cover when shots are heard involving enemy closer to me. Many times they'll even make the trek over to provide support or help search.
 
W1zzard, any reason you've excluded the GTX 1060 3GB? As a reseller, I have to tell you that it's one of the best selling cards at the moment (even though I advice against it because of the limited memory).
I'm pretty sure its performance would tank with only 3GB of RAM.
The Zotac card I had had to go back. Bought a new card just now.
 
Back
Top