• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Deus Ex: Mankind Divided Performance Analysis

Now I'm curious, with my end goal merely to see if it starts.
Yeah it's an ugly mess right now with no explanations. Nixxes has been on the Hub trying to collect logs etc, The "solutions" have ranged from turning off the DLC to creating a new User Account. I have tried all the "fixes" except the User Account thing, I refuse to do that for one game and well it's just as hit and miss as all the other "fixes" Ironically there is a FuryX user who was playing just fine but my Fury goes nowhere....
 
ok.....despite preloading with some 20GB, it is downloading 22.8GB today. WTF???
 
46fps in 1440p ultra settings with a gtx 1080, WTF?
 
Last edited:
so this is mostly dx 11 atm , so why is the 480 faster than the 1060?
 
Wow performance sucks right across the board, lucky I have 3000+ games to play before I pickup the GOTY edition for 5 bucks.
 
These Fury X numbers... are staggering.

AMD is doing really well here on a very GPU intensive title and Pascal... is falling apart in comparison. Guess it's not all about clocks then?
 
These Fury X numbers... are staggering.

AMD is doing really well here on a very GPU intensive title and Pascal... is falling apart in comparison. Guess it's not all about clocks then?

Shhh...My Fury won't even launch....
 
We're going to have Doom Vulkan numbers soon. Also, we'll be switching to DX12 on Hitman.
 
These Fury X numbers... are staggering.

AMD is doing really well here on a very GPU intensive title and Pascal... is falling apart in comparison. Guess it's not all about clocks then?
Not a surprise since it uses more or less the same engine as Hitman (2016), both are derivatives of Glacier 2. And since we know for a fact how well AMD performs in Hitman, this one was a given.
 
And they haven't even put DX12 in yet so that should be quite interesting. I know Hitman runs great for me in DX12
 
wow D: this game is kicking my gtx 1080's ass at ultra 1080p
 
Nvidia cards age horribly, and the 480 was 32% stronger than the 1060 in Vulkan - these are pretty much the exact results I expected.

What is crazy to me is that the 1070 is only like 20% stronger than the 480 in DX11!!! In DX12 they may be equal (Hence why AMD doesn't think it needs ultra cards yet).
That is the case til you start pairing that 480 with a weaker cpu that is most likely gonna be paired with a 480 then 1060 takes over in performance.

so this is mostly dx 11 atm , so why is the 480 faster than the 1060?

AMD sponsored game like AOTS. Game is optimized for AMD cards.

We're going to have Doom Vulkan numbers soon. Also, we'll be switching to DX12 on Hitman.
I think least for mid to low range cards like 480/1060, needs to be test more likely cpu that would be used with them as well to show how they stack up instead of top end cpu that is not likely to be used.

Reason i ask is another site had a graph showing a 480/1060 paired with a weaker cpu instead of a top end intel fps gains seemed to suffer a bit. Would be valid to show what a person would expect with a less power cpu. http://i.imgur.com/JF7ngP5.png
 
:shadedshu: Alright, this shit is embarrassing. I just ran the in-game benchmark using my system specs system with high settings, modified by checking box for tesselation, unchecking motion blur, and adding 4xMSAA.

27.1fps average. 17.8 minimum, and 30.2 maximum.

EDIT: Reducing Shadows overall to medium, while turning on contact shadow hardening, turning off msaa and using temporal anti-aliasing prodyces much better results with no noticeable drop in image quality.

59.7fps average. 50.9 minimum, and 61.3 maximum. All this is on a mere 1080p too.
 
Last edited:
So far it looks like RX 400 will only consist of polaris (10,11) and Vega (10,11 too). Since we know that th RX 490 is a confirmed name, it will likely be vega (hence with HBM), since AMD didn't talk about a third family of chip.

There are actually a few inconsistencies that have made me wonder what AMD will do in the coming months. Here's what we know:

1) AMD's roadmap shows ALL polaris in 2016

2) AMD's roadmap shows ALL vega in 2017

3) Rumors would suggest that AMD is indeed launching a more powerful card before 2017, and common sense would back this up. After all Vega will AT LEAST be a 4096 SP HBM2 card, and that would be 2x stronger than the 480. They will need 2-3 cards in-between the 480 and that monster.

4) 256-bit GDDR5X or 384-bit GDDR5 would provide enough bandwidth for a much more powerful core than the 480.

Thus I think we have another GDDR card on the way, and it would make a ton of sense. Who cares about efficiency when DX12 games will make AMD nearly as efficient as Nvidia? My money is on a 12GB 384-bit GDDR5 card coming out with 3500 SP's. It would probably match the 1080 in the latest games while having the same amount of memory as the Titan, and after all it would be cheap to make.

:shadedshu: Alright, this shit is embarrassing. I just ran the in-game benchmark using my system specs system with high settings, modified by checking box for tesselation, unchecking motion blur, and adding 4xMSAA.

27.1fps average. 17.8 minimum, and 30.2 maximum.

EDIT: Reducing Shadows overall to medium, while turning on contact shadow hardening, turning off msaa and using temporal anti-aliasing. prodyces much better results with no noticeable drop in image quality.

59.7fps average. 50.9 minimum, and 61.3 maximum. All this is on a mere 1080p too.

I can't tell if you are complaining or not. My good sir you possess a 128-bit card with a paltry 2.4 TF. That's barely better stats than the current $110 RX 460. Why would you expect any better?
 
Last edited by a moderator:
I can't tell if you are complaining or not. My good sir you possess a 128-bit card with a paltry 2.4 TF. That's barely better stats than the current $110 RX 460. Why would you expect any better?

um no. System specs is a 980Ti. I got it sorted to something more reflective of W1zzard's results.
 
um no. System specs is a 980Ti. I got it sorted to something more reflective of W1zzard's results.

My bad I read 960 in your signature. Sorry your 980 Ti is performing like my overclocked 470. But again I make the same point: The 980 Ti has 6.1 TFLOPS. Why would it be much stronger than a stock 480? The specs certainly don't suggest it should be...
 
Rip my 980 ti.

Luckily it should perform better than the stock 980ti though!
 
There are actually a few inconsistencies that have made me wonder what AMD will do in the coming months. Here's what we know:

1) AMD's roadmap shows ALL polaris in 2016

2) AMD's roadmap shows ALL vega in 2017

3) Rumors would suggest that AMD is indeed launching a more powerful card before 2017, and common sense would back this up. After all Vega will AT LEAST be a 4096 SP HBM2 card, and that would be 2x stronger than the 480. They will need 2-3 cards in-between the 480 and that monster.

4) 256-bit GDDR5X or 384-bit GDDR5 would provide enough bandwidth for a much more powerful core than the 480.

Thus I think we have another GDDR card on the way, and it would make a ton of sense. Who cares about efficiency when DX12 games will make AMD nearly as efficient as Nvidia? My money is on a 12GB 384-bit GDDR5 card coming out with 3500 SP's. It would probably match the 1080 in the latest games while having the same amount of memory as the Titan, and after all it would be cheap to make.


there is no HBM2 available for anyone on the market, until Q2/Q3 '17. That's one of the reasons you don't see the fabled "P100 SXM2 card" used anywhere.
 
Last edited:
Great results, but seeing that the Fury X scaling doesn't change much in % compared to GeForce 1080 when increasing from 1080p to 1440p, I highly doubt DX12 will make any difference, because the GPUs are not bottlenecked by DX11. This or all cards will profit from DX12 the same, so the differences will stay the same too. It's basically a high demanding graphics game, typically these sorts of games doesn't need DX12 to max the GPU out, somewhat like Crysis (3).
 
Can we have a test with all of the 'GCN optimized' effects turned off? Would be an interesting exercise to check out how it runs with AO set to only ON or OFF, Contact Hardening Shadows OFF and Volumetric Lighting ON or OFF.
DE:MD seems to be using most of AMD's GPUOpen libraries, like TressFX, AOFX and ShadowFX, it would be nice if they at least labeled them properly in the settings menu.
PCGH tried to keep it more agnostic, they have CHS (ShadowFX) turned off and AO toned down and their results are more in line with what's expected.
 
there is no HBM2 available for anyone on the market, until Q2/Q3 '17. That's one of the reasons you don't see the fabled "P100 SXM2 card" used anywhere.

Link?

Everything I have read says December - January for first availability.
 
Can we have a test with all of the 'GCN optimized' effects turned off? Would be an interesting exercise to check out how it runs with AO set to only to ON or OFF, Contact Hardening Shadows OFF and Volumetric Lighting ON or OFF.
DE:MD seems to be using most of AMD's GPUOpen libraries, like TressFX, AOFX and ShadowFX, it would be nice if they at least labeled them properly in the settings menu.
PCGH tried to keep it more agnostic, they have CHS (ShadowFX) turned off and AO toned down and their results are more in line with what's expected.

Game looks like complete garbage with shadow hardening off, so do what YOU want buddy lol
 
Game looks like complete garbage with shadow hardening off, so do what YOU want buddy lol
Well Hairworks looks better than the default hair in The Witcher 3 but most reviews turn that effect off for fairness sake. It wouldn't be that hard to try to turn off vendor specific effects in AMD Gaming Evolved games for a fair comparison, or at least have some 'agnostic' settings tests included besides the default Ultra/High settings ones.
 
Well Hairworks looks better than the default hair in The Witcher 3 but most reviews turn that effect off for fairness sake. It wouldn't be that hard to try to turn off vendor specific effects in AMD Gaming Evolved games for a fair comparison, or at least have some 'agnostic' settings tests included besides the default Ultra/High settings ones.

Hairworks looks worse half the time, and only marginally better sometimes (For a massive performance hit on even Nvidia GPU's). TressFX in Tombraider on the other hand is still the best hair I have ever seen in a game.

Again though the point is you are basically asking them to run the game on Low. Some games only offer HBAO (No HDAO), and if that is the only option they should (And usually do) bench it when testing Ultra settings.
 
Back
Top