• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD GPUs Show Strong DirectX 12 Performance on "Ashes of the Singularity"

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Stardock's "Ashes of the Singularity" may not be particularly pathbreaking as an RTS, in the Starcraft era, but has the distinction of being the first game to the market with a DirectX 12 renderer, in addition to its default DirectX 11 one. This gave gamers the first peak at API to API comparisons, to test the tall bare-metal optimizations of DirectX 12, and as it turns out, AMD GPUs do seem to benefit big.

In a GeForce GTX 980 vs. Radeon R9 390X comparison by PC Perspective, the game seems to perform rather poorly on its default DirectX 11 renderer for the R9 390X, which when switched to DirectX 12, not only takes a big leap (in excess of 30%) in frame-rates, but also outperforms the GTX 980. A skeptical way of looking at these results would be that the R9 390X isn't optimized for the D3D 11 renderer to begin with, and merely returns to its expected performance vs. the GTX 980, with the D3D 12 renderer.



Comparing the two GPUs on CPU-intensive resolutions (900p and 1080p), across various CPUs (including the i7-5960X, i7-6700K, i3-4330 dual-core, FX-8350, FX-6300, reveals that the R9 390X has a narrow performance drop with fewer CPU cores, and has slight performance gains with increasing number of cores. Find the full insightful review in the source link below.

View at TechPowerUp Main Site
 
It just brings the AMD up to where it should be though, yeah? It's not like Fury X is going to curb stomp Titan X on DX12, amiright?
 
so AMD is again ahead with hardware and lack software .....is not the 1st time they do this and seems nothing was learned from past...
 
So AMD cards shine in DX12, and Nvidia cards stay about the same.
 
The way I see it, is that why optimize for dx11 anymore ?

Win 10 has been DL more than 50 million times allready including dx12 which makes dx11 a thing of the past.
Move on and forget, just keep dx11 where it allready is, thats good enough and start developing dx12 right away.
 
So AMD cards shine in DX12, and Nvidia cards stay about the same.

It looses performance on fast CPUs beyond 1080p "Low" Average settings
ashes-gtx980.png


In Heavy "Low" seams to gain performance where "High" you loose.
ashesheavy-gtx980.png


1080p "Low" looks to be consistent outcome where DX12 is faster then DX11 on Nvidia using a powerful CPU.

The more interesting or FUN part of it is probably the spat between OXIDE and Nvidia right now...
 
Last edited:
Colour me skeptical but is this really a case of DX12 offering much better FPS than DX11 when paired with a decent CPU, or is it more a case of a game in alpha offering much better FPS now it's launch API has been implemented rather than it's placeholder API?
 
The way I see it, is that why optimize for dx11 anymore ?
Not everyone will upgrade to Win10. There are plenty of people - for whatever reasons (although they seem to include technophobia, making a stand against subscription models, abhorrence of "apps", world Government conspiracy theories among others), that will only give up their Win7 serials when they're prised from their cold, dead hands.
I'm also pretty sure that HD 6000 card series owners aren't keen to have their gaming marginalized any further than it already is.
So AMD cards shine in DX12, and Nvidia cards stay about the same.
In this instance yes...although this instance seems to be an alpha build from a game engine, and a developer and game engine very closely allied with AMD. No doubt given the range of DX12 /DX11.3 feature and hardware coding options, you'll see game engines and individual games vary wildly depending upon what is implemented, and what is not.
One thing seems certain from the PC Per graphs Xzibit posted. AMD really need to pour some of those optimizations into CPUs.
 
Most people I know that aren't upgrading to Windows 10 are staying with what they are on because of Windows Media Center. If you don't have an Xbox One and you rely on Windows Media Center, you're basically stuck.
 
Most people I know that aren't upgrading to Windows 10 are staying with what they are on because of Windows Media Center. If you don't have an Xbox One and you rely on Windows Media Center, you're basically stuck.
I know a lot of people running around with donkeys instead of using some modern kind of transportation,
but those people who are "stuck" there is also xbmc, far superior than anything out there.
 
This is one game, a game that its devs been petted by AMD since it was announced.
By this time after all we've been through in this dirty market, i would totally believe that these great numbers on DX12 are either because DX11 got purposely butchered or becuase they are genuinely so well-optimized for AMD that no-one actually bothered to suite this thing to NVIDIA cards.

They wanna please their red friends. Those get to sell cards, and those get to sell there 0-PR driven game by having posts like these.
Imma wait fo UE4 to make the transission to DX12 (not this half-ported test thing)
 
This is one game, a game that its devs been petted by AMD since it was announced.
By this time after all we've been through in this dirty market, i would totally believe that these great numbers on DX12 are either because DX11 got purposely butchered or becuase they are genuinely so well-optimized for AMD that no-one actually bothered to suite this thing to NVIDIA cards.

They wanna please their red friends. Those get to sell cards, and those get to sell there 0-PR driven game by having posts like these.
Imma wait fo UE4 to make the transission to DX12 (not this half-ported test thing)

Did you read the article, watch the video or read Ryans comments on it.

PCPerspective Ryan Shrout said:
NVIDIA has had source code access for a year. There are no excuses then.


 
Nvidia didn't lose the chance of it's usual marketing stunt. They publish the usual "Game Ready" driver to show that, for every new game, they are ready. Then they followed with their advice to not consider the benchmark results important. Maybe even ignore them completely. If this was AMD, if AMD had come out with a very specific driver for one specific game and then failed at those benchmarks, this thread would be flooded with mocking and derogatory comments for AMD, even hate.

Anyway, "Ashes of the Singularity" shows two things about drivers. First, Nvidia's DX11 are miles ahead of AMD's. Second, AMD's DX12 drivers start from a more concrete base. If AMD's DX11 drivers where started with the wrong foot and latter there was no programmers or money to fix it, at least with DX12 AMD starts from a better position. Let's hope they don't mess up down the road. If they think that with DX12 they have the upper hand, Nvidia does have the resources and the talent, to show them that they are wrong.

Two last things. I was saying in the past that Mantle was made to fix the pathetic Bulldozer's performance in games compared to Intel CPUs and not so to improve AMD's GPUs performance. Seeing the results on PCPerspective's CPU tests and how bad FX processors score, I can say that I was completely wrong. FX is something that can not be saved. At least based on this specific test. The last thing is the results of Radeon R7 370. As we can see here the 370(265, 7850) does not gain much. There could be two reasons for this. GCN 1.0 and 2GB of RAM. I would like to remind here that Mantle was NOT performing well with cards that had less than 3GBs of RAM. It seems that DX12 has the same problem. We might have to consider 3GBs of RAM as the minimum in the future for DX12 performance.
 
Figures why AMD dropped Mantle for DX12. They knew they'd kill nVidia at this
 
This is one game, a game that its devs been petted by AMD since it was announced.
By this time after all we've been through in this dirty market, i would totally believe that these great numbers on DX12 are either because DX11 got purposely butchered or becuase they are genuinely so well-optimized for AMD that no-one actually bothered to suite this thing to NVIDIA cards.

They wanna please their red friends. Those get to sell cards, and those get to sell there 0-PR driven game by having posts like these.
Imma wait fo UE4 to make the transission to DX12 (not this half-ported test thing)
Hahha Dirty market? It wasnt dirty market when nVidia made the the big Dx10 lie with Crysis, and fooled millions of people with a dx9.0c game, nVidia has been cheating since ages
If AMD is doing something wrong now you have to blame nvidia for it, because nvidia started it or invented it
 
I know a lot of people running around with donkeys instead of using some modern kind of transportation,
but those people who are "stuck" there is also xbmc, far superior than anything out there.
It's inferior to Windows Media Center. Kodi's DVR and EPG (in USA) features leave a lot to be desired; moreover, Kodi's DLNA support is horrendous. Xbox One functions as a DVR, obtains EPG, and functions as a DLNA server. A $400 Xbox One, $200 for an external HDD, and $50 for a Hauppauge USB TV tuner for Windows 10 isn't exactly appealing next to a $100 Tuner and Windows Media Center.

Figures why AMD dropped Mantle for DX12. They knew they'd kill nVidia at this
Mantle code is in Direct3D 12 and Vulkan. There's no sense in AMD continuing to develop a proprietary API when it is becoming the standard.
 
Last edited:
so AMD is again ahead with hardware and lack software .....is not the 1st time they do this and seems nothing was learned from past...

Hmm, I don't know how exactly to read these results.
Perhaps, nvidia lacks software optimisation at all in DX12 ?!
Obviously, these results are not very promising, if they do not reveal performance gains everywhere and with everyone.
 
If AMD is doing something wrong now you have to blame nvidia for it, because nvidia started it or invented it
No, I really, really, really wouldn't go there.
For some of us who have been 3D graphics enthusiasts for a while, we can remember when ATI outed the "all new" Rage Pro Turbo...with supposedly "40% more performance" than the outgoing Rage Pro in February 1998. The 40% improvement was only driver optimization for Winbench 98. As this review shows, real world gains were nil. The Pro Turbo eventually added some performance for actual games, but by then it had been overtaken by a whole swathe of newer cards including those of ATI itself.

If you're going to make a statement of fact, it should actually be factual.
Most of the graphics vendors indulged in benchmark shenanigans at one time or another - Nvidia included - but ATI's deliberate optimization for a single benchmark actually set the tone.
 
Hmm, I don't know how exactly to read these results.
Perhaps, nvidia lacks software optimisation at all in DX12 ?!
Obviously, these results are not very promising, if they do not reveal performance gains everywhere and with everyone.
AMD has major CPU bottlenecks on low settings on DX11. Those CPU bottlenecks are gone in DX12 so AMD cards perform like NVIDIA cards in DX12 when they're pretty far behind in DX11.

I think it shows that AMD put all of their effort into Mantle/DX12/Vulkan over DX11 which really shouldn't surprise anyone.
 
All I read out from this is that AMDs drivers are crap and with DX12 this doesn't matter as much as it did with DX11.

I bet the CPU usage under DX12 with AMD is a lot higher than it is with nVidia. Not that this is necessaryly a bad thing. But it also isn't more efficient per se. The 5960x just has enough crunching power to not be the bottleneck here.

I hope AMD doesn't take their DX12 superiority for granted and stop developing and optimizing their DX12 drivers any further.
 
What does this article tell us? That Ashes of Singularity runs better on AMD than Nvidia. There are games that run better on AMD or Nvidia. It is common. One company or the other works closer with the developer and gains an edge. It doesn't mean much that Nvidia chose not to for whatever reason if that is what happened. imo we can't look at the performance increase in one DX12 game and extrapolate that to most DX12 games.

If it works out that AMD has the edge in most DX12 games then it's time to go red but how long will it be before it's clear which company has the best offerings? It may be years possibly before there are a good selection of DX12 games.
 
AMD has major CPU bottlenecks on low settings on DX11. Those CPU bottlenecks are gone in DX12 so AMD cards perform like NVIDIA cards in DX12 when they're pretty far behind in DX11.

I think due to the multithreaded nature of DX12, those bottlenecks (or inefficiencies) are better hidden, but they are still there. :)
 
There's all sorts of wonkiness going on with all sides of this chart. The only take away from this test is that we need more tests.
 
What does this article tell us? That Ashes of Singularity runs better on AMD than Nvidia. There are games that run better on AMD or Nvidia. It is common. One company or the other works closer with the developer and gains an edge. It doesn't mean much that Nvidia chose not to for whatever reason if that is what happened. imo we can't look at the performance increase in one DX12 game and extrapolate that to most DX12 games.

If it works out that AMD has the edge in most DX12 games then it's time to go red but how long will it be before it's clear which company has the best offerings? It may be years possibly before there are a good selection of DX12 games.

This. Overall, this is not news, this is just a stir in the AMD-Nvidia fanrage bowl, with a game that nobody really plays or even heard of.
 
It's better to say AMD catching up in dx12.
 
Back
Top