• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Lack of Async Compute on Maxwell Makes AMD GCN Better Prepared for DirectX 12

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
47,670 (7.43/day)
Location
Dublin, Ireland
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard Gigabyte B550 AORUS Elite V2
Cooling DeepCool Gammax L240 V2
Memory 2x 16GB DDR4-3200
Video Card(s) Galax RTX 4070 Ti EX
Storage Samsung 990 1TB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
It turns out that NVIDIA's "Maxwell" architecture has an Achilles' heel after all, which tilts the scales in favor of competing AMD Graphics CoreNext architecture, in being better prepared for DirectX 12. "Maxwell" lacks support for async compute, one of the three highlight features of Direct3D 12, even as the GeForce driver "exposes" the feature's presence to apps. This came to light when game developer Oxide Games alleged that it was pressured by NVIDIA's marketing department to remove certain features in its "Ashes of the Singularity" DirectX 12 benchmark.

Async Compute is a standardized API-level feature added to Direct3D by Microsoft, which allows an app to better exploit the number-crunching resources of a GPU, by breaking down its graphics rendering tasks. Since NVIDIA driver tells apps that "Maxwell" GPUs supports it, Oxide Games simply created its benchmark with async compute support, but when it attempted to use it on Maxwell, it was an "unmitigated disaster." During to course of its developer correspondence with NVIDIA to try and fix this issue, it learned that "Maxwell" doesn't really support async compute at the bare-metal level, and that NVIDIA driver bluffs its support to apps. NVIDIA instead started pressuring Oxide to remove parts of its code that use async compute altogether, it alleges.



"Personally, I think one could just as easily make the claim that we were biased toward NVIDIA as the only "vendor" specific-code is for NVIDIA where we had to shutdown async compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn't really have Async Compute so I don't know why their driver was trying to expose that. The only other thing that is different between them is that NVIDIA does fall into Tier 2 class binding hardware instead of Tier 3 like AMD which requires a little bit more CPU overhead in D3D12, but I don't think it ended up being very significant. This isn't a vendor specific path, as it's responding to capabilities the driver reports," writes Oxide, in a statement disputing NVIDIA's "misinformation" about the "Ashes of Singularity" benchmark in its press communications (presumably to VGA reviewers).

Given its growing market-share, NVIDIA could use similar tactics to keep game developers away from industry-standard API features that it doesn't support, and which rival AMD does. NVIDIA drivers tell Windows that its GPUs support DirectX 12 feature-level 12_1. We wonder how much of that support is faked at the driver-level, like async compute. The company is already drawing flack for using borderline anti-competitive practices with GameWorks, which effectively creates a walled garden of visual effects that only users of NVIDIA hardware can experience for the same $59 everyone spends on a particular game.

View at TechPowerUp Main Site
 
Last edited:
Who the heck writes these and how do they pass to be posted on the website ?
Pressurized, pressurizing, does the writer even know what that means ?
I mean really.....
 
Who the heck writes these and how do they pass to be posted on the website ?
Pressurized, pressurizing, does the writer even know what that means ?
I mean really.....

Huh, I thought it was a pun. "Oxide" 'n all..

Need a new humour-meter...
 
That's interesting, and now I wonder how well the old Kepler cards do in that DX12 benchmark. Presumably those come with their compute capabilities intact.
 
I think NVIDIA just couldn't be bothered with driver implementation till now because frankly, async compute units weren't really needed till now (or shall I say till DX12 games are here). Maybe drivers "bluff" the support just to prevent crashing if someone happens to try and use it now, but they'll implement it at later time properly. Until NVIDIA confirms that GTX 900 series have no async units, I call it BS.
 
Sneaky Sneaky

So, what your saying is it's easy to shine, so long as the path has been laid for you. The moment you take a detour, flaws get exposed. Have to commend AMD on this one that while they are hurting, at least they are truly baking in support for the features that they claim they are. Not just a quick once over to get to market (obviously they are not rushed on that front...)

Bluffing driver.... Wonder what other features we "think" we are using.
 
Oxide guys truly ignited some fire in nVidia turf...

Bravo guys... bravo... I need the damn prices going down!
 
  • Like
Reactions: xvi
reminds me of good ol' Gabe Newell handing the smack down on NVidia many many moons ago. (like, well over a decade ago).

It's should be pressured.

Which the original copy (text) had. It was changed on purpose, as humor, it seems.
 
ragecomic.png
 
Here's the original article at WCCFTECH: http://wccftech.com/oxide-games-dev-replies-ashes-singularity-controversy/

There are two points here to look at: first, Oxide is saying that Nvida has had access to the source code of the game for over a year and that they've been getting updates to the source code on the same day as AMD and Intel.

"P.S. There is no war of words between us and Nvidia. Nvidia made some incorrect statements, and at this point they will not dispute our position if you ask their PR. That is, they are not disputing anything in our blog. I believe the initial confusion was because Nvidia PR was putting pressure on us to disable certain settings in the benchmark, when we refused, I think they took it a little too personally."

Second, they claim they're using the so called async-compute units found on AMD's GPUs. These are parts on AMD's recent GPUs that can be used to asynchronously schedule work to underutilized GCN clusters.

"AFAIK, Maxwell doesn’t support Async Compute, at least not natively. We disabled it at the request of Nvidia, as it was much slower to try to use it then to not. Weather or not Async Compute is better or not is subjective, but it definitely does buy some performance on AMD’s hardware. Whether it is the right architectural decision for Maxwell, or is even relevant to it’s scheduler is hard to say."
 
Last edited:
Well i believe nVidia already on to this and know full well there is no games yet and will just add it later when their is some and just say tuff crap to those who get the current hardware.
 
Looks like AMD got the opening salvo in DX12. I'm not sure what Nvidia can do at this point or how much they will care even if Maxwell sales do fall off some. The entry level Maxwells have been out for a year and a half and the mid range Maxwells for about a year. They've probably sold a ton of them already and the profit was made to cover R&D by now I would imagine.

We are late in the game on Maxwell. Pascal should be here in maybe 6 months and I think I read that they planned to add compute back in on the Pascals.

Hope it helps AMD to sell more cards.
 
If people had half a clue, they would hold on to current NVIDIA cards until the next generation lands in 2016 that will likely perform better. There are going to be a couple of DX12 titles out by that time at least.
 
Who the heck writes these and how do they pass to be posted on the website ?
Pressurized, pressurizing, does the writer even know what that means ?
I mean really.....
Well how it is written makes more sense....

NVIDIA instead started pressuring Oxide to remove parts of its code that use async compute altogether, it alleges

If you wanted to use "pressured" it would read like this....

Oxide alledes, NVIDIA pressured them to remove parts of its code that use async compute altogether.
 
This came to light when game developer Oxide Games claimed that it was pressured by NVIDIA's marketing department to remove certain features in its "Ashes of the Singularity" DirectX 12 benchmark.
it learned that "Maxwell" doesn't really support async compute at the bare-metal level, and that NVIDIA driver bluffs its support to apps.
NVIDIA instead started pressuring Oxide to remove parts of its code that use async compute altogether, it alleges.
Given its growing market-share, NVIDIA could use similar tactics to keep game developers away from industry-standard API features that it doesn't support, and which rival AMD does.
NVIDIA drivers tell Windows that its GPUs support DirectX 12 feature-level 12_1. We wonder how much of that support is faked at the driver-level, like async compute.
The company is already drawing flack for using borderline anti-competitive practices with GameWorks, which effectively creates a walled garden of visual effects that only users of NVIDIA hardware can experience for the same $59 everyone spends on a particular game.
Why am I not surprised by any of this...
 
Who the heck writes these and how do they pass to be posted on the website ?
Pressurized, pressurizing, does the writer even know what that means ?
I mean really.....

Welcome to TPU. Quite an auspicious start:
-Complain about article content, Check.
-Insult moderator, Check.
 
Sneaky Sneaky

*

Bluffing driver.... Wonder what other features we "think" we are using.

This. We all know Nvdia has cheated at benchmarks in the past.

so its like this:

Oxide: Look! we got benchmarks!
AMD: oh my we almost beat The Green Meanies
Nvidia: @oxide you cheated on the benchmarks
Oxide: did not. nyah.
Nvidia: disable competitive features so our non-async bluff works right!
Oxide: not gonna happen
Nvidia: F*** you AMD! you're not better than us! we'll fix you with our l33t bluffing skillz
AMD: *poops pants* and hides in a corner eating popcorn.
 
Last edited:
I don't understand why a game company would write a benchmark that favours or doesn't favour a specific card... that seems like the opposite purpose of a benchmark.

Be that as it may, I'd also like to know which version of Maxwell allegedly doesn't support async shaders - v1, v2, or both?

If this is true, then it's a massive c**k-up on nVIDIA's part, but one that probably won't affect them until the end of this year, when big-name DX12 games arrive for the holiday season. Even so, said games will still probably have DX11 rendering paths, and once Pascal arrives in March/April 2016 this will all be forgotten... assuming Pascal isn't delayed.
 
Why am I not surprised by any of this...

Because its been done time and time again. Whichever company doesn't have support for new shiney goes out of their way to downplay its importance and states that their competition is foolish for being early adopters by supporting it.

Later when they do support new shiney they go out of their way to claim its now more important and you should want it.

From a business standpoint not supporting the latest and greatest features of direct x makes sense. How much did nv spend on SM3.0 support in the their 6000 series? By the time the games that supported it were out, most were too resource heavy for anything short of the 6800 Ultra.

What about the Direct X 10 fiasco? how much was spent on getting compliance only to have next to no games support it and instead stick with Direct X 9 until 11 came out?

Though I am curious like the previous poster is kepler also suffers from this or if its something that was nerfed for maxwell.
 
If people had half a clue, they would hold on to current NVIDIA cards until the next generation lands in 2016 that will likely perform better. There are going to be a couple of DX12 titles out by that time at least.
Still rocking two 670's here ;)

Not that I have a clue........I'm just broke.
 
Maybe that explains Ars technica's results?

UDYHBl.png
 
It's again a SINGLE game. Until I see more (that aren't exclusive to either camp like this one is to AMD), then I'll accept the info...
 
No support, no problem, just pay them to gimp it on AMD cards: welcome to the wonderful world of the nVidia console, sorry, pc gaming, the way it's meant to be paid.
 
*Sigh*

I am so tired of things like this...
 
Back
Top