Monday, August 31st 2015

Lack of Async Compute on Maxwell Makes AMD GCN Better Prepared for DirectX 12

It turns out that NVIDIA's "Maxwell" architecture has an Achilles' heel after all, which tilts the scales in favor of competing AMD Graphics CoreNext architecture, in being better prepared for DirectX 12. "Maxwell" lacks support for async compute, one of the three highlight features of Direct3D 12, even as the GeForce driver "exposes" the feature's presence to apps. This came to light when game developer Oxide Games alleged that it was pressured by NVIDIA's marketing department to remove certain features in its "Ashes of the Singularity" DirectX 12 benchmark.

Async Compute is a standardized API-level feature added to Direct3D by Microsoft, which allows an app to better exploit the number-crunching resources of a GPU, by breaking down its graphics rendering tasks. Since NVIDIA driver tells apps that "Maxwell" GPUs supports it, Oxide Games simply created its benchmark with async compute support, but when it attempted to use it on Maxwell, it was an "unmitigated disaster." During to course of its developer correspondence with NVIDIA to try and fix this issue, it learned that "Maxwell" doesn't really support async compute at the bare-metal level, and that NVIDIA driver bluffs its support to apps. NVIDIA instead started pressuring Oxide to remove parts of its code that use async compute altogether, it alleges.
"Personally, I think one could just as easily make the claim that we were biased toward NVIDIA as the only "vendor" specific-code is for NVIDIA where we had to shutdown async compute. By vendor specific, I mean a case where we look at the Vendor ID and make changes to our rendering path. Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware. As far as I know, Maxwell doesn't really have Async Compute so I don't know why their driver was trying to expose that. The only other thing that is different between them is that NVIDIA does fall into Tier 2 class binding hardware instead of Tier 3 like AMD which requires a little bit more CPU overhead in D3D12, but I don't think it ended up being very significant. This isn't a vendor specific path, as it's responding to capabilities the driver reports," writes Oxide, in a statement disputing NVIDIA's "misinformation" about the "Ashes of Singularity" benchmark in its press communications (presumably to VGA reviewers).

Given its growing market-share, NVIDIA could use similar tactics to keep game developers away from industry-standard API features that it doesn't support, and which rival AMD does. NVIDIA drivers tell Windows that its GPUs support DirectX 12 feature-level 12_1. We wonder how much of that support is faked at the driver-level, like async compute. The company is already drawing flack for using borderline anti-competitive practices with GameWorks, which effectively creates a walled garden of visual effects that only users of NVIDIA hardware can experience for the same $59 everyone spends on a particular game.
Sources: DSOGaming, WCCFTech
Add your own comment

196 Comments on Lack of Async Compute on Maxwell Makes AMD GCN Better Prepared for DirectX 12

#26
truth teller
its not like it will be the first time nbadia will resort to a slow software implementation for hardware lacking

*cough*fx5200*cough*

this will keep on happening and their customers wont complain because "muh better in gaymes" payed shilling after these events like always
Posted on Reply
#27
nem
ahahah nv fanboys today. :B
Posted on Reply
#28
Xaled
rtwjunkie
Welcome to TPU. Quite an auspicious start:
-Complain about article content, Check.
-Insult moderator, Check.
He is one of those nvidia guys, who created multiple accounts to turn the "what do you think of nvidia's 3.5 gb thing.." poll to nvidia's favor ..
Posted on Reply
#29
lilhasselhoffer
Slow clap....



Benchmark says a new feature isn't embraced by an Nvidia GPU. Benchmark says that AMD is ahead, on a standard that they helped write when Vulkan functionally became DX12.



Sorry, but this is kinda derp. There's no games that actually show this in action. The people who wrote the code for the benchmark are cagey as to whether real world performance will bear out the superiority as a real asset. Sorry red team, this isn't a win for you.

On the other hand, this is a loss for Nvidia. They're desperately trying to play this off as a communication issue internally, but the benchmark writers are claiming they pushed for the feature to be disabled. Honestly, they could have played this like AMD played the tesselation results snafu but they went full Mcintosh. Gotta say, objectively that's Nvidia walking into a room and slamming their heads on the table. It's a loss, but only a minor one.






Again, let's be objective. Our current node has supported three generations of hardware. AMD and Nvidia are both running on empty when it comes to real performance improvement. AMD admitted it with a functional rebrand of the 2xx series, and Nvidia did it by crippling the features in Maxwell not directly related to today's games. Neither option is good, and it's an admission that they're just treading water until 2016.
Posted on Reply
#31
the54thvoid
Casecutter
Nvidia must provide a clear and truthful statement as to the goings on with this, or… IDK
^^This.

Two (and a half) situations here - simple as that:

1) What Oxide say is true and Nvidia have poor hardware level implementation and the driver level ASync is not well equipped. This gives AMD a massive boost on titles using fully fledged DX12 that utilised this part of the API (and if it's open, it should use it unless it's not required).
1 and a half) Nvidia don't require Async as the cards are well equipped to deal with all other aspects of DX12, therefore in other scenarios the lack of Async wont hamper them (but may still give AMD more leg room if ASync is the be all and end all).
2) Async isn't actually the best thing ever and isn't used or required - gives no edge to AMD in other titles. Nvidia sponsored titles will certainly do this.

Frankly if this is a real case of Bad Nvidia shenanigans, it wont hurt them until the slew of AAA DX12 titles arrive. If they have new cards out by then, I guarantee they'll have addressed this and taken no prisoners. Problem is, with Win10 still in adoption mode, DX12 wont matter for the bulk of the market for quite a while until at least a healthy percentage of titles is DX12 coded. This is still too early to be truly meaningful. But Kudos to AMD for pushing Mantle and helping get hardware to do the work.
Posted on Reply
#32
Sony Xperia S
ShurikN
Why am I not surprised by any of this...
Given its growing market-share, NVIDIA could use similar tactics to keep game developers away from industry-standard API features that it doesn't support, and which rival AMD does. NVIDIA drivers tell Windows that its GPUs support DirectX 12 feature-level 12_1. We wonder how much of that support is faked at the driver-level, like async compute. The company is already drawing flack for using borderline anti-competitive practices with GameWorks, which effectively creates a walled garden of visual effects that only users of NVIDIA hardware can experience for the same $59 everyone spends on a particular game.
Why am I not surprised at all by this too ?

Nvidia is so dirty like pigs in the mud.

I keep telling you that this company is not good but how many listen to me ?
The world will become a better place when we get rid of nvidia.

Monopoly of AMD (with good hearts) will be better than monopoly of nvidia (who only look how to screw technological progress).
Posted on Reply
#33
Agility
All i see is fanboyism being anal about the post. You shallow creatures need to look at the bigger picture.
Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware.
It was mentioned that Nvidia tried to use async compute in which their chip totally does not support it A.K.A Tier 3. Why would Nvidia then tell the world that their GPU does support it? Have you guys totally forgotten about the whole big PR bull-shit of the GTX 970 memory scandal?

Maybe you guys need some reminders
NVIDIA drivers tell Windows that its GPUs support DirectX 12 feature-level 12_1. We wonder how much of that support is faked at the driver-level, like async compute.
his came to light when game developer Oxide Games claimed that it was pressured by NVIDIA's marketing department to remove certain features in its "Ashes of the Singularity" DirectX 12 benchmark.
Apparently, it shows that Nvidia just tried to bluff its way through the software via driver support and got backfired. I wouldn't be surprise if Nvidia starts blaming their so-called marketing department for the fault mentioned.

All in all, this entire post clearly mentions and tells us the way Nvidia handles their marketing and business. Simply put, they lack proper business etiquette by doing under-table nonsense and being opaque to consumers.
Posted on Reply
#34
KarymidoN
the54thvoid
^^This.

Two (and a half) situations here - simple as that:

1) What Oxide say is true and Nvidia have poor hardware level implementation and the driver level ASync is not well equipped. This gives AMD a massive boost on titles using fully fledged DX12 that utilised this part of the API (and if it's open, it should use it unless it's not required).
1 and a half) Nvidia don't require Async as the cards are well equipped to deal with all other aspects of DX12, therefore in other scenarios the lack of Async wont hamper them (but may still give AMD more leg room if ASync is the be all and end all).
2) Async isn't actually the best thing ever and isn't used or required - gives no edge to AMD in other titles. Nvidia sponsored titles will certainly do this.

Frankly if this is a real case of Bad Nvidia shenanigans, it wont hurt them until the slew of AAA DX12 titles arrive. If they have new cards out by then, I guarantee they'll have addressed this and taken no prisoners. Problem is, with Win10 still in adoption mode, DX12 wont matter for the bulk of the market for quite a while until at least a healthy percentage of titles is DX12 coded. This is still too early to be truly meaningful. But Kudos to AMD for pushing Mantle and helping get hardware to do the work.
Today in DX12 = AMD > Nvidia.
When DX12 got Popular and have a lot of games = Nvidia (new Generation) > AMD.

Nvidia Marketing department is so stupid, they said Maxwell will be compatible with all DX12 feature levels... haha, jokes on you Nvidia fanboy.
Posted on Reply
#35
Sony Xperia S
KarymidoN
Nvidia Marketing department is so stupid, they said Maxwell will be compatible with all DX12 feature levels... haha, jokes on you Nvidia fanboy.
The marketing department is the last responsible. Their job is just to make what is crap looking a little bit better.

The real monsters are those like JHH and the upper management who NEVER EVER in their lives gave the initiative for improving DX or providing more efficient layers like Mantle etc...
Posted on Reply
#37
vega22
brace yourselves, here comes the butthurt green team xD

doh!

too late :rofl:
Posted on Reply
#38
john_
DirectX 10.1 was a bug, Async Compute is also a bug. At least Oxide is not Ubisoft.
Posted on Reply
#39
rtwjunkie
PC Gaming Enthusiast
To me it matters not. Both sides take turns telling lies. I'm still on Kepler, which does have compute, and still trying to delay upgrading, whether to AMD or Nvidia...haven't decided.
Posted on Reply
#40
raptori
That is some serious disappointment, I still remember what Nvidia said "DX12 will be supported by all cards down to fermi" and that's Fermi and Kepler and Maxwell , is it all bluff now ..... , I can see my self turning DX12 options "off" in the future , what a mess :mad: .
Posted on Reply
#41
geon2k2
Suddenly ... the Fury price looks justified.
Posted on Reply
#42
lilhasselhoffer
Sony Xperia S is here, brace yourselves and prepare cyanide tabs.
Sony Xperia S
Why am I not surprised at all by this too ?

Nvidia is so dirty like pigs in the mud.

I keep telling you that this company is not good but how many listen to me ?
The world will become a better place when we get rid of nvidia.

Monopoly of AMD (with good hearts) will be better than monopoly of nvidia (who only look how to screw technological progress).
AMD isn't some sort of angel. Perhaps you've got blinders on.

AMD sold Bulldozer as a Sandy Bridge killer. They cherry picked performance figures, and ran testing that specifically favored their processors. That's lying.
Nvidia sold a 4 GB video card, where the last 0.5 GB being called basically blows its kneecaps out. That's lying.
Intel regularly engages in shady business practices, and has utilized its near monopoly to stagnate progress ever since Sandy Bridge.


If Nvidia folded tomorrow AMD would gouge on GPU pricing, while justifying it as a means to make Zen a competitor to whatever Intel is offering. They could theoretically do this, assuming their management wasn't incompetent. If AMD folded tomorrow the price of all computer hardware would basically have another zero. No competition means we'd all be screwed. Both Nvidia and Intel have a documented track record of these practices. Intel won't fold. They could release a Dorito, and still have people buy it. They proved it with the thermal paste crap on Ivy Bridge and Haswell.
Sony Xperia S
The marketing department is the last responsible. Their job is just to make what is crap looking a little bit better.

The real monsters are those like JHH and the upper management who NEVER EVER in their lives gave the initiative for improving DX or providing more efficient layers like Mantle etc...
Jesus, I'm going to make the same argument here. Prove it.
1) Are there any games using the feature? Nope.
2) Do the people writing the code for the benchmark think it'll be a game changer? Nope.
3) Who was responsible for pressuring the coders to omit code that they knew made their cards suck? That's marketing.

As per your usual display, golf clap. You've somehow managed to pin a conspiracy onto the evil overlords at Nvidia. The absolute angels that are currently stripping the wall paper out of their AMD offices are poor, misunderstood, giving souls. They couldn't possibly be riding the company into the ground with painful decision after painful decision. They aren't responsible for stripping value out of the company, to meet a company valuation report that would earn them a bonus check worth more than some of their employees make in a year. They aren't cutting good, hard working individuals to meet financial objectives.



Christ! I'm between atheism and agnosticism, but that's all I can come up with. I've seen people on bad trips that made more coherent sense than you.
Posted on Reply
#43
TheMailMan78
Big Member
lilhasselhoffer
Sony Xperia S is here, brace yourselves and prepare cyanide tabs.







AMD isn't some sort of angel. Perhaps you've got blinders on.

AMD sold Bulldozer as a Sandy Bridge killer. They cherry picked performance figures, and ran testing that specifically favored their processors. That's lying.
Nvidia sold a 4 GB video card, where the last 0.5 GB being called basically blows its kneecaps out. That's lying.
Intel regularly engages in shady business practices, and has utilized its near monopoly to stagnate progress ever since Sandy Bridge.


If Nvidia folded tomorrow AMD would gouge on GPU pricing, while justifying it as a means to make Zen a competitor to whatever Intel is offering. They could theoretically do this, assuming their management wasn't incompetent. If AMD folded tomorrow the price of all computer hardware would basically have another zero. No competition means we'd all be screwed. Both Nvidia and Intel have a documented track record of these practices. Intel won't fold. They could release a Dorito, and still have people buy it. They proved it with the thermal paste crap on Ivy Bridge and Haswell.



Jesus, I'm going to make the same argument here. Prove it.
1) Are there any games using the feature? Nope.
2) Do the people writing the code for the benchmark think it'll be a game changer? Nope.
3) Who was responsible for pressuring the coders to omit code that they knew made their cards suck? That's marketing.

As per your usual display, golf clap. You've somehow managed to pin a conspiracy onto the evil overlords at Nvidia. The absolute angels that are currently stripping the wall paper out of their AMD offices are poor, misunderstood, giving souls. They couldn't possibly be riding the company into the ground with painful decision after painful decision. They aren't responsible for stripping value out of the company, to meet a company valuation report that would earn them a bonus check worth more than some of their employees make in a year. They aren't cutting good, hard working individuals to meet financial objectives.



Christ! I'm between atheism and agnosticism, but that's all I can come up with. I've seen people on bad trips that made more coherent sense than you.
WALL OF TEXT ACTIVATE!

I love reading your posts lilhasselhoffer but, man you gotta learn how to articulate your points into torpedo's and not a long spray of .22 caliber rat shot.

The only person who types more in a response than you is Ford.

Learn to hit hard and fast.
Posted on Reply
#44
KarymidoN
The thing i love about Nvidia is the way they Lie to us, and some people still loving their Lies...


GTX 970, Now Async, who's next?
Posted on Reply
#45
rtwjunkie
PC Gaming Enthusiast
KarymidoN
The thing i love about Nvidia is the way they Lie to us, and some people still loving their Lies...


GTX 970, Now Async, who's next?
See posts 40 and 43. Lies are something both sides do very well. To think otherwise would show one to be extremely gullible.
Posted on Reply
#46
MxPhenom 216
ASIC Engineer
Sony Xperia S



Why am I not surprised at all by this too ?

Nvidia is so dirty like pigs in the mud.

I keep telling you that this company is not good but how many listen to me ?
The world will become a better place when we get rid of nvidia.

Monopoly of AMD (with good hearts) will be better than monopoly of nvidia (who only look how to screw technological progress).
Any monopoly is bad. Doesn't matter who has the control over it, its bad for everyone.
Posted on Reply
#47
Ikaruga
Why do people surprised by this? Directx12 (and Vulcan too) both have AMD roots in them (Microsoft has two consoles in the market with AMD GPUs). This is like saying that "Lack of <random feature like proper tesselation> on GCN makes Maxwell better in Gameworks titles". Ofc Nvidia won't be faster in everything if the api is made by people from the other side. It is a bit slower in async compute atm. The question is if they can correct it within their drivers (using a little more CPU time, which we have plenty of unused on the PC, so it will make zero difference).

I would still go for NV cards with this generation (more ROPS, better tesselation, faster AA, etc), but perhaps I will change my mind with the next one.

Just my two cents:toast:
Posted on Reply
#48
GhostRyder
Whether or not this becomes a big deal has yet to be seen at least from a more than one game standpoint. I believe more information is needed on the subject from both sides before we can make our final decisions, but so far this is just another bit of "Misinformation" that has been put out from their side which seems to be happening a lot recently (Just like how there was a "Misunderstanding" about the GTX 970).

Either way, its definitely nothing we have not seen before as far as any of the sides have made up their own shares of lies over the last few years.
Posted on Reply
#49
Ikaruga
GhostRyder
Whether or not this becomes a big deal has yet to be seen at least from a more than one game standpoint. I believe more information is needed on the subject from both sides before we can make our final decisions, but so far this is just another bit of "Misinformation" that has been put out from their side which seems to be happening a lot recently (Just like how there was a "Misunderstanding" about the GTX 970).

Either way, its definitely nothing we have not seen before as far as any of the sides have made up their own shares of lies over the last few years.
There is no misinformation at all, most of the dx12 features will be supported by software on most of the cards, there are no GPU on the market with 100% top tier dx12 support (and I'm not sure if the next generation will be one, but maybe). This is nothing but a very well directed market campaign to level the fields, but I expected more insight into this from some of the TPU vets tbh (I don't mind it btw, AMD needs all the help he can get anyways).
Posted on Reply
#50
lilhasselhoffer
TheMailMan78
WALL OF TEXT ACTIVATE!

I love reading your posts lilhasselhoffer but, man you gotta learn how to articulate your points into torpedo's and not a long spray of .22 caliber rat shot.

The only person who types more in a response than you is Ford.

Learn to hit hard and fast.
Fair. My only response is that too often not explaining yourself makes you look like an ass.

More than once I've been guilty of that...sigh....
Posted on Reply
Add your own comment