• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Lack of Async Compute on Maxwell Makes AMD GCN Better Prepared for DirectX 12

So where does this leave Nvidia, as I thought it had been pretty well purported that Nvidia "confirmed" some time ago that Maxwell 2 ( GM 200, 204, 206) can utilize the feature? Is it now that they don't have async compute (enough) and tried to emulated through software and their see bad teething plains? Is now surfacing perhaps Nvidia covered up the truth to sell GPU’s?

It been known AMD has baked in async compute into their hardware since GCN, and even in PS/Xbox consoles. And it been know well before Maxwell async compute was going to be something Dx12 would be able to leverage.

So Nvidia has been selling a shit load of cards, and now all those cards are found they might be able to provide emulated support in due time. Did they intend to have designed hardware that appears not to offer native support (or so little) that one might see it and negligent? So what were they intending most Maxwell owners (and even Kepler) look to do, wait for Pascal and be happy to thrown more money at them… while they watch resale of their cards plummet? Sure, right now owners can just convince themselves they can do with some half-baked support till more Dx12 games start to show.

Nvidia must provide a clear and truthful statement as to the goings on with this, or… IDK
 
Last edited:
its not like it will be the first time nbadia will resort to a slow software implementation for hardware lacking

*cough*fx5200*cough*

this will keep on happening and their customers wont complain because "muh better in gaymes" payed shilling after these events like always
 
Slow clap....



Benchmark says a new feature isn't embraced by an Nvidia GPU. Benchmark says that AMD is ahead, on a standard that they helped write when Vulkan functionally became DX12.



Sorry, but this is kinda derp. There's no games that actually show this in action. The people who wrote the code for the benchmark are cagey as to whether real world performance will bear out the superiority as a real asset. Sorry red team, this isn't a win for you.

On the other hand, this is a loss for Nvidia. They're desperately trying to play this off as a communication issue internally, but the benchmark writers are claiming they pushed for the feature to be disabled. Honestly, they could have played this like AMD played the tesselation results snafu but they went full Mcintosh. Gotta say, objectively that's Nvidia walking into a room and slamming their heads on the table. It's a loss, but only a minor one.






Again, let's be objective. Our current node has supported three generations of hardware. AMD and Nvidia are both running on empty when it comes to real performance improvement. AMD admitted it with a functional rebrand of the 2xx series, and Nvidia did it by crippling the features in Maxwell not directly related to today's games. Neither option is good, and it's an admission that they're just treading water until 2016.
 
Good job guys!
 
Nvidia must provide a clear and truthful statement as to the goings on with this, or… IDK

^^This.

Two (and a half) situations here - simple as that:

1) What Oxide say is true and Nvidia have poor hardware level implementation and the driver level ASync is not well equipped. This gives AMD a massive boost on titles using fully fledged DX12 that utilised this part of the API (and if it's open, it should use it unless it's not required).
1 and a half) Nvidia don't require Async as the cards are well equipped to deal with all other aspects of DX12, therefore in other scenarios the lack of Async wont hamper them (but may still give AMD more leg room if ASync is the be all and end all).
2) Async isn't actually the best thing ever and isn't used or required - gives no edge to AMD in other titles. Nvidia sponsored titles will certainly do this.

Frankly if this is a real case of Bad Nvidia shenanigans, it wont hurt them until the slew of AAA DX12 titles arrive. If they have new cards out by then, I guarantee they'll have addressed this and taken no prisoners. Problem is, with Win10 still in adoption mode, DX12 wont matter for the bulk of the market for quite a while until at least a healthy percentage of titles is DX12 coded. This is still too early to be truly meaningful. But Kudos to AMD for pushing Mantle and helping get hardware to do the work.
 
Why am I not surprised by any of this...
Given its growing market-share, NVIDIA could use similar tactics to keep game developers away from industry-standard API features that it doesn't support, and which rival AMD does. NVIDIA drivers tell Windows that its GPUs support DirectX 12 feature-level 12_1. We wonder how much of that support is faked at the driver-level, like async compute. The company is already drawing flack for using borderline anti-competitive practices with GameWorks, which effectively creates a walled garden of visual effects that only users of NVIDIA hardware can experience for the same $59 everyone spends on a particular game.

Why am I not surprised at all by this too ?

Nvidia is so dirty like pigs in the mud.

I keep telling you that this company is not good but how many listen to me ?
The world will become a better place when we get rid of nvidia.

Monopoly of AMD (with good hearts) will be better than monopoly of nvidia (who only look how to screw technological progress).
 
All i see is fanboyism being anal about the post. You shallow creatures need to look at the bigger picture.

Curiously, their driver reported this feature was functional but attempting to use it was an unmitigated disaster in terms of performance and conformance so we shut it down on their hardware.

It was mentioned that Nvidia tried to use async compute in which their chip totally does not support it A.K.A Tier 3. Why would Nvidia then tell the world that their GPU does support it? Have you guys totally forgotten about the whole big PR bull-shit of the GTX 970 memory scandal?

Maybe you guys need some reminders

NVIDIA drivers tell Windows that its GPUs support DirectX 12 feature-level 12_1. We wonder how much of that support is faked at the driver-level, like async compute.

his came to light when game developer Oxide Games claimed that it was pressured by NVIDIA's marketing department to remove certain features in its "Ashes of the Singularity" DirectX 12 benchmark.

Apparently, it shows that Nvidia just tried to bluff its way through the software via driver support and got backfired. I wouldn't be surprise if Nvidia starts blaming their so-called marketing department for the fault mentioned.

All in all, this entire post clearly mentions and tells us the way Nvidia handles their marketing and business. Simply put, they lack proper business etiquette by doing under-table nonsense and being opaque to consumers.
 
^^This.

Two (and a half) situations here - simple as that:

1) What Oxide say is true and Nvidia have poor hardware level implementation and the driver level ASync is not well equipped. This gives AMD a massive boost on titles using fully fledged DX12 that utilised this part of the API (and if it's open, it should use it unless it's not required).
1 and a half) Nvidia don't require Async as the cards are well equipped to deal with all other aspects of DX12, therefore in other scenarios the lack of Async wont hamper them (but may still give AMD more leg room if ASync is the be all and end all).
2) Async isn't actually the best thing ever and isn't used or required - gives no edge to AMD in other titles. Nvidia sponsored titles will certainly do this.

Frankly if this is a real case of Bad Nvidia shenanigans, it wont hurt them until the slew of AAA DX12 titles arrive. If they have new cards out by then, I guarantee they'll have addressed this and taken no prisoners. Problem is, with Win10 still in adoption mode, DX12 wont matter for the bulk of the market for quite a while until at least a healthy percentage of titles is DX12 coded. This is still too early to be truly meaningful. But Kudos to AMD for pushing Mantle and helping get hardware to do the work.

Today in DX12 = AMD > Nvidia.
When DX12 got Popular and have a lot of games = Nvidia (new Generation) > AMD.

Nvidia Marketing department is so stupid, they said Maxwell will be compatible with all DX12 feature levels... haha, jokes on you Nvidia fanboy.
 
Nvidia Marketing department is so stupid, they said Maxwell will be compatible with all DX12 feature levels... haha, jokes on you Nvidia fanboy.

The marketing department is the last responsible. Their job is just to make what is crap looking a little bit better.

The real monsters are those like JHH and the upper management who NEVER EVER in their lives gave the initiative for improving DX or providing more efficient layers like Mantle etc...
 
brace yourselves, here comes the butthurt green team xD

doh!

too late :rofl:
 
DirectX 10.1 was a bug, Async Compute is also a bug. At least Oxide is not Ubisoft.
 
To me it matters not. Both sides take turns telling lies. I'm still on Kepler, which does have compute, and still trying to delay upgrading, whether to AMD or Nvidia...haven't decided.
 
That is some serious disappointment, I still remember what Nvidia said "DX12 will be supported by all cards down to fermi" and that's Fermi and Kepler and Maxwell , is it all bluff now ..... , I can see my self turning DX12 options "off" in the future , what a mess :mad: .
 
Sony Xperia S is here, brace yourselves and prepare cyanide tabs.



Why am I not surprised at all by this too ?

Nvidia is so dirty like pigs in the mud.

I keep telling you that this company is not good but how many listen to me ?
The world will become a better place when we get rid of nvidia.

Monopoly of AMD (with good hearts) will be better than monopoly of nvidia (who only look how to screw technological progress).



AMD isn't some sort of angel. Perhaps you've got blinders on.

AMD sold Bulldozer as a Sandy Bridge killer. They cherry picked performance figures, and ran testing that specifically favored their processors. That's lying.
Nvidia sold a 4 GB video card, where the last 0.5 GB being called basically blows its kneecaps out. That's lying.
Intel regularly engages in shady business practices, and has utilized its near monopoly to stagnate progress ever since Sandy Bridge.


If Nvidia folded tomorrow AMD would gouge on GPU pricing, while justifying it as a means to make Zen a competitor to whatever Intel is offering. They could theoretically do this, assuming their management wasn't incompetent. If AMD folded tomorrow the price of all computer hardware would basically have another zero. No competition means we'd all be screwed. Both Nvidia and Intel have a documented track record of these practices. Intel won't fold. They could release a Dorito, and still have people buy it. They proved it with the thermal paste crap on Ivy Bridge and Haswell.

The marketing department is the last responsible. Their job is just to make what is crap looking a little bit better.

The real monsters are those like JHH and the upper management who NEVER EVER in their lives gave the initiative for improving DX or providing more efficient layers like Mantle etc...

Jesus, I'm going to make the same argument here. Prove it.
1) Are there any games using the feature? Nope.
2) Do the people writing the code for the benchmark think it'll be a game changer? Nope.
3) Who was responsible for pressuring the coders to omit code that they knew made their cards suck? That's marketing.

As per your usual display, golf clap. You've somehow managed to pin a conspiracy onto the evil overlords at Nvidia. The absolute angels that are currently stripping the wall paper out of their AMD offices are poor, misunderstood, giving souls. They couldn't possibly be riding the company into the ground with painful decision after painful decision. They aren't responsible for stripping value out of the company, to meet a company valuation report that would earn them a bonus check worth more than some of their employees make in a year. They aren't cutting good, hard working individuals to meet financial objectives.



Christ! I'm between atheism and agnosticism, but that's all I can come up with. I've seen people on bad trips that made more coherent sense than you.
 
Sony Xperia S is here, brace yourselves and prepare cyanide tabs.







AMD isn't some sort of angel. Perhaps you've got blinders on.

AMD sold Bulldozer as a Sandy Bridge killer. They cherry picked performance figures, and ran testing that specifically favored their processors. That's lying.
Nvidia sold a 4 GB video card, where the last 0.5 GB being called basically blows its kneecaps out. That's lying.
Intel regularly engages in shady business practices, and has utilized its near monopoly to stagnate progress ever since Sandy Bridge.


If Nvidia folded tomorrow AMD would gouge on GPU pricing, while justifying it as a means to make Zen a competitor to whatever Intel is offering. They could theoretically do this, assuming their management wasn't incompetent. If AMD folded tomorrow the price of all computer hardware would basically have another zero. No competition means we'd all be screwed. Both Nvidia and Intel have a documented track record of these practices. Intel won't fold. They could release a Dorito, and still have people buy it. They proved it with the thermal paste crap on Ivy Bridge and Haswell.



Jesus, I'm going to make the same argument here. Prove it.
1) Are there any games using the feature? Nope.
2) Do the people writing the code for the benchmark think it'll be a game changer? Nope.
3) Who was responsible for pressuring the coders to omit code that they knew made their cards suck? That's marketing.

As per your usual display, golf clap. You've somehow managed to pin a conspiracy onto the evil overlords at Nvidia. The absolute angels that are currently stripping the wall paper out of their AMD offices are poor, misunderstood, giving souls. They couldn't possibly be riding the company into the ground with painful decision after painful decision. They aren't responsible for stripping value out of the company, to meet a company valuation report that would earn them a bonus check worth more than some of their employees make in a year. They aren't cutting good, hard working individuals to meet financial objectives.



Christ! I'm between atheism and agnosticism, but that's all I can come up with. I've seen people on bad trips that made more coherent sense than you.
WALL OF TEXT ACTIVATE!

I love reading your posts lilhasselhoffer but, man you gotta learn how to articulate your points into torpedo's and not a long spray of .22 caliber rat shot.

The only person who types more in a response than you is Ford.

Learn to hit hard and fast.
 



Why am I not surprised at all by this too ?

Nvidia is so dirty like pigs in the mud.

I keep telling you that this company is not good but how many listen to me ?
The world will become a better place when we get rid of nvidia.

Monopoly of AMD (with good hearts) will be better than monopoly of nvidia (who only look how to screw technological progress).

Any monopoly is bad. Doesn't matter who has the control over it, its bad for everyone.
 
Why do people surprised by this? Directx12 (and Vulcan too) both have AMD roots in them (Microsoft has two consoles in the market with AMD GPUs). This is like saying that "Lack of <random feature like proper tesselation> on GCN makes Maxwell better in Gameworks titles". Ofc Nvidia won't be faster in everything if the api is made by people from the other side. It is a bit slower in async compute atm. The question is if they can correct it within their drivers (using a little more CPU time, which we have plenty of unused on the PC, so it will make zero difference).

I would still go for NV cards with this generation (more ROPS, better tesselation, faster AA, etc), but perhaps I will change my mind with the next one.

Just my two cents:toast:
 
Whether or not this becomes a big deal has yet to be seen at least from a more than one game standpoint. I believe more information is needed on the subject from both sides before we can make our final decisions, but so far this is just another bit of "Misinformation" that has been put out from their side which seems to be happening a lot recently (Just like how there was a "Misunderstanding" about the GTX 970).

Either way, its definitely nothing we have not seen before as far as any of the sides have made up their own shares of lies over the last few years.
 
Whether or not this becomes a big deal has yet to be seen at least from a more than one game standpoint. I believe more information is needed on the subject from both sides before we can make our final decisions, but so far this is just another bit of "Misinformation" that has been put out from their side which seems to be happening a lot recently (Just like how there was a "Misunderstanding" about the GTX 970).

Either way, its definitely nothing we have not seen before as far as any of the sides have made up their own shares of lies over the last few years.
There is no misinformation at all, most of the dx12 features will be supported by software on most of the cards, there are no GPU on the market with 100% top tier dx12 support (and I'm not sure if the next generation will be one, but maybe). This is nothing but a very well directed market campaign to level the fields, but I expected more insight into this from some of the TPU vets tbh (I don't mind it btw, AMD needs all the help he can get anyways).
 
Back
Top