Tuesday, January 23rd 2018

AMD Cancels Implicit Primitive Shader Driver Support

Primitive shaders are lightweight shaders that break the separation of vertex and geometry shaders, promising a performance gain in supporting games. Initially announced during the Radeon RX Vega launch, the feature has been delayed again and again. At one of its 2018 International CES interactions with the press, AMD reportedly announced that it had cancelled the implicit driver path for primitive shaders. Game developers will still be able to implement primitive shaders on AMD hardware, using a (yet to be released) explicit API path. The implicit driver path was the more interesting technology though, since it could have provided meaningful performance gains to existing games and help cut down a lot of developer effort for games in development. AMD didn't state the reasons behind the move.

To explain the delay, some people were speculating that the Primitive Shader feature was broken unfixable in hardware, which doesn't seem to be the case, now that we are hearing about upcoming API support for it, so this can also be interpreted as good news for Vega owners.
Source: Y33H@ (Golem's Marc Sauter) on 3DCenter.org Forums
Add your own comment

39 Comments on AMD Cancels Implicit Primitive Shader Driver Support

#26
Vya Domus
CandymanGRThats a fact of life.
Fact of life ? What is this , a Morgan Freeman documentary ? :kookoo:

This can't possibly get any more cringey or unnecessarily pompous.
Posted on Reply
#27
CandymanGR
Vya DomusFact of life ? What is this , a Morgan Freeman documentary ? :kookoo:

This can't possibly get any more cringey or unnecessarily pompous.
The truth usually is simple yet people cannot see it.
Posted on Reply
#28
Flanker
If it was going to be a feature exclusive to AMD cards, it's probably a good call to save resources by cancelling it.
Posted on Reply
#29
MuhammedAbdo
iOThey didnt really cancel anything. PS was always planned to be implemented per game and the implicit path was only an option. Still bad.
What is the source of this slide please?
Posted on Reply
#30
GoldenX
Maybe GCN is a little old by now?
Posted on Reply
#31
Captain_Tom
R0H1TDoesn't the upcoming FC5 advertise primitive shaders, rapid packed math among others?
Yep. I would say this is overall bad news though because it was somewhat expected that AMD would bring a 10-30% performance boost at some point through a big feature driver.

But as this article points out - the good news is it is still being implemented on plenty of games anyways. And when it is implemented, Vega 64 is indeed as strong as the Titan Xp...
Posted on Reply
#32
Xajel
krukDid anyone here read the actual article?
The original promise is that it will be enabled at the driver level, developers won't need to change a thing to enable it. so every body will directly get benefit from it...

But it seems that AMD saw it requires a huge amount of work from their side, so they just forgot about their promise and dropped it in the hands of developers.
Posted on Reply
#33
XiGMAKiD
R0H1TYou sure about that?
If they pull a Doom-like performance with FC5 then it's gonna be great
Posted on Reply
#34
TheinsanegamerN
MuhammedAbdoThis was pretty much a given, we all knew Vega imaginary features were nothing more than empty promises, those expected Vega to get 50% more performance through imaginary drivers have nothing but wishful thinking to blame for their false hope.
And yet, so many in the AMD community screeched about how VEGA and FineWine would totally curb-stomp Nvidia, once all the features were enabled.

Cane-in-point:
RejZoRSo, they were selling us hardware on premise of awesome new features that are now getting sacked. Come on AMD, do you even try anymore?
Only the most dedicated AMD fans are remotely surprised by this. Everybody was pointing out AMD's hot air machine back when VEGA came out.
XajelThe original promise is that it will be enabled at the driver level, developers won't need to change a thing to enable it. so every body will directly get benefit from it...

But it seems that AMD saw it requires a huge amount of work from their side, so they just forgot about their promise and dropped it in the hands of developers.
And given how amazing DX12 mGPU support has been, I'm sure we will see this feature is all 1 games that bother to support it.
Posted on Reply
#35
RejZoR
krukDid anyone here read the actual article?
Which essentially means we'll never see it in action. Devs need easy to use coding, not doing what AMD should have done and provide proper support for it.
Posted on Reply
#37
kruk
RejZoRWhich essentially means we'll never see it in action. Devs need easy to use coding, not doing what AMD should have done and provide proper support for it.
How exactly do you know that? There is no API path available yet, again, it says so in the article. The implicit path is probably PITA to get right and they just skipped it to save on resources. Explicit path might be piece of a cake (or PITA again), but we won't know until the specs are out ... And the devs will decide if they wan't to support this path in RX Vega, future AMD and Intel APUs, and Macs.
Posted on Reply
#38
TheinsanegamerN
Captain_TomCorrect:

digiworthy.com/wp-content/uploads/2017/11/Wolfenstein-2-benchmarks_1440pR.png
Cmon captian tom, we already know you are an AMD fanboi, the least you could do is post something creative.

Instead, you resort to benchmarks from a single game, running on an engine that is known to favor AMD hardware. Cherry picking benchmarks is a long overplayed, predictable tactic.

On average, vega 64 is about equal to a 1080, a chip that was both cheaper and more power efficient. Vega 56 was at least performance competitive with the 1070. Both came out far too late for AMD, and nobody believed that these chips would ever challenge more powerful chips, no matter how many driver shenanigans that AMD pulls.
Posted on Reply
Add your own comment
Apr 26th, 2024 04:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts