• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Halts Optimizations for Mantle API

The features of Mantle was planned for Direct3D 12 long before Mantle, and Mantle was in fact based of this draft. Mantle was a way for AMD to experiment with some of the new features. This is nothing new at all, both AMD and Nvidia provide experimental features to OpenGL and Direct3D all the time, some of which are included in following specifications.
you can believe want you want believe all who lived the history we know the true history .... ;B
 
you can believe want you want believe all who lived the history we know the true history .... ;B
It's a simple fact that Direct3D features are drafted many years ahead (3-5 years), otherwise there would be no way to add the right set of features to the GPUs in time. Direct3D 12 was like all previous versions drafted years ahead, anyone believing anything else is ignorant.

You might remember tessellation as the big feature of Direct3D 11, but did you know it was actually planned for Direct3D 10? Both AMD and Nvidia made their hardware implementations (example), AMD provided Direct3D extensions, and both provided OpenGL extensions. For some reason this was left out of the final spec, and a more mature version arrived instead in Direct3D 11.
 
Is there any particular reason why you believe AMD will have an advantage over Nvidia?

I know this one. It's mirakul, read his/her post history. :p
 
Vulkan forked from Mantle spec
so its not that big of a leap ;)

this is how Vulkan was created
Khronos we need a new api, opengl next is going nowhere
AMD: take mantle spec, please dont mess it up to much
Khonos: Thanks, well fork it!

this is a true story, you can read about it and watch the vulkan presentations, even D.I.C.E (EA) didnt hide that mantle was vulkan

if you compare vulkan and mantle, most of the spec is the same, there might be differences here and there, much like between a fork of a project

You can compare mantle and vulkan to openssl and libressl (a fork that removed some code and added some code) (ignore the security mess that is openssl)
 
It's a simple fact that Direct3D features are drafted many years ahead (3-5 years), otherwise there would be no way to add the right set of features to the GPUs in time. Direct3D 12 was like all previous versions drafted years ahead, anyone believing anything else is ignorant.

You might remember tessellation as the big feature of Direct3D 11, but did you know it was actually planned for Direct3D 10? Both AMD and Nvidia made their hardware implementations (example), AMD provided Direct3D extensions, and both provided OpenGL extensions. For some reason this was left out of the final spec, and a more mature version arrived instead in Direct3D 11.
I don't get why people want credit one way or another. Nvidia, Intel, AMD and Microsoft all likely contributed for years to the design and codebase of DX12. Meaning Mantle likely was parts of AMD's proposal to Microsoft or hell it could have been part of Intel's or Nvidia's proposal to Microsoft which all of them reviewed. It doesn't matter because at the end of the day DX12 is here Mantle is not. No reason to beat a dead horse.
 
has the poor performance in project cars been repaired yet?
pcars_2560_1440.gif

Nvidia Titan X reduce image quality cheating...

The root of the problem is the image quality setting in Nvidia control panel. Check the difference (and proof) here:

http://hardforum.com/showpost.php?p=1041709168&postcount=84

there seems to be around 10% perf. drop after setting the quality to highest.



Mantle%20vs%20DX12%20pgm%20guide.jpg


Oh, I think there might be a fair bit, actually.
aq62YN7.jpg

Vulkan is essentially Mantle with modifications.

AMD donated their efforts to the Khronos group and some of the relationship is obvious from the name.
 
Last edited:
Nvidia Titan X reduce image quality cheating...

The root of the problem is the image quality setting in Nvidia control panel. Check the difference (and proof) here:

http://hardforum.com/showpost.php?p=1041709168&postcount=84

there seems to be around 10% perf. drop after setting the quality to highest.




aq62YN7.jpg

Vulkan is essentially Mantle with modifications.

AMD donated their efforts to the Khronos group and some of the relationship is obvious from the name.
paging w1zzard to check nvidia cpl setting :)
i myself use AF everywhere with gtx670 usually set to sufficient value of 8. the picture would be blatantly blurred from a sideview for all to see. but i cant compare titanx vs furyx myself in a benchmark, im too poor to get ahold of 'em :laugh: :cry:
 
If they treat Vulcan the same way they've treated OpenGL over the years I can't see or say that it will be a success considering the hit and miss affairs we've had to put up with with drivers either working well or not with every new release
 
Nvidia Titan X reduce image quality cheating...

The root of the problem is the image quality setting in Nvidia control panel. Check the difference (and proof) here:

I seem to remember Mantle had reduced draw distance in BF4 so they cheated as well.

Vulkan is essentially Mantle with modifications.

AMD donated their efforts to the Khronos group and some of the relationship is obvious from the name.
They probably hope that it will help them in performance on linux side since amd gpu's are pretty bad performance wise on linux.
 
AMD needs to just make a decent computer chip and stop muckin' around.
2009-2015 is what?
Idk..maybe it's been too long,They should release a decent chip like..2 years ago to be halfway right.
FAIL,FAIL,make me mad, FAIL.
 
DX9 still alive because win XP users and old consoles , DX 9.0C is best you can have with win XP. Game developers could not close eyes to potential customers using XP until recent times.
Now everyone using win7/8/8,1 can upgrade to win10 freely at first year , DX 12 will be adopted much much faster and also has huge advantages.
Except DX11 won't be dying off for a looooong time. It won't disappear just because DX12 comes out. Has DX9 disappeared? No, in fact, there are recent games that have been released as DX9 games (and look quite good, btw). Expect the same kind of situation for years with 11.
 
Last edited:
W10 better be better than 7 and definitely 8, Otherwise MS ties DX to the OS too much like they did with IE.
 
AMD needs to just make a decent computer chip and stop muckin' around.
2009-2015 is what?
Idk..maybe it's been too long,They should release a decent chip like..2 years ago to be halfway right.
FAIL,FAIL,make me mad, FAIL.
http://hardforum.com/showpost.php?p=1041709226&postcount=100

On AMD, default = higher quality, but it hurts performance.

On NV, default = worse quality, but it runs faster (10% according to BF4 tests).

Do a detailed analysis in recent games, image quality comparisons on default driver settings (since its what you test on), same settings in-game, same location. Then do it forced on quality in CC/NVCP.

You will get a ton of site hits from everyone interested in this issue. If it turns out to be true, then basically NV has "optimized" their way to extra performance for years at the expense of IQ.
 
Nvidia Titan X reduce image quality cheating...
The root of the problem is the image quality setting in Nvidia control panel. Check the difference (and proof) here:
http://hardforum.com/showpost.php?p=1041709168&postcount=84
there seems to be around 10% perf. drop after setting the quality to highest.
Well, firstly what does this have to do with AMD's dropping Mantle support? Secondly, if you're going to use a case in point, maybe you should check you aren't spreading FUD. Why not actually spend a couple of minutes to see if it can be shown to be just that?
The originator of that comparative video, clarified the issue almost a week ago.
Guys, I was the author of the original video and after speaking with several people, there is something up with my setup or at least there was. Everything is correct and present now and all settings are identical. I was very new to using a capture card as well and hence a few silly errors.
I have since redone the video's and the IQ is identical as far as I can tell.
Maybe you should have bothered checking past the sensationalist fanboy salivating and looked the original posting at OcUK Forums - the whole issue was cleared up this time last week.

BTW (and on topic); What happened to the "over 100 game development teams who signed up for Mantle" ? Sorry guys, start over with DX12, it's all good!
 
Last edited:
Well, firstly what does this have to do with AMD's dropping Mantle support? Secondly, if you're going to use a case in point, maybe you should check you aren't spreading FUD. Why not actually spend a couple of minutes to see if it can be shown to be just that?
The originator of that comparative video, clarified the issue almost a week ago.

Maybe you should have bothered checking past the sensationalist fanboy salivating and looked the original posting at OcUK Forums - the whole issue was cleared up this time last week.

BTW (and on topic); What happened to the "over 100 game development teams who signed up for Mantle" ? Sorry guys, start over with DX12, it's all good!
If you read https://www.youtube.com/watch?v=rswpOo7kcSk

Thought it only fair to redo this battlefield 4 bench with the exact same settings (Max Quality in the NCP). With the IQ exactly the same and settings exactly the same

The issue is with NVIDIA's default settings i.e. "out-of-the-box" settings. Web sites like Hardocp uses driver's default settings for their PC benchmarks.

PS; I do have MSI TwinFrozr GeForce 980 Ti OC. By default, the texture level is set to "quality".
 
The issue is with NVIDIA's default settings i.e. "out-of-the-box" settings. Web sites like Hardocp uses driver's default settings for their PC benchmarks.
I would expect they don't use driver defaults, i would expect they set drivers to let the application make the choice then anything as it would compare cards apples to apples not which one has more lacked settings in driver panel.

I do wear glasses, but watching that video didn't really see anything too noticeable in graphic quality. Really IMO if the graphic difference is so minor that it doesn't immediately pop out then doesn't matter. Kinda like tressfx in tomb raider, didn't really matter since it you were playing whole game watching her hair move around.
 
Last edited:
There is indeed a lot of copy-pasta there. Question is: which came first? Did AMD copy Microsoft's documentation or did Microsoft copy AMD's documentation? I guess at the end of the day it doesn't really matter. Mantle is done, Direct3D and Vulkan go on.

I really think if AMD had usurped DX12 and then ran off and handed it to Khronos, MSFT's lawyers might have mentioned it to a judge somewhere by now. On the other hand, AMD did publicly say they gave all of their Mantle work to MSFT to use in DX12. There is zero evidence that AMD took DX12, or DX12 code, and made Mantle. Not from AMD or MSFT.

If you look at AMD's entire product stack for PC and realize the serial nature of DX11 was really holding it back, it's not too hard to understand the motivation behind Mantle and connect the dots to DX12.
 
So the absence of lawsuits suggests Microsoft copied AMD's documentation. Makes sense.
 
So the absence of lawsuits suggests Microsoft copied AMD's documentation. Makes sense.
Or maybe AMD shared that on purpose with Microsoft. I mean, AMD and MS are not enemies, on the contrary, is on AMD best interest to have good game compatibility on Windows platform. Mantle was a tech demo until D3D12 will be released, then AMD will only optimise the drivers for D3D12. I don't believe Vulkan to be a concurrent for D3D12, but again, used mostly for professional apps.
 
All games porting to Linux will eventually use Vulkan. Vulkan is replacing OpenGL.
 
Back
Top