• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD's Revolutionary Mantle Graphics API Adopted by Various Developers

One day when Microsoft goes bankrupt and we can no longer play DX based games, is that grounds to ban DX?

I wonder why there are so many computer illiterate people (to put it mildly, but I didn't want to call people retards) on a website supposed to be the hub of IT.

Direct3D is an open API - you can reimplement it, if you want and that's already done for Wine/Linux (alas only for Direct3D 9c and partly 10, 11 is largely dormant since it's very difficult and the project is lacking resources).

And I don't care about PhysX - it's an addon, every game which has it can run without PhysX hardware or PhysX can use your CPU for calculations.

Lame, so lame. I'm really disappointed with the level of technical discussion here. Pure fanboyism, no valid information, no arguments - that's what I see here.
 
PC games ported to consoles is a rarity. the vast majority of games are written using low-level APIs on both consoles and then the code stack is ported to the PC which involves rewriting all the code while keeping the code structure the same (most of the time).

UE3 and UE4 are both mediocre game engines not designed for AA or AAA games.

Yeah, this quote really emphasizes what we're dealing with here - UE3/UE4 power over a hundred games, some of them are, let's say, quad A games.
 
Maybe you didn't know about this, even if you disable "Physx" on some games, the game still using Physx for physic simulation (rigid body, collision detection, etc) and it running on CPU. Maybe you should try Alice the Madness Return to see what i mean. Also, this are the list of games that using only cpu physx so you don't even need to use nvidia gpu and you can run it smoothly on a ATI gpu.

A friend of mine who had an HD6870 bought for a month and then sold a second hand GTX 260 to play Alice. JUST TO PLAY ALICE. Even my 550ti back then was running the game easily with physX enabled. On the other hand a 6 core phenom II oc at 4GHz and his HD6870 was going down to single digits when there where physics effects on screen. I also use Alice all the time as an example of great implementation of PhysX in a game. I ignore Batman that everyone talks about when talking about PhysX. If you play Alice with PhysX enabled, you play a totally different game. Much much better looking. It is like comparing low quality settings with ultra. I played Alice with my 560ti and enjoyed it. If I play it again there is not a chance in a million to play it without an Nvidia card.
So maybe you don't know what you are talking about.

Incase you didn't know, the latest Physx SDK 3.2++ already fix the unoptimized code for latest cpu (x87) and according to this test (with Physx SDK 3.3), it even beat the AMD sponsored Bullet physic by a huge margin . Sadly, there is no games using Physx SDK 3.2++ as of now.

Sadly........

Plus, Physx are free to use, easy to implement on your games, support by many 3d game engine, have a lot of tutorial, demo, support and resources. Heck, even Universities teach it to their students around the world. You should know it very hard to make your own complex physic simulation (You also need to make sure your algorithms are fast enough and not resource hungry).

It is so free that the drivers are locked. Jen-Hsun Huang himself comes to you house and decides what card you will have as primary if you want to enable PhysX in a game. If you don't accept his.... advice, you end up with single digit fps. Nice isn't it?

About mantle running on Nvidia and Intel GPU... i think it will not be possible dude, it low level after all. But I might be wrong though.

PhysX can't run on AMD or Intel gpus dude.

Also, didn't John Carmack already made a point that Nvidia OpenGL are also can optimize as a low level, even AMD said they will released special OpenGL that is on par with Mantle.

Yeap, AMD takes the contract for XBOX ONE, announces Mantle and suddenly everyone remembers OpenGL. Pure coincidence dude, pure coincidence.


I totally agree with you on AMD might increase their price if they can capture more market than Nvidia and also, if more games started using Mantle API. It will be the same with AMD Athlon FX, when AMD still dominate Intel at that times.

I was expecting that you will agree ONLY with the second paragraph.

BTW dude, in case you didn't know, without Mantle you still have optimized 3D graphics in your games. :p

Have a nice day.
 
Everyone posting: Mantle is open, The more we post the more it open.
Everyone posting: Mantle is open, The more we post the more it open.
Everyone posting: Mantle is open, The more we post the more it open.
.....}
until (end of time)

:roll::laugh:
 
I wonder why there are so many computer illiterate people (to put it mildly, but I didn't want to call people retards) on a website supposed to be the hub of IT.

Direct3D is an open API - you can reimplement it, if you want and that's already done for Wine/Linux (alas only for Direct3D 9c and partly 10, 11 is largely dormant since it's very difficult and the project is lacking resources).

And I don't care about PhysX - it's an addon, every game which has it can run without PhysX hardware or PhysX can use your CPU for calculations.

Lame, so lame. I'm really disappointed with the level of technical discussion here. Pure fanboyism, no valid information, no arguments - that's what I see here.

Isn't the lack of resources to develop emulators the same as a closed API? After all, Mantle can eventually become an open API when it is advantageous to do so, all AMD needs is to release the documentation. Anyone with time and resources can emulate it.

As a side note, my dad used punch cards for his programming. Given that there are punch machines in no longer wide availability, the work he did is more or less dead. The cost to revive his work is too prohibitive, I suspect the same can be said for DirectX games when Microsoft is no longer ruling the roost.
 
NVAPI anyone?
 
A friend of mine who had an HD6870 bought for a month and then sold a second hand GTX 260 to play Alice. JUST TO PLAY ALICE. Even my 550ti back then was running the game easily with physX enabled. On the other hand a 6 core phenom II oc at 4GHz and his HD6870 was going down to single digits when there where physics effects on screen. I also use Alice all the time as an example of great implementation of PhysX in a game. I ignore Batman that everyone talks about when talking about PhysX. If you play Alice with PhysX enabled, you play a totally different game. Much much better looking. It is like comparing low quality settings with ultra. I played Alice with my 560ti and enjoyed it. If I play it again there is not a chance in a million to play it without an Nvidia card.
So maybe you don't know what you are talking about.



Sadly........



It is so free that the drivers are locked. Jen-Hsun Huang himself comes to you house and decides what card you will have as primary if you want to enable PhysX in a game. If you don't accept his.... advice, you end up with single digit fps. Nice isn't it?



PhysX can't run on AMD or Intel gpus dude.



Yeap, AMD takes the contract for XBOX ONE, announces Mantle and suddenly everyone remembers OpenGL. Pure coincidence dude, pure coincidence.




I was expecting that you will agree ONLY with the second paragraph.

BTW dude, in case you didn't know, without Mantle you still have optimized 3D graphics in your games. :p

Have a nice day.

First of all, i didn't said that physx run on AMD and Intel GPU, it run on CPU. AMD also making CPU right?

Second, I think you need to read the link that i gave to you.

Third, you need see the list of the games that didn't require any Nvidia GPU since you said earlier that it only support pc using Nvidia GPU. It up to developer to using it on GPU or not. Yeah, if you said Nvidia paid the developers, so does AMD with Dice

Forth, I can play Alice the Madness Return with physx turn off on my AMD A6-3650 using only APU flawlessly. Even when the physx turn off, you still can see the the hair, cloth, etc moving. I guess it the same with this guy

Fifth, what do you mean driver are locked? You think that it like Nvidia Binary Driver for Linux?

Sixth, If you mean Physx is not open source, yeah, so is havok. Also Physx is not a drivers.

Seventh, You don't need Jen-Hsung to comes at your house for physx. You can get it directly in here. If you are game programmer, you should know by now how easy to program physic effect using Physx compare when using Bullet Physic, if you are not, please using this 3d engine to learn.

Lastly, I don't really care dude, as long as the software working all pc hardware
 
I wonder why there are so many computer illiterate people (to put it mildly, but I didn't want to call people retards) on a website supposed to be the hub of IT.

Direct3D is an open API - you can reimplement it, if you want and that's already done for Wine/Linux (alas only for Direct3D 9c and partly 10, 11 is largely dormant since it's very difficult and the project is lacking resources).

And I don't care about PhysX - it's an addon, every game which has it can run without PhysX hardware or PhysX can use your CPU for calculations.

Lame, so lame. I'm really disappointed with the level of technical discussion here. Pure fanboyism, no valid information, no arguments - that's what I see here.

Calling the others "retards" doesn't make you smart.
Not to mention that that last paragraph can easily describe your posts.
 
Last edited:
First of all,PhysX is not API.It only middleware that do layering for the code to pull string within the main engine.

I wonder why there are so many computer illiterate people (to put it mildly, but I didn't want to call people retards) on a website supposed to be the hub of IT.

Direct3D is an open API - you can reimplement it, if you want and that's already done for Wine/Linux (alas only for Direct3D 9c and partly 10, 11 is largely dormant since it's very difficult and the project is lacking resources).

And I don't care about PhysX - it's an addon, every game which has it can run without PhysX hardware or PhysX can use your CPU for calculations.

Lame, so lame. I'm really disappointed with the level of technical discussion here. Pure fanboyism, no valid information, no arguments - that's what I see here.

D3D is open as everyone can use it,but it's closed as none could modify,alter or change to suit their need.For example,if you want to push command outside memory resources you have to wait POOL after GDI had fully flushing buffer.That very inefficient,yet you cant do a Direct Memory Access to accelerate pipelining since micro$h1t forbid them.

My point is there's nothing wrong with AMD SDK or nVidia SDK,both did a good job and give out the most for their product (graphics cards).The real culprit was D3D.Period.
 
First of all, i didn't said that physx run on AMD and Intel GPU, it run on CPU. AMD also making CPU right?

Second, I think you need to read the link that i gave to you.

Third, you need see the list of the games that didn't require any Nvidia GPU since you said earlier that it only support pc using Nvidia GPU. It up to developer to using it on GPU or not. Yeah, if you said Nvidia paid the developers, so does AMD with Dice

Forth, I can play Alice the Madness Return with physx turn off on my AMD A6-3650 using only APU flawlessly. Even when the physx turn off, you still can see the the hair, cloth, etc moving. I guess it the same with this guy

Fifth, what do you mean driver are locked? You think that it like Nvidia Binary Driver for Linux?

Sixth, If you mean Physx is not open source, yeah, so is havok. Also Physx is not a drivers.

Seventh, You don't need Jen-Hsung to comes at your house for physx. You can get it directly in here. If you are game programmer, you should know by now how easy to program physic effect using Physx compare when using Bullet Physic, if you are not, please using this 3d engine to learn.

Lastly, I don't really care dude, as long as the software working all pc hardware


You ignore many facts, also your English are much worst than mine. You make no sense and also you don't understand what I am writing to you.
 
Lame, so lame. I'm really disappointed with the level of technical discussion here. Pure fanboyism, no valid information, no arguments - that's what I see here.
That pot... Its black! LOOK OUT its calling the kettle black too! Oh no3s!

Glass houses, don't throw stones if you live in them...

birdie said:
if, for instance, AMD bankrupts all those Mantle "powered" games won't run anywhere else ever again.
With respect, it is hard to take one serious that doesn't appear to have a grasp of the most basic of concepts on the technology in the first place... ;)

birdie said:
or PhysX can use your CPU for calculations
While true... there is a MONUMENTAL performance hit on most titles where you can do that so it really isn't a viable option in most cases.
 
You ignore many facts, also your English are much worst than mine. You make no sense and also you don't understand what I am writing to you.

Sorry, English are not my main language and I rarely use English for speaking or writing.

So, what are the fact that I ignore my friend? Can at-least you list it out for me so I can learn something new? At-least I included the link that support my believe. I don't know if you ever opened any link that I give to you though.

Also, I'm forgot to mention this :

I'm didn't said that if you turn on Physx GPU acceleration games that it will run faster running on CPU. I said, even if you turn off physx, you still get all the physic simulation that might be even better than other physic middle-ware out there without sacrifice the performance (Alice Madness Return Physx turn off still have hair, cloth simulation). Also, you should tried the games that using only CPU acceleration such as Castlevania Lord Of The Shadow and maybe games that using UE3 games.

Also, you said when turning off Physx, it likes comparing low quality settings with ultra. It true that you will not get any fancy effects, but do you honestly think that cheap ass company like nVidia will give all their benefit for free to the others manufacturers if it doesn't safe their interest? They are business company after all, they need to sell product. Also, when you buy low-mid end graphic card do you expect it to can run on ultra setting on all games? Or if you buy standard toyota camry you expected that it have mileage like hyrid toyota camry. Or if you buy an apple, you do expected it to taste like an orange?

Oh, if you forgot that nvidia already said they can license CUDA and Physx to AMD , and they start bashing nvidia
 
Last edited:
Sorry, English are not my main language and I rarely use English for speaking or writing.

So, what are the fact that I ignore my friend? Can at-least you list it out for me so I can learn something new? At-least I included the link that support my believe. I don't know if you ever opened any link that I give to you though.

Also, I'm forgot to mention this :

I'm didn't said that if you turn on Physx GPU acceleration games that it will run faster running on CPU. I said, even if you turn off physx, you still get all the physic simulation that might be even better than other physic middle-ware out there without sacrifice the performance (Alice Madness Return Physx turn off still have hair, cloth simulation). Also, you should tried the games that using only CPU acceleration such as Castlevania Lord Of The Shadow and maybe games that using UE3 games.

Also, you said when turning off Physx, it likes comparing low quality settings with ultra. It true that you will not get any fancy effects, but do you honestly think that cheap ass company like nVidia will give all their benefit for free to the others manufacturers if it doesn't safe their interest? They are business company after all, they need to sell product. Also, when you buy low-mid end graphic card do you expect it to can run on ultra setting on all games? Or if you buy standard toyota camry you expected that it have mileage like hyrid toyota camry. Or if you buy an apple, you do expected it to taste like an orange?

Oh, if you forgot that nvidia already said they can license CUDA and Physx to AMD , and they start bashing nvidia

There's one major point all the physx brigade seam to be missing, it can all be done easily and quickly on any direct compute or opencl gpu without the devs paying licenses to nvidia per feature and that fee reflecting on what I pay.
Amd =open = qualcom = Samsung = sony = ms all driving the same features and functions in the same coherent ways.
 
This I did not know. I think most folk assumed given what AMD said about it initially was that it would require GCN architecture. If what you say is true then that's fab. :rockout:

All we need is for Nvidia and AMD to actually fucking co-operate on something for once and we'll all be so much better off for it. I'll still buy Nvidia because I'm an idiot that only buys expensive hardware and someone else will buy AMD because they need a new hairdryer.

j/k people......

But really, it's like Apple v Samsung - just co-operate you money whoring bastards.

Then we will get patent law suit bullshit monthly. :mad:
 
I saw someone say Direct3D was closed a few posts back--it's actually not completely closed. You just can't hack it all to hell. You can make changes and customize the way certain things function--that's why certain DX11 games look and run like crippity crap and others look and run great. It's also the annoying reason every game you install on Steam has to re-install DirectX...
 
Like a said on the whole you trolling every Amd thread ,, Yawn

What I find funny is your the only one that responds to it.

That said, I might jump back to AMD eventually due to the performance. All depends on what the aftermarket 290 non X can do.

I go where the performance is not fruity APIs that even if they take off wont be worth a damn for another 2-3 years at which point these GPUs are obsolete.

Still I do hope my lack of faith in AMD and their Mantle API is proved wrong and they hit a homer with it I just seriously doubt it.

Whats really funny is my AMD vs Intel / Nvidia purchases.

7800GTX 512 / 8800 GTS 640mb / GTX 780 = 3 GPUs

4870x2 / 5450 / 5850x2 / 6970x2 / 6950x2 / 7970 = 9 GPUs

Athlon x2 4400+ / Phenom II 940 / Phenom II 965 = 3 CPUs
2500k / 3770k / = 2 CPUs

Test system hardware is not counted.

So I do troll the threads but I still buy AMD more often than not.
 
Last edited:
Call me a fanatic, but I want Mantle to die ASAP.

A proprietary 3D rendering API just shouldn't exist - if, for instance, AMD bankrupts all those Mantle "powered" games won't run anywhere else ever again.

Even though Glide didn't have enough time to live, we still a whole lot of games you won't be able to run because Glide is not supported.

Mantle is not good news for anyone - even for AMD since they are dumping a sh*tload of money into the technology which will have a time span of several years tops, and which won't be supported by up to 75% of x86 GPUs (yeah, I count Intel too - since a lot of people buy laptops or desktops without discrete graphics) - ARM solutions will likely never support it at all.

you do realize AMD opened up Mantle to to Nvidia and likely Intel right? Mantle is NOT only for AMD cards.
 
you do realize AMD opened up Mantle to to Nvidia and likely Intel right? Mantle is NOT only for AMD cards.

Sure, but they are developing Mantle with their own cards in mind.
 
Sure, but they are developing Mantle with their own cards in mind.

So? Why would they develop anything for other companies? That's up to the other companies.
 
Sure, but they are developing Mantle with their own cards in mind.

Nvidia and Intel know this. That is why Mantle will have to be very good for them to start adopting it. And since Mantle is open both Nvidia and Intel can use the code and develop it specifically for their own chips. I am just glad that we could be getting away from directx. Man, Microsoft has been sucking wind these past 2 years.
 
So? Why would they develop anything for other companies? That's up to the other companies.


Im not complaining, Im just making a statement. :oops:
 
Nvidia and Intel know this. That is why Mantle will have to be very good for them to start adopting it. And since Mantle is open both Nvidia and Intel can use the code and develop it specifically for their own chips. I am just glad that we could be getting away from directx. Man, Microsoft has been sucking wind these past 2 years.


I think what you mean is the lack of DX over 9 adoption has been sucking wind, the improvements are there, but the hardware base has made it hard to push for a really good working of DX10 or higher currently.
 
I think what you mean is the lack of DX over 9 adoption has been sucking wind, the improvements are there, but the hardware base has made it hard to push for a really good working of DX10 or higher currently.

Nah, I mean Microsoft has been sucking wind. Windows Phone sucked, Windows Tablet sales suck, Windows Surface sales suck, Windows 8 adoption sucks, XBOXONE by all measures is going to be trounced by PS4. Now you have companies actively looking for ways to get away from Windows as a PC gaming platform. Bye bye, Microsoft!
 
Back
Top