Thursday, December 11th 2008

PhysX will Die, Says AMD

In an interview with Godfrey Cheng, Director of Technical Marketing in AMD's Graphics Products Group, Bit-Tech.net has quoted him saying that standards such as PhysX would die due to their proprietary and closed nature. Says Mr. Cheng:
"There is no plan for closed and proprietary standards like PhysX," said Cheng. "As we have emphasised with our support for OpenCL and DX11, closed and proprietary standards will die."
Bit-Tech.net interviewed the AMD person to get the company's take on EA and 2K's decision to adopt NVIDIA PhysX across all of their worldwide studios, earlier this week. Interestingly, when asked about how the major publishers such as EA adopting PhysX across all of their studios would impact the propagation of the API, Cheng responded with saying that monetary incentives provided to publishing houses alone won't help a great deal in propagating the API, and that the product (PhysX) must be competitive, and that AMD viewed Havoc and its physics simulation technologies as leaders. "Games developers share this view. We will also invest in technologies and partnerships beyond Havok that enhances gameplay." he added. PhysX is a proprietary physics simulation API created by Ageia technologies, which was acquired and developed by NVIDIA. You can read the full Bit-Tech.net interview with Godfrey Cheng here.
Add your own comment

67 Comments on PhysX will Die, Says AMD

#2
AsRock
TPU addict
PCpraiser100Take that Ghost Recon!
I take it you mean GRAW and not GR right ?.


Time will tell i guess.
Posted on Reply
#3
Kreij
Senior Monkey Moderator
A bold statement from a company who can be bold at the moment.
We shall see.
Posted on Reply
#4
ShadowFold
You tell 'em AMD. I think they know something about PII we don't :p They are starting to get cocky. That's either a desperate act or an act of "I know were gonna pwn you".
Posted on Reply
#5
KBD
ShadowFoldYou tell 'em AMD. I think they know something about PII we don't :p They are starting to get cocky. That's either a desperate act or an act of "I know were gonna pwn you".
i think its more about their GPU division, not CPU. one reason they are so bold is because Radeon 5000 series will whoop nvidia's ass again in their view, they may have a surprise on that front. May be them buying ATI wasnt such a bad move afterall, that graphics division is prolly helping keep the company afloat.
Posted on Reply
#6
Wile E
Power User
Funny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.
Posted on Reply
#7
kysg
KBDi think its more about their GPU division, not CPU. one reason they are so bold is because Radeon 5000 series will whoop nvidia's ass again in their view, they may have a surprise on that front. May be them buying ATI wasnt such a bad move afterall, that graphics division is prolly helping keep the company afloat.
It's not surprising though, graphics division has been getting in stride. hopefully the 5 series wont be just a die shrunk 4 series and there will continue to be improvement for the red camp.
Wile EFunny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.
wasn't DX the only standard at that time besides openGL??? which really wasn't a standard.

whoops dbl post my bad.
Posted on Reply
#8
tkpenalty
At least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines in game engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.

Sure AMD is being bold and attempting to scare away nvidia shareholders, but its true. Havok basically rips Physx in terms of how much its implemented.
"It should be noted that title support for GPU accelerated physics simulation is NOT the end game. The end game is having GPU physics as an integral part of game play and not just eye candy. If it is optional eye candy, GPU physics will not gain traction. The titles we have seen today with shattering glass and cloth waving in the wind is not integral to game play and the impact on the game's experience is minimal. We are looking for ways to integrate GPU physics better into game play. Or even things like AI instead of focusing on eye candy / effects physics."

Cheng's final words make a lot of sense and I find myself agreeing with him. We said something similar when Nvidia announced that the PC version of Mirror's Edge was delayed because of the PhysX implementation which, following a brief hands-on preview last week, does nothing but add some additional eye candy. None of it influences the actual gameplay experience.
Cannot agree more.
Posted on Reply
#9
kysg
tkpenaltyAt least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.
well this is obvious though they plan to do things there own, When you been doing stuff that way for a while its really gonna tick off a few people when something new gets introduced, that really doesn't do squat, which makes your day very long, when you could have already been done doing it the old way.
Posted on Reply
#10
Wile E
Power User
tkpenaltyAt least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.
In GRAW2 I found it to make the game much more enjoyable, and worth the small performance hit. Sufficient doesn't cut it for me. The GPU can handle Physx a hell of a lot better than even the fastest Quad can. I want GPU accelerated physics to become the norm. The CPU just doesn't cut it anymore.
Posted on Reply
#11
newtekie1
Semi-Retired Folder
Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
Posted on Reply
#12
[I.R.A]_FBi
newtekie1Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.
You know what they meant ... a standard in which nne of them have overly powerful controlling interest
Posted on Reply
#13
WarEagleAU
Bird of Prey
Proprietary in that its not easily programmable over a wide range. Kind of like Dell Hardware used to be, you couldnt swap out with anything, it had to be Dell specific (as an example here). I think it is bold and cocky and I like it. Will it succeed? WE shall see. I dont think ATI is hurting themselves here either.
Posted on Reply
#14
Haytch
I really enjoyed playing coop GRAW, and ever since ive been awaiting the release of more coop campaign gameplay for those friday or saturday night lan sessions with the guys. I have to admit, GRAW 2's PhysX wasnt the best ive seen, but taking into consideration the games age and official release date of the AGEIA PhysX P1 cards im more inclined to think . . . whatever . . . What matters was the enjoyable hours of fun played.

Since the GRAW 2 production days, PhysX has come a long way. This can be witnessed via the many examples out there on the internet. Whether it be a fluid demo, particle demo, a ripping flag or my balls bouncing off each other. Either way, the realism it provides is a vital step. EA and 2K seem to think so.

PhysX enabled reduces performance on lower end systems, and/or systems missing required hardware. Ofcourse we can get the CPU to run the PhysX stuff, but whats going to run everything else . . . .

Cheng and all of AMD is scared that PhysX will evolve to the only next step it has. To become a part of the A.I, and the game play.
PhysX cant get worse, and we all know that this technology will eventually evolve. Simulating, and ripping is second grade and will never sum up to be the best.

I wonder how the 295GTX will cope with all this.
Posted on Reply
#15
Mussels
Freshwater Moderator
a lot of people dont seem to be getting it

PhysX is an nvidia only way of doing this
DirectX 11 is doing an open (any video card) version of this.

ATI/AMD are saying that nvidias closd one will die, and the open version will live on.
Posted on Reply
#16
Swansen
HaytchEA and 2K seem to think so.
I wonder how the 295GTX will cope with all this.
I tend to not follow anything EA does, as they generally destroy anything they touch. The performance hit is a big deal for most people, as many don't buy bleeding edge graphics cards because they are to expensive. Eye candy is cool, but i hardly think many will miss a flag blowing in the wind. I think this just all goes to further a problem i've seen as of late, thats game developers focusing on the wrong things, gameplay should always be first, everything else comes after.
MusselsDirectX 11 is doing an open (any video card) version of this.
Lol, no i get what they were saying, its just a REALLY poor way of wording it, as DX is closed sourced software. I think most people are missing the fact that DX11 will have physics software built in, which, is very cool.... as long as its done right.
Posted on Reply
#17
eidairaman1
The Exiled Airman
newtekie1Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
Sort of Like Nvidia Saying the 4830 is defective when its not. Then Before that The head of NV said they underestimated the R770
Posted on Reply
#18
tkpenalty
SwansenI tend to not follow anything EA does, as they generally destroy anything they touch. The performance hit is a big deal for most people, as many don't buy bleeding edge graphics cards because they are to expensive. Eye candy is cool, but i hardly think many will miss a flag blowing in the wind. I think this just all goes to further a problem i've seen as of late, thats game developers focusing on the wrong things, gameplay should always be first, everything else comes after.



Lol, no i get what they were saying, its just a REALLY poor way of wording it, as DX is closed sourced software. I think most people are missing the fact that DX11 will have physics software built in, which, is very cool.... as long as its done right.
Very well spoken there...
Posted on Reply
#19
phanbuey
is this anything like the time AMD said something about their 'true' quad core being faster thant 2 core 2's glued together? :nutkick: I think he's right - but ONLY if the DX11 way ACTUALLY works like its supposed to... which is a big if
Posted on Reply
#20
Sapientwolf
Wile EFunny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.
Well look at when DX was pre-9, no one wanted to use it because openGL was a lot easier to use to achieve the same results. It wasn't until 9 that it was viewed as worthy API. There is a difference when when something proprietary is received well and when there is general easy to use alternative. In this case DX9 onward was offering an ease of use and feature set that developers liked. PhysX is just kind of there offering what can be done with alternatives. Alternatives that work with more systems and are free.
Posted on Reply
#21
VulkanBros
I hope that game engine´s (like the Source engine from Valve) will gain upperhand in this battle - this way no one have to think about buying a specific piece of hardware to get the Physics pling-bing
Posted on Reply
#22
DarkMatter
Once again they come with the open standard excuse and LIES? Congratulations AMD, you finally made me a Nvidia fanboy, they are HONEST with their intentions at least. There's nothing else that I hate more than LIES and that is a big lie and retorted missinformation. Everything AMD/Ati is saying now is:
"We won't support a standard where Nvidia is faster until we have something to compete."

And open your eyes guys Nvidia IS and would probably be faster at physics calculations because they made changes to the architecture, like more CPU-like cahing system, ultra-light branching prediction, etc. Not exactly branching prediction but something that paliates the effects of lacking one. THAT's why Nvidia cards are faster in F@H for example, where more than number crunching is required. At simple number crunching Ati is faster: video conversion.

That advantage Nvidia has would apply to PhysX, OpenCL, DX11 physics or ANY hardware physics API you'd want to throw in. Their claim is just an excuse until they can prepare something. For instance they say they support HAvok, Intel OWNED Havok. Open standards? Yeah sure.


One more thing is that, PhysX is a physics API and middleware, with some bits of an engine here and there just as Havok, that can RUN on various platforms unchanged: Ageia PPUs, x86 cpus, Cell Microprocessor, and CUDA and potentially any PowerPC. It does not run directly on Nvidia GPUs, as you may remember CUDA is a x86 emulation API that runs on Nvidia cards. Once OpenCL is out, PhysX will be possible to do through OpenCL just as well as through CUDA. As long as Ati has good OpenCL there shouldn't be any problems, until then they could make PhysX run through CAL/Stream for FREE, BUT they don't want to, because it would be slower. IT'S at simple as that.

Another lie there, which you have to love, is that they claim that PhysX is just being used for eye candy. IT IS being used only for that AMD, yeah, but tell WHY. Because developers have been said hardware physics will not be supported on Ati GPUs until DX11, that's why. Because they are working hard along with Intel to make that statement true. Nvidia has many demos where PhysX are being used for a lot more, so it can be done.

AMD is just double acting. It's a shame Ati/AMD, a shame, I remember the days you were honest. I know bad times and experiences make personalities change, but this is inexcusable as well as the fact that all the advertising campaign has been based on bashing every initiative made by Nvidia instead of making your's better.

This sentence resumes it all (spealing of Havok on their GPU):
Our guidance was end of this year or early next year but, first and foremost, it will be driven by the milestones that we hit. To put some context behind GPU based physics acceleration, it is really just at the beginning. Like 3D back in the early 1990s. Our competition has made some aggressive claims about support for GPU physics acceleration by the end of this year. I.e. Support in many titles....but we can count the titles on one hand or just one or two fingers.
Like back in the 90's, because they are facing competition in something they can't compete, they are downplaying it. They know it's a great thing, they know it's the future, but they don't want that future to kick start yet. YOU SIMPLY CAN'T DOWNPLAY SOMETHING AND SAY IT WILL DIE, WHILE AT THE SAME TIME YOU'RE HARD WORKING ON YOUR OWN BEHIND THE CURTAINS!! AND USING INTEL'S HAVOK!!!
We know how accelerated graphics history evolved, despite what they said back then the GPU has become the most important thing and so will the hardware physics. Just as back then, they are just HOLDING BACK the revolution until they can be part of it. Clever, from a marketing stand point, but DISHONEST. You won't have my support Ati, you already pulled down another thing that I liked a lot: Stereo 3D. You can cheat me once, but not more.
Posted on Reply
#23
leonard_222003
You are right Darkmatter , it's true when dx11 will be available physx will be absolete or will slowly die but until then , prey AMD/ATI that Nvidia doesn't get more developers to use physx , it's a cool thing and the performance impact isn't that big for the eye candy it does , it's worth the performance loss.
I for one think this could kill AMD for good , if 2-3 big games launch with some physx thing and the difference is big bettwen them it could kill AMD graphics departement forever.
Nvidia could continue to battle in 3dmakrs and games perf. with AMD but it seems to go on for a long time and one advantage like this could end the competition a little faster.
I feel sorry for them , intel is kicking their asses , now Nvidia too , second place forever for AMD.
Posted on Reply
#24
brian.ca
Physx is not going to kill anyone... it's dx10.1 all over again. No one is going to make/sell a game relying on this thing if it's gonna screw over a good portion of the market, or leave them left out of any significant part of the game. Unless AMD/ATI support physx it will never be anything more than optional eyecandy and that limited role will limit it as a factor for people to might buy into it (would you pay $500 for a new card instead of $300 so you can get more broken glass?).

Some of you are talkiing about studios and games adopting physx but what have you seen so far? Mirror's edge seems to do a bit of showcase work for physx but how many would really buy that game? Being an EA game through and though I personally have a hard time believing the game would offer anything more than what I could get watching the trailer or some demos. Otherwise I haven't seen physx do anything that havok wasn't already doing. There's no extra edge here. There's probably some incentives from Nvidia - but then that comes back to what this guy was saying in the first place;

"We cannot provide comments on our competitor's business model except that it is awfully hard to sustain support by monetary incentives. The product itself must be competitive. We view Havok technologies and products to be the leaders in physics simulation and this is why we are working with them. Games developers share this view."
If they can market these technogies to all those big goons (Adobe, Microsoft, SunMicro, Apple...) and create user-friendly apps, boom! ATI would be certified dead in no time
This lends itself to why this stuff won't kill ATI... end of the list is Apple, didn't they put together the OpenCL standard? Which do you think they'll be pushing, CUDA or their own standard? Microsoft will be pushing it's own thing with DX11 down the road. Adobe recently took advantage of Nvidia's CUDA and ATI's Stream if I'm not mistaken... but do you think they'll want to keep on making 2 versions of their products when other people are pushing for a unified standard?
Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
I guess this is all moot anyways then... if AMD responding to an Nvidia announcement for a reporter will guarantee success for Physx then surely the grandstanding Nvidia took part in vs. Intel will have Larrabee burying all competition.


At the end of the day people may not like what this guy is saying, why, or how, but it's true. AMD is not going to support Nvidia's proprietary APIs (and why the hell would they?), and with out that support other companies will have less incentive to get on board unless Nvidia provides it. That requires either a superior product or probably cash incentive. Now realisticly.. when the immediate alternatives to Nvidia's systems seem to be OpenCL (Apple - but open), DirectX 11 (Microsoft), and Havok (Intel), do you think these other standards won't have the resources behind them to provide both those things moreso than Nvidia? If you were in AMD's shoes who would you side with? They could do all but seriously... why? It'd just confuse other efforts and probably waste their own resources, and for what? To better prop up the competition that they can beat? So they can claim some moral high ground when they recall how Nv made DX10.1 completely moot?
Posted on Reply
#25
Tatty_Two
Gone Fishing
LMAO....of course AMD would say this, there might actually be gamers out there buying green just for Physx, they might just as well say "but the HD4870 because it's 4 times faster than a GTX280", their argument, whether you beleive it or not bare's little credibility when it comes from THE major competitior.

As a matter of interest, NVidia's Market share has dropped by quite a bit this year, but is still has a much bigger market share than AMD/ATI, this is quite an interesting little piece.....NVidia actually admitting that they were caught "on the hop" by AMD, similar things have popped up around the web on the same subject but this seems to quite nicely bring all the angles together...........

www.crn.com/it-channel/212001134
Posted on Reply
Add your own comment
Apr 25th, 2024 18:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts