1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PhysX will Die, Says AMD

Discussion in 'News' started by btarunr, Dec 12, 2008.

  1. btarunr

    btarunr Editor & Senior Moderator Staff Member

    Joined:
    Oct 9, 2007
    Messages:
    30,203 (10.69/day)
    Thanks Received:
    14,600
    Location:
    Hyderabad, India
    In an interview with Godfrey Cheng, Director of Technical Marketing in AMD's Graphics Products Group, Bit-Tech.net has quoted him saying that standards such as PhysX would die due to their proprietary and closed nature. Says Mr. Cheng:

    Bit-Tech.net interviewed the AMD person to get the company's take on EA and 2K's decision to adopt NVIDIA PhysX across all of their worldwide studios, earlier this week. Interestingly, when asked about how the major publishers such as EA adopting PhysX across all of their studios would impact the propagation of the API, Cheng responded with saying that monetary incentives provided to publishing houses alone won't help a great deal in propagating the API, and that the product (PhysX) must be competitive, and that AMD viewed Havoc and its physics simulation technologies as leaders. "Games developers share this view. We will also invest in technologies and partnerships beyond Havok that enhances gameplay." he added. PhysX is a proprietary physics simulation API created by Ageia technologies, which was acquired and developed by NVIDIA. You can read the full Bit-Tech.net interview with Godfrey Cheng here.
     
  2. PCpraiser100 New Member

    Joined:
    Jul 17, 2008
    Messages:
    1,062 (0.42/day)
    Thanks Received:
    68
    Take that Ghost Recon!
     
  3. AsRock

    AsRock TPU addict

    Joined:
    Jun 23, 2007
    Messages:
    12,037 (4.10/day)
    Thanks Received:
    2,227
    Location:
    US
    I take it you mean GRAW and not GR right ?.


    Time will tell i guess.
     
  4. Kreij

    Kreij Senior Monkey Moderator Staff Member

    Joined:
    Feb 6, 2007
    Messages:
    13,881 (4.52/day)
    Thanks Received:
    5,624
    Location:
    Cheeseland (Wisconsin, USA)
    A bold statement from a company who can be bold at the moment.
    We shall see.
     
  5. ShadowFold

    ShadowFold New Member

    Joined:
    Dec 23, 2007
    Messages:
    16,921 (6.15/day)
    Thanks Received:
    1,644
    Location:
    Omaha, NE
    You tell 'em AMD. I think they know something about PII we don't :p They are starting to get cocky. That's either a desperate act or an act of "I know were gonna pwn you".
     
  6. KBD New Member

    Joined:
    Feb 23, 2007
    Messages:
    2,477 (0.81/day)
    Thanks Received:
    279
    Location:
    The Rotten Big Apple
    i think its more about their GPU division, not CPU. one reason they are so bold is because Radeon 5000 series will whoop nvidia's ass again in their view, they may have a surprise on that front. May be them buying ATI wasnt such a bad move afterall, that graphics division is prolly helping keep the company afloat.
     
    WarEagleAU says thanks.
  7. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (7.60/day)
    Thanks Received:
    3,778
    Funny, they say proprietary standards will die, yet, what is DX?

    I think they may be counting their chickens here.
     
  8. kysg New Member

    Joined:
    Aug 20, 2008
    Messages:
    1,255 (0.50/day)
    Thanks Received:
    99
    Location:
    Pacoima, CA
    It's not surprising though, graphics division has been getting in stride. hopefully the 5 series wont be just a die shrunk 4 series and there will continue to be improvement for the red camp.

    wasn't DX the only standard at that time besides openGL??? which really wasn't a standard.

    whoops dbl post my bad.
     
  9. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.17/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    At least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

    The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines in game engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.

    Sure AMD is being bold and attempting to scare away nvidia shareholders, but its true. Havok basically rips Physx in terms of how much its implemented.

    Cannot agree more.
     
    Last edited: Dec 12, 2008
  10. kysg New Member

    Joined:
    Aug 20, 2008
    Messages:
    1,255 (0.50/day)
    Thanks Received:
    99
    Location:
    Pacoima, CA
    well this is obvious though they plan to do things there own, When you been doing stuff that way for a while its really gonna tick off a few people when something new gets introduced, that really doesn't do squat, which makes your day very long, when you could have already been done doing it the old way.
     
  11. Wile E

    Wile E Power User

    Joined:
    Oct 1, 2006
    Messages:
    24,324 (7.60/day)
    Thanks Received:
    3,778
    In GRAW2 I found it to make the game much more enjoyable, and worth the small performance hit. Sufficient doesn't cut it for me. The GPU can handle Physx a hell of a lot better than even the fastest Quad can. I want GPU accelerated physics to become the norm. The CPU just doesn't cut it anymore.
     
    phanbuey says thanks.
  12. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    21,313 (6.07/day)
    Thanks Received:
    7,244
    Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

    The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

    Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
     
    Last edited: Dec 12, 2008
    Crunching for Team TPU More than 25k PPD
  13. [I.R.A]_FBi

    [I.R.A]_FBi

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.58/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    You know what they meant ... a standard in which nne of them have overly powerful controlling interest
     
  14. WarEagleAU

    WarEagleAU Bird of Prey

    Joined:
    Jul 9, 2006
    Messages:
    10,812 (3.29/day)
    Thanks Received:
    547
    Location:
    Gurley, AL
    Proprietary in that its not easily programmable over a wide range. Kind of like Dell Hardware used to be, you couldnt swap out with anything, it had to be Dell specific (as an example here). I think it is bold and cocky and I like it. Will it succeed? WE shall see. I dont think ATI is hurting themselves here either.
     
  15. Haytch

    Haytch

    Joined:
    Apr 7, 2008
    Messages:
    613 (0.23/day)
    Thanks Received:
    53
    Location:
    Australia
    I really enjoyed playing coop GRAW, and ever since ive been awaiting the release of more coop campaign gameplay for those friday or saturday night lan sessions with the guys. I have to admit, GRAW 2's PhysX wasnt the best ive seen, but taking into consideration the games age and official release date of the AGEIA PhysX P1 cards im more inclined to think . . . whatever . . . What matters was the enjoyable hours of fun played.

    Since the GRAW 2 production days, PhysX has come a long way. This can be witnessed via the many examples out there on the internet. Whether it be a fluid demo, particle demo, a ripping flag or my balls bouncing off each other. Either way, the realism it provides is a vital step. EA and 2K seem to think so.

    PhysX enabled reduces performance on lower end systems, and/or systems missing required hardware. Ofcourse we can get the CPU to run the PhysX stuff, but whats going to run everything else . . . .

    Cheng and all of AMD is scared that PhysX will evolve to the only next step it has. To become a part of the A.I, and the game play.
    PhysX cant get worse, and we all know that this technology will eventually evolve. Simulating, and ripping is second grade and will never sum up to be the best.

    I wonder how the 295GTX will cope with all this.
     
  16. Mussels

    Mussels Moderprator Staff Member

    Joined:
    Oct 6, 2004
    Messages:
    43,157 (11.00/day)
    Thanks Received:
    10,432
    a lot of people dont seem to be getting it

    PhysX is an nvidia only way of doing this
    DirectX 11 is doing an open (any video card) version of this.

    ATI/AMD are saying that nvidias closd one will die, and the open version will live on.
     
    phanbuey says thanks.
  17. Swansen New Member

    Joined:
    Nov 18, 2007
    Messages:
    182 (0.07/day)
    Thanks Received:
    9
    I tend to not follow anything EA does, as they generally destroy anything they touch. The performance hit is a big deal for most people, as many don't buy bleeding edge graphics cards because they are to expensive. Eye candy is cool, but i hardly think many will miss a flag blowing in the wind. I think this just all goes to further a problem i've seen as of late, thats game developers focusing on the wrong things, gameplay should always be first, everything else comes after.

    Lol, no i get what they were saying, its just a REALLY poor way of wording it, as DX is closed sourced software. I think most people are missing the fact that DX11 will have physics software built in, which, is very cool.... as long as its done right.
     
    AsRock says thanks.
  18. eidairaman1

    eidairaman1

    Joined:
    Jul 2, 2007
    Messages:
    14,184 (4.85/day)
    Thanks Received:
    2,060
    Sort of Like Nvidia Saying the 4830 is defective when its not. Then Before that The head of NV said they underestimated the R770
     
  19. tkpenalty New Member

    Joined:
    Sep 26, 2006
    Messages:
    6,958 (2.17/day)
    Thanks Received:
    345
    Location:
    Australia, Sydney
    Very well spoken there...
     
  20. phanbuey

    phanbuey

    Joined:
    Nov 13, 2007
    Messages:
    5,225 (1.87/day)
    Thanks Received:
    991
    Location:
    Miami
    is this anything like the time AMD said something about their 'true' quad core being faster thant 2 core 2's glued together? :nutkick: I think he's right - but ONLY if the DX11 way ACTUALLY works like its supposed to... which is a big if
     
    Last edited: Dec 12, 2008
  21. Sapientwolf New Member

    Joined:
    Aug 23, 2006
    Messages:
    57 (0.02/day)
    Thanks Received:
    1
    Well look at when DX was pre-9, no one wanted to use it because openGL was a lot easier to use to achieve the same results. It wasn't until 9 that it was viewed as worthy API. There is a difference when when something proprietary is received well and when there is general easy to use alternative. In this case DX9 onward was offering an ease of use and feature set that developers liked. PhysX is just kind of there offering what can be done with alternatives. Alternatives that work with more systems and are free.
     
  22. VulkanBros

    VulkanBros

    Joined:
    Jan 31, 2005
    Messages:
    1,449 (0.38/day)
    Thanks Received:
    373
    Location:
    The Pico Mundo Grill
    I hope that game engine´s (like the Source engine from Valve) will gain upperhand in this battle - this way no one have to think about buying a specific piece of hardware to get the Physics pling-bing
     
    Crunching for Team TPU
  23. DarkMatter New Member

    Joined:
    Oct 5, 2007
    Messages:
    1,714 (0.61/day)
    Thanks Received:
    184
    Once again they come with the open standard excuse and LIES? Congratulations AMD, you finally made me a Nvidia fanboy, they are HONEST with their intentions at least. There's nothing else that I hate more than LIES and that is a big lie and retorted missinformation. Everything AMD/Ati is saying now is:
    "We won't support a standard where Nvidia is faster until we have something to compete."

    And open your eyes guys Nvidia IS and would probably be faster at physics calculations because they made changes to the architecture, like more CPU-like cahing system, ultra-light branching prediction, etc. Not exactly branching prediction but something that paliates the effects of lacking one. THAT's why Nvidia cards are faster in F@H for example, where more than number crunching is required. At simple number crunching Ati is faster: video conversion.

    That advantage Nvidia has would apply to PhysX, OpenCL, DX11 physics or ANY hardware physics API you'd want to throw in. Their claim is just an excuse until they can prepare something. For instance they say they support HAvok, Intel OWNED Havok. Open standards? Yeah sure.


    One more thing is that, PhysX is a physics API and middleware, with some bits of an engine here and there just as Havok, that can RUN on various platforms unchanged: Ageia PPUs, x86 cpus, Cell Microprocessor, and CUDA and potentially any PowerPC. It does not run directly on Nvidia GPUs, as you may remember CUDA is a x86 emulation API that runs on Nvidia cards. Once OpenCL is out, PhysX will be possible to do through OpenCL just as well as through CUDA. As long as Ati has good OpenCL there shouldn't be any problems, until then they could make PhysX run through CAL/Stream for FREE, BUT they don't want to, because it would be slower. IT'S at simple as that.

    Another lie there, which you have to love, is that they claim that PhysX is just being used for eye candy. IT IS being used only for that AMD, yeah, but tell WHY. Because developers have been said hardware physics will not be supported on Ati GPUs until DX11, that's why. Because they are working hard along with Intel to make that statement true. Nvidia has many demos where PhysX are being used for a lot more, so it can be done.

    AMD is just double acting. It's a shame Ati/AMD, a shame, I remember the days you were honest. I know bad times and experiences make personalities change, but this is inexcusable as well as the fact that all the advertising campaign has been based on bashing every initiative made by Nvidia instead of making your's better.

    This sentence resumes it all (spealing of Havok on their GPU):

    Like back in the 90's, because they are facing competition in something they can't compete, they are downplaying it. They know it's a great thing, they know it's the future, but they don't want that future to kick start yet. YOU SIMPLY CAN'T DOWNPLAY SOMETHING AND SAY IT WILL DIE, WHILE AT THE SAME TIME YOU'RE HARD WORKING ON YOUR OWN BEHIND THE CURTAINS!! AND USING INTEL'S HAVOK!!!
    We know how accelerated graphics history evolved, despite what they said back then the GPU has become the most important thing and so will the hardware physics. Just as back then, they are just HOLDING BACK the revolution until they can be part of it. Clever, from a marketing stand point, but DISHONEST. You won't have my support Ati, you already pulled down another thing that I liked a lot: Stereo 3D. You can cheat me once, but not more.
     
    Last edited: Dec 12, 2008
  24. leonard_222003 New Member

    Joined:
    Jan 29, 2006
    Messages:
    241 (0.07/day)
    Thanks Received:
    25
    You are right Darkmatter , it's true when dx11 will be available physx will be absolete or will slowly die but until then , prey AMD/ATI that Nvidia doesn't get more developers to use physx , it's a cool thing and the performance impact isn't that big for the eye candy it does , it's worth the performance loss.
    I for one think this could kill AMD for good , if 2-3 big games launch with some physx thing and the difference is big bettwen them it could kill AMD graphics departement forever.
    Nvidia could continue to battle in 3dmakrs and games perf. with AMD but it seems to go on for a long time and one advantage like this could end the competition a little faster.
    I feel sorry for them , intel is kicking their asses , now Nvidia too , second place forever for AMD.
     
  25. brian.ca New Member

    Joined:
    Nov 1, 2007
    Messages:
    71 (0.03/day)
    Thanks Received:
    14
    Physx is not going to kill anyone... it's dx10.1 all over again. No one is going to make/sell a game relying on this thing if it's gonna screw over a good portion of the market, or leave them left out of any significant part of the game. Unless AMD/ATI support physx it will never be anything more than optional eyecandy and that limited role will limit it as a factor for people to might buy into it (would you pay $500 for a new card instead of $300 so you can get more broken glass?).

    Some of you are talkiing about studios and games adopting physx but what have you seen so far? Mirror's edge seems to do a bit of showcase work for physx but how many would really buy that game? Being an EA game through and though I personally have a hard time believing the game would offer anything more than what I could get watching the trailer or some demos. Otherwise I haven't seen physx do anything that havok wasn't already doing. There's no extra edge here. There's probably some incentives from Nvidia - but then that comes back to what this guy was saying in the first place;

    "We cannot provide comments on our competitor's business model except that it is awfully hard to sustain support by monetary incentives. The product itself must be competitive. We view Havok technologies and products to be the leaders in physics simulation and this is why we are working with them. Games developers share this view."

    This lends itself to why this stuff won't kill ATI... end of the list is Apple, didn't they put together the OpenCL standard? Which do you think they'll be pushing, CUDA or their own standard? Microsoft will be pushing it's own thing with DX11 down the road. Adobe recently took advantage of Nvidia's CUDA and ATI's Stream if I'm not mistaken... but do you think they'll want to keep on making 2 versions of their products when other people are pushing for a unified standard?

    I guess this is all moot anyways then... if AMD responding to an Nvidia announcement for a reporter will guarantee success for Physx then surely the grandstanding Nvidia took part in vs. Intel will have Larrabee burying all competition.


    At the end of the day people may not like what this guy is saying, why, or how, but it's true. AMD is not going to support Nvidia's proprietary APIs (and why the hell would they?), and with out that support other companies will have less incentive to get on board unless Nvidia provides it. That requires either a superior product or probably cash incentive. Now realisticly.. when the immediate alternatives to Nvidia's systems seem to be OpenCL (Apple - but open), DirectX 11 (Microsoft), and Havok (Intel), do you think these other standards won't have the resources behind them to provide both those things moreso than Nvidia? If you were in AMD's shoes who would you side with? They could do all but seriously... why? It'd just confuse other efforts and probably waste their own resources, and for what? To better prop up the competition that they can beat? So they can claim some moral high ground when they recall how Nv made DX10.1 completely moot?
     
    Last edited: Dec 12, 2008
    a_ump, VulkanBros and HTC say thanks.

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page