1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PhysX only using one cpu core?

Discussion in 'Graphics Cards' started by MrMilli, Sep 27, 2009.

  1. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    Of course it will be a mix, but not a mix in the same die, at least not for performance and high-end markets. That would be a suicide, a GPU die is already big, later CPU dies are too, a mix would be imposible. Imagine a 1000 mm^2 thing that has a 400w TDP. No way.

    But right now if you have to spend $500 on a CPU and a GPU, if you are a gamer you will spend $200 on a CPU and $300 in the GPU, but if you are a scientist, video editor or an economist(lol) you would usually spend $450 on the CPU and $50 on the GPU. In the future it will most probably be $150 and $350, respectively, if not even more in favor of the GPU.
  2. RejZoR

    RejZoR

    Joined:
    Oct 2, 2004
    Messages:
    4,425 (1.24/day)
    Thanks Received:
    872
    Location:
    Europe/Slovenia
    @Benetanegia
    You still don't understand. In the past we at least had 6 nodes and 6 poligons. Today we don't even have that. The object is not even there in non-PhysX mode. Thats just plain dumb.
    We could talk about normal version with 50 nodes and just as much polygons and about hi-def objects with thousands of nodes and twice as much polygons. But we have no normal version and hi-def one. Thats wtf for me.

    Particles? I've seen them in GeForce 2. Lightning Demo. Or Quake 3 Arena with CorkScrew mod.
    Ralgun slug emitted up to 999 physically affected sparks upon hitting the wall. And that was like in year 2000. Glass breaking in Red Faction 2001 looked better than PhysX one in Mirror's Edge. Except the one in Red Faction could run on AthlonXP 2400+ easily while PhysX glass cannot run smooth on 4GHz i7 Quad core. If that doesn't rise any of your eyebrows, then i'm not sure what will.

    Cloth simulation was already done with Unreal Engine (just look at the flags uin UT99 they look freakin awesome even today and it's a game from 1999). Not at such extent, but it was done. On 10 times weaker hardware. Today? The flags are just gone. Missing. Not there. Unless you have "uber" PhysX gfx card. Pheh. Clothing was also simulated in Hitman series and Splinter Cell. Again in a slightly simplified way, but it was there. Running smooth on AthlonXP 2400+.
    Don't tell me they couldn't do all that with 10 fold of everything on today' hardware.
    They instead rather remove that object than make it normal definition.

    I don't care how much horse power you need today or how many clusters gfx cards have. That's completely irrelevant information. The most important thing is relation between time, hardware performance increases in this time compared to what we've seen in the past.
    If we've seen basic cloth simulation, advanced particles, destruction, ragdolls etc in year 2000 on funny weak hardware (for today's standards), one would expect something at least on that level or improved by 10 times of that today on powerful quad cores, loads of memory and 10 times faster graphic cards. But have we seen any of it? Ok, partially on PhysX enabled graphic cards. But what about CPU physics? Flags are just gone, broken glass just fades out even before it hits the ground, static environments etc. Thank god at least ragdolls remained.
    I wouldn't mind blocky flags like the ones in games from 2000. At least it was there and i could interact with it. But i don't even have the flag there because i don't have GeForce card. It's just gone. Entirely. LMAO.

    You're not looking at the bigger picture. Sure the guy at NGOHQ made PhysX hack. But imagine all the crap users would be throwing at ATI if this hack failed to work properly in certain games or if games were crashing. No one would blame NGOHQ, they would rush blaming ATI instead. Been there, seen that numerous times with Microsoft. It was NVIDIA driver that crashed and the users were spitting over Windows Vista and how crappy it is made. But it wasn't even Vista's fault. It was NVIDIA's driver (or any other in that matter) that crashed.
    So i perfectly understand why ATI refused to cooperate. They'd go the PhysX way if NVIDIA would send them their entire documentation, SDK's, everything, with same full capability as NVIDIA. But supporting a hack made by 3rd party, that's just not logical. Sorry. The same reason why laptop companies don't support any other graphic drivers than the ones on their webpage? Because of the exact same reason. Troubleshooting and tech support and bad word that could spread about brand "X" because someone with hacked drivers fried his graphic card or something.

    I don't mind PhysX, it's a great thing actually. I just hate the way how NVIDIA is pushing it around and placing stupid restrictions on it. And all these stupid restrictions are just damaging evolution of games and physics. Imagine what would happen if PhysX was an open standard that anyone could implement and use on ANY graphic card. I can bet 1000 eur that we'd see at least 5 high profile games with awesome physics effects that everyone could enjoy. Not just GeForce users. It's really that simple.
  3. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    I agree on many points you made there, regarding not having the simple one and all, it sucks but please, pay attention to what I quoted, your own words, and after thinking it, tell me: the lack of those things is Nvidia's fault or is developers fault? It's the API's fault or how the API has been used by the developer? Why are you blaming Nvidia?

    EDIT: In Mirror's edge flags and most cloth objects are replaced by simpler animated ones for example. Different developer different decision.

    The same sentence is valid for another thing that has been questioned here. That Nvidia retired the option of running Ati+Nvidia for physics. Was a malicious move, or was it a business decision made by the fact that they couldn't test properly if it would work well with all the cards. Newer Ati cards? What would happen after an Ati driver update? Would it still work? Who would get the blame if it didn't work? And why would they have to do all the research, while it was Ati who would have the benefit? Yeah, Nvidia would benefit a bit too, but in their opinion probably the money they would have to spend was more than what they would get. Bussiness decision, end of story.

    Sorry, but I have to make the question of why so many of you Ati owners can think so thoroughtly about some things, as you did above, but other times, you can't come up with what I have just said? If that is not bias, what it is then?
    Last edited: Sep 29, 2009
  4. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,315 (6.35/day)
    Thanks Received:
    3,342
    Location:
    IA, USA
    Encoding/Decoding. It is more economical to do on 10 CPUs than 1 GPU. Not to mention how much more memory processors have available to them compared to GPUs and the more directly link to the hard drive(s).

    There's no SSE instructions designed to help with physics which F@H uses. As such, they have to brute force it and the bigger your hammer, the more damage it does. An instruction set designed for physic calculations would pretty much put CPUs back on top.



    If you dedicated SPs, obviously they are being used: for physics, not for graphics.


    Can't say I noticed. And by the way, I do remember the smoke in Arkham Asylum pissing me off. I don't remember if it was because of a framerate drop (8800 GT as well, albeit sickly for the last few months) or I couldn't see shit. Either way, it annoyed me and would be better off without.


    Have you examined NVIDIA's source code to confirm that? Of course not--it's closed source. What we do know is that PhysX is extremely biased towards NVIDIA/Ageia hardware and as such, it is bad for the market.



    If there was an open standard for physics, money wouldn't be involved. It would work as well as manufacturers and developers make it work. Truth be told, I doubt there is enough demand for a scientific physics API because games don't need 100% accurate physics-- they need 70% accurate physics which means at least a 10,000% reduction in work load. Accurate phyisics is about the only thing in game develop few people care about. Hell, the last game I saw that used physics on bullet trajectories to a positive end was back in the 1990s with Delta Force: Land Warrior and Task Force Dagger. That was nice. Did it require a beefy CPU and GPU? Nope and nope. Basic physics is more than enough. I'd rather they focus their attention on gameplay mechanics like adding more variety to side missions.

    If Dell makes a good product, why not? Competition is good--open standards breed fair competition.
    Crunching for Team TPU
  5. HalfAHertz

    HalfAHertz

    Joined:
    May 4, 2009
    Messages:
    1,883 (0.99/day)
    Thanks Received:
    378
    Location:
    Singapore
    Read my post. Physx is a proprietary software coded for NVIDIA HARDWARE. Ati cannot change anything about it, so the only way to come close to the same level of performance is to mimic NVIDIAs hardware...Now I do hope you see the problem here.
  6. KainXS

    KainXS

    Joined:
    Sep 25, 2007
    Messages:
    5,600 (2.26/day)
    Thanks Received:
    501
    Well, when you ask nvidia's support you get a gray answer, you get the it could have been for business purposes or it could have been because of stability concerns,

    but 1 thing is for sure despite that, using nvidia cards with physX worked perfectly fine back when I had my old HD4870X2 with my 8600GTS, like 5 months ago, then I sold my 4870x2 and got my old 8800GS, now on the newer drivers when I use my GS for physX, it dosen't work with my friends HD4890, but on the older ones it works perfectly,

    it makes you wonder, maybe they added a feature to the driver and it caused a bug I don't know, but it worked before nicely so I don't know what happened, but it dosen't really affect me anymore since I don't have a ATI card right now so I don really care either.


    But I look at it like this, Nvidia is a company, companies goals are to make money, and if people buy their cards and can't use them(even though they could before) unless they remove their competitors card, then thats a nice business stance right there(If not the BEST),
    Last edited: Sep 29, 2009
  7. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,315 (6.35/day)
    Thanks Received:
    3,342
    Location:
    IA, USA
    @HalfAHertz & KainXS: Exactly why I think a lawsuit is brewing. NVIDIA is taking part in anticompetitive behavior towards AMD.
    Crunching for Team TPU
  8. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    WTH, if there's one thing that is done much much faster with the help of the GPU that's encoding/decoding, You've been living in a cave?

    SIMD (Single Instruction Multiple Data) means that if you have to use 4 SPs, you have to "use" all of them, that is 16 or 24, but trully you are only using 4...


    What can I say except, meh. The I didn't notice excuse is too used man. Look at the links I provided above, if you want to know about what GPU physics can do.

    If you pay, you have access to the code and you can change it too. That's no different from Havok.The difference is that PhysX costs $50.000 + $1000 per developer, while Havok costed $200.000 last time I checked.

    So everything is based on I (FordGT90) want this, I want that and I don't care about physics so to hell with them. XD

    STALKER has good physics based bullet trajectories and ARMA too BTW.

    And you would pay twice?

    Yeah, only problem is there is none, but I see too much PhysX bashing and no Havok bashing. Open minds are as required as open standards.

    Can I edit?

    But, they see no problem using Havok. How so?
  9. FordGT90Concept

    FordGT90Concept "I go fast!1!11!1!"

    Joined:
    Oct 13, 2008
    Messages:
    13,315 (6.35/day)
    Thanks Received:
    3,342
    Location:
    IA, USA
    I'm not getting in another pissing contest with you.
    MrMilli says thanks.
    Crunching for Team TPU
  10. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    Sorry, but it's the truth, every time we have a discussion about physics in games, you come up with the same: I rather see this or I rather see that and I don't think it adds anything. You are not even close to accepting that a lot of people might want other things than the ones you want.

    Do I want better game mechanics? Of course.

    Will the lack of better physics ensure better or different game mechanics? No and no. On the contrary, the inclusion of better physics does nothing but increase the options for new game mechanics.

    PhysX is in no conflict with anything else in games. More so GPU PhysX is just an added feature that doesn't interfer with the game. Is BM:AA without GPU physics any worse than other games? No, so then why all the bashing, it should end up there. Seriously.
  11. RejZoR

    RejZoR

    Joined:
    Oct 2, 2004
    Messages:
    4,425 (1.24/day)
    Thanks Received:
    872
    Location:
    Europe/Slovenia
    @Benetanegia
    You seem to have answer for everything... but not much makes sense.
    Why are you bringing Havok into all this. It works on ANY CPU. So what has ATI to do with it?
    I can run Havok games on VIA, Intel or AMD CPU. It doesn't matter. So, do i care if it's proprietary technology? No, not really. Besides, where have you seen anything that Havok is proprietary, coded specifically for Intel? Because that's just not true. Intel owns the Havok brand and the team behind it, but they don't make it proprietary because of that.
  12. tigger

    tigger I'm the only one

    Joined:
    Mar 20, 2006
    Messages:
    10,174 (3.35/day)
    Thanks Received:
    1,397
    I think physx sucks and always did.
  13. rpsgc

    rpsgc

    Joined:
    Feb 9, 2007
    Messages:
    695 (0.26/day)
    Thanks Received:
    133
    Location:
    Portugal
    Just ignore the fanboy.
  14. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    Why I bring Havok into this? Because it's been said more than once that PhysX should die and Havok should be used instead. I'm bringing it just for comparison. PhysX runs on every CPU too, it's not a propietary API that runs only on Nvidia hardware, not at all. If you want the expanded capabilities then yes, but if Nvidia didn't push for GPU PhysX in those games you would get the same as if you disable GPU PhysX. If you can't run GPU PhysX you are not getting an inferior product.

    You ask why I say that Havok is coded specifically to run better on Intel. How do you know it isn't? How do you know that the reason that Intel CPU have been better for games almost always, even when AMD CPUs were much faster in general computing was not because of that? When if there is one company that has been caught in unloyal and illegal behavior that is Intel. I bring in Havok, because there's as much proof of that happening as there is of that happening with PhysX, that is NONE.
  15. RejZoR

    RejZoR

    Joined:
    Oct 2, 2004
    Messages:
    4,425 (1.24/day)
    Thanks Received:
    872
    Location:
    Europe/Slovenia
    Lol, you're just not getting it. Usual crap PhysX runs on every CPU. But HW PhysX doesn't. Can't you separate that apart!?
    And how do i know it's not coded for Intel? Um, maybe because i was running it on AMD CPU smoothly? Isn't that proof enough by itself?
  16. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    By crap PhysX, you mean the ones that are as good as Havok, that crap? The one that runs on AMD CPUs just as well as on Intel or Via ones? What you don't get is that AMD doesn't want PhysX accelerated on their graphics cards and that's the end of the story. When running on the CPU it runs as well in AMD as it does in Intel. The Good PhysX can't run on CPUs, period, it's time you get that already. :)

    If you think they can run, it's time you show me equivalent physics running on CPUs. I'll save you time, there's none. It's not until Havok has started the GPU Havok until they have started doing the same things. Do you get it? Nvidia wanted Ati/AMD to use PhysX, it was AMD who didn't want. Nvidia is NOT making PhysX run better on their hardware then in the competition, simply because it doesn't run in the competition at his competitor's request.
  17. HalfAHertz

    HalfAHertz

    Joined:
    May 4, 2009
    Messages:
    1,883 (0.99/day)
    Thanks Received:
    378
    Location:
    Singapore
    IMO this is a mute point in the discussion. You are not being subjective.The Physx code can run on a x86 CPU but is not optimised for it. It was only optimised to run on the Ageia PPU, just as it is currently only optimised to run on Nvidia graphics. It was never meant to run good on a CPU, because neither Ageia nor Nvidia produced CPUs. This will not change untill the Physx API becomes open source and somebody interested from developing it further picks it up.
    Currently no one can change the code except the proprietor. Nvidia is not going to waste their time and money optimising in for x86 with no forceable financial profit, because in the end they sell GPUs, not CPUs.

    I hope that by now you see my point, i don't really want to waste my time explaining this any further.
  18. [I.R.A]_FBi

    [I.R.A]_FBi New Member

    Joined:
    May 19, 2007
    Messages:
    7,664 (2.94/day)
    Thanks Received:
    540
    Location:
    c:\programs\kitteh.exe
    im for ignoring the fanboi ... anyone else?
  19. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    False. Absolutely false. PhysX is running in over 100 games and it's doing very well, I mean that it is well optimized. GPU GPU GPU GPU GPU GPU GPU GPU PhysX, you get it? GPU PhysX noooooooo it's not optimized to run on the CPU, big surprise! It's GPU PhysX! Ey! It's GPU PhysX here, I'm not optimized to run well on CPUs, that's why my friend CPU PhysX comes along with me!

    So as long as CPU PhysX runs as well as other physics APIs and it does, then everything is well. Did I explain myself now??

    List of games that use hardware accelerated physics: http://www.nzone.com/object/nzone_physxgames_home.html
    List that use PhysX: http://www.nzone.com/object/nzone_physxgames_all.html

    I have to say one more thing, maybe you guys understand this way:

    Graphics wise. If a game has been developed to run on a 8800 or faster and that's the minimum requirement, do you try to run it on a 8400? I think not right? That's the same. They could make it run on the CPU, sure, if they dumbed down the physics a lot, but then they would be just as crap as the normal CPU version is.

    To quote yourself:

    I hope that by now you see my point, i don't really want to waste my time explaining this any further.
    Last edited: Sep 29, 2009
  20. HalfAHertz

    HalfAHertz

    Joined:
    May 4, 2009
    Messages:
    1,883 (0.99/day)
    Thanks Received:
    378
    Location:
    Singapore
    So, in a way you're contradicting yourself, correct? You yourself provided two seperate links. One of games running the dumbed down and simplified CPU Physx and a seperate one using the more advanced GPU Physx.
  21. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    No I'm not contradicting myself. How so?

    Both are the same PhysX, both run the same libraries.The difference relies in the number of calculations being made. The develper when creating the GPU version, it creates it using the available power in GPU which is an order of magnitude bigger than that in the CPU. That's why it's called GPU version, because it requires too much power and only GPUs are capable of that, well or the PPU, which is basically the same. Develpers on their own, they would use only one version, and that version must run in every PC out there and in the consoles, so it's pretty dumbed down. This is no different with Havok (I name it because along with PhysX they have 50% of the games...) or any other API being in use. To back this up, I took some screenshots in previous posts. So Nvidia convinced them to make one more version, and they make it strong enough so that it's worth the effort and for kicks of course. The developers wouldn't include flags, papers and such if they were not creating the GPU version. When I talk about the versions, take it as if I was saying lowtextures/high textures, the difference does not exist in the form, but in the number of calculations being made.
    Last edited: Sep 29, 2009
  22. RejZoR

    RejZoR

    Joined:
    Oct 2, 2004
    Messages:
    4,425 (1.24/day)
    Thanks Received:
    872
    Location:
    Europe/Slovenia
    Nevermind. Arguing with someone who clearly doesn't understand game physics is not fun at all.
  23. Benetanegia

    Benetanegia New Member

    Joined:
    Sep 11, 2009
    Messages:
    2,683 (1.52/day)
    Thanks Received:
    694
    Location:
    Reaching your left retina.
    Much better than most of you apparently.
  24. Drizzt5

    Drizzt5 New Member

    Joined:
    Jul 17, 2008
    Messages:
    612 (0.28/day)
    Thanks Received:
    38
    I don't think Microsoft cares though...
    For gaming they are all about the xbox 360.

    I wish they would though :)
  25. SNiiPE_DoGG New Member

    Joined:
    Apr 2, 2009
    Messages:
    582 (0.30/day)
    Thanks Received:
    135
    Point A) OMG trash on the ground, that is the best part about physX!

    Point B) all of the games that use PhysX suck the big one, do I really need to post links? (and No UTIII is not a physx game, it has one level that uses phyx and you need to DL it separately)

    Point C) your completely missing the point that all of the magic wonderful amazing GPU PHYSX CRAP you are promoting can easily be run on a CPU, now, today using a proper cpu physics engine. MOST IMPORTANTLY: Without a hit to FPS in the game!

    now refute the points WITHOUT changing the subject or GTFO

Currently Active Users Viewing This Thread: 2 (0 members and 2 guests)

Share This Page