• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

PhysX only using one cpu core?

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
The future is not going to be CPU's or GPU's its going to be a mix

IE . . . . I hate to say it but a design similar to Larrabees will be CPU's or the future, even though I pray Larrabee fails.

Of course it will be a mix, but not a mix in the same die, at least not for performance and high-end markets. That would be a suicide, a GPU die is already big, later CPU dies are too, a mix would be imposible. Imagine a 1000 mm^2 thing that has a 400w TDP. No way.

But right now if you have to spend $500 on a CPU and a GPU, if you are a gamer you will spend $200 on a CPU and $300 in the GPU, but if you are a scientist, video editor or an economist(lol) you would usually spend $450 on the CPU and $50 on the GPU. In the future it will most probably be $150 and $350, respectively, if not even more in favor of the GPU.
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
@Benetanegia
You still don't understand. In the past we at least had 6 nodes and 6 poligons. Today we don't even have that. The object is not even there in non-PhysX mode. Thats just plain dumb.
We could talk about normal version with 50 nodes and just as much polygons and about hi-def objects with thousands of nodes and twice as much polygons. But we have no normal version and hi-def one. Thats wtf for me.

Particles? I've seen them in GeForce 2. Lightning Demo. Or Quake 3 Arena with CorkScrew mod.
Ralgun slug emitted up to 999 physically affected sparks upon hitting the wall. And that was like in year 2000. Glass breaking in Red Faction 2001 looked better than PhysX one in Mirror's Edge. Except the one in Red Faction could run on AthlonXP 2400+ easily while PhysX glass cannot run smooth on 4GHz i7 Quad core. If that doesn't rise any of your eyebrows, then i'm not sure what will.

Cloth simulation was already done with Unreal Engine (just look at the flags uin UT99 they look freakin awesome even today and it's a game from 1999). Not at such extent, but it was done. On 10 times weaker hardware. Today? The flags are just gone. Missing. Not there. Unless you have "uber" PhysX gfx card. Pheh. Clothing was also simulated in Hitman series and Splinter Cell. Again in a slightly simplified way, but it was there. Running smooth on AthlonXP 2400+.
Don't tell me they couldn't do all that with 10 fold of everything on today' hardware.
They instead rather remove that object than make it normal definition.

I don't care how much horse power you need today or how many clusters gfx cards have. That's completely irrelevant information. The most important thing is relation between time, hardware performance increases in this time compared to what we've seen in the past.
If we've seen basic cloth simulation, advanced particles, destruction, ragdolls etc in year 2000 on funny weak hardware (for today's standards), one would expect something at least on that level or improved by 10 times of that today on powerful quad cores, loads of memory and 10 times faster graphic cards. But have we seen any of it? Ok, partially on PhysX enabled graphic cards. But what about CPU physics? Flags are just gone, broken glass just fades out even before it hits the ground, static environments etc. Thank god at least ragdolls remained.
I wouldn't mind blocky flags like the ones in games from 2000. At least it was there and i could interact with it. But i don't even have the flag there because i don't have GeForce card. It's just gone. Entirely. LMAO.

And it's open, and it would have been open to Ati users if when Nvidia gave PhysX to Ati for free with no conditions they had said yes, or when the guy from ngohq.com made one moded driver that made it posible, if they had supported him, like Nvidia did, instead of scare the hell out of him with demands. But of course, back then Ati had nothing to compete in that arena and Nvidia cards would have destroyed their cards, so they said no no nooo! And now poor Ati users can't do anything but cry, and say it's not that great.

You're not looking at the bigger picture. Sure the guy at NGOHQ made PhysX hack. But imagine all the crap users would be throwing at ATI if this hack failed to work properly in certain games or if games were crashing. No one would blame NGOHQ, they would rush blaming ATI instead. Been there, seen that numerous times with Microsoft. It was NVIDIA driver that crashed and the users were spitting over Windows Vista and how crappy it is made. But it wasn't even Vista's fault. It was NVIDIA's driver (or any other in that matter) that crashed.
So i perfectly understand why ATI refused to cooperate. They'd go the PhysX way if NVIDIA would send them their entire documentation, SDK's, everything, with same full capability as NVIDIA. But supporting a hack made by 3rd party, that's just not logical. Sorry. The same reason why laptop companies don't support any other graphic drivers than the ones on their webpage? Because of the exact same reason. Troubleshooting and tech support and bad word that could spread about brand "X" because someone with hacked drivers fried his graphic card or something.

I don't mind PhysX, it's a great thing actually. I just hate the way how NVIDIA is pushing it around and placing stupid restrictions on it. And all these stupid restrictions are just damaging evolution of games and physics. Imagine what would happen if PhysX was an open standard that anyone could implement and use on ANY graphic card. I can bet 1000 eur that we'd see at least 5 high profile games with awesome physics effects that everyone could enjoy. Not just GeForce users. It's really that simple.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
But imagine all the crap users would be throwing at ATI if this hack failed to work properly in certain games or if games were crashing.

I agree on many points you made there, regarding not having the simple one and all, it sucks but please, pay attention to what I quoted, your own words, and after thinking it, tell me: the lack of those things is Nvidia's fault or is developers fault? It's the API's fault or how the API has been used by the developer? Why are you blaming Nvidia?

EDIT: In Mirror's edge flags and most cloth objects are replaced by simpler animated ones for example. Different developer different decision.

The same sentence is valid for another thing that has been questioned here. That Nvidia retired the option of running Ati+Nvidia for physics. Was a malicious move, or was it a business decision made by the fact that they couldn't test properly if it would work well with all the cards. Newer Ati cards? What would happen after an Ati driver update? Would it still work? Who would get the blame if it didn't work? And why would they have to do all the research, while it was Ati who would have the benefit? Yeah, Nvidia would benefit a bit too, but in their opinion probably the money they would have to spend was more than what they would get. Bussiness decision, end of story.

Sorry, but I have to make the question of why so many of you Ati owners can think so thoroughtly about some things, as you did above, but other times, you can't come up with what I have just said? If that is not bias, what it is then?
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
Tell me a task that requires heavy amount of complex calculations, that can't be split into simple ones, as is the case with F@H.
Encoding/Decoding. It is more economical to do on 10 CPUs than 1 GPU. Not to mention how much more memory processors have available to them compared to GPUs and the more directly link to the hard drive(s).

There's no SSE instructions designed to help with physics which F@H uses. As such, they have to brute force it and the bigger your hammer, the more damage it does. An instruction set designed for physic calculations would pretty much put CPUs back on top.



If you dedicated SPs, obviously they are being used: for physics, not for graphics.


** Simple compared to what GPU PhysX can do, but it's still way more complex than any other smoke seen in a game to date.
Can't say I noticed. And by the way, I do remember the smoke in Arkham Asylum pissing me off. I don't remember if it was because of a framerate drop (8800 GT as well, albeit sickly for the last few months) or I couldn't see shit. Either way, it annoyed me and would be better off without.


How many times do I have to explain this? PhysX is a multiplatform API that can run on the CPU or the GPU (or Ageia PPU, Cell, Xenos). It will take as much as it can from everywhere. If there is no CUDA compatible card or Ageia PPU, it runs everything in the CPU***. There's no difference between the (GPU) expanded mode and the standard mode, except that it adds a lot of detail*. Why do you think you can run the enhanced mode without a Nvidia card otherwise?
Have you examined NVIDIA's source code to confirm that? Of course not--it's closed source. What we do know is that PhysX is extremely biased towards NVIDIA/Ageia hardware and as such, it is bad for the market.



The existence of 3rd party physics developers is a good thing actually. Not only they save up a lot of money and time to develpers, but they also have a higher expertise than them. Would you want Dell starting to make CPUs, GPUs, ram, etc, instead of buying them from other comanies that are 100% dedicated to their respective products? Outsourcing is essential nowadays.
If there was an open standard for physics, money wouldn't be involved. It would work as well as manufacturers and developers make it work. Truth be told, I doubt there is enough demand for a scientific physics API because games don't need 100% accurate physics-- they need 70% accurate physics which means at least a 10,000% reduction in work load. Accurate phyisics is about the only thing in game develop few people care about. Hell, the last game I saw that used physics on bullet trajectories to a positive end was back in the 1990s with Delta Force: Land Warrior and Task Force Dagger. That was nice. Did it require a beefy CPU and GPU? Nope and nope. Basic physics is more than enough. I'd rather they focus their attention on gameplay mechanics like adding more variety to side missions.

If Dell makes a good product, why not? Competition is good--open standards breed fair competition.
 
Joined
May 4, 2009
Messages
1,970 (0.36/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
Read my post. Physx is a proprietary software coded for NVIDIA HARDWARE. Ati cannot change anything about it, so the only way to come close to the same level of performance is to mimic NVIDIAs hardware...Now I do hope you see the problem here.
 
Joined
Sep 25, 2007
Messages
5,965 (0.99/day)
Location
New York
Processor AMD Ryzen 9 5950x, Ryzen 9 5980HX
Motherboard MSI X570 Tomahawk
Cooling Be Quiet Dark Rock Pro 4(With Noctua Fans)
Memory 32Gb Crucial 3600 Ballistix
Video Card(s) Gigabyte RTX 3080, Asus 6800M
Storage Adata SX8200 1TB NVME/WD Black 1TB NVME
Display(s) Dell 27 Inch 165Hz
Case Phanteks P500A
Audio Device(s) IFI Zen Dac/JDS Labs Atom+/SMSL Amp+Rivers Audio
Power Supply Corsair RM850x
Mouse Logitech G502 SE Hero
Keyboard Corsair K70 RGB Mk.2
VR HMD Samsung Odyssey Plus
Software Windows 10
Well, when you ask nvidia's support you get a gray answer, you get the it could have been for business purposes or it could have been because of stability concerns,

but 1 thing is for sure despite that, using nvidia cards with physX worked perfectly fine back when I had my old HD4870X2 with my 8600GTS, like 5 months ago, then I sold my 4870x2 and got my old 8800GS, now on the newer drivers when I use my GS for physX, it dosen't work with my friends HD4890, but on the older ones it works perfectly,

it makes you wonder, maybe they added a feature to the driver and it caused a bug I don't know, but it worked before nicely so I don't know what happened, but it dosen't really affect me anymore since I don't have a ATI card right now so I don really care either.


But I look at it like this, Nvidia is a company, companies goals are to make money, and if people buy their cards and can't use them(even though they could before) unless they remove their competitors card, then thats a nice business stance right there(If not the BEST),
 
Last edited:

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
@HalfAHertz & KainXS: Exactly why I think a lawsuit is brewing. NVIDIA is taking part in anticompetitive behavior towards AMD.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Encoding/Decoding. It is more economical to do on 10 CPUs than 1 GPU. Not to mention how much more memory processors have available to them compared to GPUs and the more directly link to the hard drive(s).

WTH, if there's one thing that is done much much faster with the help of the GPU that's encoding/decoding, You've been living in a cave?

If you dedicated SPs, obviously they are being used: for physics, not for graphics.

SIMD (Single Instruction Multiple Data) means that if you have to use 4 SPs, you have to "use" all of them, that is 16 or 24, but trully you are only using 4...


Can't say I noticed. And by the way, I do remember the smoke in Arkham Asylum pissing me off. I don't remember if it was because of a framerate drop (8800 GT as well, albeit sickly for the last few months) or I couldn't see shit. Either way, it annoyed me and would be better off without.

What can I say except, meh. The I didn't notice excuse is too used man. Look at the links I provided above, if you want to know about what GPU physics can do.

Have you examined NVIDIA's source code to confirm that? Of course not--it's closed source. What we do know is that PhysX is extremely biased towards NVIDIA/Ageia hardware and as such, it is bad for the market.

If you pay, you have access to the code and you can change it too. That's no different from Havok.The difference is that PhysX costs $50.000 + $1000 per developer, while Havok costed $200.000 last time I checked.

If there was an open standard for physics, money wouldn't be involved. It would work as well as manufacturers and developers make it work. Truth be told, I doubt there is enough demand for a scientific physics API because games don't need 100% accurate physics-- they need 70% accurate physics which means at least a 10,000% reduction in work load. Accurate phyisics is about the only thing in game develop few people care about. Hell, the last game I saw that used physics on bullet trajectories to a positive end was back in the 1990s with Delta Force: Land Warrior and Task Force Dagger. That was nice. Did it require a beefy CPU and GPU? Nope and nope. Basic physics is more than enough. I'd rather they focus their attention on gameplay mechanics like adding more variety to side missions.

So everything is based on I (FordGT90) want this, I want that and I don't care about physics so to hell with them. XD

STALKER has good physics based bullet trajectories and ARMA too BTW.

If Dell makes a good product, why not?

And you would pay twice?

Competition is good--open standards breed fair competition.

Yeah, only problem is there is none, but I see too much PhysX bashing and no Havok bashing. Open minds are as required as open standards.

Read my post. Physx is a proprietary software coded for NVIDIA HARDWARE. Ati cannot change anything about it, so the only way to come close to the same level of performance is to mimic NVIDIAs hardware...Now I do hope you see the problem here.

Can I edit?

Read my post. Havok is a proprietary software coded for Intel hardware. Ati cannot change anything about it, so the only way to come close to the same level of performance is to mimic Intel hardware...Now I do hope you see the problem here.

But, they see no problem using Havok. How so?
 

FordGT90Concept

"I go fast!1!11!1!"
Joined
Oct 13, 2008
Messages
26,259 (4.65/day)
Location
IA, USA
System Name BY-2021
Processor AMD Ryzen 7 5800X (65w eco profile)
Motherboard MSI B550 Gaming Plus
Cooling Scythe Mugen (rev 5)
Memory 2 x Kingston HyperX DDR4-3200 32 GiB
Video Card(s) AMD Radeon RX 7900 XT
Storage Samsung 980 Pro, Seagate Exos X20 TB 7200 RPM
Display(s) Nixeus NX-EDG274K (3840x2160@144 DP) + Samsung SyncMaster 906BW (1440x900@60 HDMI-DVI)
Case Coolermaster HAF 932 w/ USB 3.0 5.25" bay + USB 3.2 (A+C) 3.5" bay
Audio Device(s) Realtek ALC1150, Micca OriGen+
Power Supply Enermax Platimax 850w
Mouse Nixeus REVEL-X
Keyboard Tesoro Excalibur
Software Windows 10 Home 64-bit
Benchmark Scores Faster than the tortoise; slower than the hare.
I'm not getting in another pissing contest with you.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Sorry, but it's the truth, every time we have a discussion about physics in games, you come up with the same: I rather see this or I rather see that and I don't think it adds anything. You are not even close to accepting that a lot of people might want other things than the ones you want.

Do I want better game mechanics? Of course.

Will the lack of better physics ensure better or different game mechanics? No and no. On the contrary, the inclusion of better physics does nothing but increase the options for new game mechanics.

PhysX is in no conflict with anything else in games. More so GPU PhysX is just an added feature that doesn't interfer with the game. Is BM:AA without GPU physics any worse than other games? No, so then why all the bashing, it should end up there. Seriously.
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
@Benetanegia
You seem to have answer for everything... but not much makes sense.
Why are you bringing Havok into all this. It works on ANY CPU. So what has ATI to do with it?
I can run Havok games on VIA, Intel or AMD CPU. It doesn't matter. So, do i care if it's proprietary technology? No, not really. Besides, where have you seen anything that Havok is proprietary, coded specifically for Intel? Because that's just not true. Intel owns the Havok brand and the team behind it, but they don't make it proprietary because of that.
 
Joined
Feb 9, 2007
Messages
696 (0.11/day)
Location
Portugal
System Name Quiet Gaming PC v5
Processor AMD Ryzen 5 5600
Motherboard Gigabyte B550 Aorus Elite V2 (rev. 1.2)
Cooling Thermalright TRUE Rev. C
Memory 2x16GB TeamGroup Vulcan Z Red DDR4-3600 CL18
Video Card(s) XFX Speedster SWFT 309 RX 6700 XT
Storage WD Blue SN550 1TB NVMe // TeamGroup MP34 4TB NVMe // Crucial MX100 512GB
Display(s) MSI G27CQ4 E2
Case Fractal Design Define R4
Audio Device(s) Realtek ALC1200
Power Supply Corsair RM850x
Mouse Logitech G502 Hero
Keyboard MKPlus Slayer M3 RGB Mechanical Keyboard
Software Windows 11 Pro
Just ignore the fanboy.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
@Benetanegia
You seem to have answer for everything... but not much makes sense.
Why are you bringing Havok into all this. It works on ANY CPU. So what has ATI to do with it?
I can run Havok games on VIA, Intel or AMD CPU. It doesn't matter. So, do i care if it's proprietary technology? No, not really. Besides, where have you seen anything that Havok is proprietary, coded specifically for Intel? Because that's just not true. Intel owns the Havok brand and the team behind it, but they don't make it proprietary because of that.

Why I bring Havok into this? Because it's been said more than once that PhysX should die and Havok should be used instead. I'm bringing it just for comparison. PhysX runs on every CPU too, it's not a propietary API that runs only on Nvidia hardware, not at all. If you want the expanded capabilities then yes, but if Nvidia didn't push for GPU PhysX in those games you would get the same as if you disable GPU PhysX. If you can't run GPU PhysX you are not getting an inferior product.

You ask why I say that Havok is coded specifically to run better on Intel. How do you know it isn't? How do you know that the reason that Intel CPU have been better for games almost always, even when AMD CPUs were much faster in general computing was not because of that? When if there is one company that has been caught in unloyal and illegal behavior that is Intel. I bring in Havok, because there's as much proof of that happening as there is of that happening with PhysX, that is NONE.
 
Joined
Oct 2, 2004
Messages
13,791 (1.94/day)
Lol, you're just not getting it. Usual crap PhysX runs on every CPU. But HW PhysX doesn't. Can't you separate that apart!?
And how do i know it's not coded for Intel? Um, maybe because i was running it on AMD CPU smoothly? Isn't that proof enough by itself?
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
Lol, you're just not getting it. Usual crap PhysX runs on every CPU. But HW PhysX doesn't. Can't you separate that apart!?
And how do i know it's not coded for Intel? Um, maybe because i was running it on AMD CPU smoothly? Isn't that proof enough by itself?

By crap PhysX, you mean the ones that are as good as Havok, that crap? The one that runs on AMD CPUs just as well as on Intel or Via ones? What you don't get is that AMD doesn't want PhysX accelerated on their graphics cards and that's the end of the story. When running on the CPU it runs as well in AMD as it does in Intel. The Good PhysX can't run on CPUs, period, it's time you get that already. :)

If you think they can run, it's time you show me equivalent physics running on CPUs. I'll save you time, there's none. It's not until Havok has started the GPU Havok until they have started doing the same things. Do you get it? Nvidia wanted Ati/AMD to use PhysX, it was AMD who didn't want. Nvidia is NOT making PhysX run better on their hardware then in the competition, simply because it doesn't run in the competition at his competitor's request.
 
Joined
May 4, 2009
Messages
1,970 (0.36/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
Why I bring Havok into this? Because it's been said more than once that PhysX should die and Havok should be used instead. I'm bringing it just for comparison. PhysX runs on every CPU too, it's not a propietary API that runs only on Nvidia hardware, not at all. If you want the expanded capabilities then yes, but if Nvidia didn't push for GPU PhysX in those games you would get the same as if you disable GPU PhysX. If you can't run GPU PhysX you are not getting an inferior product.

You ask why I say that Havok is coded specifically to run better on Intel. How do you know it isn't? How do you know that the reason that Intel CPU have been better for games almost always, even when AMD CPUs were much faster in general computing was not because of that? When if there is one company that has been caught in unloyal and illegal behavior that is Intel. I bring in Havok, because there's as much proof of that happening as there is of that happening with PhysX, that is NONE.

IMO this is a mute point in the discussion. You are not being subjective.The Physx code can run on a x86 CPU but is not optimised for it. It was only optimised to run on the Ageia PPU, just as it is currently only optimised to run on Nvidia graphics. It was never meant to run good on a CPU, because neither Ageia nor Nvidia produced CPUs. This will not change untill the Physx API becomes open source and somebody interested from developing it further picks it up.
Currently no one can change the code except the proprietor. Nvidia is not going to waste their time and money optimising in for x86 with no forceable financial profit, because in the end they sell GPUs, not CPUs.

I hope that by now you see my point, i don't really want to waste my time explaining this any further.
 
Joined
May 19, 2007
Messages
7,662 (1.24/day)
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
im for ignoring the fanboi ... anyone else?
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
IMO this is a mute point in the discussion. You are not being subjective.The Physx code can run on a x86 CPU but is not optimised for it. It was only optimised to run on the Ageia PPU, just as it is currently only optimised to run on Nvidia graphics. It was never meant to run good on a CPU, because neither Ageia nor Nvidia produced CPUs. This will not change untill the Physx API becomes open source and somebody interested from developing it furthe,r picks it up.
Currently no one can change the code except the proprietor. Nvidia is not going to waste their time and money optimising in for x86 with no forceable financial profit, because in the end they sell GPUs, not CPUs.

I hope that by now you see my point, i don't really want to waste my time explaining this any further.

False. Absolutely false. PhysX is running in over 100 games and it's doing very well, I mean that it is well optimized. GPU GPU GPU GPU GPU GPU GPU GPU PhysX, you get it? GPU PhysX noooooooo it's not optimized to run on the CPU, big surprise! It's GPU PhysX! Ey! It's GPU PhysX here, I'm not optimized to run well on CPUs, that's why my friend CPU PhysX comes along with me!

So as long as CPU PhysX runs as well as other physics APIs and it does, then everything is well. Did I explain myself now??

List of games that use hardware accelerated physics: http://www.nzone.com/object/nzone_physxgames_home.html
List that use PhysX: http://www.nzone.com/object/nzone_physxgames_all.html

I have to say one more thing, maybe you guys understand this way:

Graphics wise. If a game has been developed to run on a 8800 or faster and that's the minimum requirement, do you try to run it on a 8400? I think not right? That's the same. They could make it run on the CPU, sure, if they dumbed down the physics a lot, but then they would be just as crap as the normal CPU version is.

To quote yourself:

I hope that by now you see my point, i don't really want to waste my time explaining this any further.
 
Last edited:
Joined
May 4, 2009
Messages
1,970 (0.36/day)
Location
Bulgaria
System Name penguin
Processor R7 5700G
Motherboard Asrock B450M Pro4
Cooling Some CM tower cooler that will fit my case
Memory 4 x 8GB Kingston HyperX Fury 2666MHz
Video Card(s) IGP
Storage ADATA SU800 512GB
Display(s) 27' LG
Case Zalman
Audio Device(s) stock
Power Supply Seasonic SS-620GM
Software win10
False. Absolutely false. PhysX is running in over 100 games and it's doing very well, I mean that it is well optimized. GPU GPU GPU GPU GPU GPU GPU GPU PhysX, you get it? GPU PhysX noooooooo it's not optimized to run on the CPU, big surprise! It's GPU PhysX! Ey! It's GPU PhysX here, I'm not optimized to run well on CPUs, that's why my friend CPU PhysX comes along with me!

So as long as CPU PhysX runs as well as other physics APIs and it does, then everything is well. Did I explain myself now??

List of games that use hardware accelerated physics: http://www.nzone.com/object/nzone_physxgames_home.html
List that use PhysX: http://www.nzone.com/object/nzone_physxgames_all.html

I have to say one more thing, maybe you guys understand this way:

Graphics wise. If a game has been developed to run on a 8800 or faster and that's the minimum requirement, do you try to run it on a 8400? I think not right? That's the same. They could make it run on the CPU, sure, if they dumbed down the physics a lot, but then they would be just as crap as the normal CPU version is.

To quote yourself:

I hope that by now you see my point, i don't really want to waste my time explaining this any further.

So, in a way you're contradicting yourself, correct? You yourself provided two seperate links. One of games running the dumbed down and simplified CPU Physx and a seperate one using the more advanced GPU Physx.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
So, in a way you're contradicting yourself, correct? You yourself provided two seperate links. One of games running the dumbed down and simplified CPU Physx and a seperate one using the more advanced GPU Physx.

No I'm not contradicting myself. How so?

Both are the same PhysX, both run the same libraries.The difference relies in the number of calculations being made. The develper when creating the GPU version, it creates it using the available power in GPU which is an order of magnitude bigger than that in the CPU. That's why it's called GPU version, because it requires too much power and only GPUs are capable of that, well or the PPU, which is basically the same. Develpers on their own, they would use only one version, and that version must run in every PC out there and in the consoles, so it's pretty dumbed down. This is no different with Havok (I name it because along with PhysX they have 50% of the games...) or any other API being in use. To back this up, I took some screenshots in previous posts. So Nvidia convinced them to make one more version, and they make it strong enough so that it's worth the effort and for kicks of course. The developers wouldn't include flags, papers and such if they were not creating the GPU version. When I talk about the versions, take it as if I was saying lowtextures/high textures, the difference does not exist in the form, but in the number of calculations being made.
 
Last edited:
Joined
Jul 17, 2008
Messages
643 (0.11/day)
System Name PIA
Processor Ryzen 5 3600
Motherboard ASRock B550M Steel Legend
Cooling Corsair H50
Memory 2x8GB Gskill DDR4 3600
Video Card(s) MSI GTX 1660 Super
Storage Samsung EVO 970 1TB
Display(s) 2xAOC Curved 24" 144hz
Case Cooler Master MasterBox Q300L
Power Supply CORSAIR RMX (White) RM750x
Mouse Logitech G305
Keyboard Logitech G915 TKL
Software Windows 10 Pro
You know who I would like to step in here? Microsoft. For the most part, Windows is the road these cards drive on like we drive are cars down the road. We as motorists are restricted to a set of standards (speed limits, safety equipment, etc.) that we must conform to. I'd like to see Microsoft step up and say "Ok, this is the way it is going to be done." Setup standards for Windows and work in collaboration with hardware manufacturers. Have unified physics and the like, and let the video card companies duel it out through performance.

I don't think Microsoft cares though...
For gaming they are all about the xbox 360.

I wish they would though :)
 
Joined
Apr 2, 2009
Messages
582 (0.11/day)
System Name Flow
Processor AMD Phenom II 955 BE
Motherboard MSI 790fx GD70
Cooling Water
Memory 8gb Crucial Ballistix Tracer Blue ddr3 1600 c8
Video Card(s) 2 x XFX 6850 - Yet to go under water.
Storage Corsair s128
Display(s) HP 24" 1920x1200
Case Custom Lian-Li V2110b
Audio Device(s) Auzentech X-Fi Forte 7.1
Power Supply Corsair 850HX
Point A) OMG trash on the ground, that is the best part about physX!

Point B) all of the games that use PhysX suck the big one, do I really need to post links? (and No UTIII is not a physx game, it has one level that uses phyx and you need to DL it separately)

Point C) your completely missing the point that all of the magic wonderful amazing GPU PHYSX CRAP you are promoting can easily be run on a CPU, now, today using a proper cpu physics engine. MOST IMPORTANTLY: Without a hit to FPS in the game!

now refute the points WITHOUT changing the subject or GTFO
 
Top