• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Why should Physx matter?!

  • Thread starter Thread starter Deleted member 185158
  • Start date Start date

Would physx improve your experience?

  • Would like to see higher particle counts!!!

    Votes: 24 48.0%
  • Somewhat interested, but only have AMD

    Votes: 8 16.0%
  • Don't want to run more than one card (NV)

    Votes: 2 4.0%
  • Have heard about it, not informed.

    Votes: 0 0.0%
  • Don't care, not interested.

    Votes: 16 32.0%

  • Total voters
    50
And it can seamlessly be combined with RTRT...

...because you can do both at the same time at little additional resource costs.

Haha you think? So we're going to calculate those rays now on the physics effects too... I strongly doubt that's going to ever be smooth - well maybe at 30 fps.
 
I have one in my shelf. You will be limited with drivers and only few games support physx PPU type. You are better using a dedicated GPU like I did.

Ok understood.
But Im going to run Ageia physx not NV physx.
Have lots of NV GPUs. Ive done quite a fair share of physx gaming through the years for sure ;)
 
The most visible aspect of PhysX is particle effects and most known aspect about PhysX itself is the entire controversy over PPU, GPU acceleration and poor CPU-based library. Today, a lot of that is in the past - it has been rewritten to run well on CPUs thanks to good (enough) threading and being compiled properly these days (this is the case since version 3.0 back in 2011), GPU acceleration is rarely (if ever) used and since December 2018, PhysX is open-source.

PhysX is a physics library. It is a middleware realtime physics engine that is used in a variety of games, engines (both Unreal and Unity include PhysX support) as well as some notable software (AutoDesk applications). There are not many big direct competitors. Havok comes to mind, there are a lot of smaller physics middleware available and it seems to be common enough to roll your own if needed.
 
PhysX matters because it's a very capable physics platform, and thanks to advancements in compute power, it's become practically "free" (especially with Turing GPUs). NVIDIA drivers are also able to auto-select between GPU (CUDA) or CPU for PhysX worker threads to minimize FPS impact.


Practically free is a misnomer. Its not free, either it takes hardware resources that could be used to do pixel work, or the addition of extra hardware to lessen the impact could have been engineered to push more pixels. I get its the same choice we all make when purchasing something, but at the end of the day its a failed start due to being proprietary tech. Consoles use CPU Physx, none of the competition uses it or is "allowed" to use it despite it being shown to work on other hardware. Its merely a cherry on the ice cream for those who want it, the rest of the users would rather that cherry be more performance, or something that works across enough platforms to raise the adoption rate.
 
Agreed. Seriously under-utilized. If NVidia had been smart they would have licensed the PhysX IP to all GPU makers and made it an industry standard feature set.

It was on the table ... no one is going to invest millions in R & D and then give the tech away .... AMD could have any nVidia tech they want, just have to ante up and pay the piper. AMDs mindset however seems to be that ... "Well if we wanna license the tech ... nVidia will have a cost advantage and that's "always been our thing" . Instead they do an alternate technology... call it something similar and take the "fake news" approach, telling everyone it's the "same thing at lower cost " ... and only the 2nd part is true. Case in point ... Freesync:


With Freesync, no MBR tech / Scaler lag / Overdrive issues as refresh rate changes / HDR Certification questionable on some models / No FALD / Lower Refresh rates on same panels
Of course, using more high end tech has downsides as G-Sync has less cable connection options / costs more / by not using scaler, can't to popular TV options like PiP

And yes, I always kept the old card when upgrading... using then old card for PhysX was more than fine. GPU utilization was very low ... was way more card than needed.
 
It was on the table ... no one is going to invest millions in R & D and then give the tech away .... AMD could have any nVidia tech they want, just have to ante up and pay the piper. AMDs mindset however seems to be that ... "Well if we wanna license the tech ... nVidia will have a cost advantage and that's "always been our thing" . Instead they do an alternate technology... call it something similar and take the "fake news" approach, telling everyone it's the "same thing at lower cost " ... and only the 2nd part is true. Case in point ... Freesync:


With Freesync, no MBR tech / Scaler lag / Overdrive issues as refresh rate changes / HDR Certification questionable on some models / No FALD / Lower Refresh rates on same panels
Of course, using more high end tech has downsides as G-Sync has less cable connection options / costs more / by not using scaler, can't to popular TV options like PiP

And yes, I always kept the old card when upgrading... using then old card for PhysX was more than fine. GPU utilization was very low ... was way more card than needed.


That's a lot of words about freesync, which is just what AMD called the already in place industry standard variable refresh rate.

Nvidia was the one who decided they should create a marketing name for a already adopted standard to milk their sheep.


"Adaptive-Sync is a proven and widely adopted technology. The technology has been a standard component of VESA’s embedded DisplayPort (eDP™) specification since its initial rollout in 2009."
 
I turned it up once in BL2 I was just basically treated to a “confetti“ of more particle. Yeah I wasn’t missing anything...
 
I'm interested in the "confetti" being applied to VR more so than flat screen gaming.
Don't think we'll see any (hw) NV Physx in VR gaming, but would be neat to see. (unless there is?)
 
So wccftec announced Nvidia are releasing physx 5 including Fem simulation.
Next year apparently.

Seams FEM physics will indeed be a thing on the new console's and next year's pc games, nice.

I think it's support for a standardised FEM not a further proprietary tech.
 
BeamNG.drive makes full use of soft body physx and is a demanding game because of this, not sure if it was mentioned yet. physx is far from dead, simulated gravity has been around since there needed to be a way to keep the player on the map. Any game you can jump in has some sort of gravity in use.
 
BeamNG.drive makes full use of soft body physx and is a demanding game because of this, not sure if it was mentioned yet. physx is far from dead, simulated gravity has been around since there needed to be a way to keep the player on the map. Any game you can jump in has some sort of gravity in use.
physx 5.0 just dropped
 
Gonna have to stick an older GPU back on my gaming rig and test!
Wonder how a Geforce GT 740 would do as a PhysX card?
61pvZSqzfFL._AC_SL1280_.jpg

Kind of like this card in appearance but with a 64-bit bus, and GDDR5.
 
Wonder how a Geforce GT 740 would do as a PhysX card?
61pvZSqzfFL._AC_SL1280_.jpg

Kind of like this card in appearance but with a 64-bit bus, and GDDR5.
They are easily over powered by a modern host, like , it's not worth fitting for a 1080ti or above but below, possibly , though I expect the modern physx5 Will change things such that this wouldn't work well.
In the past physx 3 i think it was completely eradicated hybrid physx scenarios , which technically is fine , but at the same time it made some cards From Nvidia useless for an add in physx boost.

I would absolutely try it regardless though :).

I think I have a sneaky 910-940 type mini quadro here somewhere;).
 
Wonder how a Geforce GT 740 would do as a PhysX card?
61pvZSqzfFL._AC_SL1280_.jpg

Kind of like this card in appearance but with a 64-bit bus, and GDDR5.
I don't know. Kinda curious. Do you have one and if so are you willing to briefly test and tell us how it went? I think we have a few GTX750's in my shop, but I'd love to see if a 740 can do it!
 
I don't know. Kinda curious. Do you have one and if so are you willing to briefly test and tell us how it went
I'll give it a try, and find out. Just had to take off for work.
 
I'll give it a try, and find out. Just had to take off for work.
No worries, this'll be interesting. I've got a GTX 670 and Quadro FX1800 I'm going to give a try for giggles!

EDIT; I spoke too soon... It's only been announced. The drivers only have 4.1. Still an announcement means active development.
 
Last edited:
Last edited:
That's what I was thinking, but the more power the better and I do just have it laying around. It's a 4GB model too.

That will do fine indeed. The clock speed especially.
I'll have to run a GTX 580 as a secondary.
Still waiting on te PPU to arrive.
Specs for it as follows.
  • PPU: Ageia PhysX 100, 500 Mhz.
  • RAM: 128 Mb GDDR3, 738 Mhz.
  • Memory bus: 128-bit.
  • Interface: PCI-E x1.
  • Power: 4-pin molex.
Pretty much anything that is NV based would be starting at 8000 series cards and up, like the 8600gt would handle physx.
 
Also have a Geforce GTX 660 I could try.
 
That will do fine indeed. The clock speed especially.
I'll have to run a GTX 580 as a secondary.
Still waiting on te PPU to arrive.
Specs for it as follows.
  • PPU: Ageia PhysX 100, 500 Mhz.
  • RAM: 128 Mb GDDR3, 738 Mhz.
  • Memory bus: 128-bit.
  • Interface: PCI-E x1.
  • Power: 4-pin molex.
Pretty much anything that is NV based would be starting at 8000 series cards and up, like the 8600gt would handle physx.
While that's all true, like with a game, as things progress so to does the need for more computational power, thus a more powerful GPU is required to run Physx4.1 than the original incarnation of Physx. Not gonna do it tonight, but sometime soon.
 
Last edited:
Back
Top