• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Recommended PhysX card for 5xxx series? [Is vRAM relevant?]

GTX 1650 if you're worried about the driver cutoff, Quadro P620 or GTX 1050 if you aren't. Both are the cheapest options that are still performant enough to not cause a bottleneck. I've used both in tandem with AMD and Nvidia cards and they work fine.
 
Sure, eventually Nvidia will drop non RTX support.
But right now, even 700 series supported, so, first they drop 700 series, then 900 series and then 1000 series and then 16xx series will be last
So we have at least 3? 5 years?
By that time RTX 3050 will be dirt cheap, if i find one for under 100$ ill get one, but right now there is no need to buy 1050/1050Ti/1650 like I assumed, I was afraid that only 1050Ti and higher are fast enough.

Only the 750 Ti (Maxwell), the rest of the 700 series got dropped quite a few years ago already. They're probably going to remove 750 Ti, 900, 10 and Titan V together. There's rumors the GTX 16 series will go alongside all of these, despite being Turing architecture, so NV would only support RTX 20 series and up. But I think 16 series sticks by at least as long as the 20 series IMO.
 
So from the software side of things, would it be possible for Nvidia/the community to provide some sort of emulation/translation layer for 32-bit PhysX calls to run on 64-bit? Similar to Proton/Wine for Windows applications?
 
Yes, it has always been enough. The problem is when Nvidia drops 10 series support, which is any day now.

Unbelievable, even 750Ti is good enough and it almost doubled 4090 FPS, how is this posible? from 190 to 309fps in Batman Arkham Asylum
750Ti cuda-wise should be like say 1-2% of 4090, so how the F performance almost doubled in some scenarios


750Ti.jpg


GTX 1650 if you're worried about the driver cutoff, Quadro P620 or GTX 1050 if you aren't. Both are the cheapest options that are still performant enough to not cause a bottleneck. I've used both in tandem with AMD and Nvidia cards and they work fine.

apartnly even 750Ti and 1030 are good enough, i posted a graph above, its insane
 
Unbelievable, even 750Ti is good enough and it almost doubled 4090 FPS, how is this posible? from 190 to 309fps in Batman Arkham Asylum
750Ti cuda-wise should be like say 1-2% of 4090, so how the F performance almost doubled in some scenarios


View attachment 389608



apartnly even 750Ti and 1030 are good enough, i posted a graph above, its insane

It has always been this way. Always. PhysX seems to cause stalls when running in the same GPU that is rendering a game. No matter how powerful the card is.
 
It has always been this way. Always. PhysX seems to cause stalls when running in the same GPU that is rendering a game. No matter how powerful the card is.
That was not my experience with an Asus STRIX GTX 1080. Batman games and Black Flag ran great paired with a 5800X3D. Gave that card away in a system for the neighbor's kid at Xmas as his dad got laid off. I just got a open box 3050 6GB on Best Buy for $140 should be here tomorrow or Monday. I will see how it does with a 5600X3D.
 
That was not my experience with an Asus STRIX GTX 1080. Batman games and Black Flag ran great paired with a 5800X3D. Gave that card away in a system for the neighbor's kid at Xmas as his dad got laid off. I just got a open box 3050 6GB on Best Buy for $140 should be here tomorrow or Monday. I will see how it does with a 5600X3D.

They'll run... but if you offload it to another graphics card it's going to run much, much better
 
Unbelievable, even 750Ti is good enough and it almost doubled 4090 FPS, how is this posible? from 190 to 309fps in Batman Arkham Asylum
750Ti cuda-wise should be like say 1-2% of 4090, so how the F performance almost doubled in some scenarios


View attachment 389608



apartnly even 750Ti and 1030 are good enough, i posted a graph above, its insane

Would be interesting with a more powerful PhysX card or even using the 3050 as the rendering card and the 4090 for PhysX along with the others and comparing. It would offer a better PhysX scaling test while the primary rendering performance would drop. I don't know how much the game FPS changes with the PhysX effects turned on or off though. If it's a relatively big part of the render pipeline for FPS as a whole shifting the duties of physx handling strictly to the 4090 and offloading the primary rendering to a weaker GPU might offer more benefits which is where these kinds of insights become interesting.
 
I like this kind of testing. It's good insight overall. I'd like to see the 4090 as the dedicated PhysX card though just to see it scales for that purpose. Like I mentioned depending on what's the bigger performance bottleneck the actual PhysX of the rest of the rendering pipeline you could see big sweeps in performance as a result of how you pair and use them together.
 
No, run dual video cards for gaming just for the PhysX thing. To be fair, I'm inclined to try it out. Sorry, I should have been more clear.

If you have 5XXX series, even 5090 and you want to play old Physx game, then you have no other choice, unless you disable Physx and play like "AMD pleb" [lol]
But as you can se even cards that have 32bit physx like 4090 get speed boost, so throwing a cheapo 1030 if you have a free PCIe slot, might not be a bad idea
 
You basically are improving threading and better load balancing by doing so effectively. I'm not entirely sure what impacts PhysX performance the most in general and I'm sure that varies from one PhysX effect to another and depends on the GPU hardware itself a bit because it's not exactly a linear design scaling for software or hardware usage. You could also be better off using integrated graphics for PhysX as well rather than the CPU depending on CPU/APU.
 
You basically are improving threading and better load balancing by doing so effectively. I'm not entirely sure what impacts PhysX performance the most in general and I'm sure that varies from one PhysX effect to another and depends on the GPU hardware itself a bit because it's not exactly a linear design scaling for software or hardware usage. You could also be better off using integrated graphics for PhysX as well rather than the CPU depending on CPU/APU.

There are exceptionally few integrated GPUs that can run it, all very old. nForce has been gone for a very long time.

A dedicated card is the only option in most scenarios.
 
There are exceptionally few integrated GPUs that can run it, all very old. nForce has been gone for a very long time.

A dedicated card is the only option in most scenarios.

I miss nForce mobos, they were great, Nvidia did both AMD and Intel, I wonder why they quit, they can keep doing it, sure they cant make chipsets of their own [maybe on AMD they can?]

BTW, I think nVidia missed a business opportunity with physx, as we see even 4090 can benefit greatly from something so simple as 1030.
I have 2 options in my mind, i think option 2 more logical, but option 1 is not bad either

1] a chiplet GPU that has PhysX accelerator on board, basically something like 1030, stripped of everything, just the part that does Physx, and conencted with some bus to the gpu die.
These cards will be sold for some premium, say 50$ more and knowing us gamers, many will buy them, just to not miss out and since it will also bring physx back.
I miss all the cool effects, sure RTX is cool, but i want cloth simulation, fog, cool explosions and all that. Especially like in Metro 2, where you shoot with gatling gun at stone and it chips away, even Fallout 4 had that, but its broken

2] This can be a small side project for nvidia, a Booster card, sat PCIe x2, that has PhysX accelerator + good, modern audio chip [I love me some sound cards, been buying them since old pentium days, the last one i had is Creative Sound Blaster X3, but when i moved to wirless audio, I quit using them]
I would buy such card, why not, everybody needs good audio and the motehrboard audio is ok, but Creative does betetr, even today with 600Ω headphone amplifiers, per output amplifiers and Super X-F directional audio, that does soemthing like Dolby Atmos for Headphone and DTS:X, but IMHO better, since you can take photos of you ears with special app and upload them to optimize the sound]
 
For example you mean? I have better cards than that for such a purpose.

It wont make much difference, i seen somewhere benchmarks using 4090, it was like 4-5 fps better then 3050 and 3050 is almost on par with 1030
 
I miss nForce mobos, they were great, Nvidia did both AMD and Intel, I wonder why they quit, they can keep doing it, sure they cant make chipsets of their own [maybe on AMD they can?]

BTW, I think nVidia missed a business opportunity with physx, as we see even 4090 can benefit greatly from something so simple as 1030.
I have 2 options in my mind, i think option 2 more logical, but option 1 is not bad either

1] a chiplet GPU that has PhysX accelerator on board, basically something like 1030, stripped of everything, just the part that does Physx, and conencted with some bus to the gpu die.
These cards will be sold for some premium, say 50$ more and knowing us gamers, many will buy them, just to not miss out and since it will also bring physx back.
I miss all the cool effects, sure RTX is cool, but i want cloth simulation, fog, cool explosions and all that. Especially like in Metro 2, where you shoot with gatling gun at stone and it chips away, even Fallout 4 had that, but its broken

2] This can be a small side project for nvidia, a Booster card, sat PCIe x2, that has PhysX accelerator + good, modern audio chip [I love me some sound cards, been buying them since old pentium days, the last one i had is Creative Sound Blaster X3, but when i moved to wirless audio, I quit using them]
I would buy such card, why not, everybody needs good audio and the motehrboard audio is ok, but Creative does betetr, even today with 600Ω headphone amplifiers, per output amplifiers and Super X-F directional audio, that does soemthing like Dolby Atmos for Headphone and DTS:X, but IMHO better, since you can take photos of you ears with special app and upload them to optimize the sound]

1. EVGA thought of that once. They made a GTX 275 that came with a GTS 250 that was specifically intended to be used that way, check this out:


2. Probably not ever happening, although a return to the very origin (ageia PPU) would be hilarious. There's no market share and very, very little demand, besides, they probably have some GTX 1630's to ship and those who are really interested could get just that down the road I guess
 
It wont make much difference, i seen somewhere benchmarks using 4090, it was like 4-5 fps better then 3050 and 3050 is almost on par with 1030
I was thinking more about one of my Quadro cards. I don't have a 1030 or anything close. I do have a few Quadro cards kicking around that I think are PhysX compatible. I also still have a 960 or 1060 and a few others to use.
 
Last edited:
So from the software side of things, would it be possible for Nvidia/the community to provide some sort of emulation/translation layer for 32-bit PhysX calls to run on 64-bit? Similar to Proton/Wine for Windows applications?
there's no technical reason they couldn't, but since GPU physx isnt open source, it would have to be nvidia that does it.
 
I miss nForce mobos, they were great, Nvidia did both AMD and Intel, I wonder why they quit, they can keep doing it, sure they cant make chipsets of their own [maybe on AMD they can?]
?

Their chipsets ran hot and couldn't get FSB as high as comparable P35/P45/X38/X48 boards. Their only selling point was SLI capability.
 
?

Their chipsets ran hot and couldn't get FSB as high as comparable P35/P45/X38/X48 boards. Their only selling point was SLI capability.
As we already discussed in the nostalgic HW thread couple of times, it seems the general consensus is that those boards (chipsets) are prone to failure much more than others, have a lot of OC stability issues, and stability problems overall. Their main value as collectibles could be just their rarity, and nothing more (SLI too, as you said).
 
?

Their chipsets ran hot and couldn't get FSB as high as comparable P35/P45/X38/X48 boards. Their only selling point was SLI capability.
They are way older than that bro, my first nvidia chipset was with a Tbred 2600+, that is S462, or socket A. They were awesome, they overclocked, And they did dolby digital live.
 
Back
Top