• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Recommended PhysX card for 5xxx series? [Is vRAM relevant?]

But CUDA is not absent, only 32bit does, what if DXVK translates everything to 64bit?
Its worth checking, i would but im on a pre-order list for over a month and it may take 2-3 months more lol, i pre-ordered in 4 shops just in case

DXVK does not make a 32-bit application 64-bit, it simply intercepts and does just-in-time (JIT) translation of DirectX runtime calls and converts them into the open-source Vulkan API. This program was designed essentially as a way to run DirectX games on Linux, because it does not support DirectX and does not have the WDDM display driver model available. While DXVK also runs on Windows, its usefulness tends to be quite limited, often niched to workaround some game engine bugs in some hardware configurations.

You have to understand, 32 and 64 bit applications are binary incompatible. The reason you can run 32 bit apps on 64 bit Windows unmodified is due to the "Windows on Windows 64" (WoW64) compatibility system, and it's only fully seamless as it is because the required instructions to execute these programs are still present and there is a way to manage their memory without affecting the rest of the system. This is because of a decision that the industry took at the very beginning of the 64-bit era, which is basically choosing to extend the existing x86 architecture instead of adopting a new 64-bit ISA (this is the story of the AMD Athlon 64 and the Intel Itanium processor, I could tell you all about it sometime but it's not the scope of this thread - to put it in simple terms, AMD's proposal won over Intel's).

On a pure 64-bit system, a 32-bit executable will not run. Therefore, it stands to reason that you cannot run 32-bit applications on a 64-bit runtime. This is why .NET and Visual C++ include both a "x86" and a "x64" DLL, they aren't interchangeable with one another. It's as simple as this: no 32-bit CUDA, no 32-bit PhysX.
 
I'm on laptop with 4070, latest driver installed (572.60) and all physx options are still present in nvidia control panel, also just checked borderlands 2, physx works on gpu as it have always been working there.

The same driver works for both 5xxx and 1xxx gpus so there should be no problem with making, for example, 1050/ti being delegated to physx only aside 5090 running as the main gpu.

If anyone still have any doubts about that then i bet it would be a great idea for test and article for techpowerup team.
Sounds like I was wrong then, thanks for the account.
 
So is Borderlands 2 playable on a 5 series card with physx on low (which I seem to remember is actually off)?

Pretty sure I always set it to low (off?) because it would bug out loot (fall through world) often... and all it added was some tent physics and a bit of goo.

Genuinely curious for when I upgrade (maybe) to 6 series. As BL2 is one of my all time favs and I don't really want to build a retro box for one game.
 
So is Borderlands 2 playable on a 5 series card with physx on low (which I seem to remember is actually off)?

Pretty sure I always set it to low (off?) because it would bug out loot (fall through world) often... and all it added was some tent physics and a bit of goo.

Genuinely curious for when I upgrade (maybe) to 6 series. As BL2 is one of my all time favs and I don't really want to build a retro box for one game.
Pretty sure yea, I've always disabled physX on Bl2, but luckily, BL2 doesn't really require physX and it honestly is just very extra. You aren't missing much without it besides some gooey water particles, tent physics, debris and barrel gib physics... yadada. Very extra stuff.
 
I didn't think drivers were necessary for the second GPU as long as it was recognized by the system and only selected for PhysX.
It's not like it will be directly connected to the Monitor for output.

Drivers are absolutely needed but it seems to be a nonissue here, I was wrong.

All those PxysX x86 lost games NVidia will use RTXRemix to remake them (remastertx) then it will be fixed.
I don't think that's going to make a 32-bit game 64-bit...

what if DXVK translates everything to 64bit?
It doesn't and doing so would be a.) slow b.) hard if not impossible.
 
So is Borderlands 2 playable on a 5 series card with physx on low (which I seem to remember is actually off)?

Pretty sure I always set it to low (off?) because it would bug out loot (fall through world) often... and all it added was some tent physics and a bit of goo.

Genuinely curious for when I upgrade (maybe) to 6 series. As BL2 is one of my all time favs and I don't really want to build a retro box for one game.

Yes, all games are playable, with PhysX disabled. Which... at least for Borderlands 2, you would have it off anyway. It always caused lots of performance problems, even if you had a good card. That game can get very hectic.
 
Sounds like I was wrong then, thanks for the account.
Digital foundry actually tested such scenario: 5080 for rendering + 3060 dedicated to physx, as an example batman arkham asylum was used:

P.S.
To be honest i would expect higher FPS on that kind of hardware even if over 100fps obviously is more than plenty.

So is Borderlands 2 playable on a 5 series card with physx on low (which I seem to remember is actually off)?

Pretty sure I always set it to low (off?) because it would bug out loot (fall through world) often... and all it added was some tent physics and a bit of goo.

Genuinely curious for when I upgrade (maybe) to 6 series. As BL2 is one of my all time favs and I don't really want to build a retro box for one game.
Yes, all games are playable even if you disable physx, that's how ATI/AMD users have always been playing these games.

And yes, in borderlands 2 when you switch physx to "low" it means actually that physx is disabled. Dedicated players were disabling physx anyways as it was causing loot to drop into unreachable places and borderlands players in general are complaining about "visual clutter". That's not just when physx is enabled but even in borderlands 3 where there was no physx at all but a lot of effects (mainly from various weapons, skills, etc) are overlapping and obfuscating what is going on in the game.
 
I mean yes I specifically mentioned the 3050 as the better option. 6 GB is small today but back in 2013 it took the almighty Titan to get you that much, and PhysX generally predates even that
Hmm... I could've sworn it said 3060. Hence my (completely unnecessary) question...
 
Yes, all games are playable, with PhysX disabled. Which... at least for Borderlands 2, you would have it off anyway. It always caused lots of performance problems, even if you had a good card. That game can get very hectic.
I really enjoyed Hawken. It was actually pretty popular online multiplayer and used Physx destruction. It ran pretty darn good too.
 
Lol hey 50 owners.. remember this?

Screenshot 2025-03-02 070711.png
 
Now, ban yourself for trolling. :)
I tried already sorry :D

Anyway.. no point in living in the past, just move on with the cards you were dealt.

Be brave. I believe in you.

Edit:

First person to want to trade their 5070Ti for my 4070Ti has a deal haha :)
 
I tried already sorry :D

Anyway.. no point in living in the past, just move on with the cards you were dealt.

Be brave. I believe in you.

Edit:

First person to want to trade their 5070Ti for my 4070Ti has a deal haha :)

Getting a 1050Ti or 3050 as extra GPU and for Physx [when you need it you can plug it to game] Is not a bad idea.

I always had 1030 as spare to test and for new builds, then i got 4070 super, the smallest i could find, its dual slot, but its overkill for physx, i kept it as spare and when i sold 4090 im using it as my main, very hard to downgrade, but its ok, I can survive until my pre-order comes in.
Honestly with the way games are, if i knew that 5090 wont be as fast as before [I actually made a list going years back, and Next gen XX70 or xx70Ti card was ALWAYS eitehr on par of faster than last gen Top card, it was the LAW until this gen] I could keep using my 4090, since i dont care about frame gen, i do use DLSS and love it.
But frame gen is not important since im not an FPS snob, i game at 4K/120 on my monitor [which i use an OLED 55inch TV as since 2016 been doing this, cheaper then monitor and bigger for sure]
Yet if its not a shooter, i can just as easy either do Half Vsync for 60fps or switch to 60Hz to save power and mainly since I always use custom loop i get the heat almost in my face from the radiator.
But, if i can do 40fps on 120hz im more than happy.
I have PS5 and games that let me do 120hz with unlocked FPS that do almost stable 40 with jumps higher with VRR feel great.
30fps feel horrible, the 10fps difference from 30 to 40 for ME feels more than jump from 60 to 120
 
This whole concept, is a throwback *definitely* not on my 2025 bingo card...

IIRC, The PC enthusiast and gaming community was doing this before the turn of 2010. -When CUDA PhysX first was 'a thing'.

Is it unfair to feel like this is a bit regressive?
tho, I have a bit of schadenfreude for those now wishing for more PCIe expansion :laugh:
 
To be honest i would expect higher FPS on that kind of hardware even if over 100fps obviously is more than plenty.
I am assuming performance is still bad with a dedicated card because PhyX stalls the entire rendering pipeline + overhead from copying the results from across PCIe, this was an extremely stupid piece of software built with no foresight from a time when compute shaders/async compute weren't a thing, other games achieved amazing physics and particle effects running on the CPU like Crysis.
 
I was kinda concerned that the "LOW" setting (there is no "OFF") wouldn't completely turn BL2 physx when it detected an nvidia card, but that doesn't seem to be the case, thanks folks!
 
I am assuming performance is still bad with a dedicated card because PhyX stalls the entire rendering pipeline + overhead from copying the results from across PCIe, this was an extremely stupid piece of software built with no foresight from a time when compute shaders/async compute weren't a thing, other games achieved amazing physics and particle effects running on the CPU like Crysis.

Not too far off the mark. PhysX's demise was ultimately a good thing for everyone involved. Even with a dedicated card there's some pretty intense performance degradation.
 
Not too far off the mark. PhysX's demise was ultimately a good thing for everyone involved. Even with a dedicated card there's some pretty intense performance degradation.

How is it good? Ancient Physx games look better than Modern ones in special effect department, Fabric simulation looked insane, physical fog that flowed according to wind and your movement and o much more.
Its all gone, modern games look like pictures, but the interactivity of physical objects, particles, fogs etc is minimal.
RTX also has degradation, but with time you get faster hardware, also t he codes gets better.
Losing Physx, nvidiaWorks, Hairworks and so on is a sad day, I want games not just look cool, but feel interactive, realistic with real [even exaggerated] physical properties
 
I remember long ago using a 2nd card for physx. The FPS was horrible because the frame generation requires the calculations of physx to be finished before the next frame. Essentially you need a equal card or the FPS is only as good as the weakest link, which is a total waste.

If you want 5080 perf, you will need a 4090 alongside it.
 
How is it good? Ancient Physx games look better than Modern ones in special effect department, Fabric simulation looked insane, physical fog that flowed according to wind and your movement and o much more.
Its all gone, modern games look like pictures, but the interactivity of physical objects, particles, fogs etc is minimal.
RTX also has degradation, but with time you get faster hardware, also t he codes gets better.
Losing Physx, nvidiaWorks, Hairworks and so on is a sad day, I want games not just look cool, but feel interactive, realistic with real [even exaggerated] physical properties

Even on RTX class hardware it's often a ~70% fps loss. It's too much for these effects.
 
Even on RTX class hardware it's often a ~70% fps loss. It's too much for these effects.

Then explain to me why everyone is pushing "Path Tracing" Which does the same 70% loss in FPS or more loss is worth it?
 
Then explain to me why everyone is pushing "Path Tracing" Which does the same 70% loss in FPS or more loss is worth it?

Because it's a new method that almost completely reinvents the wheel, raising the bar far higher than rasterization could ever take graphics towards photorealism. Its biggest drawback is the vast amount of computing power required to achieve the feat. PhysX on the other hand, is just an obsolete, proprietary physics engine that has long outlived its usefulness and suffered with optimization problems through most of all its service life.
 
I really enjoyed Hawken. It was actually pretty popular online multiplayer and used Physx destruction. It ran pretty darn good too.
Man you brought up a core memory for me and that game. I miss it.
 
Where was all the anger when they removed Mantle from the driver?
Mantle was never going to be the final product and as such was really only AMD's experiment which in the end turned into Vulcan
What nGreedia are doing is completely removing PhysX with GPU support on newest GPU's with no actual plans to replace it with a 64bit version allowing it to run on a GPU
They're two completely different things
 
Back
Top