• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Understanding GTX 480 and PhysX

Joined
Aug 20, 2010
Messages
209 (0.04/day)
Location
Mostar, Bosnia & Herzegovina
System Name Micro Mule
Processor Intel i7 950 Stock + Noctua NH-C14
Motherboard Asus Rampage III Gene MicroATX
Cooling Noctua 120mm/80m Fans
Memory Crucial Ballistix 6GB DDR3 1600MHz
Video Card(s) Asus nVidia GTX 580
Storage Samsung 850 Pro SSD, WD Caviar Black 2TB HDD
Display(s) LG 42LD650 42" LCD HDTV
Case Silverstone Fortress FT03
Audio Device(s) Creative SB X-Fi Titanium HD + Sennheiser PC360 Headset
Power Supply Corsair AX850 - 850W Modular Gold
Software Windows 7 Ultimate 64 bit
Apologies to all nVidia fan boys out there for this title but I'm really puzzled here! It was 8 years ago that I built my last gaming rig, and now (being tempted with the excellent realism of today's PC games) I decided to build a new high end gaming rig.

I've done my homework and did a lot of research to select the components that best fit my needs. Now, I would like to decide on a single GPU that would run any game at a resolution of 1920x1080 with highest settings delivering 60+ frame rates. I'm considering to go with nVidia for one reason; Physx. I'm really impressed with the comparison videos that I've seen and I think that it adds a great deal of realism to games.

My only objection is that turning Physx "on" in games supporting it greatly impacts frame rates and reduces them to an unacceptable limit, which is simply stupid. Do nVidia expect us to buy another nVidia card just to run Physx?! Why don't they simply add a separate chip on their board to carry out Physx processing without impacting the overall GPU performance?! We already know that the PCI-e 16x bandwidth isn't the bottleneck there. I have already decided to discard GTX 480 and will do the same for the upcoming GTX 490/495 if they don't have such capability, and opt for AMD ATI 5870 or 6870 instead. I'm building a MicroATX system and I have plans for a Sound Card, so I only have one PCI-e 16x slot intended for the best GPU, what do you think?
 
Most games do not use hardware PhysX, but rather software PhysX...

one of our moderators, Mussels, has a thread where he talks about it... In short, don't buy Nvidia for PhysX


and you could go 5970 if dual gpu on one card is OK with you
 
Personally im an nvidia guy myself, i used ati for a long time and am much happier on the green side. but I would wait untill the newer lower power consuming GTX4XX cards come. i would get 2 460's before I got 1 480
 
Most games do not use hardware PhysX, but rather software PhysX...

one of our moderators, Mussels, has a thread where he talks about it... In short, don't buy Nvidia for PhysX


and you could go 5970 if dual gpu on one card is OK with you

Thanks n-ster , I'll look up that thread about software physics and see.
 
Basically, PhysX isn't worth buying a card for. In the few games that do use hardware PhysX(something like 16 or 17), it definitely doesn't make the game super amazing over the same game without PhysX.

Also, as of yet, I haven't come across a game that PhysX actually shows a noticeable framerate drop when enabled on my GTX470 on my 1980x1080 screen. This includes Metro2033, the Mafia demo does show a decrease when PhysX is set to high, but not one when it is set to medium, and I honestly can't tell the difference between the two settings... Plus, I seem to remember someone testing Mafia II with a GTX460 and a 9800GT dedicated PhysX card, and adding the 9800GT only yielded about 5-10FPS more over the GTX460 by itself.
 
Most games do not use hardware PhysX, but rather software PhysX...

one of our moderators, Mussels, has a thread where he talks about it... In short, don't buy Nvidia for PhysX


and you could go 5970 if dual gpu on one card is OK with you

What he said. PhysX is nice, but it is only used in a small hand full of games. Batman: AA being the only one I own. Yes, it hurt when I switched to the Red team and Batman doesn't have the cool smoke effects, papers flying the wind, etc. but it still look amazing. Besides, the PhysX, while present is only heavily used in like 4 section of the game.

Don't buy Nvidia just for PhysX cause Havok is used much more (software based). While PhysX adds to the realism on occasions, it is not needed to do so. Get the best card that fits within your budget, regardless of it being Red or Green.
 
I think that what you are saying makes alot of sense... they even tried that with the COOP edition cards, but it failed miserably.

I think the reason that they do not do it is that it will add to the board power - because an 8800GT is a reasonable amount of power for physX and an additional g92 on the board would add to the TDP.

Also, additional stuff like that can mess up drivers too.
 
Basically, PhysX isn't worth buying a card for. In the few games that do use hardware PhysX(something like 16 or 17), it definitely doesn't make the game super amazing over the same game without PhysX.

Also, as of yet, I haven't come across a game that PhysX actually shows a noticeable framerate drop when enabled on my GTX470 on my 1980x1080 screen. This includes Metro2033, the Mafia demo does show a decrease when PhysX is set to high, but not one when it is set to medium, and I honestly can't tell the difference between the two settings... Plus, I seem to remember someone testing Mafia II with a GTX460 and a 9800GT dedicated PhysX card, and adding the 9800GT only yielded about 5-10FPS more over the GTX460 by itself.

Please refer to this article http://www.legitreviews.com/article/1386/

As you can see when running Mafia II demo on GTX 460 the frame rates (AA off) dropped from 58fps to 37fps (at 1920x1080) when Physx was turned on, which is quite significant. If nVidia people are trying to convince us that Physx is an advantage of their GPUs over AMD/ATI's GPUs then this should not be on the account of overall GPU performance. Meaning; GTX 480 with Physx would be inferior to ATI 5870 in running all games supporting Physx, which is illogical given that the GTX 480 sells for around $100 more than the ATI 5870. Not to mention the ridiculous power requirement and heat output of the GTX 480!

I think that what you are saying makes alot of sense... they even tried that with the COOP edition cards, but it failed miserably.

I think the reason that they do not do it is that it will add to the board power - because an 8800GT is a reasonable amount of power for physX and an additional g92 on the board would add to the TDP.

Also, additional stuff like that can mess up drivers too.

In my opinion I believe that nVidia badly needs to develop a GPU that outperforms ATI's 5870 while supporting hardware physics and within a reasonable TDP soon, otherwise their sales will really suffer during the upcoming months, especially when the new ATI 6xxx series GPUs are released supposedly next December.
 
Last edited by a moderator:
In my opinion I believe that nVidia badly needs to develop a GPU that outperforms ATI's 5870 while supporting hardware physics and within a reasonable TDP soon, otherwise their sales will really suffer during the upcoming months, especially when the new ATI 6xxx series GPUs are released supposedly next December.

yeah but you're not comparing apples to apples. A gtx 480 without physX beats a 5870 - the gtx 480 gives you the option of physX where the 5870 does not. So in PhysX games, the 5870 "wins" by not having the option at all... which is not really winning, its just not supporting the feature.

I personally think PhysX is a feature that is badly implemented. I likenvidia for their drivers first and foremost... PhysX was always a bit gimmicky to me. Nvidia will keep it around and come out with GPU's that support it, but all cards, barring the COOP ones, will take a hit when using PhysX for the foreseeable future.

Ultimately all PhysX is - is eyecandy which requires crunching power. GPU's are general purpose, so they can either crunch or render- if you have a GPU that has reserved blocks JUST for crunching 100% of the time then you have wasted silicone.
 
Last edited:
IMO, PhysX would be a lot more successful if nVidia opened it up to licensing so all card manufacturers could use it. As it stands only a few devs use it and they only use it for little gimmicks, not for substantial gameplay.

As others have said, buy the best card for you regardless of PhysX.
 
Do nVidia expect us to buy another nVidia card just to run Physx?!
Looks like you hit the nail on the head. nvidia sells GPUs. Of course they want you to buy another one to run their exclusive software!
 
ATI and Nvidia need to stop the war for a minute and agree to at least 1 deal. Re-enable the PhysX when an ATI GPU is the primary card. ATI users get PhysX, which should improve ATI sales. Nvidia gets to sell lower end GPU's to ATI users who need a PhysX card, so they get to make money off a market they normally can't tap.

I can say exactly who's fault it was that ended with this ability being blocked, but I say this on the subject......f&^% both of them for dicking around about it.
 
Please refer to this article http://www.legitreviews.com/article/1386/

As you can see when running Mafia II demo on GTX 460 the frame rates (AA off) dropped from 58fps to 37fps (at 1920x1080) when Physx was turned on, which is quite significant. If nVidia people are trying to convince us that Physx is an advantage of their GPUs over AMD/ATI's GPUs then this should not be on the account of overall GPU performance. Meaning; GTX 480 with Physx would be inferior to ATI 5870 in running all games supporting Physx, which is illogical given that the GTX 480 sells for around $100 more than the ATI 5870. Not to mention the ridiculous power requirement and heat output of the GTX 480!

Yes, with a GTX460. What is the differences between a GTX460 and a GTX470, like I have and based my statement on? Well, I'll tell you the differences: 112 Shaders, 8 ROPs, 64-bits of memory bus...that IS an extra GPU you want put on the card...
 
IMO, PhysX would be a lot more successful if nVidia opened it up to licensing so all card manufacturers could use it. As it stands only a few devs use it and they only use it for little gimmicks because they're bribed into it by Nvidia, not for substantial gameplay.

Sorry, I had to add that. I agree with you 100%.

I don't mean to fuel the Nvidia<->ATI feud. May the best chips win.
 
yeah but you're not comparing apples to apples. A gtx 480 without physX beats a 5870 - the gtx 480 gives you the option of physX where the 5870 does not. So in PhysX games, the 5870 "wins" by not having the option at all... which is not really winning, its just not supporting the feature.

I personally think PhysX is a feature that is badly implemented. I likenvidia for their drivers first and foremost... PhysX was always a bit gimmicky to me.

Nvidia will keep it around and come out with GPU's that support it, but all cards, barring the COOP ones, will take a hit when using PhysX for the foreseeable future.

Ultimately all PhysX is - is eyecandy which requires crunching power. GPU's are general purpose, so they can either crunch or render- if you have a GPU that has reserved blocks JUST for crunching 100% of the time then you have wasted silicone.

You are right; Physx requires crunching power, so nVidia has the option to either add this "waste of silicon" as you put it as a separate chip to process Physx on their high-end GPUs and thus the rendering performance (i.e. frames-per-second) won't suffer in games supporting Physx, or they should hand over this task to the best number crunchers ever (i.e. the CPUs) to do the job efficiently and not deliberately handicap that through software as they are currently doing (is that legal anyway?). This way they can make their profit from licensing Physx to a wider base of game developers, and focus their efforts on developing powerful yet efficient GPUs, an objective they seemingly lost track of.

Unfortunately they prefer to use Physx in an attempt to monopolize the GPU market; and as I can tell from their financial performance this year, and from the market share figures, they are dead wrong!


Yes, with a GTX460. What is the differences between a GTX460 and a GTX470, like I have and based my statement on? Well, I'll tell you the differences: 112 Shaders, 8 ROPs, 64-bits of memory bus...that IS an extra GPU you want put on the card...

No arguments there since at the moment I don't have any numbers (percentage of frame rate drop when Physx is turned "on") for the GTX 470 nor the GTX 480, but I'll search for those.

I've based my search on the GTX 460 since IMO it is the best Fermi to date and since it's GF104 chip will supposedly be the base for the upcoming GTX 490/495, which I'm considering to get.


ATI and Nvidia need to stop the war for a minute and agree to at least 1 deal. Re-enable the PhysX when an ATI GPU is the primary card. ATI users get PhysX, which should improve ATI sales. Nvidia gets to sell lower end GPU's to ATI users who need a PhysX card, so they get to make money off a market they normally can't tap.

I can say exactly who's fault it was that ended with this ability being blocked, but I say this on the subject......f&^% both of them for dicking around about it.

I hope but this will never happen; they are both just too greedy to come to any agreement.


Looks like you hit the nail on the head. nvidia sells GPUs. Of course they want you to buy another one to run their exclusive software!

Fine, but if they wanted us to buy another card for Physx anyway then why did they integrate Physx into their GPUs in the first place!?! They should have left Physx be handled by a separate card as was the case originally!
 
Last edited:
The GPU does a much better job with PhysX then the CPU can iirc
 
I wouldn't waste my time with PhysX man. Hmmm if I were you dude, and you couldn't wait for the HD6xxx series cards from ATI, I would grab a GTX 460 and overclock it like hell, they are cheap and perform really well too for the price and you have the option of using PhysX as well:toast: so you can see what its like for yourself;) then maybe down the track you can sell the card off and buy a HD6970 when they come out(or if they come out:confused::laugh:) or maybe you can just buy a HD5970 and rape every game out there:laugh: thats if money isn't a issue:toast:
 
When you build this new gaming rig what games will you be installing?

Or what games would you like to start off playing?
 
The GPU does a much better job with PhysX then the CPU can iirc

Sorry n-ster, but I think that you are wrong. I believe that nVidia deliberately cripples Physx on CPU as you can read in :Nvidia purposefully hobbles PhysX on the CPU and CPU PhysX looks deliberately crippled, among many other articles.

Secondly; in my line of work we use computer-based virtual physical simulation (finite element analysis (FEA) and rigid-body dynamics) and I know by first hand experience that calculations for physical simulation are CPU intensive, you can easily verify that by a web search.


When you build this new gaming rig what games will you be installing?

Or what games would you like to start off playing?

Batman (Arkham Asylum & Arkham City), Battlefield : Bad Company 2, Call of Duty : Modern Warfare 2, Mafia II, Metro 2033, Crysis 2, Need for Speed : Hot Pursuit, Aliens vs. Predator, Brink, HAWX 2, Nail'd, Blur, Red Alert 3, Rage, DiRT 3, Pro Evolution Soccer 2011 , .... forgive me for the long list but I've been away for too long! I know that some games on this list haven't been released yet, but I intend to play them once they are :)
 
Last edited:
hmm... I just was saying by what I recalled from some benchmarks... even if NV cripples it on the CPU, can you really uncripple it?
 
I did run a 9600GT as physx for a while but come to realize that i didn't play ANY games the required it so why waist valuable watt on a card that does nothing. i have thought about adding physx back in case i get into batman or something but it wont be something over 40$ like a 9600GSO or something with 96 shaders
 
Batman (Arkham Asylum & Arkham City), Battlefield : Bad Company 2, Call of Duty : Modern Warfare 2, Mafia II, Metro 2033, Crysis 2, Need for Speed : Hot Pursuit, Aliens vs. Predator, Brink, HAWX 2, Nail'd, Blur, Red Alert 3, Rage, DiRT 3, Pro Evolution Soccer 2011 , .... forgive me for the long list but I've been away for too long! I know that some games on this list haven't been released yet, but I intend to play them once they are :)
From that list you have, the only games that benefit from Hardware Physx are Batman, Metro 2033, Mafia and Brink.
In your list Batman is really the only game that hardware Physx effects are notable. :ohwell:
(I have no experience with Mafia II and Brink)
 
hmm... I just was saying by what I recalled from some benchmarks... even if NV cripples it on the CPU, can you really uncripple it?

Please read the articles that I've referenced above; nVidia are crippling CPU Physx by code and yes it can be "uncrippled" but only by them.


From that list you have, the only games that benefit from Hardware Physx are Batman, Metro 2033, Mafia and Brink.
In your list Batman is really the only game that hardware Physx effects are notable. :ohwell:
(I have no experience with Mafia II and Brink)

Mafia II benefits a lot from Physx in the aspect of destructible objects. Besides; I'd like my upcoming GPU to be future-proof for the upcoming 2-3 years.
 
Apologies to all nVidia fan boys out there for this title but I'm really puzzled here! It was 8 years ago that I built my last gaming rig, and now (being tempted with the excellent realism of today's PC games) I decided to build a new high end gaming rig.

I've done my homework and did a lot of research to select the components that best fit my needs. Now, I would like to decide on a single GPU that would run any game at a resolution of 1920x1080 with highest settings delivering 60+ frame rates. I'm considering to go with nVidia for one reason; Physx. I'm really impressed with the comparison videos that I've seen and I think that it adds a great deal of realism to games.

My only objection is that turning Physx "on" in games supporting it greatly impacts frame rates and reduces them to an unacceptable limit, which is simply stupid. Do nVidia expect us to buy another nVidia card just to run Physx?! Why don't they simply add a separate chip on their board to carry out Physx processing without impacting the overall GPU performance?! We already know that the PCI-e 16x bandwidth isn't the bottleneck there. I have already decided to discard GTX 480 and will do the same for the upcoming GTX 490/495 if they don't have such capability, and opt for AMD ATI 5870 or 6870 instead. I'm building a MicroATX system and I have plans for a Sound Card, so I only have one PCI-e 16x slot intended for the best GPU, what do you think?

Just ask yourself this question: How many games use hardware physx? You get like two physx games per year; it's crap. I'm amazed at people that get a separate card for physx, it's a waste. :shadedshu
 
I'd like my upcoming GPU to be future-proof for the upcoming 2-3 years.
In that case, get the 5970 which is the most powerful card out there right now.
Or wait until Oct and see what the new Southern Islands (HD6K series) offer.

Physx has been out for quite a few years now, but its adoption is not really on the rise.
Get a Physx card if you so desire, but so far most people are disappointed by what they see.
 
Last edited:
Back
Top