• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

Bah. When i saw this story i figured i would grab a Nvidia card from newegg to give PhysX a try. Oh well thanks for the update, just saved me a few bucks. I guess if you use an AMD card as your primary card Nvidia doesn't want your business.

You can still do it :toast: Very quick and easy.

See link.

Works great for me. GT 240 as a dedicated Physx and 5870 main card.
 
I honestly don't think it was a mistake to leave out that stuff in the latest update. It appears that they did this as a one time thing. And, I wouldn't be surprise to see it again on another non WHQL release driver.


Think about it, if it really was a mistake then the file should have been replaced by now. Downloading the file now should have the locks in place. Anyone want to test that theory?
 
I use a GTS 250 on the x4 slot of my Rampage II Gene with 2 5850s in CF. It works great. If you're going to buy an Nvidia card for dedicated PhysX I suggest you buy a 9800GT or higher.

If anyone wants more info about PhysX Hybrid setups I suggest you read this site: http://physxinfo.com/news/2789/hybrid-physx-mod-1-03-available/

I post regular comments on there under the name xDee xDee.

This is my rig: http://www.techpowerup.com/gallery/2634.html

My GT 240 is working great in Batman with PhysX on high x8 AA. Plus, no external power connector to deal with. NGOHQ.com has all the things you need to get it running. :toast:
 
For all of the bitching about how physx sucks and is "useless" These threads sure do generate A LOT of interest :rolleyes:

I for one never thought PhysX (or Havok for that matter) suck. It's just underused, reducing it to useless eye candy, rather than the fully destructible environments that it's capable of. I believe a handful of games actually deliver this - Bad Company 2 is it? - and it seriously improves the game.

When I saw Havok first used in Half-Life 2, it was fantastic. :rockout:
 
which is kinda wierd if you ask me, they make motherboards for amd, why not make physx cards for amd as well.

ATi has always been a direct competitor to nVidia.

However, nVidia started making AMD chipsets before AMD bought ATi, long before. It has only been recently that AMD became a direct competitor by buying ATi, and it doesn't make sense for nVidia to just shut down their entire chipset devision because of it.

I for one never thought PhysX (or Havok for that matter) suck. It's just underused, reducing it to useless eye candy, rather than the fully destructible environments that it's capable of. I believe a handful of games actually deliver this - Bad Company 2 is it? - and it seriously improves the game.

When I saw Havok first used in Half-Life 2, it was fantastic. :rockout:

The hardware accelerated parts of PhysX definitely are unuderused, and reduced to useless eye candy.

However, the software parts of PhysX, that run on the CPU like Havok, tend to be what makes the game playable and have anything moveable that interacts with the player.

I would really like to see PhysX uses to its full portential in games, with fully destructable environments, but saddly no developer will ever do that unless every gamer can use it. This means we will never see it unless PhysX runs on ATi hardware, or at least runs on a cheap nVidia card with an ATi card as the main GPU.
 
i get what your saying, but its still kind of the same thing. they make chipset for amd to make money from there chipsets. Why not release the lock to make more money off ati/amd people who want physx. physx isnt a big deciding point on buying the main rendering video card. So for those people that do decide on ati they will still be able to make a buck off them buy selling them another video card for physx. Its like a win win situation for them and there not taking advantage of it.
 
i get what your saying, but its still kind of the same thing. they make chipset for amd to make money from there chipsets. Why not release the lock to make more money off ati/amd people who want physx. physx isnt a big deciding point on buying the main rendering video card. So for those people that do decide on ati they will still be able to make a buck off them buy selling them another video card for physx. Its like a win win situation for them and there not taking advantage of it.

That would completely go against their marketing philosophy. PhysX and CUDA are the reasons they want to to buy their cards exclusively. They feel that blocking these features to users with a different graphics card makes them have to buy their cards.
 
I guess i am the only person that thinks nvidia physx is stupid right?
 
I guess i am the only person that thinks nvidia physx is stupid right?

It's not stupid by any means, I think it's a cool technology. The only problem is nVidia is keeping it locked from people who don't use their GPU's as a primary card.

Based on your system specs, you may not know the difference between hardware PhysX and the ilk. The only title I can really comment on is Batman AA which makes excellent use of the technology and it looks great too.

Check out this video for a comparison between PhysX and non-PhysX.

http://www.youtube.com/watch?v=6GyKCM-Bpuw
 
PhysX itself isn't stupid. What's stupid is the way Nvidia is handling it's usage.

I really just have a very hard time understanding how they can justify disabling a feature that works 100% with non-Nvidia GPUs in the same system. Obviously if people want PhysX they have to use an Nvidia GPU. So either way they would get sales. It's not marketing it's pigheadedness. To think someone paid their money for an Nvidia card and they can't use it how they want to, regardless if it is not their primary adapter, and Nvidia intentionally disables a working feature of a video card is ridiculous.

Heck I bet it would help them clear the shelves of their older cards because people using ATI video cards would like to purchase a 9800GT or even newer cards for PhysX which would be a sell that otherwise wouldn't have happened at all.
 
i get what your saying, but its still kind of the same thing. they make chipset for amd to make money from there chipsets. Why not release the lock to make more money off ati/amd people who want physx. physx isnt a big deciding point on buying the main rendering video card. So for those people that do decide on ati they will still be able to make a buck off them buy selling them another video card for physx. Its like a win win situation for them and there not taking advantage of it.

Personally, I agree with you. However, I'm just stating why the PhysX and Chipset comparision is flawed.

Stopping their chipset business would be like shutting down an entire devision of their company, it would be stupid. And as I stated, AMD wasn't a competitor until very recently when they aquired ATi.

PhysX itself isn't stupid. What's stupid is the way Nvidia is handling it's usage.

I really just have a very hard time understanding how they can justify disabling a feature that works 100% with non-Nvidia GPUs in the same system. Obviously if people want PhysX they have to use an Nvidia GPU. So either way they would get sales. It's not marketing it's pigheadedness. To think someone paid their money for an Nvidia card and they can't use it how they want to, regardless if it is not their primary adapter, and Nvidia intentionally disables a working feature of a video card is ridiculous.

Heck I bet it would help them clear the shelves of their older cards because people using ATI video cards would like to purchase a 9800GT or even newer cards for PhysX which would be a sell that otherwise wouldn't have happened at all.

Yep, it is completely idiotic. I think a lot of ATi users would pick up a cheap nVidia card to use PhysX(and CUDA in games like Just Cause 2). Of course the problem is that nVidia doesn't make as much money on the cheaper cards compared to the higher end, but something is better then making nothing...
 
Yep, it is completely idiotic. I think a lot of ATi users would pick up a cheap nVidia card to use PhysX(and CUDA in games like Just Cause 2). Of course the problem is that nVidia doesn't make as much money on the cheaper cards compared to the higher end, but something is better then making nothing...

I think they would get pure profit from opening it up, simply because ATI users are ATI users. Someone buying an nVidia card for PhysX support only wasn't necessarily going to buy a high-end nVidia in the first place.

I think I said it before, they could totally run on the whole "Why pay more for PhysX?" campaign.
 
nV just needs to start selling G92 without display connections, and a blank backplate, to be specifically used as a Phys-X card. I fail to understand why they have not done this yet...
 
I think they would get pure profit from opening it up, simply because ATI users are ATI users. Someone buying an nVidia card for PhysX support only wasn't necessarily going to buy a high-end nVidia in the first place.I think I said it before, they could totally run on the whole "Why pay more for PhysX?" campaign.

This is what I keep thinking. I would never buy an Nvidia GPU as a primary based on the minimal benifits of Physx or CUDA and I find it had to believe many people actually would.

Could you imagine ATI diasbling Eyefinity functionality on all systems which employ secondary NV Physx GPUs? I couldnt either.

Allowing everyone regardless of the primary GPU to join the Physx party only results in not only in improved sales (which everyone knows they need) but also encourages more developers to put more A list Physx titles to market. Its a win situation for everyone in cluding NV despite how their flawed logic views the subject.

Like I said before... not officially supporting this means NV will never receive my money for a new GPU dedcated to Physx but Ill still consider buying a used card.
 
LoL. why are people getting a stiffy for physx, all it is good for is trying, then you will soon realize what a waste of time, energy and heat a physx card is and subsequently remove it form your system. :toast:

Happened here!!!

Unless, 50% or more of the top PC games out there start relying on PhysX... I doubt this is going to occur anyway :rolleyes:

Bad move Nvidia.
 
(...) I think a lot of ATi users would pick up a cheap nVidia card to use PhysX(and CUDA in games like Just Cause 2). Of course the problem is that nVidia doesn't make as much money on the cheaper cards compared to the higher end, but something is better then making nothing...

You cannot use an Nvidia card as a secondary dedicated CUDA card in JC2. The main renderer has to support CUDA in order to enable the special features.
 
I think they would get pure profit from opening it up, simply because ATI users are ATI users. Someone buying an nVidia card for PhysX support only wasn't necessarily going to buy a high-end nVidia in the first place.

I think I said it before, they could totally run on the whole "Why pay more for PhysX?" campaign.

Not pure profit exactly. As giving the consumer the option to buy your competitors high end high profit product, while getting the benefits of your product with a lower end low profit card isn't always best.

Put it like this, you've got two high end sports cards from two manufacturers. Both are very similar all around in performance and price, say they are both about $200K, and the profit is $150K per car. Car A has cup holders, and you like cup holders but don't need them. Car B has sun visors, and you like sun visors but don't need them. Then, the manufacturer of Car B releases a very cheap $15K car that has sun visors also, that fit perfectly in Car A, but the profit of this new cheap car is only $2K. So now, what are you the consumer going to do? Buy Car A, give $150K in profit to that company, then buy the cheap car and only give $2K to that company. Would you see why the company making the cheap car would then change their sun visors so they won't work in Car A? Yes, they are still making the $2K profit, but if they didn't give the consume that easy option to go with the competitor and not loose any functionality, then they are losing a potential $148. Yes, the consume still might have gone with the competitors car anyway, but they might not have.

You cannot use an Nvidia card as a secondary dedicated CUDA card in JC2. The main renderer has to support CUDA in order to enable the special features.

Thats good, because I never said you could.

They can't use Hardware Accelerated PhysX either, at least not officially, but my point was that they would buy a cheaper nVidia card if they could.
 
Not pure profit exactly. As giving the consumer the option to buy your competitors high end high profit product, while getting the benefits of your product with a lower end low profit card isn't always best.

Put it like this, you've got two high end sports cards from two manufacturers. Both are very similar all around in performance and price, say they are both about $200K, and the profit is $150K per car. Car A has cup holders, and you like cup holders but don't need them. Car B has sun visors, and you like sun visors but don't need them. Then, the manufacturer of Car B releases a very cheap $15K car that has sun visors also, that fit perfectly in Car A, but the profit of this new cheap car is only $2K. So now, what are you the consumer going to do? Buy Car A, give $150K in profit to that company, then buy the cheap car and only give $2K to that company. Would you see why the company making the cheap car would then change their sun visors so they won't work in Car A? Yes, they are still making the $2K profit, but if they didn't give the consume that easy option to go with the competitor and not loose any functionality, then they are losing a potential $148. Yes, the consume still might have gone with the competitors car anyway, but they might not have.

doesnt work. profit from high end GPU's is minimal, they make more money from bulk sales of low end cards.

Lets put it another way: If ATI have a 30% market share and PhysX was worth it and allowed in ATI systems, that could potentially be a lot of PC's running a secondary Nvidia card.

Putting a more accurate example in:

Nvidia sell a car, which runs hot and chews fuel. it has a sunroof and cup holders.

ATI released a car which is a tiny bit slower, but cheaper, far better on fuel, and has no sunroof and cupholders.

You can buy a kit nvidia sell optionally (say, a 9600GT) to add that sunroof and cup holder to your car... it fits. but Nvidia specifically forbid you to do so, even tho they make money from it cause they'd rather you buy a new car, than an optional product.
 
doesnt work. profit from high end GPU's is minimal, they make more money from bulk sales of low end cards.

Lets put it another way: If ATI have a 30% market share and PhysX was worth it and allowed in ATI systems, that could potentially be a lot of PC's running a secondary Nvidia card.

Putting a more accurate example in:

Nvidia sell a car, which runs hot and chews fuel. it has a sunroof and cup holders.

ATI released a car which is a tiny bit slower, but cheaper, far better on fuel, and has no sunroof and cupholders.

You can buy a kit nvidia sell optionally (say, a 9600GT) to add that sunroof and cup holder to your car... it fits. but Nvidia specifically forbid you to do so, even tho they make money from it cause they'd rather you buy a new car, than an optional product.

Overall profit on high end cards is minimal, and in nVidia's case the profit per card is minimal also thanks to their gigantic die, but generally the profit per card is higher on high end cards. And if you wanted to be technical, desktop cards, low through high end, only make up 1/3 of nVidia's graphics card profits, but account for 2/3 of their sales volume. Sales of their ultra high-end Quadro cards make up 2/3 of their graphics card profits and 1/3 of the sales volume. So...no, profit from high end cards isn't lower...

However, the volume is relatively the same when talking about buying cards just for PhysX.
 
Overall profit on high end cards is minimal, and in nVidia's case the profit per card is minimal also thanks to their gigantic die, but generally the profit per card is higher on high end cards.

However, the volume is relatively the same when talking about buying cards just for PhysX.

i just see it as a dumb move, cause i'd rather see my cards used as a feature booster than not at all.
 
i just see it as a dumb move, cause i'd rather see my cards used as a feature booster than not at all.

I agree entirely, as I've already pointed out. I'm just explaining what I believe their reasoning is, I'm not saying I agree with it.
 
I will find a way regardless to use my just purchased 240 GT as a Physics PPU

I have the room in my case and the slot on my motherboard and a 4850 X2 just looking for some help in Batman Arkham Asylum. I don't care if Nvidia wants to try to stop me. I will make it happen. For those of us who already have ATI graphics cards their business strategy is to block the only friggin reason for us to purchase any of their gear because we don't need it for anything else. Nvidia is so dumb that they don't realize that they are simply alienating a large crowd of gamers who already have ATI cards and frankly don't need to switch brands for normal graphics. WE WANT TO BUY YOUR PRODUCT. WHY ARE YOU TRYING TO STOP US? DID YOU EVER HEAR OF 3DFX? Yes you probably did. Hell, you bought their SLI technology and I hope you either wake up or suffer the same fate as 3DFX. No marketing strategy has ever included preventing sales. I guess Nvidia is looking to be inventive. But a wheel is round for a reason and no possible sale should ever be turned away. :mad:
 
Last edited:
I don't know why so many people don't understand why Nvidia disables PhysX with a non-Nvidia card. It's just not profitable to ensure QA. Just because the hack (and in this case the un-locked beta drivers) works for the mayority, that doesn't mean it works for everybody without a single problem (for instance it won't work in Vista). Things that come from companies like Nvidia, Ati,Intel, etc have to work 100% or at least 99.9999999% of the times. Plain and simple.

Someone somewhere will always be able to hack something or mod something that will work 99% of the times without spending excesive time and money on the development, but they are free of responsability if that 1% for which it doesn't work as it should, breaks their PC trying to make it work. Companies have to ensure by law that it works on 100% of the cases and when it fails they have legal responsability. It's that 1% that costs these companies (and this goes for any tech company, game developer, car vendors, whatever) a lot of money in QA, but they have to do it, because even something that seems so small as 1% is a very big number of people in real life, outside iof enthusiast forums. A hack is used by very few people, which can literally translate to 99 people saying how well it works and only one person saying it broke his windows installment. That person will be ignored and people think it works flawlessly which in most cases is probably true, but not always. There's still the fact that it could NOT work in certain cases, because it has not been tested. If something untested was officially released and it didn't work in just 1% of people, that would still make a number of more than 1 million failing cases and that would make a lot of noise... class actions would be put in place etc, etc. I repeat, companies have to ENSURE it works flawlessly and that costs a lot of money, not to mention having access to tech and IP that the company might not have, like for example, for Nvidia Southern Islands/Northerns Islands. How are them supposed to ensure 100% interoperability when those cards are released? Average joe will not understand if for whatever reason PhysX doesn't work in his shiny new card. Why is he supposed to wait 2 months in order to have something he already had working before?

In a sense that's what is good about PC gaming and modding. Someone can make something and you can try it under your responsability. When I say "you", I mean an enthusiast, because average joe will not downlaod it, and that's the difference. Average joe won't download such a hack, but average joe will download an official release, average joe will try such official release and if it doesn't work average joe will blame the company and will go as far as taking legal action, because average joe knows much more about class actions than he knows about tech. And that's all, really. No campany is willing to spend so much money making something work when it won't even work in most systems out there (Vista). try explaining average joe why that something that is official works on XP or 7, but doesn't work on Vista... try...
 
C'mon now

I don't know why so many people don't understand why Nvidia disables PhysX with a non-Nvidia card. It's just not profitable to ensure QA. Just because the hack (and in this case the un-locked beta drivers) works for the mayority, that doesn't mean it works for everybody without a single problem (for instance it won't work in Vista). Things that come from companies like Nvidia, Ati,Intel, etc have to work 100% or at least 99.9999999% of the times. Plain and simple.

Someone somewhere will always be able to hack something or mod something that will work 99% of the times without spending excesive time and money on the development, but they are free of responsability if that 1% for which it doesn't work as it should, breaks their PC trying to make it work. Companies have to ensure by law that it works on 100% of the cases and when it fails they have legal responsability. It's that 1% that costs these companies (and this goes for any tech company, game developer, car vendors, whatever) a lot of money in QA, but they have to do it, because even something that seems so small as 1% is a very big number of people in real life, outside iof enthusiast forums. A hack is used by very few people, which can literally translate to 99 people saying how well it works and only one person saying it broke his windows installment. That person will be ignored and people think it works flawlessly which in most cases is probably true, but not always. There's still the fact that it could NOT work in certain cases, because it has not been tested. If something untested was officially released and it didn't work in just 1% of people, that would still make a number of more than 1 million failing cases and that would make a lot of noise... class actions would be put in place etc, etc. I repeat, companies have to ENSURE it works flawlessly and that costs a lot of money, not to mention having access to tech and IP that the company might not have, like for example, for Nvidia Southern Islands/Northerns Islands. How are them supposed to ensure 100% interoperability when those cards are released? Average joe will not understand if for whatever reason PhysX doesn't work in his shiny new card. Why is he supposed to wait 2 months in order to have something he already had working before?

In a sense that's what is good about PC gaming and modding. Someone can make something and you can try it under your responsability. When I say "you", I mean an enthusiast, because average joe will not downlaod it, and that's the difference. Average joe won't download such a hack, but average joe will download an official release, average joe will try such official release and if it doesn't work average joe will blame the company and will go as far as taking legal action, because average joe knows much more about class actions than he knows about tech. And that's all, really. No campany is willing to spend so much money making something work when it won't even work in most systems out there (Vista). try explaining average joe why that something that is official works on XP or 7, but doesn't work on Vista... try...

Why is it any easier to make CUDA work with Nvidia cards than ATI? Really? When Physics/CUDA started it was a separate entity from the GPU altogether and their is no reason why a GPU can't be designed to run as simply a dedicated PPU. How hard can it be? Even if there are bugs enthusiasts as we are will find a way to make it work ourselves more likely than some tech support yahoo at Nvidia unless they really bunk up the whole process. I don't like it when a company is bullshitting me. In this case Nvidia is not telling the whole story and they probably think that CUDA is enough of a selling point to convince idiots to by their higher end gear when they already have strong enough CUDAless gear. Flushing money down the toilet is not my style. It just aint enough of a reason to change our whole graphics setup. Certain gamers like me only want to try out the Physics tech at a smaller premium than altering our already super expensive super powerful gaming rigs. Nvidia is simply nuts and should have embraced the idea of a separate PPU idea instead of trying to solely integrate it only into their own GPU configurations. I can't possibly think of any reason why somebody would purchase a higher end Nvidia card for CUDA when the ATI card they already have is fast enough. So why not sell CUDA for what it is; a separate entity from the GPU altogether. Why not have the option for both? I'm not buying the quality control aspect.
 
Back
Top