Friday, May 28th 2010

NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

NVIDIA has reportedly removed the driver-level code which restricts users from having an NVIDIA GeForce GPU process PhysX with an ATI Radeon GPU in the lead, processing graphics. Version 257.15 Beta of the GeForce drivers brought about this change. Possible commercial interests may have played NVIDIA's previous decision to prevent the use of GeForce GPUs to process PhysX with ATI Radeon GPUs, where users could buy an inexpensive GeForce GPU to go with a high-end DirectX 11 compliant Radeon GPU, thereby reducing NVIDIA's margins, though officially NVIDIA maintained that the restriction was in place to ensure Quality Assurance. The present move also seems to have commercial interests in mind, as NVIDIA could clear inventories of GeForce GPUs at least to users of ATI Radeon GPUs. NVIDIA replenished its high-end offering recently with the DirectX 11 compliant GeForce 400 series GPUs.

Update (28/05): A fresh report by Anandtech says that the ability to use GeForce for PhysX in systems with graphics led by Radeon GPUs with the 257.15 beta driver is just a bug and not a feature. It means that this ability is one-off for this particular version of the driver, and future drivers may not feature it.Source: NGOHQ.com
Add your own comment

276 Comments on NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

#1
TheMailMan78
Big Member
Benetanegia said:
I don't buy any PR crap, I've been saying this for a long long time, long, even before they said anything. Because I know for a matter of fact that things work that way. Not in the GPU or driver bussiness, but I've been there, so I know what is about. It doesn't matter if the QA is that important in the end or if it works at all, they have to do it, because in many countries it's obligatory. If they spent time and money and it doesn't work, no worries, but oh friend if it doesn't work and no QA was done... be prepared.

And they just don't want to spend the money on QA on something that is not really in their hands. A lot of that QA has to be made on AMD's end and they will just not do it. Even when only Nvidia cards are used, every PhysX driver update needs the latest GPU driver as well, or everything gets fucked up soon, that's something that I have suffered from. So a mix between Ati and Nvidia is always going to be worse.

Now, the idea of allowing it on the beta... that could work, but there's still the fact that it would not work on Vista systems and that's a nightmare to explain to average joe and it owuldn't be very different than the hack anyway. The hack has probably more support than the beta regarding Ati+Nvidia setup.
Yeah and one small disclaimer on the box would cover them in most countries if they are that chicken shit. Like I said if one hacker can make it work great then why can't a billion dollar company? Its PR BS plain and simple. If you still don't think so then you should sue every game developer in the world for not making 100% compatible games with EVERY combination of hardware.

Mussels said:
The QA was already done... AGEIA PPU's worked on ATI, nvidia, SIS, matrox, etc.
Yup. Now tell me AGEIA was a better funded company than Nvidia. :rolleyes:
Posted on Reply
#2
Wile E
Power User
Benetanegia said:
That was a looooong time ago. Yes, GPU drivers from 2005 worked on my Radeon 9600 and games from that era too, but try running them today... They may work on many cases, but you are surely going to find a lot of problems. As a company Nvidia just wants to stay clear from any problems of that nature. Plain and simple.
No, they just wanted to force people to use only nVidia hardware, plain and simple. All they had to do was allow Physx to run in coprocessor mode, like the original PPU, if they were worried about conflicting video drivers.

Hell, they don't even have to go that far. They can simply say mixed gpu solutions are not officially supported, and they wash their hands of the imagined support costs.

And none of that explains why they let it go for so long before deciding to cut it out.

It has nothing to do with support at all. It's nVidia not happy with the situation, and taking their ball and going home.

Mussels said:
so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.

Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.


The only reason nvidia are doing this is because they've done so much dodgy shit disabling features in the name of physX (such as with batman AA) that people might find out *gasp* that in fact, they just disable it on ATI even if physX is working.
Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.
Posted on Reply
#3
erocker
Wile E said:


Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.
It does work with a Physics card rather well. Both AA and the PhysX. So is AA being run exclusively through my GT 240?
Posted on Reply
#4
TheMailMan78
Big Member
Wile E said:
No, they just wanted to force people to use only nVidia hardware, plain and simple. All they had to do was allow Physx to run in coprocessor mode, like the original PPU, if they were worried about conflicting video drivers.

Hell, they don't even have to go that far. They can simply say mixed gpu solutions are not officially supported, and they wash their hands of the imagined support costs.

And none of that explains why they let it go for so long before deciding to cut it out.

It has nothing to do with support at all. It's nVidia not happy with the situation, and taking their ball and going home.



Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.
The AA works fine when you force it. I even made a thread on it.
Posted on Reply
#5
Wile E
Power User
erocker said:
It does work with a Physics card rather well. Both AA and the PhysX. So is AA being run exclusively through my GT 240?
TheMailMan78 said:
The AA works fine when you force it. I even made a thread on it.
Is it enabled thru the in game settings?
Posted on Reply
#6
Benetanegia
Mussels said:
so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.

Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.
Like I said GPU drivers and PhysX drivers are closely tied, they can't do that.

TheMailMan78 said:
Yeah and one small disclaimer on the box would cover them in most countries if they are that chicken shit. Like I said if one hacker can make it work great then why can't a billion dollar company? Its PR BS plain and simple. If you still don't think so then you should sue every game developer in the world for not making 100% compatible games with EVERY combination of hardware.
Like I said because that hacker has never QA it. Thousands of people in the internet have done it for free. If the hacker had to pay to everyone that helped him make the mod stable, he would need to be a multibillion company and have a real interest in spending that much on it too. Besides, they do a lot more reverse engineering in NGOH than it is "socially acceptable" in the bussiness world, if you know what I mean. They have much more (real, applicable) access to Ati hardware than any company will ever do.
Yup. Now tell me AGEIA was a better funded company than Nvidia. :rolleyes:
yup, and Ageia went bankrupt...

Nvidia strugles to stay in green. Hell even Ati struggles and the reason that Stream never kicked off is because they simply didn't want to put money on it. Same for OpenCL right now, or GPU accelerated Havok or countless of other examples. Just because Nvidia has money doesn't mean they have to let it go down the drain.
Posted on Reply
#7
Wile E
Power User
Benetanegia said:
Like I said because that hacker has never QA it. Thousands of people in the internet have done it for free. If the hacker had to pay to everyone that helped him make the mod stable, he would need to be a multibillion company and have a real interest in spending that much on it too. Besides, they do a lot more reverse engineering in NGOH than it is "socially acceptable" in the bussiness world, if you know what I mean. They have much more (real, applicable) access to Ati hardware than any company will ever do.



yup, and Ageia went bankrupt...

Nvidia strugles to stay in green. Hell even Ati struggles and the reason that Stream never kicked off is because they simply didn't want to put money on it. Same for OpenCL right now, or GPU accelerated Havok or countless of other examples. Just because Nvidia has money doesn't mean they have to let it go down the drain.
It was working fine before nV blocked it. We didn't even need the hacker.
Posted on Reply
#8
Mussels
Moderprator
Wile E said:

Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.
not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.


I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.
Posted on Reply
#9
Benetanegia
Wile E said:
It was working fine before nV blocked it. We didn't even need the hacker.
It was working back then and it might work now just like I can take my 9600 pro out of the closet, take it's dirver cd and play many games. Why the hell do they release GPU drivers every month? The previous ones work just well...

They don't want to have to worry EVER. Period. That's why you just cut it off. Other option is enable it and see the web flooding with complaints. And don't say that there would be no complaints, because there's been many complaints about far more irrelevant things than poor fps in a certain game which is the first sympton that would be noticed.

Mussels said:
not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.


I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.
When forced from CCC it's not Nvidia's AA, it's the normal supersampling AA that is always posible.
Posted on Reply
#10
Mussels
Moderprator
Benetanegia said:
It was working back then and it might work now just like I can take my 9600 pro out of the closet, take it's dirver cd and play many games. Why the hell do they release GPU drivers every month? The previous ones work just well...

They don't want to have to worry EVER. Period. That's why you just cut it off. Other option is enable it and see the web flooding with complaints. And don't say that there would be no complaints, because there's been many complaints about far more irrelevant things than poor fps in a certain game which is the first sympton that would be noticed.



When forced from CCC it's not Nvidia's AA, it's the normal supersampling AA that is always posible.
no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place (Why not allow ATI to have supersampling AA in the in-game options?)
Posted on Reply
#11
Wile E
Power User
Mussels said:
no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place (Why not allow ATI to have supersampling AA in the in-game options?)
Doesn't work properly on the hardware. That was proven a long time ago with back to back screenshots with it enabled on ATI, when people first starting complaining about it.

Or if you mean the regular SSAA, it's because it's not in the engine at all. It would just be the equivalent of forcing it thru the CCC anyway.
Posted on Reply
#12
Mussels
Moderprator
Wile E said:
Doesn't work properly on the hardware. That was proven a long time ago with back to back screenshots with it enabled on ATI, when people first starting complaining about it.
screenshots arent the best way to confirm AA is working, if its done in post processing it doesnt show up in screenies - i've heard of this many times before as to why screenshots look worse than in game.
Posted on Reply
#13
Benetanegia
Mussels said:
no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place (Why not allow ATI to have supersampling AA in the in-game options?)
MY GOD!!! This has been discussed thousands of times. Unreal Engine 3 has no AA and no other UE3 game besides Batman has an in-game AA option. If you want to enable it it has to be made from CCC... why batman has to be any different?? Nvidia didn't block anything, they added their own AA. Period.
Posted on Reply
#14
Wile E
Power User
Mussels said:
screenshots arent the best way to confirm AA is working, if its done in post processing it doesnt show up in screenies - i've heard of this many times before as to why screenshots look worse than in game.
Even the people running it said there was no difference.

They didn't disable anything for ATI. They added a feature for themselves. 2 entirely different things. If they disabled shit for ATI, we would already be hearing about anti-trust/anti-competitive lawsuits or investigations. Nothing supports your theory, mussels.
Posted on Reply
#15
TheMailMan78
Big Member
Benetanegia said:
yup, and Ageia went bankrupt...

Nvidia strugles to stay in green. Hell even Ati struggles and the reason that Stream never kicked off is because they simply didn't want to put money on it. Same for OpenCL right now, or GPU accelerated Havok or countless of other examples. Just because Nvidia has money doesn't mean they have to let it go down the drain.
Ageia went bankrupt because they had no financial backing to push Physx into the mainstream. Its not because they couldn't support the QA monetarily. Nvidia bought them out because they could. Now you are ether saying Physx is a waste of time and that is why Nvidia doesn't do the "QA".

Come on man admit it! Nvidia blocked a feature so that you would buy their hardware exclusively. All that hacker did was re-enable it. This has nothing to do with QA and everything to do with investment. If Nvidia was smart and REALLY wanted Phyisx to go mainstream they would sell dedicated PPU's like AGIEA. However they won't. Why because they think they bought the golden goose with Physx. Problem is none of the developers seem to agree unless you toss a bucket of money at them with TWIMTBP program.
Posted on Reply
#16
Mussels
Moderprator
i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...

i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.
Posted on Reply
#17
Wile E
Power User
Mussels said:
i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...

i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.
So it's just an anti-Physx post then? Coming from someone who considers you a friend, that seems a bit trollish to me, Mussels.

Physx is capable of much more. It doesn't have the market share for devs to use it for anything more tho. Non-nV users still need to want to play the games, and using Physx too heavily counts them out of the super advanced features. That IS nV's fault, however, and this blocking Physx on systems with ATI is one of the prime reasons.
Posted on Reply
#18
Benetanegia
Mussels said:
i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...

i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.
PhysX is hardware accelerated on at least the PS3, maybe that's why they have certain features. The Xbox is able to handle 3 threads so they might have one only for physx too. AS much as you might disagree, NO, that cannot be done on the PC, because most people don't have quads. And games using PhysX, when runing from the CPU do use 2 threads, although that's something the developer decides, how many to use, always based on the lowest common denominator. For comparison Havok games use only one CPU core, I have tested that myself, with NFS:Shift, Batman and Mass Effect for PhysX and Fallout3, L4D and HL2:Ep2 for Havok.
Posted on Reply
#19
lyndonguitar
I play games
I have a question? What company is supporting Crysis 2 right now?. Nvidia or ATI? I saw some demos of Crysis 2 run on Eyefinity and was hosted by ATi and wondering because if its Nvidia again, PhysX will be there and i don't have a PhysX Card(yet.) If its Nvidia, might as well buy a cheap Nvidia card.
Posted on Reply
#20
Benetanegia
TheMailMan78 said:
If Nvidia was smart and REALLY wanted Phyisx to go mainstream they would sell dedicated PPU's like AGIEA.
They don't do that, because that bussiness model was proved to be fail. Ageia already did it and everybody at the time agreed that a PPU could be a good idea as long as it was integrated, either in the MB or the GPU and Nvidia did exactly that, integrate it into the GPU.
Problem is none of the developers seem to agree unless you toss a bucket of money at them with TWIMTBP program.
That is not a problem with PhysX. That is a problem that affects PhysX, affects the inclusion of dedicated servers, affects the optimization of the PC port, affects the UI... Without the pushing from PC centric companies, most games would lack dedicated servers, would look like a 2000 game while running so bad that would seem they were running off a Pentium 2 and you would have to use the arrow keys to aim and press the triangle to shoot and the circle to jump (not that it isn't happening already... i.e. Dead Space? Street Fighter 4?).
Posted on Reply
#21
newtekie1
Semi-Retired Folder
Mussels said:
not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.


I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.
No, it wasn't on the console versions, at least not the PS3 version. The console versions looked the same as the PC version with PhysX turned off.

The AA forced through CCC is FSAA, not MSAA, which is why the AA enabled through CCC comes at a huge performance hit, and the AA enabled through the game menu doesn't.

Also, the AA that nVidia added to UE3 for Batman doesn't work on ATi hardware as it wasn't coded for ATi hardware. This is identical to your argument about CUDA not working on ATi hardware. It wasn't coded for ATi hardware, so it doesn't work on ATi hardware.
Posted on Reply
#22
GenL
Guys, just to let you know... about a month ago i've made a fix for Batman which makes the game use its native MSAA code on any hardware.
You can find more info about it here: http://www.ngohq.com/graphic-cards/17716-batman-arkham-asylum-msaa-fix.html

Such thought as...
doesn't work on ATi hardware as it wasn't coded for ATi hardware
are invalid.
This case has nothing to do with nvidia hardware/technology, just two VendorID checks. One in the launcher, another one in the game.

What about CUDA... it's just a badly programmed applications. They are supposed to choose CUDA GPU by themselves, but some apps just ignore anything but primary GPU.
Posted on Reply
#23
the54thvoid
The QA issue is nonsense. Nvidia does not guarantee 'safety' of its own drivers as clearly stated in the very end of the product info for all it's WHQL driver releases. The following is a direct quote..

"The software is provided 'as is', without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and noninfringement. In no event shall the contributors or copyright holders be liable for any claim, damages or other liability, whether in an action of contract, tort ot otherwise, arising from, out of or in connection with the software or the use or other dealings with the software"

I'm sure ATI has the same disclaimer, effectively saying if drivers are dodgy, tough - you installed them. It's this sort of small print ant the end of the release notes thate makes a mockery of the notion that just because they're official you can use them with absolute certainty you have recourse to legal action if things go wrong.

So this kind of nullifies any arguments about QA for mixing gfx cards and physx where system damage is the end result.
Posted on Reply
#24
Benetanegia
the54thvoid said:
The QA issue is nonsense. Nvidia does not guarantee 'safety' of its own drivers as clearly stated in the very end of the product info for all it's WHQL driver releases. The following is a direct quote..

"The software is provided 'as is', without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and noninfringement. In no event shall the contributors or copyright holders be liable for any claim, damages or other liability, whether in an action of contract, tort ot otherwise, arising from, out of or in connection with the software or the use or other dealings with the software"

I'm sure ATI has the same disclaimer, effectively saying if drivers are dodgy, tough - you installed them. It's this sort of small print ant the end of the release notes thate makes a mockery of the notion that just because they're official you can use them with absolute certainty you have recourse to legal action if things go wrong.

So this kind of nullifies any arguments about QA for mixing gfx cards and physx where system damage is the end result.
That disclaimer is as useless as the EULA in games. Maybe it has legal weight in the US, but outside of the US it's useless. They can say as much as they want but, at least in the EU, they have legal responsability no matter what they say. Laws are always above any contract.

Those disclaimers and the EULA are put there in order to make people think they can't do anything if something goes wrong and from what I see, it works.
Posted on Reply
#25
wahdangun
Benetanegia said:
That disclaimer is as useless as the EULA in games. Maybe it has legal weight in the US, but outside of the US it's useless. They can say as much as they want but, at least in the EU, they have legal responsability no matter what they say. Laws are always above any contract.

Those disclaimers and the EULA are put there in order to make people think they can't do anything if something goes wrong and from what I see, it works.
hmmm so if QA nvdia so great why they release WHQL driver that shut off the fan and make the card overheating.

it seems to me they don't take QA seriously
Posted on Reply
Add your own comment