Friday, May 28th 2010

NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

NVIDIA has reportedly removed the driver-level code which restricts users from having an NVIDIA GeForce GPU process PhysX with an ATI Radeon GPU in the lead, processing graphics. Version 257.15 Beta of the GeForce drivers brought about this change. Possible commercial interests may have played NVIDIA's previous decision to prevent the use of GeForce GPUs to process PhysX with ATI Radeon GPUs, where users could buy an inexpensive GeForce GPU to go with a high-end DirectX 11 compliant Radeon GPU, thereby reducing NVIDIA's margins, though officially NVIDIA maintained that the restriction was in place to ensure Quality Assurance. The present move also seems to have commercial interests in mind, as NVIDIA could clear inventories of GeForce GPUs at least to users of ATI Radeon GPUs. NVIDIA replenished its high-end offering recently with the DirectX 11 compliant GeForce 400 series GPUs.

Update (28/05): A fresh report by Anandtech says that the ability to use GeForce for PhysX in systems with graphics led by Radeon GPUs with the 257.15 beta driver is just a bug and not a feature. It means that this ability is one-off for this particular version of the driver, and future drivers may not feature it.Source: NGOHQ.com
Add your own comment

276 Comments on NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

#1
shevanel
this is good and this is also hilarious.
Posted on Reply
#2
newtekie1
Semi-Retired Folder
Mussels said:
cuda cant, and never will run on ATI. its a hardware part of the GPU's.

PhysX could be made to run on ATI stream, but there is just no way in hell CUDA can run on ATI, nor stream could run on Nv.
There isn't really a reason that CUDA can't run on ATi hardware. CUDA is not a hardware part of the GPUs, CUDA is all software, built into the driver that uses standard unified Shaders to do work. In theory any GPU with unified shaders should be able to use CUDA.

CUDA came out well after several of the supported GPUs, and support was added via drivers and nothing more, there is nothing in the physical GPU that enables CUDA.

Now, at this point, it would probably be a better idea to port PhysX onto Streams, as ATi definitely isn't going to add CUDA support to their drivers.

Wile E said:
I'd rather just file the back of the PCIe slot to be open.
The problem with doing it like that is there might be components on the board behind the PCI-E x1 slots that would still interfere with the PCI-E connector on the card. In the case of his P5Q, there are components that would prevent the card from going either PCI-E x1 slot if you just filed the back of the PCI-E slot.

Wile E said:
Or they port Physx over to OpenCL or DirectCompute.
That is probably an even better idea then porting it to Streams! You here that nVidia? Get on it.

Wile E said:
I've been thinking that, too. Then, to eliminate all compatibility issues with gfx drivers, make it listed as a co-processor in the OS.
I've said they should do this since Vista first showed the problem with only allowing one type of graphics driver active at a time.

Though I would prefer that they still put display outputs on the card, and make it a graphics card, and instead just put out special drivers that you use that make it a a co-processor in the OS. So if you want to use it as a graphics card, you use the standard drivers, if you want to use it as a PPU only, you use the special drivers.
Posted on Reply
#3
Mussels
Moderprator
newtekie1 said:
There isn't really a reason that CUDA can't run on ATi hardware. CUDA is not a hardware part of the GPUs, CUDA is all software, built into the driver that uses standard unified Shaders to do work. In theory any GPU with unified shaders should be able to use CUDA.

CUDA came out well after several of the supported GPUs, and support was added via drivers and nothing more, there is nothing in the physical GPU that enables CUDA.
cuda is a language to translate program calls to work on Nvidia GPU's. ATI GPU's are not nvidia GPU's. CUDA would never work there.

lets use a more simple example.

HD-DVD and BLU-RAY.

They both store on the same sized medium (GPU's) they both do the same thing (hold HD content). No matter what the hell you do, they're not compatible. you can convert the movie (the program) to work on the alternative format by re-burning it on the other disk type (recoding the program to work on openCL instead of CUDA, for example)... but no matter what you do, the language is keyed to that hardware.

people just dont seem to get that while you can code a CUDA app to work on another direct compute style system, YOU CANT RUN CUDA ITSELF ON ANYTHING BUT NVIDIA HARDWARE.
Posted on Reply
#4
wahdangun
Mussels said:
cuda is a language to translate program calls to work on Nvidia GPU's. ATI GPU's are not nvidia GPU's. CUDA would never work there.

lets use a more simple example.

HD-DVD and BLU-RAY.

They both store on the same sized medium (GPU's) they both do the same thing (hold HD content). No matter what the hell you do, they're not compatible. you can convert the movie (the program) to work on the alternative format by re-burning it on the other disk type (recoding the program to work on openCL instead of CUDA, for example)... but no matter what you do, the language is keyed to that hardware.

people just dont seem to get that while you can code a CUDA app to work on another direct compute style system, YOU CANT RUN CUDA ITSELF ON ANYTHING BUT NVIDIA HARDWARE.
but i kind a gree wit newtikie, it's just like ordinary CPU. just think about it, windows can run in either AMD or intel branded, or you can take extreme example like linux, it can run on PS3(with other OS support) or intel CPU although it's have really different architecture (PPC vs X86)
Posted on Reply
#5
newtekie1
Semi-Retired Folder
Mussels said:
cuda is a language to translate program calls to work on Nvidia GPU's. ATI GPU's are not nvidia GPU's. CUDA would never work there.

lets use a more simple example.

HD-DVD and BLU-RAY.

They both store on the same sized medium (GPU's) they both do the same thing (hold HD content). No matter what the hell you do, they're not compatible. you can convert the movie (the program) to work on the alternative format by re-burning it on the other disk type (recoding the program to work on openCL instead of CUDA, for example)... but no matter what you do, the language is keyed to that hardware.

people just dont seem to get that while you can code a CUDA app to work on another direct compute style system, YOU CANT RUN CUDA ITSELF ON ANYTHING BUT NVIDIA HARDWARE.
You can use all the examples you want, I get what you are trying to say, you are just wrong in saying that CUDA is part of the GPU. That was my point.

It is not part of the GPU, it is entirely software based, there is nothing special required in the GPU for CUDA to run on it. Your original statement was that it is a hardware part of the GPUs, and it is not.

As it is right now, CUDA can not run on ATi hardware. However, there is no reason it couldn't, it just needs to be programmed to do so. Granted, it is never going to happen because ATi and nVidia could never work together to do it, but there is no reason other than that and the huge amount of time and developement it would take that it couldn't.

I'll even use an example of my own, or rather a more accurate version of your example from earlier:

It is like running OSX on non-Apple hardware. OSX is entirely software, there isn't really anything in the hardware that is OSX. At this point, Apples hardware doesn't have a special part built in that allows OSX to run, there isn't anything special about Apple's hardware.

But when you try to run OSX on non-Apple hardware, 9 times out of 10 it bails out a few seconds into the boot sequence. However, you do a little work, a little massaging, and pretty soon you have OSX running on non-Apple hardware.
Posted on Reply
#6
WSP
if so, then cuda is an apps writings exclusively for nvidia hardware?
Posted on Reply
#7
human_error
wahdangun said:
but i kind a gree wit newtikie, it's just like ordinary CPU. just think about it, windows can run in either AMD or intel branded, or you can take extreme example like linux, it can run on PS3(with other OS support) or intel CPU although it's have really different architecture (PPC vs X86)
GPUs don't use the same arcitecture/instruction sets in the way cpus do. ATi and nvidia gpus have different instruction sets and very different arcitectures which although similar from a high level point of view at the level where CUDA/stream operate they are extremely different.

Your example of running linux on the cell BE processor in a ps3 and running it on an x86 processor is a bad one as if you were to compare the kernels of the operating system on a ps3 and compare it with that of linux on x86 you would see it is completely different to accomodate the different arcitecture. Your reply would no doubt be "but yes but at the level that matters linux runs on both systems". My response to that is yes - but in the gpu world the kernel is equivalent to CUDA and stream and the linux os which sits on top of the kernel would be the applicaion implementing cuda and stream calls.

If you wanted to code something which runs on both ati and nvidia arcitectures then you would use openCl as ATi and Nvidia have built their own driver level translators to convert openCl calls to match their arcitecture - this is not a small task and removes the need to translate between cuda and stream. Saying nvidia could port cuda to run on ati hardware would be the same as saying intel could port their gpu drivers to run ati hardware - the purpose of the code at that level is to provide an interface to the hardware (CUDA is a hardware API as it gives access to hardware calls specific to certain hardware arcitectures, as opposed to a software API such as directx which is intercepted by drivers which then make the hardware calls appropriate to the hardware which is installed).
Posted on Reply
#8
Mussels
Moderprator
the HARDWARE can do it, but you're forgetting that we're talking about hte language used BETWEEN the software and the hardware.

CUDA apps converted to run on stream = entirely possible

running CUDA itself is whats it impossible, and what i'm getting sick of seeing people saying.
Posted on Reply
#9
newtekie1
Semi-Retired Folder
Mussels said:
the HARDWARE can do it, but you're forgetting that we're talking about hte language used BETWEEN the software and the hardware.

CUDA apps converted to run on stream = entirely possible

running CUDA itself is whats it impossible, and what i'm getting sick of seeing people saying.
Running CUDA itself is not impossible, just not likely. CUDA, being entirely software, can be re-written and modified to run on pretty much any hardware, it is just up to nVidia what they are willing to do.
Posted on Reply
#10
W1zzard
newtekie1 said:
Running CUDA itself is not impossible, just not likely. CUDA, being entirely software, can be re-written and modified to run on pretty much any hardware, it is just up to nVidia what they are willing to do.
nvidia's definition of cuda is that it is the hardware architecture enabling compute on their gpus. but you are correct in a way that it exposes a software interface for which an emulator can be written
Posted on Reply
#11
KainXS
it would probably be so slow you might be better off trying to emulate a ps3 on xbox360 or vice verca.

I think it might work but would take so much time to recode that you would be better off making a new standard completely.
Posted on Reply
#13
OnBoard
dlpatague said:
UPDATE: Official BLOG from Nvidia: http://blogs.nvidia.com/ntersect/2010/05/update-on-release-256-physx-support-1.html
Now if it was true that main reason for not allowing ATI/NVIDIA combo is the QA work, then leave it enabled just in Beta drivers. Everyone would be happy enough, who needs support.

But it just isn't the truth, only thing needed on WHQL drivers would be an option to the control panel with hybrid support and default is off. Turn it on and you get a message "NVIDIA doesn't support AMD/NVIDIA hybrid GPU configurations, you use this option at your own risk"

Everyone wins and is happy, but no. They purposely make it now work, just because they can and even say it them selves that you need to hack the drivers to use your NVIDIA card..
Posted on Reply
#14
u2konline
DannibusX said:

Based on your system specs, you may not know the difference between hardware PhysX and the ilk.
That has nothing to do with anything.

DannibusX said:
Check out this video for a comparison between PhysX and non-PhysX.
After watching the video, its nothing to be all hype about it, i never really notice anyway in games. The only type of physx i notice was in timeshift, which looks great. But overall, physx, nvidia, i won't cry if the stuff is disable in games. Nothing important to me.
Posted on Reply
#15
newtekie1
Semi-Retired Folder
OnBoard said:
Now if it was true that main reason for not allowing ATI/NVIDIA combo is the QA work, then leave it enabled just in Beta drivers. Everyone would be happy enough, who need support.

But it just isn't the truth, only thing needed on WHQL drivers would be an option to the control panel with hybrid support and default is off. Turn it on and you get a message "NVIDIA doesn't support AMD/NVIDIA hybrid GPU configurations, you use this option at your own risk"

Everyone wins and is happy, but no. They purposely make it now work, just because they can and even say it them selves that you need to hack the drivers to use your NVIDIA card..
I like this idea, disable it in the WHQL drivers, and leave it enabled in the beta drivers. Sounds like a perfect solution, since they don't support beta drivers anyway, they don't have to worry about supporting something that might not work.
Posted on Reply
#16
bebbee
i hate these overpriced and expensive new GPUs.

a GTS 250 is still a fast and bang for the bucks card.

i am against all these new cards.
Posted on Reply
#18
Robert-The-Rambler
As long as we have a chance to at least try GPU PhysX

TRIPTEX_MTL said:
I really respect nvidia for this decision. Official support itsnt a realistic expectation for ATI users but the enthusiast community isn't afraid of beta software.
I'm happy with a use at your own risk policy. Hell, that rule applies every time I eat out. :toast:
Posted on Reply
#19
Loosenut
Robert-The-Rambler said:
I'm happy with a use at your own risk policy. Hell, that rule applies every time I eat out. :toast:
+1 Robert, you ain't ramblin' now... :toast:
Posted on Reply
#20
TheMailMan78
Big Member
Benetanegia said:
I don't know why so many people don't understand why Nvidia disables PhysX with a non-Nvidia card. It's just not profitable to ensure QA. Just because the hack (and in this case the un-locked beta drivers) works for the mayority, that doesn't mean it works for everybody without a single problem (for instance it won't work in Vista). Things that come from companies like Nvidia, Ati,Intel, etc have to work 100% or at least 99.9999999% of the times. Plain and simple.

Someone somewhere will always be able to hack something or mod something that will work 99% of the times without spending excesive time and money on the development, but they are free of responsability if that 1% for which it doesn't work as it should, breaks their PC trying to make it work. Companies have to ensure by law that it works on 100% of the cases and when it fails they have legal responsability. It's that 1% that costs these companies (and this goes for any tech company, game developer, car vendors, whatever) a lot of money in QA, but they have to do it, because even something that seems so small as 1% is a very big number of people in real life, outside iof enthusiast forums. A hack is used by very few people, which can literally translate to 99 people saying how well it works and only one person saying it broke his windows installment. That person will be ignored and people think it works flawlessly which in most cases is probably true, but not always. There's still the fact that it could NOT work in certain cases, because it has not been tested. If something untested was officially released and it didn't work in just 1% of people, that would still make a number of more than 1 million failing cases and that would make a lot of noise... class actions would be put in place etc, etc. I repeat, companies have to ENSURE it works flawlessly and that costs a lot of money, not to mention having access to tech and IP that the company might not have, like for example, for Nvidia Southern Islands/Northerns Islands. How are them supposed to ensure 100% interoperability when those cards are released? Average joe will not understand if for whatever reason PhysX doesn't work in his shiny new card. Why is he supposed to wait 2 months in order to have something he already had working before?

In a sense that's what is good about PC gaming and modding. Someone can make something and you can try it under your responsability. When I say "you", I mean an enthusiast, because average joe will not downlaod it, and that's the difference. Average joe won't download such a hack, but average joe will download an official release, average joe will try such official release and if it doesn't work average joe will blame the company and will go as far as taking legal action, because average joe knows much more about class actions than he knows about tech. And that's all, really. No campany is willing to spend so much money making something work when it won't even work in most systems out there (Vista). try explaining average joe why that something that is official works on XP or 7, but doesn't work on Vista... try...
You are to smart to buy into that PR crap from Nividia.

First of all name one thing in your experience that you have NEVER had a problem with in the computer world. Even my case panel sometimes doesn't close as it should but never once did I think "I'm going to file a class action lawsuit against Coolermaster because one screw doesn't ALWAYS line up!" I mean really both Nvidia and ATI have driver problems with SOMETHING in EVERY release. If they didn't why do they keep releasing updates? Because something new or old isn't 100% compatible!

A lone joker in the hacking world created a decent mod that works 99% of the time. What do you think Nvidias RD crew could do? Please this QA crap is just PR so they don't seem like greedy bastards. Not that its a bad thing wanting to make money off of what you paid for but don't insult peoples intelligence.
Posted on Reply
#21
xBruce88x
what about with my 9600gt in the lead and the hd3200 chipset? well i'll try the drivers out and let you guys know!
Posted on Reply
#22
Benetanegia
TheMailMan78 said:
You are to smart to buy into that PR crap from Nividia.

First of all name one thing in your experience that you have NEVER had a problem with in the computer world. Even my case panel sometimes doesn't close as it should but never once did I think "I'm going to file a class action lawsuit against Coolermaster because one screw doesn't ALWAYS line up!" I mean really both Nvidia and ATI have driver problems with SOMETHING in EVERY release. If they didn't why do they keep releasing updates? Because something new or old isn't 100% compatible!

A lone joker in the hacking world created a decent mod that works 99% of the time. What do you think Nvidias RD crew could do? Please this QA crap is just PR so they don't seem like greedy bastards. Not that its a bad thing wanting to make money off of what you paid for but don't insult peoples intelligence.
I don't buy any PR crap, I've been saying this for a long long time, long, even before they said anything. Because I know for a matter of fact that things work that way. Not in the GPU or driver bussiness, but I've been there, so I know what is about. It doesn't matter if the QA is that important in the end or if it works at all, they have to do it, because in many countries it's obligatory. If they spent time and money and it doesn't work, no worries, but oh friend if it doesn't work and no QA was done... be prepared.

And they just don't want to spend the money on QA on something that is not really in their hands. A lot of that QA has to be made on AMD's end and they will just not do it. Even when only Nvidia cards are used, every PhysX driver update needs the latest GPU driver as well, or everything gets fucked up soon, that's something that I have suffered from. So a mix between Ati and Nvidia is always going to be worse.

Now, the idea of allowing it on the beta... that could work, but there's still the fact that it would not work on Vista systems and that's a nightmare to explain to average joe and it owuldn't be very different than the hack anyway. The hack has probably more support than the beta regarding Ati+Nvidia setup.
Posted on Reply
#23
Mussels
Moderprator
Benetanegia said:
I don't buy any PR crap, I've been saying this for a long long time, long, even before they said anything. Because I know for a matter of fact that things work that way. Not in the GPU or driver bussiness, but I've been there, so I know what is about. It doesn't matter if the QA is that important in the end or if it works at all, they have to do it, because in many countries it's obligatory. If they spent time and money and it doesn't work, no worries, but oh friend if it doesn't work and no QA was done... be prepared.

And they just don't want to spend the money on QA on something that is not really in their hands. A lot of that QA has to be made on AMD's end and they will just not do it. Even when only Nvidia cards are used, every PhysX driver update needs the latest GPU driver as well, or everything gets fucked up soon, that's something that I have suffered from. So a mix between Ati and Nvidia is always going to be worse.

Now, the idea of allowing it on the beta... that could work, but there's still the fact that it would not work on Vista systems and that's a nightmare to explain to average joe and it owuldn't be very different than the hack anyway. The hack has probably more support than the beta regarding Ati+Nvidia setup.
The QA was already done... AGEIA PPU's worked on ATI, nvidia, SIS, matrox, etc.
Posted on Reply
#24
Benetanegia
Mussels said:
The QA was already done... AGEIA PPU's worked on ATI, nvidia, SIS, matrox, etc.
That was a looooong time ago. Yes, GPU drivers from 2005 worked on my Radeon 9600 and games from that era too, but try running them today... They may work on many cases, but you are surely going to find a lot of problems. As a company Nvidia just wants to stay clear from any problems of that nature. Plain and simple.

The reason Nvidia is disabling PhysX is the same that Ati discontinued the X1000 series of cards, even when they were still selling them. There's a point in which a company doesn't want to spend more money in something that only few people are gonna use.
Posted on Reply
#25
Mussels
Moderprator
Benetanegia said:
That was a looooong time ago. Yes, GPU drivers from 2005 worked on my Radeon 9600 and games from that era too, but try running them today... They may work on many cases, but you are surely going to find a lot of problems. As a company Nvidia just wants to stay clear from any problems of that nature. Plain and simple.
so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.

Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.


The only reason nvidia are doing this is because they've done so much dodgy shit disabling features in the name of physX (such as with batman AA) that people might find out *gasp* that in fact, they just disable it on ATI even if physX is working.
Posted on Reply
Add your own comment