Friday, May 28th 2010

NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

NVIDIA has reportedly removed the driver-level code which restricts users from having an NVIDIA GeForce GPU process PhysX with an ATI Radeon GPU in the lead, processing graphics. Version 257.15 Beta of the GeForce drivers brought about this change. Possible commercial interests may have played NVIDIA's previous decision to prevent the use of GeForce GPUs to process PhysX with ATI Radeon GPUs, where users could buy an inexpensive GeForce GPU to go with a high-end DirectX 11 compliant Radeon GPU, thereby reducing NVIDIA's margins, though officially NVIDIA maintained that the restriction was in place to ensure Quality Assurance. The present move also seems to have commercial interests in mind, as NVIDIA could clear inventories of GeForce GPUs at least to users of ATI Radeon GPUs. NVIDIA replenished its high-end offering recently with the DirectX 11 compliant GeForce 400 series GPUs.

Update (28/05): A fresh report by Anandtech says that the ability to use GeForce for PhysX in systems with graphics led by Radeon GPUs with the 257.15 beta driver is just a bug and not a feature. It means that this ability is one-off for this particular version of the driver, and future drivers may not feature it.Source: NGOHQ.com
Add your own comment

276 Comments on NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

#1
Robert-The-Rambler
Thats not what I'm asking for

Mussels said:
cuda cant, and never will run on ATI. its a hardware part of the GPU's.

PhysX could be made to run on ATI stream, but there is just no way in hell CUDA can run on ATI, nor stream could run on Nv.
Give me a dedicated PPU like when it all started so I can still use my ATI cards as I always have been and let the PPU do its own job. Make it work with your own video cards or let specific video cards in your lineup such at the 240 GT function as a PPU only no matter what GPU you are using. At least try!!!
Posted on Reply
#2
Mussels
Moderprator
oh and just because i can...


if you two want to argue this stuff, at least get your facts right.

Please learn what these all are, and do:

CUDA
PhysX
Physics
Stream

once you've done that, come back, and make sure you dont screw things up. Nvidia doesnt do physics - they do PhysX. ATi doesnt do CUDA, and never will, and so on.
Posted on Reply
#3
DannibusX
Just to set the record straight, CUDA is part of nVidia's architecture, correct? When they acquired PhysX they had it ported to CUDA so ATI couldn't use it, but it also made Ageia's PPU's obsolete.

CUDA is the reason you need an nVidia card, because that's what language PhysX speaks.

Is Streams ATI's version, or answer to CUDA?

Physics is math.

Just making sure it's all straight in my head.
Posted on Reply
#4
Mussels
Moderprator
DannibusX said:
Just to set the record straight, CUDA is part of nVidia's architecture, correct? When they acquired PhysX they had it ported to CUDA so ATI couldn't use it, but it also made Ageia's PPU's obsolete.

CUDA is the reason you need an nVidia card, because that's what language PhysX speaks.

Is Streams ATI's version, or answer to CUDA?

Physics is math.

Just making sure it's all straight in my head.
STREAM and CUDA are both languages used to allow the GPU on a video card to perform non-3D tasks. ATI use stream, nvidia use cuda.

is it possible for a CUDA app to be ported to STREAM and vice versa? yes. but no ones done it yet (probably due to legal reasons)
Posted on Reply
#5
DannibusX
Yar, I'm reading up on CUDA right now wikipedia, interesting stuff.

Ok, so no one will port CUDA to Stream, simply because nVidia would hammer them with lawsuits to protect their IP. Using an nVidia card to use CUDA isn't illegal because that's the reason you bought it, and you own the product.

Maybe nVidia is trying to protect themselves from liability by not supporting the GTX for PhysX front, in case someone seriously messes up their computer. They're not making it really hard for it to be hacked.
Posted on Reply
#6
Mussels
Moderprator
DannibusX said:
Yar, I'm reading up on CUDA right now wikipedia, interesting stuff.

Ok, so no one will port CUDA to Stream, simply because nVidia would hammer them with lawsuits to protect their IP. Using an nVidia card to use CUDA isn't illegal because that's the reason you bought it, and you own the product.

Maybe nVidia is trying to protect themselves from liability by not supporting the GTX for PhysX front, in case someone seriously messes up their computer. They're not making it really hard for it to be hacked.
you *cant* port cuda to stream.

thats like saying i'm gunna port OSX to windows. Or run the GUI for an iphone on my samsung mobile phone. you can convert OSX apps to run in windows (with a lot of work) but you cant make OSX run in windows.


They're two different langauges, maybe i should stick with that for examples.

You can translate a japanese movie into english, and back and forth - but you cant make a movie in english and then expect japanese speakers to understand it WITHOUT that translation.

You could write a translation layer that translates CUDA into STREAM allowing CUDA apps to run on stream, but it wouldnt be a perfect lineup, just like spoken languages don't translate perfectly. manual debugging still needs to be done, and performance would be far worse than doing it natively


(ignore the whole VMware thing that someones going to throw up in response to this - emulation isnt the point at hand)
Posted on Reply
#7
DannibusX
Mussels said:
you *cant* port cuda to stream.

thats like saying i'm gunna port OSX to windows. Or run the GUI for an iphone on my samsung mobile phone. you can convert OSX apps to run in windows (with a lot of work) but you cant make OSX run in windows.



(ignore the whole VMware thing that someones going to throw up in response to this - emulation isnt the point at hand)
I totally misread that post. For some reason my eyes skipped right over the "app" after Cuda.

Oh, I totally get it. Like I said, I misread, lol.
Posted on Reply
#8
Mussels
Moderprator
i think the language part at the bottom is easier to comprehend, i added that in after you quoted me.
Posted on Reply
#9
Robert-The-Rambler
Whatever....

Mussels said:
oh and just because i can...


if you two want to argue this stuff, at least get your facts right.

Please learn what these all are, and do:

CUDA
PhysX
Physics
Stream

once you've done that, come back, and make sure you dont screw things up. Nvidia doesnt do physics - they do PhysX. ATi doesnt do CUDA, and never will, and so on.
You did understand what I was saying I hope. I know CUDA is what Nvidia uses to accellerate physics calculations that have been called PhysX since Ageia was in the business and ATI uses Stream to do stuff like assist in DVD interpolation in Power DVD 9 Ultra. I've never seen Stream used in games thus far.

Physics is PhysX, right.

I think I'm done. :)
Posted on Reply
#10
Mussels
Moderprator
Robert-The-Rambler said:
You did understand what I was saying I hope. I know CUDA is what Nvidia uses to accellerate physics calculations that have been called PhysX since Ageia was in the business and ATI uses Stream to do stuff like assist in DVD interpolation in Power DVD 9 Ultra. I've never seen Stream used in games thus far.

Physics is PhysX, right.

I think I'm done. :)
no, you've still either screwing things up in your head, or just when you type them.

PhysX is one physics engine made by ageia, now owned by nvidia. the words physics and physx are NOT interchangeable.

ATI's stream and nvidias CUDA are the same thing, for their different hardware engines - a way to use the power of a GPU to perform non 3D tasks. anything doable on CUDA is doable on stream, so long as its coded natively for that platform.
Posted on Reply
#11
lyndonguitar
I play games
is 9500GT enough for a PhysX card and using a PCI-e x1 slot. im using it with 5850
Posted on Reply
#12
DannibusX
PhysX is really interesting, I've never seen anything like the difference in Batman AA. Before it seemed like it was just a gimmick nVidia was trying to sling, but seeing it put into action so well, I like it a lot. I still like my ATI card, but I'll continue to use the hack to keep my PhysX card running. Of course, I'll have to say goodbye to it when I buy a second 5870 though. I don't have enough PCI-E slots.

Robert, PhysX is nVidia's proprietary physics engine. Physics is not PhysX.
Posted on Reply
#13
Mussels
Moderprator
DannibusX said:
PhysX is really interesting, I've never seen anything like the difference in Batman AA. Before it seemed like it was just a gimmick nVidia was trying to sling, but seeing it put into action so well, I like it a lot. I still like my ATI card, but I'll continue to use the hack to keep my PhysX card running. Of course, I'll have to say goodbye to it when I buy a second 5870 though. I don't have enough PCI-E slots.

Robert, PhysX is nVidia's proprietary physics engine. Physics is not PhysX.
batman AA is a poor example. many of those 'hardware physX' features are available in the console versions... its not so much that physX is used to make them work, its more that they turned off non-physX things if non nvidia card is detected.
Posted on Reply
#14
DannibusX
Lyndon, from what I've heard the 9500 is not a good card for PhysX. I use an 8800GT as it has 112 Shaders and 512MB of ram. A lot of people recommend the 8800 as a minimum, but prefer 9600 or a 9800.

Here's a thread with pretty in depth discussion on it:
http://forums.techpowerup.com/showthread.php?t=119217

Mussels
batman AA is a poor example. many of those 'hardware physX' features are available in the console versions... its not so much that physX is used to make them work, its more that they turned off non-physX things if non nvidia card is detected.
Meh, I've never played it on the console. I'm starting to become a PC gaming guy these days. All I ever built my machines for in the past was for WoW and I overkilled it. My Xbox gathers dust, unless an exclusive title comes out I really want to play.
Posted on Reply
#15
Robert-The-Rambler
I still don't get what I'm missing

DannibusX said:
PhysX is really interesting, I've never seen anything like the difference in Batman AA. Before it seemed like it was just a gimmick nVidia was trying to sling, but seeing it put into action so well, I like it a lot. I still like my ATI card, but I'll continue to use the hack to keep my PhysX card running. Of course, I'll have to say goodbye to it when I buy a second 5870 though. I don't have enough PCI-E slots.

Robert, PhysX is nVidia's proprietary physics engine. Physics is not PhysX.
I gotta get to sleep. Maybe I'm just punchy. I bought into this stupid PhysX thing right from the start with a BFG PPU. I've sort of missed it since the PPU when poopoo. Something is lost in translation here. PhysX is a PHYSICS engine, a way to a translate CUDA accelerated, CPU accelerated or in our dreams Stream Accelerated explosions and whatnot to enhance the gameplay experience. BTW Shadowgrounds Survivor is an awesome CPU PhysX enhanced game that used to be PPU that is real cheap on STEAM. Knocking trees down is a lot of fun while shooting Aliens.

But Mussels I led you astray in my earlier post. I was merely trying to question why CUDA would be harder to do on a NVidia GPU used as a PPU with an ATI also present as the main GPU. Why isn't this being done? That is all I was trying to shout.

These posts have way too many acronyms. Good night all.
Posted on Reply
#16
Mussels
Moderprator
yes, PhysX is a physics engine. but there is more than one physics engine out there. you've been swapping the words back and forth in your messages, confusing the issue.

even again there you say something that makes no sense.
PhysX is a PHYSICS engine, a way to a translate CUDA accelerated, CPU accelerated or in our dreams Stream Accelerated explosions and whatnot to enhance the gameplay experience.
PhysX is a software physics engine designed to run on the CPU primarily, or be accelerated by a PPU or Nvidia GPU. It has nothing to do with translating anything.
Posted on Reply
#17
Robert-The-Rambler
Substitute Bring For Translate

Mussels said:
yes, PhysX is a physics engine. but there is more than one physics engine out there. you've been swapping the words back and forth in your messages, confusing the issue.

even again there you say something that makes no sense.



PhysX is a software physics engine designed to run on the CPU primarily, or be accelerated by a PPU or Nvidia GPU. It has nothing to do with translating anything.
I think you were being a bit too literal. Anyway hope we all learned something......
Posted on Reply
#18
wahdangun
Mussels said:
yes, PhysX is a physics engine. but there is more than one physics engine out there. you've been swapping the words back and forth in your messages, confusing the issue.

even again there you say something that makes no sense.



PhysX is a software physics engine designed to run on the CPU primarily, or be accelerated by a PPU or Nvidia GPU. It has nothing to do with translating anything.
i think what robert mean was, why it's make it harder to use phisyx when ati card present, robert even said he have BFG PPU(AEGIA), but he can't use it anymore because of this restriction.


and btw i think this is ilegall move, we buy the card because of this feature. i hope EU can step in just like they do wit linux in ps3
Posted on Reply
#19
Mussels
Moderprator
oh i agree that limiting them (especially the PPU's) is illegal.

but theres nothing we can do about that here.
Posted on Reply
#20
Wile E
Power User
newtekie1 said:
It isn't surprising, and btarunner pretty much hit the nail right on the head, this is done so they can clear out old weaker GPUs as PhysX cards. A quick trip over to eVGADIA...I mean eVGA...and they have huge adverts on the main page that say "GT 240 Makes a great Dedicated PhysX card!" and "Maximize your gaming experience with a PhysX card!" They aren't even calling them graphics cards at this point, they are simply referring to them as PhysX cards...:laugh:



You've got a PCI-E x1 slot...buy a dremel and a super cheap $50 9800GT and make yourselft a PCI-E x1 PhysX card. I recently chopped PCI-E x16 card down to fit in a board with no x16 slot, and I was actually surprised at how easy it really was. I was affraid at first, and though it would be hard, but really it wasn't.
I'd rather just file the back of the PCIe slot to be open.

newtekie1 said:
ATi has always been a direct competitor to nVidia.

However, nVidia started making AMD chipsets before AMD bought ATi, long before. It has only been recently that AMD became a direct competitor by buying ATi, and it doesn't make sense for nVidia to just shut down their entire chipset devision because of it.



The hardware accelerated parts of PhysX definitely are unuderused, and reduced to useless eye candy.

However, the software parts of PhysX, that run on the CPU like Havok, tend to be what makes the game playable and have anything moveable that interacts with the player.

I would really like to see PhysX uses to its full portential in games, with fully destructable environments, but saddly no developer will ever do that unless every gamer can use it. This means we will never see it unless PhysX runs on ATi hardware, or at least runs on a cheap nVidia card with an ATi card as the main GPU.
Or they port Physx over to OpenCL or DirectCompute.

cadaveca said:
nV just needs to start selling G92 without display connections, and a blank backplate, to be specifically used as a Phys-X card. I fail to understand why they have not done this yet...
I've been thinking that, too. Then, to eliminate all compatibility issues with gfx drivers, make it listed as a co-processor in the OS.
Posted on Reply
#21
Wile E
Power User
Oh, and can somebody PM me a link to the unblocked driver for Win 7 x64?
Posted on Reply
#23
zAAm
RejZoR said:
I know, but SLi and Crossfire can be done. So it's not entirely locked. Unless it's limited to one model only even if you have more of them. Unlike PhysX where you'll have a different kind of adapter.
No, it's locked. Like I said, only one display driver. SLI uses two or more NVIDIA cards and Crossfire uses two or more ATI cards - which means all of them use the same driver. Which is why it'll work on Vista. Once you start mixing brands with Vista = no go. ;)
Posted on Reply
#24
KainXS
I can only hope nvidia gets some brains and starts trying to sell their GT21X cards with no backplate or connections instead of the G92's because they don't use much power at all and have better compute capabilities, They have the chance to make alot of money and try to redeem PhysX but for some reason I think they will lock everything up by the next release.:laugh:
Posted on Reply
#25
Helper
wahdangun said:
robert even said he have BFG PPU(AEGIA), but he can't use it anymore because of this restriction.
Actually, you can still use an Ageia PPU along w/ any other GPU. But you have to get old Physx pack from 2008, one before they put up restriction and stopped supporting it. It'll have it's own blue control panel and settings to test the PPU etc, better then Nvidia's integration. :) Kinda cute but doesn't really justify wasting a PCI slot IMO LOL.

Anyway, it was obvious they didn't "remove" restriction. They're Nvidia and that won't happen. The block was on Nvidia's CP level, in SLi&Physx settings. When used with a non-Nvidia GPU, Physx option disappeared. Now that they changed UI to a new one on that page in 257.15, block went away on first rev beta drivers. They forgot to put it back, LOL. Now I wonder if I can do SLi on Server 2003, without changing it to XP x64... maybe tri-way SLi, or how about Quad SLi on XP based OS? Is there ANY reason why I can't, other then their policy? Man Nvidia is stupid...
Posted on Reply
Add your own comment