• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Galaxy Designs GeForce GTX 480 with Dedicated PhysX GPU

As Steevo said, imagine this thing shutting down the 480 core for surfing/idle. Hmm..
 
Play all 16 hardware accelerated games, pair it with the killer 5770 and have a entirely overblown computer that will get spanked by a machine costing half as much.
 
A bit interesting then the regular GTX 480. NICE one Galaxy.:)
 
They really can't find a way to sell them G2XX leftovers, eh ? This is beyond silly. A GTX480 doesn't need the add-on GPU for PhysX. PhysX is a sham anyway (There's what ? 15 games that actually do it on the GPU ?), and this card is pointless.

Now, if this second chip was a Hydra chip like wahdangun said, then it would've been quite an intriguing card.
 
They really can't find a way to sell them G2XX leftovers, eh ? This is beyond silly. A GTX480 doesn't need the add-on GPU for PhysX. PhysX is a sham anyway (There's what ? 15 games that actually do it on the GPU ?), and this card is pointless.

It's no doubt that physX will never amount to much in the scheme of things, but adding some chips they would otherwise not sell, recouping some money on them and getting sales by putting 'OMG PhysX!' on the side is a good business practice. People will buy anything if it has enough exclamation marks on it.
 
It's no doubt that physX will never amount to much in the scheme of things, but adding some chips they would otherwise not sell, recouping some money on them and getting sales by putting 'OMG PhysX!' on the side is a good business practice. People will buy anything if it has enough exclamation marks on it.

"Good business practice" doesn't change the fact the card itself is silly and the fact that people will buy anything with enough exclamation marks on it doesn't mean it is a good product.
 
Last edited:
The GTX480 is enough to run physx on its own. All that does is more heat and power draw.
 
Doesn't it like go over the pci-e power standard? The 480 is already close to 300W on its own.
 
They should shut off the 480 core and memory when in 2D clocks and save a shitload of power. Then it would be worth it.

that is very good remark.

imagine this card in a gaming/HTPC setup...
 
They should shut off the 480 core and memory when in 2D clocks and save a shitload of power. Then it would be worth it.

That would be totally sweet, but saddly I don't think it is possible with the way the display logic is integrated in to the GPU. It might have worked on the older GF200, with the seperate display logic chip though.
 
What if they used an nvidia optimus type of solution? The pc still sees the two separate GPUs right? And each of them has its own powerer circuitry...
 
What if they used an nvidia optimus type of solution? The pc still sees the two separate GPUs right? And each of them has its own powerer circuitry...

Actually that might work, a Hybrid-SLi type of setup. Not sure if that would work or not, but I'm sure some smart engineer could figure something out.
 
wasn't there a benchmark with a GTX 470 and a physics gpu and it showed no difference. what woudl be the diff here with a GTX 480. pointless imo
 
Well the gt240 gpu is just stuck on there now, it will sell, and its alright.'

GTX 480 is silly buff in physx so it wont even change performance value's with the extra gpu in there. Hopefully Something will happen



Im pist frankly. I wanted a 512core 480 that was a beast just so i can look up at buying a beast nvidia card in the future. Thats not going to happen right now at least. :mad:
 
I think what he is asking is if the GT240 is fast enough to calculate PhysX and not hold back the GTX480. The same way a weak CPU would hold back a stong GPU.

The issue is that people think PhysX takes a lot of computing power, when it really doesn't, so a very basic card can handle it easily without limitting the more powerful graphics card.

and like CPU can do this.
Alltho nvidia doesnt like that :P

I used the hack once when you cudnt do nv for physx, but it was a total waste of powerconsumtion i think, i playd one game per year that would maybe look better with physx....
Make it opencl based and then both cpu and gpu can do it, based on desire...
 
and like CPU can do this.
Alltho nvidia doesnt like that :P

I used the hack once when you cudnt do nv for physx, but it was a total waste of powerconsumtion i think, i playd one game per year that would maybe look better with physx....
Make it opencl based and then both cpu and gpu can do it, based on desire...

There is the option to have the CPU do PhysX in the PhysX control panel. With my X3220(Q9650)@3.6GHz, the framerates are terrible when doing it on the CPU. Now that is only using one core of the quad core processor, I feel that it probably would be able to handle it if the game developers went back and re-worked it to use the multi-threaded capabilities nVidia added to PhysX recently. However, while someone with a quad like me might be able to handle PhysX on the CPU, I still don't think a lot of the dual-core users would be able to handle PhysX on the CPU.

I really have to agree with you on PhysX not really being worth the power consumption of a second card. My HD4890+9600GT was only used for Batman, and after that I removed the 9600GT and it sits on my desk... If it wasn't for the fact that I keep the 9600GT as a back-up card(which is why I bought it, I would not have bought it for PhysX) I would get sell it.
 
Looks kinda like a Voodoo2. The only difference is that this can heat your home.
 
Back
Top