Friday, March 20th 2009

AMD to Demonstrate GPU Havok Physics Acceleration at GDC

GPU-accelerated physics is turning out to be the one part of specifications AMD is yearning for. One of NVIDIA's most profitable acquisitions in recent times, has been that of Ageia technologies, and its PhysX middleware API. NVIDIA went on to port the API to its proprietary CUDA GPGPU architecture, and is now using it as a significant PR-tool apart from a feature that is genuinely grabbing game developers' attention. In response to this move, AMD's initial reaction was to build strategic technology alliance with the main competitor of PhysX: Havok, despite its acquisition by Intel.

In the upcoming Game Developers Conference (GDC) event, AMD may materialize its plans to bring a GPU-accelerated version of Havok, which has till now been CPU-accelerated. The API has featured in several popular game titles such as Half Life 2, Max Payne II, and some other Valve Source-based titles. ATI's Terry Makedon, in his Twitter-feed has revealed that AMD would put forth its “ATI GPU Physics strategy.” He also added that the company would present a tech-demonstration of Havok technology working in conjunction with ATI hardware. The physics API is expected to utilize OpenCL and AMD Stream.

Source: bit-tech.net
Add your own comment

226 Comments on AMD to Demonstrate GPU Havok Physics Acceleration at GDC

#1
alexp999
Staff
I didnt think Physx did character models? Just environments and fluids?

EDIT:

Just read the physx page, seems they do. Tho I have never seen it implemented.

I still dont get the cloth physx tho. Does anyone actually think it looks real :confused:
Posted on Reply
#2
DarkMatter
Sorry but some comments in this thread are just stupid. And TBH I call BS on them and trolling up to this point. Sorry because I'm talking about people that is been in TPU for a long time, but I'm just amazed of how much people can talk about (and bash) a thing, without even knowing what it does. PhysX no collision detection? My GOD! Character physics and ragdoll? :banghead: Of course it does all those things FFS!! Requiring aditional hardware? NO!! (Only for massive physics, you don't need it for very small number of particles, or 50-100 boxes)

You guys are talking too much and you never saw one single PhysX demostration!! PhysX has everything Havok has always have, plus many other things like the ones people are mentioning here that they want, like real fluids, massive physics (I suggest you see a pair of demos)... You could have had ALL those things implemented since 2006, if Intel had not tried so hard to ban it from games (yeah even before the adquisition it was for their interest) OR if you didn't asked so passionately to ban hardware physics from games. If what you trully wanted is all that, you could have asked AMD to support it instead of bashing a product you know NOTHING about. :shadedshu

EDIT: BTW examples of PhysX running in software mode (and crappy console CPU) are: Gears of War, Mass Effect. I don't know you, but I would say that those two games have amazing physics.
Posted on Reply
#3
FordGT90Concept
"I go fast!1!11!1!"
Mass Effect is ran on Unreal Engine 3. Unreal Engine 3 has core physics coded by James Golding and also offers support for NVIDIA PhysX as middleware. It is present but that doesn't mean they have to use it.


Still, I have beat Mass Effect probably 3-5 times already and not one time have I thought to myself "this game looks pretty" or "those are nice physics." Actually, I scolded the physics a few times when a Krogan gets bionic lifted on Feros and falls down under a scaffolding where he can't be killed. Not once did I praise the physics or graphics because frankly, I couldn't care less about them.
Posted on Reply
#4
ShadowFold
DarkMatter said:
Sorry but some comments in this thread are just stupid. And TBH I call BS on them and trolling up to this point. Sorry because I'm talking about people that is been in TPU for a long time, but I'm just amazed of how much people can talk about (and bash) a thing, without even knowing what it does. PhysX no collision detection? My GOD! Character physics and ragdoll? :banghead: Of course it does all those things FFS!! Requiring aditional hardware? NO!! (Only for massive physics, you don't need it for very small number of particles, or 50-100 boxes)

You guys are talking too much and you never saw one single PhysX demostration!! PhysX has everything Havok has always have, plus many other things like the ones people are mentioning here that they want, like real fluids, massive physics (I suggest you see a pair of demos)... You could have had ALL those things implemented since 2006, if Intel had not tried so hard to ban it from games (yeah even before the adquisition it was for their interest) OR if you didn't asked so passionately to ban hardware physics from games. If what you trully wanted is all that, you could have asked AMD to support it instead of bashing a product you know NOTHING about. :shadedshu

EDIT: BTW examples of PhysX running in software mode (and crappy console CPU) are: Gears of War, Mass Effect. I don't know you, but I would say that those two games have amazing physics.
So where are these AWESOME games that have these awesome PhysX demonstrations? The only game that made me go "hm, nice physics" was half-life 2. when i had my 280, nothing with PhysX was any good.

And i have Mass Effect, it's one of my favorite games.
Posted on Reply
#5
imperialreign
ShadowFold said:
So where are these AWESOME games that have these awesome PhysX demonstrations? The only game that made me go "hm, nice physics" was half-life 2. when i had my 280, nothing with PhysX was any good.

And i have Mass Effect, it's one of my favorite games.
Only game I can recall that has activelly used the new PhysX implimentation was Mirrors Edge

Havok, though, has been the mainstream physics engine for absolute ages . . . and both ATI/AMD and Intel have supported it in the past (and still do). Some of the more popular titles names using Havok:

FEAR
FEAR 2
Thief: Deadly Shadows
Timeshift
Assassin's Creed
Bioshock
Company of Heroes
Fallout 3
Half-Life 2
Halo 3
StarCraft II
Diablo 3
Ghost Recon: Advanced Warfighter 2

as well as:

Futuremark 3Dmark05
Futuremark 3Dmark06
Futuremark 3DmarkVantage

and countless other mainstream titles . . . PhsyX itself, based on Aegia's engine, is good . . . but it's not as heavily supported as Havok.

Thing is, if AMD and Intel can come together and start agreeing on implimentation of the Havok engine (of which, IIRC, Intel had bought back in '07), they could quickly and easily drive nVidia out of the physics market . . . Haok is used across both console and PC platforms, and has the bigger market dominance over PhsyX. the only thing that nVidia has going for them, in regards to their implimentation, is their large GPU dominance . . . but Intel and AMD working together could quickly drive them out.
Posted on Reply
#6
hat
Enthusiast
great... havok on amd gpus, physx on nvidia gpus... now what am I supposed to do? I got all excited cause I could set my IGP to run Physx while my 9800gt focuses its undivided attention on my games but now this comes out. sigh
Posted on Reply
#7
Lillebror
imperialreign said:
...Futuremark 3DmarkVantage..
There is a reason why people with nvidia cards scores so high ;) It uses physx to offload it to the cpu - or the gpu if its enabled!
Posted on Reply
#8
imperialreign
Lillebror said:
There is a reason why people with nvidia cards scores so high ;) It uses physx to offload it to the cpu - or the gpu if its enabled!
there has been a lot of debate over that in the past . . . namely, over whether or not the PhysX scores are legitmate . . .

even still - although nVidia might be the leader in the GPU market . . . if AMD and Intel ever collaborate and push Havok further, nVidia nad their monolithic hardware wouldn't stand a chance in the physics market against the two.

But, that all balances against AMD and Intel ever deciding to work together with Havok implimentation.
Posted on Reply
#9
FordGT90Concept
"I go fast!1!11!1!"
hat said:
great... havok on amd gpus, physx on nvidia gpus... now what am I supposed to do? I got all excited cause I could set my IGP to run Physx while my 9800gt focuses its undivided attention on my games but now this comes out. sigh
I think Intel and AMD will be releasing processors with dedicated PPUs on-die (or GPUs that can act as a physics processor). When not used for physics, it could be used for something else.

That's another reason why NVIDIA feels threatened and was starting to talk about making their own x86 CPU.
Posted on Reply
#10
Error 404
PhysX requires 256 MB of RAM and 16 SPUs on an 8x00 series card or higher, at minimum, to work, IIRC.
My 9600 GT has 64 SPUs, so if I enably PhysX on it then that reduces my SPU count to a minimum of 48. That is not good!
ATI cards have up to 800 SPUs. How many of those would Havok based Physics require to run properly? 40, 80, 200? I'd guess 80 because thats what their lowest end cards usually have. Any other ideas on SPU count required for this?
Posted on Reply
#11
Kursah
Error 404 said:
PhysX requires 256 MB of RAM and 16 SPUs on an 8x00 series card or higher, at minimum, to work, IIRC.
My 9600 GT has 64 SPUs, so if I enably PhysX on it then that reduces my SPU count to a minimum of 48. That is not good!
ATI cards have up to 800 SPUs. How many of those would Havok based Physics require to run properly? 40, 80, 200? I'd guess 80 because thats what their lowest end cards usually have. Any other ideas on SPU count required for this?
The way SPU's are counted between ATI and NV is different, there's a breakdown of it on TPU and on the web. Not that big of a deal, both took slightly different routes on SP's and types of SP's, which is good imo, both have shown that each route is quite capable.

I think it's very cool to see Havok getting support like this, really what I would like to see is the two in comparison in the same game via middleware patch or something. Show the differences, show the effects/affects of each engine, etc. I think Havok is great stuff since it's been used so long, but I dont' know much about it to know just how well it will work for more realistic games in the future...same with PhysX though. While it is neat, it's not used, I don't really care either way yet because there are quite a few games that use CPU driven proprietary physics engines for that specific game that works fine. Though if we could see a blend of PhysX/Havok that could be something truly worth having around, that'd be the way to go...as-far-as AMD and Intel making Havok a standard, it could happen...whether it will...we'll find out within the next couple years I believe. None-the-less, not worth making a big deal out of till there's a big deal to be made from results imo. I want to see AMD/ATI cards with Physx support on the end-user side like NV's had for PhysX for months to make my own judgement...will you notice a difference in HL2 or any other game that uses Havock with a newer processor being offloaded and new GPU being loaded more? Could be more negative than good depending on how it's executed and just what's going on in the particular scene I suppose...I'll wait and not really worry about it till there's something more solid and out there for end-users.

:toast:
Posted on Reply
#12
DarkMatter
FordGT90Concept said:
Mass Effect is ran on Unreal Engine 3. Unreal Engine 3 has core physics coded by James Golding and also offers support for NVIDIA PhysX as middleware. It is present but that doesn't mean they have to use it.


Still, I have beat Mass Effect probably 3-5 times already and not one time have I thought to myself "this game looks pretty" or "those are nice physics." Actually, I scolded the physics a few times when a Krogan gets bionic lifted on Feros and falls down under a scaffolding where he can't be killed. Not once did I praise the physics or graphics because frankly, I couldn't care less about them.
In those games PhysX is the physics engine in use.
Anyway, did you praise the ones in Oblivion?? You can't blame an engine because of how it has been used in a game...

ShadowFold said:
So where are these AWESOME games that have these awesome PhysX demonstrations? The only game that made me go "hm, nice physics" was half-life 2. when i had my 280, nothing with PhysX was any good.

And i have Mass Effect, it's one of my favorite games.
I have said it already. There's almost no game using it to all it's extension because.

a) Intel and AMD have tried so hard to ban PhysX from games.

b) because of the comments from so many people anong the lines seen here. If developers see that people don't care about physics they will not spend their time implementing anything.

My comment was not for those who don't care about physics (good for them), is for those who seem to want some better physics and at the same time are bashing PhysX, which has been delivering exactly what they wanted since it's creation, but could never be implemented because of the points above.

And my post was directly directed at those spilling BS about that PhysX can't do this or that. It can do everything that Havok can do on the CPU and much much more when on the GPU (until now, we'll see). I'm in no way saying this Havok GPU implementation is worse than PhysX, but I can almost say it won't be better either. Thing is we don't know.

DON'T expect this other implementation to be implemented more than PhysX, as it will face the same problems, unless Intel really wants it implemeted, which would be very suspicious. It's coming 1-2 years later so it will take time nevertheless.

All in all, my post was regarding the BS about PhysX (that it is flawed, no collision, etc), and not saying it's any better than other engines. GPU physics is much better than any CPU based physics and PhysX is just a very good one that has already proven itself. On the other hand, this Havok implementation still needs to demostrate if it has what it takes. Yet all of you are already praising it as if it was the Godsend and at the same time bashing PhysX, with clueless allegations. I wonder if it has anything to do with who is releasing it?? :rolleyes:

I don't care if it's PhysX or is Havok or is any other one the physics implementation that wins, but I want it NOW already and PhysX is the only one that can do it right now. Thats why I support it, why I have always supported it, not because of who it belongs. On the other hand is pretty clear the bias that most of you guys have. GPU physics was a waste of time until yesterday, but it just takes one newspost to make it the best thing ever and now everybody wants massive physics, fluids and whatnot. That is, the same things that Ageia was doing 4 years ago and Nvidia was capable of doing since the adquisition, but this time in the hands of someone else. Because, you are not happy because this is an open standard, because it's not, nor because it's free for the developers, because it's not, nor because it's a better implementation, because you don't know. You are happy because it's AMD, period. And that's plain and simply biased.

Just to finish, tell me which PhysX demos you have seen, because it's pretty clear for me you didn't see anyone. There are tons of videos in youtube if you can't see them directly on a Nvidia GPU.
Posted on Reply
#13
TheMailMan78
Big Member
Ok I don't care about this debate. When will I see some drivers? I have a 4850 just itching to do some physics processing. :rockout:
Posted on Reply
#14
DarkMatter
imperialreign said:
Some of the more popular titles names using Havok:

...
Ghost Recon: Advanced Warfighter 2

Futuremark 3DmarkVantage
Excuse me???
Thing is, if AMD and Intel can come together and start agreeing on implimentation of the Havok engine (of which, IIRC, Intel had bought back in '07), they could quickly and easily drive nVidia out of the physics market . . . Haok is used across both console and PC platforms, and has the bigger market dominance over PhsyX. the only thing that nVidia has going for them, in regards to their implimentation, is their large GPU dominance . . . but Intel and AMD working together could quickly drive them out.
That is completely true, but there's nothing good about that. What do you think it will happen when PhysX (or Nvidia) is out of the game? Intel will eat AMD with some fish and chips, alltogether. AMD is giving Intel the keys to the gaming and GPU markets and Intel will be second to none, at least if they give them such advantages and AMD can't afford that luxury. It's funny because people think it was smart for AMD to not adopt PhysX because it belonged to Nvidia, but now them supporting Intel's Havok is the best thing ever? And the thing is that the company against which AMD has filled lawsuits for unfair competition is Intel and not Nvidia. Also while AMD has released many competent CPUs that were just as fast and sometimes faster in the bussiness market, or in mainstream programs, it's been almost invariably lagging behind in games, I wonder why...
Posted on Reply
#15
ShadowFold
DarkMatter said:
In those games PhysX is the physics engine in use.
Anyway, did you praise the ones in Oblivion?? You can't blame an engine because of how it has been used in a game...



I have said it already. There's almost no game using it to all it's extension because.

a) Intel and AMD have tried so hard to ban PhysX from games.

b) because of the comments from so many people anong the lines seen here. If developers see that people don't care about physics they will not spend their time implementing anything.

My comment was not for those who don't care about physics (good for them), is for those who seem to want some better physics and at the same time are bashing PhysX, which has been delivering exactly what they wanted since it's creation, but could never be implemented because of the points above.

And my post was directly directed at those spilling BS about that PhysX can't do this or that. It can do everything that Havok can do on the CPU and much much more when on the GPU (until now, we'll see). I'm in no way saying this Havok GPU implementation is worse than PhysX, but I can almost say it won't be better either. Thing is we don't know.

DON'T expect this other implementation to be implemented more than PhysX, as it will face the same problems, unless Intel really wants it implemeted, which would be very suspicious. It's coming 1-2 years later so it will take time nevertheless.

All in all, my post was regarding the BS about PhysX (that it is flawed, no collision, etc), and not saying it's any better than other engines. GPU physics is much better than any CPU based physics and PhysX is just a very good one that has already proven itself. On the other hand, this Havok implementation still needs to demostrate if it has what it takes. Yet all of you are already praising it as if it was the Godsend and at the same time bashing PhysX, with clueless allegations. I wonder if it has anything to do with who is releasing it?? :rolleyes:

I don't care if it's PhysX or is Havok or is any other one the physics implementation that wins, but I want it NOW already and PhysX is the only one that can do it right now. Thats why I support it, why I have always supported it, not because of who it belongs. On the other hand is pretty clear the bias that most of you guys have. GPU physics was a waste of time until yesterday, but it just takes one newspost to make it the best thing ever and now everybody wants massive physics, fluids and whatnot. That is, the same things that Ageia was doing 4 years ago and Nvidia was capable of doing since the adquisition, but this time in the hands of someone else. Because, you are not happy because this is an open standard, because it's not, nor because it's free for the developers, because it's not, nor because it's a better implementation, because you don't know. You are happy because it's AMD, period. And that's plain and simply biased.

Just to finish, tell me which PhysX demos you have seen, because it's pretty clear for me you didn't see anyone. There are tons of videos in youtube if you can't see them directly on a Nvidia GPU.
Tl;dr
Show me some games with full PhysX utilization and maybe I will think it's ok but for the time being, it's a dead engine.
Posted on Reply
#16
TheMailMan78
Big Member
DarkMatter said:
What do you think it will happen when PhysX (or Nvidia) is out of the game? Intel will eat AMD with some fish and chips, altogether. AMD is giving Intel the keys to the gaming and GPU markets and Intel will be second to none, at least if they give them such advantages and AMD can't afford that luxury.
The world losing PhysX will not mean "game over" for AMD or Nvidia. As far as Nvidia being shut down I don't think we have anything to worry about. Its not like the only thing keeping them alive is PhysX.

Also Intel CANNOT eat AMD with some "fish and chips". If they could they would have already. Intel would love nothing more than to be the undisputed king of the hill. AMD taking PhysX on with Havok is just good old competition.

No wheres my damn drivers?
Posted on Reply
#17
FordGT90Concept
"I go fast!1!11!1!"
DarkMatter said:
In those games PhysX is the physics engine in use.
Anyway, did you praise the ones in Oblivion?? You can't blame an engine because of how it has been used in a game...
Oblivion's only strong suit is length of gameplay and the voice acting. If you do everything there is to do in the game with official mods and the expansion pack, you can easily break 100 hours of gameplay. The mechanics of the game weren't really notable (movement seemed a bit awkward, all maps were pretty dumbed down/repetitive, combat is pretty bland and repetitive, etc.).

The only game I'd say that had notably good physics is Freelancer (Havok engine). When you get hit by those disorientation mines, holy $h!t. I can't say any other game impressed me in regard to physics.

The only game that impressed me in regards to graphics was X3: Reunion. It was just awesome getting close to a capital ship and seeing all the details on its surface. They did a brilliant job there and yet, it still ran well on lowly hardware. I am more impressed by them taking the time to really get it right (the models/textures) more so than the "eye-candy."
Posted on Reply
#18
DarkMatter
TheMailMan78 said:
The world losing PhysX will not mean "game over" for AMD or Nvidia. As far as Nvidia being shut down I don't think we have anything to worry about. Its not like the only thing keeping them alive is PhysX.

Also Intel CANNOT eat AMD with some "fish and chips". If they could they would have already. Intel would love nothing more than to be the undisputed king of the hill. AMD taking PhysX on with Havok is just good old competition.

No wheres my damn drivers?
By eat alive I meant that Intel had >90% of the market share in both CPU and GPU markets. They don't want AMD to dissapear.

On the contrary, it's for Intel's best interest to keep AMD alive, but with the smaller market share posible. Intel could have and can crush AMD whenever they liked to. Their CPUs are cheaper to make so they can actually release them cheaper and everybody knows they are faster. That is specially true every time they release a new batch on a lower fab process. When 32nm are released they could put the new processors at a price that AMD would never survive, but as I said they will never do it, because it's better to have a weak enemy that you already know than letting a new player enter (also most probably that new player would adquire AMD just in time).

Only reason there's no more relevant companies in the market, is because there's always only place for two: the leader (which ususally offers the best but at a price) and the alternative to the leader, which is the cheaper alternative. If a 3rd tries to enter a market it has to be significantly better than the mentioned alternative, while being cheap or will never take off. Why? Because most people wants products from the leader and if they can't afford them, they will always elect the cheap alternative that they already know, very few will take the cheap, slow and NEW alternative. It's hard to make a new product, so very few times you will make a better product than the others and because you are new, you will never get enough revenue to keep going with the other two.

AMD is the shield that Intel has against other companies that could want to enter the market, even something like IBM. IBM doesn't need to enter the consumer market, and it's not for their best interest to fight against Intel and AMD there. They would be 3rd, even when they are IBM, but without AMD there would be a hole that IBM could very easily fill and once they entered and obtain AMD's current market share, they could do a lot of things to compete, things that AMD can't do because it is so small.

And apart from that, there is the fact that they could face some issues regarding monopoly if AMD didn't exist anymore, and no other took over. They could be forced to make x86 free for all, for example.
Posted on Reply
#19
DarkMatter
FordGT90Concept said:
Oblivion's only strong suit is length of gameplay and the voice acting. If you do everything there is to do in the game with official mods and the expansion pack, you can easily break 100 hours of gameplay. The mechanics of the game weren't really notable (movement seemed a bit awkward, all maps were pretty dumbed down/repetitive, combat is pretty bland and repetitive, etc.).

The only game I'd say that had notably good physics is Freelancer (Havok engine). When you get hit by those disorientation mines, holy $h!t. I can't say any other game impressed me in regard to physics.

The only game that impressed me in regards to graphics was X3: Reunion. It was just awesome getting close to a capital ship and seeing all the details on its surface. They did a brilliant job there and yet, it still ran well on lowly hardware. I am more impressed by them taking the time to really get it right (the models/textures) more so than the "eye-candy."
Physics in Oblivion were crappy and they used Havok, that was my only point with that. Other games have amazing physics and they use Havok. It' irrelevant which engine you use as long as you use it well. There were tons crappy games using Unreal Engine 2, that even ran slow and had bad graphics (Postal 2 anyone?), but that doesn't make UE2 a bad engine. On the contrary it was amazing. PhysX is the same. They lack support and it's because of that you don't see games using it. It has nothing to do with
how good the engine is.

Crysis has good physics and the ones that use GPU accelerated PhysX too have very good physics. If GPU Havok is well implemented it will also offer good physics with a good ammount of integration, but I am still skeptical of why would Intel let AMD make their CPUs look like crap at handling their own physics engine. IMO there's something shaddy there, or this GPU Havok is nothing more than a PR stunt. I vote for this last thing.
Posted on Reply
#20
Mussels
Moderprator
Physx demos are not Physx games.

Two physx items can collide and have merry fun with each other - but non physx entities cant collied. Mirrors edge as a loose example - you can shoot cloth and have holes appear in it, but you cant go walking on said cloth, or drop a gun on it and expect it to stay there in the realistic *appearing* cloth.
Posted on Reply
#21
DarkMatter
Mussels said:
Physx demos are not Physx games.

Two physx items can collide and have merry fun with each other - but non physx entities cant collied. Mirrors edge as a loose example - you can shoot cloth and have holes appear in it, but you cant go walking on said cloth, or drop a gun on it and expect it to stay there in the realistic *appearing* cloth.
:confused: What the hell are you saying? If you make the character walk on a cloth item, the cloth item reacts to your body as it would do in real life. I have not tried to throw the gun, but in the case that it doesn't react, then that's because it's not been declared like a physics item in the game. Under PhysX pretty much every object is a physics object, in the sense that it has all the properties that a real object would have.
Posted on Reply
#22
Mussels
Moderprator
DarkMatter said:
:confused: What the hell are you saying? If you make the character walk on a cloth item, the cloth item reacts to your body as it would do in real life. I have not tried to throw the gun, but in the case that it doesn't react, then that's because it's not been declared like a physics item in the game. Under PhysX pretty much every object is a physics object, in the sense that it has all the properties that a real object would have.
No its not. Phsyx is CAPABLE of it, but they simply cant do it. I'll try and explain it more simply.

Path 1: Make the game use a generic physics engine, for people without CUDA (old Nv cards, ATI) - Physx does as little as possible in this example, so that they dont need to duplicate any coding (two physics engines for the same items) - thats when you have items that dont collide together.

path 2: make two engines coded for everything. When physx is enabled everything moves over, and everything can interact with everything else.


If you were strapped for cash and time as a game developer with an unknown, brand new concept for a game... which would you take?
Posted on Reply
#23
btarunr
Editor & Senior Moderator
ShadowFold said:
Tl;dr
Show me some games with full PhysX utilization and maybe I will think it's ok but for the time being, it's a dead engine.
Cryostasis: Sleep of Reason. period.

DirectX 10.1 and Hardware Tesselation are dead for the time being.
Posted on Reply
#24
Haytch
ShadowFold said:
Tl;dr
Show me some games with full PhysX utilization and maybe I will think it's ok but for the time being, it's a dead engine.
I remember playing Cell Factor Revolution back in the day, correct me if im wrong, but wasnt that all PhysX ? I remember running a few tests, with alternating hardware and what not. Looked to me like that game was all the Asus Ageia PhysX P1 card.

Might have to look that up when i get home.
Posted on Reply
#25
ShadowFold
Cell Factor doesn't work on nvidia PhysX. I tried.. And I forgot Cryostasis, but it's really the ONLY game that uses it well.. I remember trying the demo and it was ok, but ran pretty bad on a gtx 280 and didn't really see anything cool besides the mercury, I mean water.
Posted on Reply
Add your own comment