Friday, March 20th 2009

AMD to Demonstrate GPU Havok Physics Acceleration at GDC

GPU-accelerated physics is turning out to be the one part of specifications AMD is yearning for. One of NVIDIA's most profitable acquisitions in recent times, has been that of Ageia technologies, and its PhysX middleware API. NVIDIA went on to port the API to its proprietary CUDA GPGPU architecture, and is now using it as a significant PR-tool apart from a feature that is genuinely grabbing game developers' attention. In response to this move, AMD's initial reaction was to build strategic technology alliance with the main competitor of PhysX: Havok, despite its acquisition by Intel.

In the upcoming Game Developers Conference (GDC) event, AMD may materialize its plans to bring a GPU-accelerated version of Havok, which has till now been CPU-accelerated. The API has featured in several popular game titles such as Half Life 2, Max Payne II, and some other Valve Source-based titles. ATI's Terry Makedon, in his Twitter-feed has revealed that AMD would put forth its “ATI GPU Physics strategy.” He also added that the company would present a tech-demonstration of Havok technology working in conjunction with ATI hardware. The physics API is expected to utilize OpenCL and AMD Stream.

Source: bit-tech.net
Add your own comment

226 Comments on AMD to Demonstrate GPU Havok Physics Acceleration at GDC

#1
FordGT90Concept
"I go fast!1!11!1!"
Most likely, AMD discovered it ain't so easy to transfer Havok calls to their GPUs so they gave up on it. That, or Intel did something to shoo them away.

There was talks of Havok FX (Havok on GPU) a long time ago but it never happened. Havok just appears to be in limbo. Intel doesn't want to do anything with it because of the issues with Larrabee. They are just expanding the libraries to do more stuff like AI pathfinding and cloth.


But think about it: what is Havok's speciality? Making physics code believable but not very intensive so it can run on the CPU without causing problems. There really is no market for them to make a more complex physics engine (like PhysX) that increases the hardware burden substantially and the end user can't tell a difference. Havok is havok--works great and is hardware friendly.

Maybe Intel, Havok, and AMD discovered this so Havok is now just minding their own businesses improving on their already successful product?
Posted on Reply
#2
TheMailMan78
Big Member
Meh. I'm going Nvidia in my next build. ATI has a bunch of features that no one uses.
Posted on Reply
#3
erocker
TheMailMan78 said:
Meh. I'm going Nvidia in my next build. ATI has a bunch of features that no one uses.
What? So you can use PhysX for all of those PhysX games? :laugh: I hear Fermi's love Florida this time of year. :D
Posted on Reply
#4
TheMailMan78
Big Member
erocker said:
What? So you can use PhysX for all of those PhysX games? :laugh:
Yeah the few that offer it. Plus the folding aspect of them and TWIMTBP program are both nice. We have Stream/Open CL that supports.........um?

Oh an no Fermi. To hot for my region. I have to wait until the next gen.
Posted on Reply
#5
sneekypeet
Unpaid Babysitter
Dont let erocker pull your chain, he is currently raising funds for said fail physX...lol
Posted on Reply
#6
erocker
TheMailMan78 said:
Yeah the few that offer it. Plus the folding aspect of them and TWIMTBP program are both nice. We have Stream/Open CL that supports.........um?
It doesn't matter to me bud. Do we know folding performance yet? I've been playing plenty of "PhysX" games just fine with my card. None of these features to me are anything to get rid of my current card over. Though, I would gladly replace it with a GTX 480 on performance alone.


sneekypeet said:
Dont let erocker pull your chain, he is currently raising funds for said fail physX...lol
OMG NO YOU!!!! :roll:
Posted on Reply
#7
TheMailMan78
Big Member
sneekypeet said:
Dont let erocker pull your chain, he is currently raising funds for said fail physX...lol
All I'm saying is I think Ill go Nvidia next time.
Posted on Reply
#8
sneekypeet
Unpaid Babysitter
erocker said:




OMG NO YOU!!!! :roll:
Just waiting for Monday;) I just hope they dont release while Im at the dentist and I miss everything.
Posted on Reply
#9
tigger
I'm the only one
I'm just waiting for the free nvidia card with TWIMTBP branded games,otherwise i will just avoid them in future if i can.I am sick of nvidia giving game companies back handers to make sure games run like crap on ati hardware.
Posted on Reply
#10
sneekypeet
Unpaid Babysitter
Im in it for the free t-shirt Newegg is offering.:roll:
Posted on Reply
#11
Fourstaff
I am waiting for a day when Intel bans Havok on AMD chips. Hopefully it doesnt happen, but there might be a possibility.
Posted on Reply
#12
FordGT90Concept
"I go fast!1!11!1!"
tigger said:
I'm just waiting for the free nvidia card with TWIMTBP branded games,otherwise i will just avoid them in future if i can.I am sick of nvidia giving game companies back handers to make sure games run like crap on ati hardware.
I play lots, and lots, and lots of games. Overall, I think there's fewer problems with AMD cards than NVIDIA cards for one simple reason: AMD releases drivers monthly while NVIDIA is doing good to release them bi-annually. If there is a problem, AMD is likely to get it fixed long before NVIDIA does.

Lets also not forget that NVIDIA is the reason DX 10.1 exists (DX 10.1 features were intended to be part of DX 10 but NVIDIA couldn't make Microsoft's deadline so they had to release DX 10 and later DX 10.1 with the features NVIDIA couldn't support but AMD could) and even then, it took them years to finally adapt it. NVIDIA was also late in releasing DX11 parts by about half a year.

Oh, and the obscenely high failure rates on GeForce 8 series cards. :(


All of the above are the reasons I went back to AMD.


Fourstaff said:
I am waiting for a day when Intel bans Havok on AMD chips. Hopefully it doesnt happen, but there might be a possibility.
If Intel did that, Havok would be like PhysX with rare implementations. Intel won't do that for the sake of keeping Havok a viable company.
Posted on Reply
#13
Wile E
Power User
tigger said:
I'm just waiting for the free nvidia card with TWIMTBP branded games,otherwise i will just avoid them in future if i can.I am sick of nvidia giving game companies back handers to make sure games run like crap on ati hardware.
This is total and complete BS. I get so sick of this claim. Not even ATI themselves make this claim.

TWIMTBP does not cripple ATI hardware, PERIOD. It just means nVidia took the time to give help to the dev to get their hardware optimized. Optimizing for nVidia is not the same as crippling ATI. ATI has the same opportunities to offer dev help, but usually choose not to. How is this, in any way, "evil" nVidia crippling ATI?

PS: Sorry I sound snippy, tig. It's not meant to be personal, it's just a frustrating topic to see always popping up.

FordGT90Concept said:
I play lots, and lots, and lots of games. Overall, I think there's fewer problems with AMD cards than NVIDIA cards for one simple reason: AMD releases drivers monthly while NVIDIA is doing good to release them bi-annually. If there is a problem, AMD is likely to get it fixed long before NVIDIA does.

Lets also not forget that NVIDIA is the reason DX 10.1 exists (DX 10.1 features were intended to be part of DX 10 but NVIDIA couldn't make Microsoft's deadline so they had to release DX 10 and later DX 10.1 with the features NVIDIA couldn't support but AMD could) and even then, it took them years to finally adapt it. NVIDIA was also late in releasing DX11 parts by about half a year.

Oh, and the obscenely high failure rates on GeForce 8 series cards. :(


All of the above are the reasons I went back to AMD.



If Intel did that, Havok would be like PhysX with rare implementations. Intel won't do that for the sake of keeping Havok a viable company.
Except that for the past 1 1/2 years, it's been ATI with the more bug laden drivers. nVidia's turn will come back around again tho. Both companies go back and forth on driver quality, just like they go back and forth in performance.

And the failure rate on 8 series cards did not seem that high to me. Certainly not much different than ATI. Unless you meant the big defective batch of mGPUs?
Posted on Reply
#14
a_ump
i agree with wile E. i mean the way i can relate to it is a recent project i had in english 12. we had a group of 4, acting a scene in Macbeth(i hate eng 12). one of our group wasn't there for the script writing. we wrote ours, and then had to write his. he didn't really care for his but we were fine with ours. Are we "evil" or wrong for writing his script cause he wasn't there? no. Is nvidia wrong for helping dev's on their hardware when ATI isn't helping dev's on theirs? no.
Posted on Reply
#16
Wile E
Power User
Grings said:
So, just because ATI dont shout about it at every opportunity like Nvidia do means they dont work with game developers?

Nonsense.

http://www.bit-tech.net/bits/interviews/2010/01/06/interview-amd-on-game-development-and-dx11/1
That wasn't the point at all. The point was TWIMTBP is not anti-ATI.

Besides, nVidia still does it more, and has stronger ties to more devs. ATI (and AMD in general, actually) does not push itself as hard in the market as they could, especially compared to their competitors.

PS: ATI is launching their answer to TWIMTBP this year, from what I understand. So we may be seeing more of them in the dev process of games. It's about damn time, too.
Posted on Reply
#17
Grings
From the interview i posted (discussing Batman Arkham Asylum's Nvidia only anti aliasing):
The part that I totally hold in contempt is the appalling way they added MSAA support that uses standard DirectX calls - absolutely nothing which is proprietary in any useful sense. They just did ordinary stuff, a completely standard recommendation that they make and that we make to developers for how to do MSAA, and they put it in and locked it to their hardware knowing it would run just fine on our hardware. And indeed, if you simply spoof the vendor ID in the driver - which we and other people have documented - it runs absolutely fine on AMD hardware. There's nothing proprietary about it in that sense, nothing new. I think that's exceptionally poor.
Posted on Reply
#19
Wile E
Power User
Grings said:
From the interview i posted (discussing Batman Arkham Asylum's Nvidia only anti aliasing):
nVidia added it to the engine themselves, and did not test on ATI hardware (and why would they?). Of course they locked it out.

And him claiming that it worked perfectly fine on ATI hardware is a bold faced lie. It was shown that it doesn't work properly on ATI, even if you spoof it. There were screenshots all over the place proving it when the game released. ATI did not apply AA on shadows and other weird anomalies. They may have since fixed it in drivers, but at launch it was, in fact, broken. And it was not nVidia's job to get it working on ATI hardware either. If the would've left it unlocked, they would've caught hell for it being broken, and we still would've seen people claiming they did it on purpose. They were damned if they did, and damned if they didn't.
Posted on Reply
#20
TheMailMan78
Big Member
Wile E said:
nVidia added it to the engine themselves, and did not test on ATI hardware (and why would they?). Of course they locked it out.

And him claiming that it worked perfectly fine on ATI hardware is a bold faced lie. It was shown that it doesn't work properly on ATI, even if you spoof it. There were screenshots all over the place proving it when the game released. ATI did not apply AA on shadows and other weird anomalies. They may have since fixed it in drivers, but at launch it was, in fact, broken. And it was not nVidia's job to get it working on ATI hardware either. If the would've left it unlocked, they would've caught hell for it being broken, and we still would've seen people claiming they did it on purpose. They were damned if they did, and damned if they didn't.
Just to be clear Batman:AA now does in fact offer AA on ATI drivers. I proved it.....

http://forums.techpowerup.com/showthread.php?t=119242
Posted on Reply
#21
Mussels
Moderprator
TheMailMan78 said:
Just to be clear Batman:AA now does in fact offer AA on ATI drivers. I proved it.....

http://forums.techpowerup.com/showthread.php?t=119242
via in game settings, or CCC?

edit: read the thread.

The fact you need to FORCE AA, means that the game does NOT "offer" AA on ATI drivers - it just means there is a way to make it work, even if it is performance heavy.
Posted on Reply
#22
TheMailMan78
Big Member
Mussels said:
via in game settings, or CCC?

edit: read the thread.

The fact you need to FORCE AA, means that the game does NOT "offer" AA on ATI drivers - it just means there is a way to make it work, even if it is performance heavy.
Well before you couldnt do it. Now you can and to be honest its the Unreal 3 engine which you could force AA on every other title but Batman.....until now :D
Posted on Reply
#23
Mussels
Moderprator
TheMailMan78 said:
Well before you couldnt do it. Now you can and to be honest its the Unreal 3 engine which you could force AA on every other title but Batman.....until now :D
yeah, but let credit go where credit is due - ATI allow you to do it via CCC, B:AA and its sponsors are still doing everything they can to block you.
Posted on Reply
#24
Wile E
Power User
Mussels said:
yeah, but let credit go where credit is due - ATI allow you to do it via CCC, B:AA and its sponsors are still doing everything they can to block you.
Not unlocking AA for ATI is not the same as blocking ATI. At this point, it's purely up to the dev to unlock it in a patch if ATI does have it working properly.
Posted on Reply
#25
Mussels
Moderprator
Wile E said:
Not unlocking AA for ATI is not the same as blocking ATI. At this point, it's purely up to the dev to unlock it in a patch if ATI does have it working properly.
yes, it is.

Nvidia get an AA mode that only applies to whats neccesary - ATI are forced to use a generic AA profile that takes a large performance hit (applies AA to unnecessary elements) - i dunno about you, but its very clear to me ATI was blocked and had to resort to workarounds to even get this slower AA method working.
Posted on Reply
Add your own comment