Thursday, December 11th 2008

PhysX will Die, Says AMD

In an interview with Godfrey Cheng, Director of Technical Marketing in AMD's Graphics Products Group, Bit-Tech.net has quoted him saying that standards such as PhysX would die due to their proprietary and closed nature. Says Mr. Cheng:
"There is no plan for closed and proprietary standards like PhysX," said Cheng. "As we have emphasised with our support for OpenCL and DX11, closed and proprietary standards will die."
Bit-Tech.net interviewed the AMD person to get the company's take on EA and 2K's decision to adopt NVIDIA PhysX across all of their worldwide studios, earlier this week. Interestingly, when asked about how the major publishers such as EA adopting PhysX across all of their studios would impact the propagation of the API, Cheng responded with saying that monetary incentives provided to publishing houses alone won't help a great deal in propagating the API, and that the product (PhysX) must be competitive, and that AMD viewed Havoc and its physics simulation technologies as leaders. "Games developers share this view. We will also invest in technologies and partnerships beyond Havok that enhances gameplay." he added. PhysX is a proprietary physics simulation API created by Ageia technologies, which was acquired and developed by NVIDIA. You can read the full Bit-Tech.net interview with Godfrey Cheng here.
Add your own comment

67 Comments on PhysX will Die, Says AMD

#51
DarkMatter
@brian

I think you don't quite get what I mean about Intel. You have to agree that the CPU+Chipset market is still a lot bigger than the GPU one, it's there where AMD needs more help or otherwise find a way to make the GPU bigger. The chances that they will ever be able to fight in the CPU without help is very small and by that I mean reaching a 40% for example. By saying small I'm being very optimistic at his point. I like as little as any other but Intel seems to be two steps ahead.

Hardware physics could make GPUs more important and would help them a lot more. For instance it would make Spider-like platforms much more appealing. I'm not saying adopting PhysX, I'm not naive, I know the clever thing for them is to not adopt it. But one thing is not adopt it, and a very different one is DOWNPLAY it at every oportunity you find. And believe me they are shooting themselves in the foot, they need GPU physics. As I said they are relying in Havok. Do you honestly believe Intel will not make Havok run better on their hardware? What is worse is that Intel is not making a typical GPU, they are basing it in x86, so it's going to be much much much harder for AMD to to compete there.

There's very little to improve in the pure graphics front. More and more people are heading towards consoles because graphics are good enough. Very little people really feel they need or want better graphics. Hardware physics is the future, and is better run in GPU or GPU-like massive parallel devices (Larrabee). Because physics are a different thing nothing is still established, all companies have still a lot to tweak and the better model is still to be placed. The best solution will be the one with better hardware+software combination. Still GPUs are much better suited for physics than any x86 based device, because they can pack much more performance in the same silicon. Intel will never be able to reach the performance density that current GPUs have with x86, so it will certainly try to transform physics into an x86 thing. Is in AMD's better interest that never happens, but instead of fighting that, they are helping them. With what they are doing now maybe they are doing it well to win over Nvidia, but once Nvidia is out, they are naive if they think they are going to be able to compete against Intel in an industry completely DEFINED by Intel. The advancement in "graphics" in games will come by the hand of two major things, physics and ray-tracing, and if they let Intel define how those are going to be done (and that's exactly what they are doing) they are lost.

Anyone that plays a lot of 3 free for all strategy games know what I'm saying. If you take the weak one first you are lost.
brian.caIt could have not been Nvidia only but when Nvidia developed this I'd be hard pressed to believe that it wasn't developed for their cards and that they wouldn't always retain some upperhand here (it would make sense for them to do this). When you consider that alternatives were coming about from parties with less of a vested interest in favoring a specific card vendor I'm not sure what ATI's motivation to support Nv's solution would be.
Sorry because I'm being repetitive. But if that was their motivation (or lack of motivation) for downplaying PhysX, why FFS are they supporting Havok?? And again not only Intel will ensure Havok runs better on their hardware, but that hardware will be x86 based and very very different. It will be very hard for them to optimize for it. AFAIK the initiatives from AMD in parallel computing is Fusion and OpenCL, both of which are based on the GPU. It just doesn't make sense, they are being stupid, very stupid...
Posted on Reply
#52
leonard_222003
Tatty_OneI agree with much of what you are saying there , however, just one point, DX11 hardware is irrelivent if games dont support it, now let's briefly look back, say at NVidia and ATI, their first DX10 hardware (8800 G82 series and 2900 for ATi) were on the shelves for a year before there were 10.....yes thats just 10 games with DX10 enabled, now look how many games to day are DX10 enabled, divide them by the number of months DX10 hardware has been available....end result.........a cr*p load of money spent by us with VERY little return in gaming terms, damn I have owned 7 different DX10 graphic cards, I own 7 DX10 enabled games (OK I am only a light to mid gamer)....DX11 cards will be released, we will wait 6 months for one DX11 game and a year for 5, in which time we will all have spent our hard earned money to find out that those early DX11 cards just dont cut it so we buy better ones! Now with Physx, at least it's here, it's growing whether AMD like it or not but most importantly, it gives the consumer even more choice and thats gotta be good in my book.
I think windows 7 with dx11 will be a succes this time , dx10 didn't bring anything big for developers to make the change but windows 7 will bring what we want from a new windows and that is more performance.
Biggest thing that will sell this windows is dx11 wich will enable the gpu to do other things than compression and video decoding and if people will switch to this windows based on this things i think developers have no choice but use that api , actually they never had a choice , if microsoft enforces some standard they better be up to the standard they set because uninformed people care if their computer is dx10 compatible or not.
Microsoft demolishes everything that threaten their domain in software bussines and physx would be a thing that threathens them , they will copy everything Nvidia does in this area and implement it as a standard in dx11 , remember 3dfx with their opengl ? bam they wacked that thing , then opengl ? slowly killing them .
They will massacrate physx and if not them then intel with havok , Ati is playing smart here , why spend money on researching this , we can rely on microsoft to do the job for us or intel , we support them :) but not with the kind of money nvidia supports their physx.
And there is another thing to consider , Nvidia could let them use this thing for free for a period of time but , and this is a big but , let's say physx takes of as a standard because ATI uses it , Nvidia uses it and this makes all the graphic giants :) it's a standard people , so Nvidia thinks wow , we worked for this , we spent a lot of money on this , let's ask for money if they want to continue using this in ati products , ok ? then amd pays to nvidia for using physx.
They can't do this , it's suicide for them.
Posted on Reply
#53
DarkMatter
leonard_222003I think windows 7 with dx11 will be a succes this time , dx10 didn't bring anything big for developers to make the change but windows 7 will bring what we want from a new windows and that is more performance.
Biggest thing that will sell this windows is dx11 wich will enable the gpu to do other things than compression and video decoding and if people will switch to this windows based on this things i think developers have no choice but use that api , actually they never had a choice , if microsoft enforces some standard they better be up to the standard they set because uninformed people care if their computer is dx10 compatible or not.
Microsoft demolishes everything that threaten their domain in software bussines and physx would be a thing that threathens them , they will copy everything Nvidia does in this area and implement it as a standard in dx11 , remember 3dfx with their opengl ? bam they wacked that thing , then opengl ? slowly killing them .
They will massacrate physx and if not them then intel with havok , Ati is playing smart here , why spend money on researching this , we can rely on microsoft to do the job for us or intel , we support them :) but not with the kind of money nvidia supports their physx.
And there is another thing to consider , Nvidia could let them use this thing for free for a period of time but , and this is a big but , let's say physx takes of as a standard because ATI uses it , Nvidia uses it and this makes all the graphic giants :) it's a standard people , so Nvidia thinks wow , we worked for this , we spent a lot of money on this , let's ask for money if they want to continue using this in ati products , ok ? then amd pays to nvidia for using physx.
They can't do this , it's suicide for them.
I will reiterate myself again, even in risk of being flamed: Intel is not in any way better than Nvidia, is MUCH MUCH worse. The acusations to TWIMTBP are unfounded. If there was any little of truth in them, it would have been ages someone (Ati?) would have filed a lawsuit against them. Intel on the other hand has been brought to courts many times and HAS LOST on some of them. They have been found guilty, already, what makes you think they would play fair on this one? Especially when they have to introduce their new GPU that will probably be in disadvantage? I just don't get why people hate Nvidia so much that no matter what will happen in the future, Nvidia must die. I rather like Intel falling down...

EDIT: BTW. Do you think Intel is letting them use Havok for free? Maybe they are not paying with money, but don't doubt for a moment they are pying for it. Even if they are not do you think they will never ask for it? Only Nvidia would? :roll:
Posted on Reply
#54
leonard_222003
Of course Intel will ask but consider this , AMD will never ever beat intel , intel is just too big , intel can recover from antyhing AMD throws at them so clearly they have no chance there to improve things but Nvidia can be beaten by AMD in graphics departement , they have some great tehnologies on their hand some fabs that can do some wonders for a gpu if they want and a lot of pattents that could come handy in GPU making , memory optimizations ...etc.
From AMD's point of view the graphics can be won and making this a market only for them could make them very very rich , if they kill Nvidia one day could take a giant like intel but for now is in no position to make any problems for intel , just scrath some insigificant profit from them.
I for one don't want anyone to die , whatever company get's a market only for them without competition will abuse the consumer with high prices and low or mediocre products , progress will be very slow and no one wins but them.
Even now the graphics industry play a very slow progress , they wait for a die shrink but no change to the arhitecture , let's wait for TSMC to start making 40 nm GPU's and then we will have a new generation hurray , the let's wait for the 32 nm and some gddr6 - 7 and hurray we have a new generation , this is what i see from them right now , both amd and especially Nvidia.
Well we are figtinh here but look what they do
www.shacknews.com/onearticle.x/54969
wow , price fixing wich ended with a settlement of 1.7 milion dollars , why if they are innocent , why ? could this be true ?
damn i start to believe those zeitgiest movies of conspiracies , maybe intel , amd , nvida are all the same company that put up a show for us idiots to believe peopel are working hard there :)
Posted on Reply
#55
Mussels
Freshwater Moderator
brian.caI don't mean to seem like I'm picking on you or anything but I wanted say something about the DX10 comments... one thing people need to consider is that DX10 was tied to vista which a lot of people didn't like. So it shouldn't be any surprise that DX10 had a slow adoption rate if Vista had one as well.
store.steampowered.com/hwsurvey/

check the directX section - 21.43% of steam gamers are running a DX10 card on vista. thats over one in five. the adoption rate isnt as low as you think.
Posted on Reply
#56
MilkyWay
i dont think amd thinks that gpu physics will die just physx that no one will be bothered to program for it

we need a standard and not just one like physx where its nvidia calling the shots, or maybe we should just have 2 physx for nvidia and something else for ati, means hell of a lot more coding tho
Posted on Reply
#57
[I.R.A]_FBi
MilkyWayi dont think amd thinks that gpu physics will die just physx that no one will be bothered to program for it

we need a standard and not just one like physx where its nvidia calling the shots, or maybe we should just have 2 physx for nvidia and something else for ati, means hell of a lot more coding tho
hence dx11
Posted on Reply
#58
DarkMatter
leonard_222003Of course Intel will ask but consider this , AMD will never ever beat intel , intel is just too big , intel can recover from antyhing AMD throws at them so clearly they have no chance there to improve things but Nvidia can be beaten by AMD in graphics departement , they have some great tehnologies on their hand some fabs that can do some wonders for a gpu if they want and a lot of pattents that could come handy in GPU making , memory optimizations ...etc.
From AMD's point of view the graphics can be won and making this a market only for them could make them very very rich , if they kill Nvidia one day could take a giant like intel but for now is in no position to make any problems for intel , just scrath some insigificant profit from them.
I for one don't want anyone to die , whatever company get's a market only for them without competition will abuse the consumer with high prices and low or mediocre products , progress will be very slow and no one wins but them.
Even now the graphics industry play a very slow progress , they wait for a die shrink but no change to the arhitecture , let's wait for TSMC to start making 40 nm GPU's and then we will have a new generation hurray , the let's wait for the 32 nm and some gddr6 - 7 and hurray we have a new generation , this is what i see from them right now , both amd and especially Nvidia.
Well we are figtinh here but look what they do
www.shacknews.com/onearticle.x/54969
wow , price fixing wich ended with a settlement of 1.7 milion dollars , why if they are innocent , why ? could this be true ?
damn i start to believe those zeitgiest movies of conspiracies , maybe intel , amd , nvida are all the same company that put up a show for us idiots to believe peopel are working hard there :)
But you know Intel will release a GPU, don't you? It will release probably with Windows 7. AMD alone will never be able to win over Intel if they support Intel's way of doing things. And that's what they are doing, they are giving Intel the keys of the city, giving them the possibility to enter the gaming market big time.

No PhysX doesn't mean Nvidia will be out of the game BTW, it will mean that GPUs will not advance and Intel's model will prevail, in which case both AMD and Nvidia lose. But Nvidia could still be stronger than Ati and Ati become the last monkey, while Nvidia would have just a 10% of the market.

Ati will never kill Nvidia, they can't. If anyone Intel will kill both Ati and Nvidia.
Posted on Reply
#59
brian.ca
At times you seem to be of the mind that the dying of Physx as Nvidia is pushing it now equates to the end of physics being handled by the GPU... I doubt this is the case. physx is just a physics devkit, same as havok. What Nvidia is doing is combining it with their CUDA to allow for the work to be done on the GPU. But just like that's not all CUDA's meant to do ATI has Stream for their cards, Apple is pushing OpenCL to take advantage of more cores and GPUs from both vendors, and Microsoft's doing the same with directX 11. And I believe both Nv and ATI support OpenCL, I would assume support for DX11 will be a given. So one way or the other they and other companies are pushing to utiilize the GPU more... to think that this won't result with physics being handled on the GPU, if that would be advantageous to devs and consumers, seems off to me.

Also while Intel will always be strong with lots of resources I think you might be giving it (or more specifically it's ownership of havok) too much weight here. When Nvidia bought Ageia AMD was instantly pushed into a pretty interesting situation. Physx never got very far b/c of it's proprietary nature but if AMD were to support it, assuming the resulting capabilities of doing physics on GPUs would be decent selling points, Physx overtaking havok sould have definately been a possibility. With Larabee still a ways off I don't think Intel would be too keen on that so I wouldn't be surprised if AMD got something out of this deal. Intel benefits from them supporting havok after all. From what I understand AMD was supposed to have their cards accelerating havok by the end of the year... no word of that yet but it would seem they're definately working on it and it wouldn't seem unreasonable that that's being done with some level of concession or support on Intel's part.

Furthermore, I said before that I haven't really seen too much coming from Physx so far that havok hasn't done... but the article in the original post linked to another article on the same site that linked off to some neat things (www.bit-tech.net/bits/2008/09/25/roy-taylor-on-physics-ai-making-games-fun/1 specifically check out the natural motion and AIseek sites & their demos). One thing important to note... you were talking about how graphics can only go so far and after that the GPU needs to branch out to retain value. While that might be true it's important to realize that physics, or more specifically physx and havok, are not the only thing and definately not the be all end all. Check out the natural motion demos... comparing what they're doing vs. ragdoll effects, more glass shards etc from havok and physx really knocked my opionion of those SDKs down a peg. The AI seek demos were kinda neat too... maybe not as dramatic but there are some points that would probably offer a lot of potential. One specific one is where one NPC blocks the path of an advancing army with a wall of fire... instead of running through the fire they all stop and start firing arrows through it instead till it drops. This kind of advanced AI being made capable through the parallel nature of GPUs is what really seems like the future to me and currently I'm pretty sure these techs don't belong to any hardware vendors.
Anyone that plays a lot of 3 free for all strategy games know what I'm saying. If you take the weak one first you are lost.
This isn't a game though... the thing that keeps coming back to mind is Microsoft giving Apple money b/c them going under wasn't worth the hassle they would receive in regards to having a monopoly. I don't think 1 last person standing is a real possibility.
Posted on Reply
#60
brian.ca
Musselsstore.steampowered.com/hwsurvey/

check the directX section - 21.43% of steam gamers are running a DX10 card on vista. thats over one in five. the adoption rate isnt as low as you think.
Well, remember I said if Vista had a slow adoption rate. I don't follow that stuff close enough to really know what the actual adoption rate was like but the point was that the two are directly tied and I have heard the complaints, MS moaning about people not switching over, and personally tried vista and eventually switched back to XP myself. So I can only assume that not nearly as many switched to vista (and therefore DX10) as there could have been. So devs would have had less reason to put out DX10 games sooner

Also, since that's a steam poll the number is probably skewed slightly higher. I'd imagine people who vote there are probably more serious gamers that might be more inclined to make and stick with the switch to vista for DX10.

But anyways.. what I was getting at is Windows 7 so far sounds like it should be considerablly better than Vista. Combine that MS looking to cut XP support to push people to upgrade and the result should be a lot more people upgrading to a DX11 OS. Whatever their reason for upgrading the result should be more people with computers that can run DX11 games = faster industry adoption rate.
Posted on Reply
#61
DarkMatter
brian.caAt times you seem to be of the mind that the dying of Physx as Nvidia is pushing it now equates to the end of physics being handled by the GPU... I doubt this is the case. physx is just a physics devkit, same as havok. What Nvidia is doing is combining it with their CUDA to allow for the work to be done on the GPU. But just like that's not all CUDA's meant to do ATI has Stream for their cards, Apple is pushing OpenCL to take advantage of more cores and GPUs from both vendors, and Microsoft's doing the same with directX 11. And I believe both Nv and ATI support OpenCL, I would assume support for DX11 will be a given. So one way or the other they and other companies are pushing to utiilize the GPU more... to think that this won't result with physics being handled on the GPU, if that would be advantageous to devs and consumers, seems off to me.

Also while Intel will always be strong with lots of resources I think you might be giving it (or more specifically it's ownership of havok) too much weight here. When Nvidia bought Ageia AMD was instantly pushed into a pretty interesting situation. Physx never got very far b/c of it's proprietary nature but if AMD were to support it, assuming the resulting capabilities of doing physics on GPUs would be decent selling points, Physx overtaking havok sould have definately been a possibility. With Larabee still a ways off I don't think Intel would be too keen on that so I wouldn't be surprised if AMD got something out of this deal. Intel benefits from them supporting havok after all. From what I understand AMD was supposed to have their cards accelerating havok by the end of the year... no word of that yet but it would seem they're definately working on it and it wouldn't seem unreasonable that that's being done with some level of concession or support on Intel's part.

Furthermore, I said before that I haven't really seen too much coming from Physx so far that havok hasn't done... but the article in the original post linked to another article on the same site that linked off to some neat things (www.bit-tech.net/bits/2008/09/25/roy-taylor-on-physics-ai-making-games-fun/1 specifically check out the natural motion and AIseek sites & their demos). One thing important to note... you were talking about how graphics can only go so far and after that the GPU needs to branch out to retain value. While that might be true it's important to realize that physics, or more specifically physx and havok, are not the only thing and definately not the be all end all. Check out the natural motion demos... comparing what they're doing vs. ragdoll effects, more glass shards etc from havok and physx really knocked my opionion of those SDKs down a peg. The AI seek demos were kinda neat too... maybe not as dramatic but there are some points that would probably offer a lot of potential. One specific one is where one NPC blocks the path of an advancing army with a wall of fire... instead of running through the fire they all stop and start firing arrows through it instead till it drops. This kind of advanced AI being made capable through the parallel nature of GPUs is what really seems like the future to me and currently I'm pretty sure these techs don't belong to any hardware vendors.



This isn't a game though... the thing that keeps coming back to mind is Microsoft giving Apple money b/c them going under wasn't worth the hassle they would receive in regards to having a monopoly. I don't think 1 last person standing is a real possibility.
If they don't manage to push GPU physics soon, namely before Intel releases Larrabee, they'll have a hard time pusing for it later. You are naive if you think that Intel will let GPU Havok to run better than physics on Larrabee. I find it funny how everyone assumes Nvidia will do that, but not Intel, when it's Intel and only Intel that one that has been found guilty of such behavior.

I support PhysX because it's the only one GPU physics out there now and until DX11 catches on, which will be in 2010 the soonest. And I strongly believe GPU physics are and will always be much better than any x86 based physics, just based on pure performance. Havok is already the most used physics API and is nothing compareable to hardware PhysX. It just needs to be 3x what it is today and thanks to the marketing skills of Intel and the fact that developers will not use anything more advanced if they don't NEED to (example they a leaning to consoles), they will own the market, negating us the possibility of having 100x better physics. That's what Intel is doing right now, CPU based physics like Havok hasn't really improved since year 2000. They might be more acurate, but the extent to which these are used hasn't changed. That includes Natural Motion, which has some pretty effects, but a CPU won't ever be able to reproduce as much as a GPU, nor will Larrabee IMO. It's not ultra accurate physics what we need, it's a wide use of them in smoke, fluids, clothes, etc. And don't be naive, unless Nvidia and Ati manage to push GPU physics BEFORE Intel steps in with a 2x better Havok, widely used GPU physics will never be a reality.

Intel maybe wants AMD to implement GPU Havok RIGHT NOW, because it's in their best interest, but as soon as Larrabee is released they will let them down. In no way they are going to let AMD have a better physics solution than them. At that point GPU physics would already have lost. Intel doesn't need D11 or OpenCL they have x86 which is the same thing developers have been using for ages. Letting AMD use GPU physics under their conditions they will slowly kill GPU physics.
Posted on Reply
#62
VulkanBros
Intel maybe wants AMD to implement GPU Havok RIGHT NOWBased on the assumption that Larrabee is going to be a real competitioner to Nvidia/ATI witch I personally have doubts they will.

- I know some people argue Intel has the a**s full of money, the knowhow, the expierence, the people to do it, etc. etc. etc.

We will just have to wait and see .. one thing is for sure: where there is competition - there will be discussesions !
Posted on Reply
#63
Unregistered
AMD Fusion is their killer chip, while the on-die GPU won't be that powerful to begin with it will be more than sufficient solely for in-game physics while the main PCI-E card does everything else, hybrid CPU/GPU super processors is the future, both Intel and nVidia know it.
Posted on Edit | Reply
#64
Tatty_Two
Gone Fishing
For those who say that AMD can beat NVidia in the GPU wars, I truly hope so, but 2008 has been one of NVidia's worst years for quite a while and it still has considerably more of the market share than ATi/AMD, even though NVidia took a huge financial hit with the 200 based cards, actually admitting their strategy was completely wrong.

For those that beleive that AMD cannot beat Intel in the CPU wars.........until 18 months to 2 years ago, AMD was on top for almost 3 years, Intel was almost only (but not quite) being kept in the black by it's OEM sales, how things can change in a couple of years, lets hope the next couple are as competative!
Posted on Reply
#65
imperialreign
DarkMatterI support PhysX because it's the only one GPU physics out there now and until DX11 catches on, which will be in 2010 the soonest. And I strongly believe GPU physics are and will always be much better than any x86 based physics, just based on pure performance. Havok is already the most used physics API and is nothing compareable to hardware PhysX. It just needs to be 3x what it is today and thanks to the marketing skills of Intel and the fact that developers will not use anything more advanced if they don't NEED to (example they a leaning to consoles), they will own the market, negating us the possibility of having 100x better physics. That's what Intel is doing right now, CPU based physics like Havok hasn't really improved since year 2000. They might be more acurate, but the extent to which these are used hasn't changed. That includes Natural Motion, which has some pretty effects, but a CPU won't ever be able to reproduce as much as a GPU, nor will Larrabee IMO. It's not ultra accurate physics what we need, it's a wide use of them in smoke, fluids, clothes, etc. And don't be naive, unless Nvidia and Ati manage to push GPU physics BEFORE Intel steps in with a 2x better Havok, widely used GPU physics will never be a reality.
I agree here - from everything we've seen over the last few years in regards to physics processing, GPUs are the -ish.

If anyone recalls the first PPU showdown - it was nVidia and ATI having yet another pissing match over whose hardware could run physics APIs better. ATI led the trump; then Intel showed up, claiming it's quads could best both camps - naturally, Mr. red and green representative laughed at the blue man group and said "GTFO!"

But, they had a point, both GPU desings were superior to Intel's quads in terms of physics processing capabilites.

PPU showdown part dva - Aegia, tooting their own horn over their PPU card, was claiming that their solution was the only way to run physics APIs, as they were superior. Mr. red camp offered a challenge, and slapped Aegia so bad you'd think they were a child in need of protective services . . . and that's when green camp came to the rescue, claiming their hardware was better than Mr. red's - strange things were afoot at the Circle-K, as Mr. green never showed up for a match . . . they called it in; ATi, in respone, told them anywhere, anytime . . . but Mr. green never showed up . . .

instead, they purchased Aegia - and have been quiet in the PPU arguement for well over a year now . . .



This round will get interesting - as nVidia's current GPUs are much better than the older one's they were taking to the showdowns . . . as evidenced by how well they run F@H clients compared to ATI now. Although, ATI, I'm sure, still has some tricks up their sleeve - their new hardware is an absolute beast, and I'm sure they've been feeding it $30 steaks pumped full of steriods . . . we'll have to wait and see what the next gen will deliver.


But, look at it this way - nVidia currentlt has the corner on physics, rolling Aegia in with CUDA and you have a ripe source for total PPU domination . . .

ATI, 6 months ago (or maybe longer) had already stated they're supporting Havok - they felt that's where the future is, as more software developers already support Havok. Intel has also sided with Havok, and that proves a formidable match.

But, currently, nVidia still has the upper hand. Why? PhysX is designed to work at the hardware level with their GPUs . . . it makes for a much more efficient solution.



Does that mean I think ATI is dead in the PPU arena? Far from it. What did ATI just release this month, embedded into CAT 8.12? A stream processing solution that will allow supportive software to run through the GPU. How quickly do you think Havok will jump on ATI Stream for physics processing? The solution is there to have code run on ATI's hardware level, all we need is the implimentation . . .

and then, we need a GPU that can render and process physics at the same time like nVidia's do . . . I'm sure ATI has been frying the eggs up in the pan on this one - this whole last year has been a filled with very strategic, smart, and insightful market maneuvers on ATI's part; I'm sure they have something up their sleeves.
Posted on Reply
#66
leonard_222003
Tatty_OneFor those who say that AMD can beat NVidia in the GPU wars, I truly hope so, but 2008 has been one of NVidia's worst years for quite a while and it still has considerably more of the market share than ATi/AMD, even though NVidia took a huge financial hit with the 200 based cards, actually admitting their strategy was completely wrong.

For those that beleive that AMD cannot beat Intel in the CPU wars.........until 18 months to 2 years ago, AMD was on top for almost 3 years, Intel was almost only (but not quite) being kept in the black by it's OEM sales, how things can change in a couple of years, lets hope the next couple are as competative!
Hehe , i remember those times with intel having those shity p4's netburst they called it if i remember corectly.
Even in those times when AMD clearly had a better product people still bought intel , the marketing machine from intel proved it's worth , reviewrs from top magazines where almost convinced intel is better until the benchmarks and even then they could find a positive thing about intel like stability or bullshit like this , this shows what marketing can do to people's brain :) .
In my group there where people that firmly believed intel is better (marketing ) no matter what benchmark they saw or real test , intel is so big and so smart i think it's right up there with evil microsoft , no one can demolish this 2 giants because they have to much money and people depending on them , they can recover from setbacks like intel had with amd for a few years.
Let's see if amd can recover from a setback like this , core duo really messed up their bussines and they already knew this is coming , buying ati was necesary for their survival because they need some edge , that fusion cpu+gpu could be the thing but that seems to take forever.
Posted on Reply
#67
eidairaman1
The Exiled Airman
Kuma 7***, Phenom 2, Kuma Proves to be overall better than the Old Athlon 6*** CPUs and both are Dual Cores
leonard_222003Hehe , i remember those times with intel having those shity p4's netburst they called it if i remember corectly.
Even in those times when AMD clearly had a better product people still bought intel , the marketing machine from intel proved it's worth , reviewrs from top magazines where almost convinced intel is better until the benchmarks and even then they could find a positive thing about intel like stability or bullshit like this , this shows what marketing can do to people's brain :) .
In my group there where people that firmly believed intel is better (marketing ) no matter what benchmark they saw or real test , intel is so big and so smart i think it's right up there with evil microsoft , no one can demolish this 2 giants because they have to much money and people depending on them , they can recover from setbacks like intel had with amd for a few years.
Let's see if amd can recover from a setback like this , core duo really messed up their bussines and they already knew this is coming , buying ati was necesary for their survival because they need some edge , that fusion cpu+gpu could be the thing but that seems to take forever.
Posted on Reply
Add your own comment
Apr 23rd, 2024 14:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts