• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia acquires Ageia

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (1.00/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
I highly doubt that.

Anyways this was bound to happen. Ageia were in it for the money and nothing else. Now they got what they wanted: milions of dollars. The question is: will Nvidia really use this thing or did they just eliminate some competition?

This is the real question.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,277 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
No it's more of "Buy GeForce, get PhysX as well". I relly don't see PhysX api running on a GPU doing wonders. In turn, it could slow down graphics.

So a smart buyer could be convinced into "PhysX api with GeForce, Havoc api with Intel" and end up with a NVidia + Intel system. Sure Phenom does Havoc too but people will choose a cheap Intel Q6600 to a Phenom.

Also another message would be "Buy ATI, lose PhysX features"

I hate NVidia now. Two good brands, ULi and 3DFX were killed and now this :banghead:
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (1.00/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
No it's more of "Buy GeForce, get PhysX as well". I relly don't see PhysX api running on a GPU doing wonders. In turn, it could slow down graphics.

So a smart buyer could be convinced into "PhysX api with GeForce, Havoc api with Intel" and end up with a NVidia + Intel system. Sure Phenom does Havoc too but people will choose a cheap Intel Q6600 to a Phenom.

Also another message would be "Buy ATI, lose PhysX features"

I hate NVidia now. Two good brands, ULi and 3DFX were killed and now this :banghead:

One of the most disputed features of 3dfx was the "blur" feature. I forget the technical name but got a lot of flak back then.

ULi was a good chipset but I never used them so I really cannot comment on their quality.

However, in what way have we've seen ULi and 3dfx in G80 or G92 video card? :confused: I really don't see it.
 
Joined
May 10, 2007
Messages
1,420 (0.23/day)
Location
London, England
Exactly. They bought them so that no one else could benefit.
The ULi chipsets made good third-party SBs for ATi motherboards, and their NBs were getting much better.
3dfx was bought out just mostly to get the features for themselves, but I can't remember what else I was going to add... urm... nope...
 

strick94u

New Member
Joined
May 24, 2006
Messages
1,592 (0.24/day)
Location
Texas
System Name CrashMaster 17.2/ crashmaster(M) ROG
Processor CD2 E8400@4.0 ghz/ i5 430@ 2.577 mhz
Motherboard EVGA 680i/ patogram inc intel north bridge
Cooling Monsoon vigor gaming II/ fan and vents
Memory 4 gig OCZ pc8500 DDR2/ 4 gigs ddr3
Video Card(s) 2 X EVGA 8800gts 512 735/1045/nvidia 360 gtsm gddr5
Storage 2 X WD 150 Raptors Raid 0+600gig usb/ 500 gig hitachi
Display(s) Samsung 2232 bw Sync Master 22" HD/16.9 inch asus
Case Enermax uber chakra/ asus republic of gamers
Audio Device(s) Creative SB X-FI 5.1 speakers/creative eax 4.0
Power Supply OCZ extream Gaming 650 watt 4x18a/ external
Software Dos 6.22/ win 3.11 for networks/ms bob server
Benchmark Scores 3dmark06 18,201/10601 3dmark05 23,943/not run Aquamark 218,094/not run
How bad ass are games going to be we need 3 or 4 gpu's and ppu's and a 4 core cpu's all I can say is they better look like live people floating in bowls of jello being shot at with bolts of lightning.:eek:
 
Joined
May 10, 2007
Messages
1,420 (0.23/day)
Location
London, England
If games look any more realistic, we'll start viewing life as a fake.
Frankly, DX9c already has good looking games made with those specs, IMO.

Remember that computer hardware just evolves ever more, so we may need two graphics card, and a powerful CPU for Crysis, but in another year or two, we can use a mid-ranged, maybe even low-end system, and run Crysis like it was just another retro game.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,277 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
However, in what way have we've seen ULi and 3dfx in G80 or G92 video card? :confused: I really don't see it.

Nvidia killed the companies. Their engineers now work for the GPU and chipset divisions of NVidia, most likely. Besides it would be interesting to see if the PhysX PPU survives at all. Even if the api is used on a NVidia GPU with its shaders, it will clearly hit the GPU performance for graphics processing.
 
Joined
Aug 9, 2006
Messages
1,065 (0.17/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
I liked ULi for the fact they always provided Win9x drivers for their stuff, something nVidia, ATI, and later on VIA and SiS stopped doing. I guess they wanted all the edge they could get. They were on the way up for sure.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I know this post is long, sorry for that, but I think I have some good points worth of debating. Please read and tell me whatyou think, guys. ;)
However, in what way have we've seen ULi and 3dfx in G80 or G92 video card? :confused: I really don't see it.

In what way has Nvidia used Uli and 3dfx? That's your question?
Well in G80 and G92 very few things if at all, I guess.
But IIRC before Nvidia bought Uli they only offered one chipset and it was an enthusiast one. Later on Nvidia offers the high-end chipset along with his smaller brother.
3dfx: They designed the GeForce FX series entirely. Any doubts on why we don't see their "touch" on later GPUs? I'm sure that there are some features in them developed by the people who belonged to 3dfx, it's just not apparent on the surface. Not sure, but I think that Lumenex engine has something to do with them. I think that I have read that somewhere, but can't find a link to it.
BUT if you want to see 3dfx into Nvidia you don't have to go too far. SLI is a 3dfx invention and it was first developed by 3dfx. It's there where 3dfx influence is most clear.

Now on topic. IMO they have bought Ageia because they had already integrated some kind of support for physics in their future designs, maybe in conjuntion with Havok and now without them, they had hardware support without any API or compiler to work with. This support would come in the shape of GPU's instruction set and maybe how the SPs work internally. Of course both GPU and PPU's internal units are no more than Floating Point Units, but the difference relies on how they opperate. As an example, think of AMD's Firestream stream processor based on RV670, it does double-precision (FP64) at 1/4 of the speed of single-precision. Or you can think of SSE extensions on the CPU.

There are some posts in this thread claiming that physics calculation on the GPU would cripple graphics performance. That's not exactly true. I'm of the opinion that G80 was bottlenecked by SPs, but this doesn't happen with G92, meaning that there's some spare shading power, specially on the GTS. Ageia Physx has a peak of 50 GFlops, current GPU are around 500 GFlops. You can see where I'm going. And if we ahve to believe leaked info about 9800GTX my point is much more feasible:

-384 SP and over 1 TFlop.
-96 TMU
-32 ROP
-512 bit
-GDDR4

That translates to:

- Double the SP/TMU ratio compared to current gen: 384/96 = 4 Vs. 128/64 = 2
- 50% increase over G92 in SP/ROP ratio and 125% increase over G80: 384/32 = 12 Vs. 128/16 = 8 Vs. 128/24 = 5,33
- A hell of a lot higher memory bandwidth.

With those (if true) it's easy to conclude that Nvidia was expecting to see a high increase in SP usage and data traffic. Are games in the near future going to see such an increase in shading usage for it's graphics alone? I don't think so, because they would render current generation useless, a big base for their games. But what about a feature like Ageia's physics? I mean something that would be more like an added feature, a bonus, just like now happens with Physx?

Let me know what you think of this guys... :toast:
 
Joined
Aug 16, 2007
Messages
7,180 (1.18/day)
hmmn just read about this and i think that nvidia want to get the edge over there competition i actually think they want AMD to scrap their graphics division only one company can be in the lead and there must be a reason for that but when there is more than 2 company there is choice and it allows other company to not be in the lead but still be successful

in all honesty yeah it does make sense because we might see a graphics card with a ppu on it now in all honesty id like to see a graphics card that has multi core so we dont need sli
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Just another possibility for the near future: Hybrid SLI.

I don't see why they wouldn't make all of their next chipsets with this feature, as it's wonderful for our electric bill, our PC's heat output and noise. What happens is that, as it is right now, it doesn't have any real benefit (performance wise) when paired with a high-end graphics card. But mix the abitity to do GPU physics with Hybrid SLI and voila!

What do you think?
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (1.00/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
DarkMatter, you have an interesting POV. It's really hard to say how this will turn out. But time will tell if we will actually see the use of the PPU in future video cards.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,277 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Now I'm confused.

Is it going to be that NVidia will continue production of PPU cards or will the PPU be scrapped and the API be implemented in a way that the shaders do physics processing? BTW, just for the know, a PhysX card at full load consumes 25W~30W. source
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (1.00/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
Now I'm confused.

Is it going to be that NVidia will continue production of PPU cards or will the PPU be scrapped and the API be implemented in a way that the shaders do physics processing? BTW, just for the know, a PhysX card at full load consumes 25W~30W.

Some are speculating that it may be scraped (to prevent Intel from getting it). While other believe it maybe incorporated into future video cards. But if that happens it's possible it wouldn't be at the capacity we see today. As we discussed earlier we can't find any evidence of ULi and 3DFX in the G80 or G92. Maybe it's there and it's functionality is so minuet that we do not notice it/see it.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,277 (7.69/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Future Forceware drivers with the PhysX API so even current cards run it? Afterall we're dealing with fully-programmable shaders? The NVidia press release talks about "millions of people being able to use the API" and there is a mention of the 8800 GT which is current?
 
Joined
Aug 9, 2006
Messages
1,065 (0.17/day)
System Name [Primary Workstation]
Processor Intel Core i7-920 Bloomfield @ 3.8GHz/4.55GHz [24-7/Bench]
Motherboard EVGA X58 E758-A1 [Tweaked right!]
Cooling Cooler Master V8 [stock fan + two 133CFM ULTRA KAZE fans]
Memory 12GB [Kingston HyperX]
Video Card(s) constantly upgrading/downgrading [prefer nVidia]
Storage constantly upgrading/downgrading [prefer Hitachi/Samsung]
Display(s) Triple LCD [40 inch primary + 32 & 28 inch auxiliary displays]
Case Cooler Master Cosmos 1000 [Mesh Mod, CFM Overload]
Audio Device(s) ASUS Xonar D1 + onboard Realtek ALC889A [Logitech Z-5300 Spk., Niko 650-HP 5.1 Hp., X-Bass Hp.]
Power Supply Corsair TX950W [aka Reactor]
Software This and that... [All software 100% legit and paid for, 0% pirated]
Benchmark Scores Ridiculously good scores!!!
Is it going to be that NVidia will continue production of PPU cards or will the PPU be scrapped and the API be implemented in a way that the shaders do physics processing?

Either way its not going to matter much for a while. Realistically speaking, aside from an occasional big-name title every now and then, the kind of physics done by Ageia are not widespread or used that often nowadays. Although, 5-10 years from now that will probably change. Heck, I wouldn’t be surprised if nVidia did absolutely nothing with it for years. They sat on SLI tech doing nothing with it for years before implementing it themselves (And yes, I know 3Dfx's SLI was not exactly the same as nVidia's implementation that followed.)
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
Now I'm confused.

Is it going to be that NVidia will continue production of PPU cards or will the PPU be scrapped and the API be implemented in a way that the shaders do physics processing? BTW, just for the know, a PhysX card at full load consumes 25W~30W. source

I would say both. They will probably continue selling PPU cards for a while and in the meantime create GPUs with physics shader capabilities, instead of creating an API that runs on existing shaders.

Future Forceware drivers with the PhysX API so even current cards run it? Afterall we're dealing with fully-programmable shaders? The NVidia press release talks about "millions of people being able to use the API" and there is a mention of the 8800 GT which is current?

Current GPUs COULD do physics, but they'd do it in software, in drivers, because they lack the specific hardware paths or Shaders as we call them. And once when we start needing the CPU too much, it's when GPU physics start to fade off as a desirable solution. PPUs are better suited for this because they have those hardware paths, just as GPUs have theirs for T&L, color blending, etc. It's just as how you can use a knife to take out screws, but a screwdriver is much better suited for this.

Implementing those "physics shaders" into GPUs is easy. So what's the problem then? Even if you implement those shaders, you need an API and a compiler. You could do your own API, but that would mean to add another one in a market that has just started, has very few followers, and where you are the underdog. It's easier to use one of the existing ones (seems they first took Havok). You have still 2 problems with this: licensing and the compiler. The first is clear. The compiler is another thing, but it's tied with livensing too: they'll let you create the compiler according to your hardware, they create it for you or they create one common to all companies and ask you to change your hardware accordingly? I guess in this situation the best you can do is buy the company before it takes off.

Either way its not going to matter much for a while. Realistically speaking, aside from an occasional big-name title every now and then, the kind of physics done by Ageia are not widespread or used that often nowadays. Although, 5-10 years from now that will probably change.

That picture could change in only one PhysX capable GeForce generation. There are few games using PhysX hardware because there are probably only thousands of owners, not even hundreds of thousands I'd dare to say. But we are talking about millions of graphic cards sold every year. That's a huge user base. It wouldn't be any different to EAX, they could make a hardware version and a software version. If they can stand out over other developers by using it, they will use it.

Huh! I made another long post... I will never learn. :slap:
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (1.00/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
Future Forceware drivers with the PhysX API so even current cards run it? Afterall we're dealing with fully-programmable shaders? The NVidia press release talks about "millions of people being able to use the API" and there is a mention of the 8800 GT which is current?

Sure, millions with a decent CPU can use the API we've seen it already. If you can do it with a CPU, I'm sure a dual core GPU solution could do the same (or some variant). However, PhysX API won't work unless the game is made to use it, AKA GRAW/GRAW2. We have seen very few games modeled to use PhysX and honestly don't see that changing for now. The best games that use it are Graw series and UT3 through mod pack

Before Ageia released their product it was assumed that PhysX could take any existing games and enhance it in one way or another. But once released that turn out to not be true at all. Then we all hoped that certain games would get some sort of mod pack. However, so far, only UT3 received such a pack then Nvidia bought them. So, it's really hard to say how things will turn out now.
 
Last edited:
Joined
Jun 20, 2007
Messages
3,937 (0.64/day)
System Name Widow
Processor Ryzen 7600x
Motherboard AsRock B650 HDVM.2
Cooling CPU : Corsair Hydro XC7 }{ GPU: EK FC 1080 via Magicool 360 III PRO > Photon 170 (D5)
Memory 32GB Gskill Flare X5
Video Card(s) GTX 1080 TI
Storage Samsung 9series NVM 2TB and Rust
Display(s) Predator X34P/Tempest X270OC @ 120hz / LG W3000h
Case Fractal Define S [Antec Skeleton hanging in hall of fame]
Audio Device(s) Asus Xonar Xense with AKG K612 cans on Monacor SA-100
Power Supply Seasonic X-850
Mouse Razer Naga 2014
Software Windows 11 Pro
Benchmark Scores FFXIV ARR Benchmark 12,883 on i7 2600k 15,098 on AM5 7600x
Are DMM and Euphoria technologies in themselves? Who owns the rights to them?

EDIT: Are they both owned by Lucas Arts?
 
Joined
Nov 29, 2007
Messages
979 (0.16/day)
Location
Netherlands
how would you push the claculations of a ppu and gpu through an pci x16 slot:confused:
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (1.00/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
Are DMM and Euphoria technologies in themselves? Who owns the rights to them?

EDIT: Are they both owned by Lucas Arts?

They are both owned by Lucas Arts. Right now, they are releasing this on console. Listen to this (Force Unleash is the game talked about)
DMM and Euphoria are CPU intensive. Which means that a PPU really isn't needed. Read here for more information about it. It looks like we have until Sept 2008 before PC gamers can be licenses to use DMM and Euphoria. I haven't the slightly clue why they did that. IMHO DMM and Euphoria should offer the best Phsyics solutions to PC games to date. Make sure you watch the video.
 
Last edited:

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I have seen those videos and there's nothing impressive about them. Ageia has a lot more impressive demos (that run on the CPU too, but using almost 80% of my CPU) than that DMM thing, not to mention that the whole PPU thing is not about the effect you can create, but about how many you can create. With a CPU you can create a wood breaking in 30 pieces, with the same CPU utilisation + PPU or a physics capable GPU you could break it in thousands of pieces and all of them would follow Newtons laws.
The second one is similar to some demos from Meqon (bought by Ageia) that I saw back in 2003-2004, featuring a similar physics-ragdoll-AI thing they show there, and by no means better IMHO.

Also Crysis has somthing really similar in nature to DMM, but it's obvious that when many particles or objects are affected by physics the CPU struggles to handle them, unless you have a quad core.
It's commonly known that you can run great physics on the CPU, indeed a general purpose CPU can do physics a lot better than a PPU or GPU. But it will never be able to catch up to what those can do in terms of number of effects. Intel's 80 core monstrosity boasted 1 Teraflop, you can achieve that with 2 GPUs today, a single 9800GTX is supposed to have close to 1,5 TFlops. PhysX processor has only 50GFlops in comparison, but you need the fastest Quad Core2 to come close to that number. You would need 2 of them to equate a system with the Ageia processor if it was fully used. PhysX card is $100, fastest Quad is $1400, a 500GFlops GPU is $200 and this difference will always be there. Imagine what they could do if you could use one of the GPU's on an SLI configuration only for physics. :eek:

Now physics will always need CPU power to run, as stream processors of any nature can't handle the kind of operations required for gameplay physics (physics that affect gameplay).
But I expect (I want them) great advances in effects physics in the near future. Imagine a game like FEAR, when you shoot at the walls the chamber gets full of smoke, but that smoke rather than being 20 (whatever, they are few) big particles, it's formed by thousands of smaller particles that move when someone passes through them or when you shoot... That would add to the gameplay (while not being gameplay physics), because on FEAR you couldn't see anything when you filled rooms with smoke, but that way you could "see" them. It would be the same difference as Farcry/Crysis jungle fights, in FarCry you couldn't see them on the jungle, on Crysis you can if you see some leaves are moving.

I never had faith in Ageia and thought they would never succeed, unless they did find a way to integrate it in motherboards or graphics cards or sell it for <$50. But I never questioned the need of some sort of dedicated solution for advanced physics, and I'm not talking about science classes...:p Well let's see what comes now.
 

EastCoasthandle

New Member
Joined
Apr 21, 2005
Messages
6,885 (1.00/day)
System Name MY PC
Processor E8400 @ 3.80Ghz > Q9650 3.60Ghz
Motherboard Maximus Formula
Cooling D5, 7/16" ID Tubing, Maze4 with Fuzion CPU WB
Memory XMS 8500C5D @ 1066MHz
Video Card(s) HD 2900 XT 858/900 to 4870 to 5870 (Keep Vreg area clean)
Storage 2
Display(s) 24"
Case P180
Audio Device(s) X-fi Plantinum
Power Supply Silencer 750
Software XP Pro SP3 to Windows 7
Benchmark Scores This varies from one driver to another.
I've seen what Ageia can do and was never impressed with it. Specially when you have to buy a PPU to see those effects. With DMM/Euphoria you don't have to buy a PPU which give it a better advantage in my book.
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
how would you push the claculations of a ppu and gpu through an pci x16 slot:confused:

How do you do with GPU only ones? It's the same. Plus current cards don't desperately need more than 8x, they benefit a bit by going 16x, a 5% increase or so, but 16x it's not fully used, only a bit more than 8x is needed. And taking into account that PCI Express 2.0 has double the bandwidth than PCI 1.1, we have almost 4x the bandwidth we need available in stores. ;)
 

DarkMatter

New Member
Joined
Oct 5, 2007
Messages
1,714 (0.28/day)
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
I've seen what Ageia can do and was never impressed with it. Specially when you have to buy a PPU to see those effects. With DMM/Euphoria you don't have to buy a PPU which give it a better advantage in my book.

Come on, Ageia's cloth simulations are by far more impressive than that breaking wood. Specially the cloth that you can crumble (?? google translator sorry). And it runs on CPU too. 30% of max utilisation of my 4800+.

EDIT: For the record: I don't own an Ageia card, I never had one and I never will.
EDIT2: Anyway how do you know DMM and Euphoria don't belong to creators Pixelux Entertainment and NaturalMotion Ltd, and that Lucas Arts owns them?
 
Last edited:
Top