• We've upgraded our forums. Please post any issues/requests in this thread.

PhysX will Die, Says AMD

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
34,335 (9.23/day)
Likes
17,427
Location
Hyderabad, India
System Name Long shelf-life potato
Processor Intel Core i7-4770K
Motherboard ASUS Z97-A
Cooling Xigmatek Aegir CPU Cooler
Memory 16GB Kingston HyperX Beast DDR3-1866
Video Card(s) 2x GeForce GTX 970 SLI
Storage ADATA SU800 512GB
Display(s) Samsung U28D590D 28-inch 4K
Case Cooler Master CM690 Window
Audio Device(s) Creative Sound Blaster Recon3D PCIe
Power Supply Corsair HX850W
Mouse Razer Abyssus 2014
Keyboard Microsoft Sidewinder X4
Software Windows 10 Pro Creators Update
#1
In an interview with Godfrey Cheng, Director of Technical Marketing in AMD's Graphics Products Group, Bit-Tech.net has quoted him saying that standards such as PhysX would die due to their proprietary and closed nature. Says Mr. Cheng:

"There is no plan for closed and proprietary standards like PhysX," said Cheng. "As we have emphasised with our support for OpenCL and DX11, closed and proprietary standards will die."
Show full news post
 
Joined
Jul 17, 2008
Messages
1,062 (0.31/day)
Likes
68
System Name REBEL R1
Processor Core i7 920
Motherboard ASUS P6T
Cooling Stock
Memory 6GB OCZ GOLD TC LV Kit 1866MHz@1.65V 9-9-9-24
Video Card(s) Two Sapphire HD 5770 Vapor-X Xfire'd and OC'd (920/1330)
Storage Seagate 7200.11 500GB 32MB
Case Antec Three Hundred
Audio Device(s) ASUS Xonar D1 PCI Sound Card
Power Supply OCZ StealthXStream 500W
Software Windows 7 Ultimate 64-bit
Benchmark Scores 16585 Performance Score on 3DMark Vantage
#2
Take that Ghost Recon!
 

AsRock

TPU addict
Joined
Jun 23, 2007
Messages
15,315 (4.00/day)
Likes
4,724
Location
US
Processor 2500k \ 3770k
Motherboard ASRock Z68 \ Z77
Memory Samsung low profile 1600
Video Card(s) XFX 6770 \ XFX R9 290X
Storage Intel 80Gb (SATA2) WD 250Gb \ Team SSD+Samsung Evo 250Gb+500Gb+ 2xCorsair Force+WD250GbHDD
Display(s) Samsung 1080P \ Toshiba HDTV 1080P
Case HTPC400 \ Thermaltake Armor case ( original ), With Zalman fan controller ( wattage usage ).
Audio Device(s) Yamaha RX-V475 \ Marantz SR5008 Tannoy Mercury MKII Paradigm 5SE + Tannoy Mercury F4
Power Supply PC&Power 750w \ Seasonic 750w MKII
Mouse MS intelimouse \ Logitech G700s + Steelseries Sensei wireless
Keyboard Logitech K120 \ ROCCAT MK Pro ( modded amber leds )
Benchmark Scores Meh benchmarks.
#3

Kreij

Senior Monkey Moderator
Staff member
Joined
Feb 6, 2007
Messages
13,817 (3.48/day)
Likes
5,524
Location
Cheeseland (Wisconsin, USA)
Processor Intel Core 2 Quad QX9650 Extreme @ 3.0 GHz
Motherboard Asus Rampage Formula
Cooling ZeroTherm Nirvana NV120 Premium
Memory 8GB (4 x 2GB) Corsair Dominator PC2-8500
Video Card(s) 2 x Sapphire Radeon HD6970
Storage 2 x Seagate Barracuda 320GB in RAID 0
Display(s) Dell 3007WFP 30" LCD (2560 x 1600)
Case Thermaltake Armor w/ 250mm Side Fan
Audio Device(s) SupremeFX 8ch Audio
Power Supply Thermaltake Toughpower 750W Modular
Software Win8 Pro x64 / Cat 12.10
#4
A bold statement from a company who can be bold at the moment.
We shall see.
 
Joined
Dec 23, 2007
Messages
16,919 (4.64/day)
Likes
1,622
Location
Omaha, NE
System Name The ShadowFold Draconis (Ordering soon)
Processor AMD Phenom II X6 1055T 2.8ghz
Motherboard ASUS M4A87TD EVO AM3 AMD 870
Cooling Stock
Memory Kingston ValueRAM 4GB DDR3-1333
Video Card(s) XFX ATi Radeon HD 5850 1gb
Storage Western Digital 640gb
Display(s) Acer 21.5" 5ms Full HD 1920x1080P
Case Antec Nine-Hundred
Audio Device(s) Onboard + Creative "Fatal1ty" Headset
Power Supply Antec Earthwatts 650w
Software Windows 7 Home Premium 64bit
Benchmark Scores -❶-❸-❸-❼-
#5
You tell 'em AMD. I think they know something about PII we don't :p They are starting to get cocky. That's either a desperate act or an act of "I know were gonna pwn you".
 

KBD

New Member
Joined
Feb 23, 2007
Messages
2,477 (0.63/day)
Likes
275
Location
The Rotten Big Apple
Processor Intel e8600 @ 4.9 Ghz
Motherboard DFI Lanparty DK X48-T2RSB Plus
Cooling Water
Memory 2GB (2 x 1GB) of Buffalo Firestix DDR2-1066
Video Card(s) MSI Radeon HD 4870 1GB OC (820/950) & tweaking
Storage 2x 74GB Velociraptors in RAID 0; 320 GB Barracuda 7200.10
Display(s) 22" Mitsubishi Diamond Pro 2070SB
Case Silverstone TJ09-BW
Audio Device(s) Creative X-Fi Titanium Fatal1ty Profesional
Power Supply Ultra X3 800W
Software Windows XP Pro w/ SP3
#6
You tell 'em AMD. I think they know something about PII we don't :p They are starting to get cocky. That's either a desperate act or an act of "I know were gonna pwn you".
i think its more about their GPU division, not CPU. one reason they are so bold is because Radeon 5000 series will whoop nvidia's ass again in their view, they may have a surprise on that front. May be them buying ATI wasnt such a bad move afterall, that graphics division is prolly helping keep the company afloat.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (5.94/day)
Likes
3,682
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
#7
Funny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.
 

kysg

New Member
Joined
Aug 20, 2008
Messages
1,255 (0.37/day)
Likes
98
Location
Pacoima, CA
System Name Workhorse lappy
Processor AMD A6 3420
Memory 8GB DDR3 1066
Video Card(s) ATI radeon 6520G
Storage OCZ Vertex4 128GB SSD SATAIII
Display(s) 15inch LCD
Software Windows 7 64bit
#8
i think its more about their GPU division, not CPU. one reason they are so bold is because Radeon 5000 series will whoop nvidia's ass again in their view, they may have a surprise on that front. May be them buying ATI wasnt such a bad move afterall, that graphics division is prolly helping keep the company afloat.
It's not surprising though, graphics division has been getting in stride. hopefully the 5 series wont be just a die shrunk 4 series and there will continue to be improvement for the red camp.

Funny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.
wasn't DX the only standard at that time besides openGL??? which really wasn't a standard.

whoops dbl post my bad.
 
Joined
Sep 26, 2006
Messages
6,957 (1.70/day)
Likes
331
Location
Australia, Sydney
#9
At least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines in game engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.

Sure AMD is being bold and attempting to scare away nvidia shareholders, but its true. Havok basically rips Physx in terms of how much its implemented.

"It should be noted that title support for GPU accelerated physics simulation is NOT the end game. The end game is having GPU physics as an integral part of game play and not just eye candy. If it is optional eye candy, GPU physics will not gain traction. The titles we have seen today with shattering glass and cloth waving in the wind is not integral to game play and the impact on the game's experience is minimal. We are looking for ways to integrate GPU physics better into game play. Or even things like AI instead of focusing on eye candy / effects physics."

Cheng's final words make a lot of sense and I find myself agreeing with him. We said something similar when Nvidia announced that the PC version of Mirror's Edge was delayed because of the PhysX implementation which, following a brief hands-on preview last week, does nothing but add some additional eye candy. None of it influences the actual gameplay experience.
Cannot agree more.
 
Last edited:

kysg

New Member
Joined
Aug 20, 2008
Messages
1,255 (0.37/day)
Likes
98
Location
Pacoima, CA
System Name Workhorse lappy
Processor AMD A6 3420
Memory 8GB DDR3 1066
Video Card(s) ATI radeon 6520G
Storage OCZ Vertex4 128GB SSD SATAIII
Display(s) 15inch LCD
Software Windows 7 64bit
#10
At least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.
well this is obvious though they plan to do things there own, When you been doing stuff that way for a while its really gonna tick off a few people when something new gets introduced, that really doesn't do squat, which makes your day very long, when you could have already been done doing it the old way.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (5.94/day)
Likes
3,682
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
#11
At least someone is being frank. Physx has just crippled performance when enabled and not really done anything except give people a false impression of a bigger E-Penis. Reviewers havent really warmed up to it either, and I havent too.

The main issue stems from a lack of developers even bothering to conform to such proprietary standards; they want to do it their own way. CPU Based physics engines such as the CryEngine 2, or even the latest Source engine generally are sufficient.
In GRAW2 I found it to make the game much more enjoyable, and worth the small performance hit. Sufficient doesn't cut it for me. The GPU can handle Physx a hell of a lot better than even the fastest Quad can. I want GPU accelerated physics to become the norm. The CPU just doesn't cut it anymore.
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
24,280 (5.51/day)
Likes
10,375
Location
Indiana, USA
Processor Intel Core i7 4790K@4.6GHz
Motherboard AsRock Z97 Extreme6
Cooling Corsair H100i
Memory 32GB Corsair DDR3-1866 9-10-9-27
Video Card(s) ASUS GTX960 STRIX @ 1500/1900
Storage 480GB Crucial MX200 + 2TB Seagate Solid State Hybrid Drive with 128GB OCZ Synapse SSD Cache
Display(s) QNIX QX2710 1440p@120Hz
Case Corsair 650D Black
Audio Device(s) Onboard is good enough for me
Power Supply Corsair HX850
Software Windows 10 Pro x64
#12
Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
 
Last edited:
Joined
May 19, 2007
Messages
7,662 (1.98/day)
Likes
536
Location
c:\programs\kitteh.exe
Processor C2Q6600 @ 1.6 GHz
Motherboard Anus PQ5
Cooling ACFPro
Memory GEiL2 x 1 GB PC2 6400
Video Card(s) MSi 4830 (RIP)
Storage Seagate Barracuda 7200.10 320 GB Perpendicular Recording
Display(s) Dell 17'
Case El Cheepo
Audio Device(s) 7.1 Onboard
Power Supply Corsair TX750
Software MCE2K5
#13
Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.
You know what they meant ... a standard in which nne of them have overly powerful controlling interest
 

WarEagleAU

Bird of Prey
Joined
Jul 9, 2006
Messages
10,809 (2.59/day)
Likes
529
Location
Gurley, AL
System Name Boddha Getta Boddha Getta Bah!
Processor AMD FX 6100 @ 4.432Ghz @1.382
Motherboard ASUS M5A99X EVO AMD 990X AMD SB950
Cooling Custom Water. EK 240MM Kit, Supreme HSF - Runs 35C
Memory 2 x 4GB Corsair Vengeance White LP @ 1.35V
Video Card(s) XFX Radeon HD 6870 980/1100
Storage WD Caviar Black 1.0TB, WD Caviar Green 1.0TB, WD 160GB
Display(s) Asus VH222/S 22: (21.5" Viewable) 1920x1080p HDMI LCD Monitor
Case NZXT White Switch 810
Audio Device(s) Onboard Realtek 5.1
Power Supply NZXT Hale 90 Gold Cert 750W Modular PSU
Software Windows 8.1 Profession 64 Bit
#14
Proprietary in that its not easily programmable over a wide range. Kind of like Dell Hardware used to be, you couldnt swap out with anything, it had to be Dell specific (as an example here). I think it is bold and cocky and I like it. Will it succeed? WE shall see. I dont think ATI is hurting themselves here either.
 
Joined
Apr 7, 2008
Messages
632 (0.18/day)
Likes
64
Location
Australia
System Name _Speedforce_ (Successor to Strike-X, 4LI3NBR33D-H, Core-iH7 & Nemesis-H)
Processor Intel Core i9 7900X (Lapped) @ 4.9Ghz With XSPC Raystorm (Lapped)
Motherboard Asus Prime X299 Deluxe (XSPC Watercooled) - Custom Heatsinks
Cooling XSPC Custom Water Cooling + Custom Air Cooling (From Delta 120's TFB1212GHE to Spal 30101504&5)
Memory 8x 8Gb Corsair Dominator Platinum 3400MHz @ 3667Mhz (CMU32GX4M4C3466C16)
Video Card(s) 3x Asus GTX1080 Ti (Lapped) With Customised EK Waterblock (Lapped) + Custom heatsinks (Lapped)
Storage 5x Samsung 960 Pro 1Tb M.2 2280 (Hyper M.2 x16 Card), 1x Samsung 850 Pro 1Tb, 6x Samsung EVO 850 4Tb
Display(s) 6x Asus ROG Swift PG27AQ
Case Aerocool Strike X (Modified)
Audio Device(s) Creative Sound BlasterX AE-5 & Aurvana XFi Headphones
Power Supply 2x Corsair AX1500i With Custom Sheilding, Custom Switching Unit. Braided Cables.
Mouse Razer Copperhead + R.A.T 9
Keyboard Ideazon Zboard + Optimus Maximus. Logitech G13.
Software w10 Pro x64.
Benchmark Scores pppft, gotta see it to believe it. . .
#15
I really enjoyed playing coop GRAW, and ever since ive been awaiting the release of more coop campaign gameplay for those friday or saturday night lan sessions with the guys. I have to admit, GRAW 2's PhysX wasnt the best ive seen, but taking into consideration the games age and official release date of the AGEIA PhysX P1 cards im more inclined to think . . . whatever . . . What matters was the enjoyable hours of fun played.

Since the GRAW 2 production days, PhysX has come a long way. This can be witnessed via the many examples out there on the internet. Whether it be a fluid demo, particle demo, a ripping flag or my balls bouncing off each other. Either way, the realism it provides is a vital step. EA and 2K seem to think so.

PhysX enabled reduces performance on lower end systems, and/or systems missing required hardware. Ofcourse we can get the CPU to run the PhysX stuff, but whats going to run everything else . . . .

Cheng and all of AMD is scared that PhysX will evolve to the only next step it has. To become a part of the A.I, and the game play.
PhysX cant get worse, and we all know that this technology will eventually evolve. Simulating, and ripping is second grade and will never sum up to be the best.

I wonder how the 295GTX will cope with all this.
 

Mussels

Moderprator
Staff member
Joined
Oct 6, 2004
Messages
46,120 (9.57/day)
Likes
13,550
Location
Australalalalalaia.
System Name Daddy Long Legs
Processor Ryzen R7 1700, 3.9GHz 1.375v
Motherboard MSI X370 Gaming PRO carbon
Cooling Fractal Celsius S24 (Silent fans, meh pump)
Memory 16GB 2133 generic @ 2800
Video Card(s) MSI GTX 1080 Gaming X (BIOS modded to Gaming Z - faster and solved black screen bugs!)
Storage 1TB Intel SSD Pro 6000p (60TB USB3 storage)
Display(s) Samsung 4K 40" HDTV (UA40KU6000WXXY) / 27" Qnix 2K 110Hz
Case Fractal Design R5. So much room, so quiet...
Audio Device(s) Pioneer VSX-519V + Yamaha YHT-270 / sennheiser HD595/518 + bob marley zion's
Power Supply Corsair HX 750i (Platinum, fan off til 300W)
Mouse Logitech G403 + KKmoon desk-sized mousepad
Keyboard Corsair K65 Rapidfire
Software Windows 10 pro x64 (all systems)
Benchmark Scores Laptops: i7-4510U + 840M 2GB (touchscreen) 275GB SSD + 16GB i7-2630QM + GT 540M + 8GB
#16
a lot of people dont seem to be getting it

PhysX is an nvidia only way of doing this
DirectX 11 is doing an open (any video card) version of this.

ATI/AMD are saying that nvidias closd one will die, and the open version will live on.
 

Swansen

New Member
Joined
Nov 18, 2007
Messages
182 (0.05/day)
Likes
9
#17
EA and 2K seem to think so.
I wonder how the 295GTX will cope with all this.
I tend to not follow anything EA does, as they generally destroy anything they touch. The performance hit is a big deal for most people, as many don't buy bleeding edge graphics cards because they are to expensive. Eye candy is cool, but i hardly think many will miss a flag blowing in the wind. I think this just all goes to further a problem i've seen as of late, thats game developers focusing on the wrong things, gameplay should always be first, everything else comes after.

DirectX 11 is doing an open (any video card) version of this.
Lol, no i get what they were saying, its just a REALLY poor way of wording it, as DX is closed sourced software. I think most people are missing the fact that DX11 will have physics software built in, which, is very cool.... as long as its done right.
 

eidairaman1

The Exiled Airman
Joined
Jul 2, 2007
Messages
19,224 (5.03/day)
Likes
4,811
System Name PCGOD
Processor AMD FX 8350@ 5.0GHz
Motherboard Asus TUF 990FX Sabertooth R2 2901 Bios
Cooling Scythe Ashura, 2×BitFenix 230mm Spectre Pro LED (Blue,Green), 2x BitFenix 140mm Spectre Pro LED
Memory 16 GB Gskill Ripjaws X 2133 (2400 OC, 10-10-12-20-20, 1T, 1.65V)
Video Card(s) AMD Radeon 290 Sapphire Vapor-X
Storage Samsung 840 Pro 256GB, WD Velociraptor 1TB
Display(s) NEC Multisync LCD 1700V (Display Port Adapter)
Case AeroCool Xpredator Evil Blue Edition
Audio Device(s) Creative Labs Sound Blaster ZxR
Power Supply Seasonic 1250 XM2 Series (XP3)
Mouse Roccat Kone XTD
Keyboard Roccat Ryos MK Pro
Software Windows 7 Pro 64
#18
Proprietary is used in the loosest way possible here considering nVidia has expressed that they are more than willing to help get it running on ATi's hardware. ATi is forcing it to be propriatary by refusing to work with nVidia to get it working.

The performance hit is going to happen regardless of what API is used to create the physics. If they both are creating the same level of physic, the performance hit will be the same, as the graphics cards are asked to render more on the screen due to the physics.

Edit: Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
Sort of Like Nvidia Saying the 4830 is defective when its not. Then Before that The head of NV said they underestimated the R770
 
Joined
Sep 26, 2006
Messages
6,957 (1.70/day)
Likes
331
Location
Australia, Sydney
#19
I tend to not follow anything EA does, as they generally destroy anything they touch. The performance hit is a big deal for most people, as many don't buy bleeding edge graphics cards because they are to expensive. Eye candy is cool, but i hardly think many will miss a flag blowing in the wind. I think this just all goes to further a problem i've seen as of late, thats game developers focusing on the wrong things, gameplay should always be first, everything else comes after.



Lol, no i get what they were saying, its just a REALLY poor way of wording it, as DX is closed sourced software. I think most people are missing the fact that DX11 will have physics software built in, which, is very cool.... as long as its done right.
Very well spoken there...
 
Joined
Nov 13, 2007
Messages
6,142 (1.67/day)
Likes
1,636
Location
Austin Texas
System Name silen8
Processor Intel i7 7820X Delidded @ 4.64Ghz / 3.1Ghz Mesh
Motherboard MSI X299 Tomahawk
Cooling 240mm Corsair H105 Intake
Memory 32 GB Quad 3434Mhz DDR4 15-16-16-38-300-1T
Video Card(s) Gigabyte GTX 1080 Ti Gaming
Storage 1Tb Samsung 960 Pro m2, 1TB Samsung 850 Pro SSD
Display(s) Dell 24" 2560x1440 144hz, G-Sync @ 165Hz
Case NZXT S340 Elite Black
Audio Device(s) Arctis 7
Power Supply FSP HydroG 750W
Mouse zowie ec-2
Keyboard corsair k65 tenkeyless
Software Windows 10 64 Bit
Benchmark Scores Cb: 2103 Multi, 209 Single, 10450 Timespy - 10150 GPU/11900 CPU, superpi 1M - 7.71s
#20
is this anything like the time AMD said something about their 'true' quad core being faster thant 2 core 2's glued together? :nutkick: I think he's right - but ONLY if the DX11 way ACTUALLY works like its supposed to... which is a big if
 
Last edited:
Joined
Aug 23, 2006
Messages
57 (0.01/day)
Likes
1
Processor Intel Core 2 Quad QX9770 Yorkfield 4.00GHz
Motherboard Asus P5E3 Deluxe/WiFi-AP X38 Chipset Motherboard
Cooling Cooler Master Hyper 212 CPU Heatsink| Fans: Intake 1x120mm and 2x140mm| Exhaust 1x120mm and 2x140mm
Memory 4GB OCZ Platinum DDR3 1600 7-7-7-26
Video Card(s) 2 x Diamond Multimedia HD 4870 512MB Graphics Cards in CrossfireX
Storage 2 Western Digital 500GB 32MB Cache Caviar Blacks in RAID 0| 1 500GB 32MB Cache Seagate Barracuda.
Display(s) Sceptre X24WG 24" 1920x1200 4000:1 2ms LCD Monitor
Case Cooler Master CM 690
Audio Device(s) HT Omega HT Claro+
Power Supply Aerocool 750W Horsepower PSU
Software Windows Vista Home Premium x64
#21
Funny, they say proprietary standards will die, yet, what is DX?

I think they may be counting their chickens here.
Well look at when DX was pre-9, no one wanted to use it because openGL was a lot easier to use to achieve the same results. It wasn't until 9 that it was viewed as worthy API. There is a difference when when something proprietary is received well and when there is general easy to use alternative. In this case DX9 onward was offering an ease of use and feature set that developers liked. PhysX is just kind of there offering what can be done with alternatives. Alternatives that work with more systems and are free.
 
Joined
Jan 31, 2005
Messages
1,625 (0.35/day)
Likes
552
Location
The Pico Mundo Grill
System Name Commercial towing vehicle Nostromo
Processor R7 1700X Base Clock @4.000.000.000 Hz
Motherboard Crosshair Hero VI (BIOS 1701)
Cooling Hydro H110i V2 High Performance
Memory 2x8 GB Dominator CMD16GX4M2B3200C16 v4.31
Video Card(s) 970 STRIX
Storage 960 EVO M2 500 GB w. EKWB EK-M.2 NVMe nickel heatsink / UV400 480 GB / Red PRO 4 TB
Display(s) VG248QE
Case HAF XB
Audio Device(s) Onboard / JDS Labs O2 AMP / K550 headphones
Power Supply AX 860
Mouse KANA
Keyboard K60
Software Win 10 Pro x64 / KIS 2018
Benchmark Scores 2141 lightyears per hour.......
#22
I hope that game engine´s (like the Source engine from Valve) will gain upperhand in this battle - this way no one have to think about buying a specific piece of hardware to get the Physics pling-bing
 
Joined
Oct 5, 2007
Messages
1,714 (0.46/day)
Likes
182
Processor Intel C2Q Q6600 @ Stock (for now)
Motherboard Asus P5Q-E
Cooling Proc: Scythe Mine, Graphics: Zalman VF900 Cu
Memory 4 GB (2x2GB) DDR2 Corsair Dominator 1066Mhz 5-5-5-15
Video Card(s) GigaByte 8800GT Stock Clocks: 700Mhz Core, 1700 Shader, 1940 Memory
Storage 74 GB WD Raptor 10000rpm, 2x250 GB Seagate Raid 0
Display(s) HP p1130, 21" Trinitron
Case Antec p180
Audio Device(s) Creative X-Fi PLatinum
Power Supply 700W FSP Group 85% Efficiency
Software Windows XP
#23
Once again they come with the open standard excuse and LIES? Congratulations AMD, you finally made me a Nvidia fanboy, they are HONEST with their intentions at least. There's nothing else that I hate more than LIES and that is a big lie and retorted missinformation. Everything AMD/Ati is saying now is:
"We won't support a standard where Nvidia is faster until we have something to compete."

And open your eyes guys Nvidia IS and would probably be faster at physics calculations because they made changes to the architecture, like more CPU-like cahing system, ultra-light branching prediction, etc. Not exactly branching prediction but something that paliates the effects of lacking one. THAT's why Nvidia cards are faster in F@H for example, where more than number crunching is required. At simple number crunching Ati is faster: video conversion.

That advantage Nvidia has would apply to PhysX, OpenCL, DX11 physics or ANY hardware physics API you'd want to throw in. Their claim is just an excuse until they can prepare something. For instance they say they support HAvok, Intel OWNED Havok. Open standards? Yeah sure.


One more thing is that, PhysX is a physics API and middleware, with some bits of an engine here and there just as Havok, that can RUN on various platforms unchanged: Ageia PPUs, x86 cpus, Cell Microprocessor, and CUDA and potentially any PowerPC. It does not run directly on Nvidia GPUs, as you may remember CUDA is a x86 emulation API that runs on Nvidia cards. Once OpenCL is out, PhysX will be possible to do through OpenCL just as well as through CUDA. As long as Ati has good OpenCL there shouldn't be any problems, until then they could make PhysX run through CAL/Stream for FREE, BUT they don't want to, because it would be slower. IT'S at simple as that.

Another lie there, which you have to love, is that they claim that PhysX is just being used for eye candy. IT IS being used only for that AMD, yeah, but tell WHY. Because developers have been said hardware physics will not be supported on Ati GPUs until DX11, that's why. Because they are working hard along with Intel to make that statement true. Nvidia has many demos where PhysX are being used for a lot more, so it can be done.

AMD is just double acting. It's a shame Ati/AMD, a shame, I remember the days you were honest. I know bad times and experiences make personalities change, but this is inexcusable as well as the fact that all the advertising campaign has been based on bashing every initiative made by Nvidia instead of making your's better.

This sentence resumes it all (spealing of Havok on their GPU):

Our guidance was end of this year or early next year but, first and foremost, it will be driven by the milestones that we hit. To put some context behind GPU based physics acceleration, it is really just at the beginning. Like 3D back in the early 1990s. Our competition has made some aggressive claims about support for GPU physics acceleration by the end of this year. I.e. Support in many titles....but we can count the titles on one hand or just one or two fingers.
Like back in the 90's, because they are facing competition in something they can't compete, they are downplaying it. They know it's a great thing, they know it's the future, but they don't want that future to kick start yet. YOU SIMPLY CAN'T DOWNPLAY SOMETHING AND SAY IT WILL DIE, WHILE AT THE SAME TIME YOU'RE HARD WORKING ON YOUR OWN BEHIND THE CURTAINS!! AND USING INTEL'S HAVOK!!!
We know how accelerated graphics history evolved, despite what they said back then the GPU has become the most important thing and so will the hardware physics. Just as back then, they are just HOLDING BACK the revolution until they can be part of it. Clever, from a marketing stand point, but DISHONEST. You won't have my support Ati, you already pulled down another thing that I liked a lot: Stereo 3D. You can cheat me once, but not more.
 
Last edited:
Joined
Jan 29, 2006
Messages
241 (0.06/day)
Likes
24
System Name Home
Processor Q6600 @ 3300
Motherboard Gigabyte p31 ds3l
Cooling TRUE Intel Edition
Memory 4 gb x 800 mhz
Video Card(s) Asus GTX 560
Storage WD 1x250 gb Seagate 2x 1tb
Display(s) samsung T220
Case no name
Audio Device(s) onboard
Power Supply chieftec 550w
Software Windows 7 64
#24
You are right Darkmatter , it's true when dx11 will be available physx will be absolete or will slowly die but until then , prey AMD/ATI that Nvidia doesn't get more developers to use physx , it's a cool thing and the performance impact isn't that big for the eye candy it does , it's worth the performance loss.
I for one think this could kill AMD for good , if 2-3 big games launch with some physx thing and the difference is big bettwen them it could kill AMD graphics departement forever.
Nvidia could continue to battle in 3dmakrs and games perf. with AMD but it seems to go on for a long time and one advantage like this could end the competition a little faster.
I feel sorry for them , intel is kicking their asses , now Nvidia too , second place forever for AMD.
 

brian.ca

New Member
Joined
Nov 1, 2007
Messages
71 (0.02/day)
Likes
14
#25
Physx is not going to kill anyone... it's dx10.1 all over again. No one is going to make/sell a game relying on this thing if it's gonna screw over a good portion of the market, or leave them left out of any significant part of the game. Unless AMD/ATI support physx it will never be anything more than optional eyecandy and that limited role will limit it as a factor for people to might buy into it (would you pay $500 for a new card instead of $300 so you can get more broken glass?).

Some of you are talkiing about studios and games adopting physx but what have you seen so far? Mirror's edge seems to do a bit of showcase work for physx but how many would really buy that game? Being an EA game through and though I personally have a hard time believing the game would offer anything more than what I could get watching the trailer or some demos. Otherwise I haven't seen physx do anything that havok wasn't already doing. There's no extra edge here. There's probably some incentives from Nvidia - but then that comes back to what this guy was saying in the first place;

"We cannot provide comments on our competitor's business model except that it is awfully hard to sustain support by monetary incentives. The product itself must be competitive. We view Havok technologies and products to be the leaders in physics simulation and this is why we are working with them. Games developers share this view."

If they can market these technogies to all those big goons (Adobe, Microsoft, SunMicro, Apple...) and create user-friendly apps, boom! ATI would be certified dead in no time
This lends itself to why this stuff won't kill ATI... end of the list is Apple, didn't they put together the OpenCL standard? Which do you think they'll be pushing, CUDA or their own standard? Microsoft will be pushing it's own thing with DX11 down the road. Adobe recently took advantage of Nvidia's CUDA and ATI's Stream if I'm not mistaken... but do you think they'll want to keep on making 2 versions of their products when other people are pushing for a unified standard?

Of course I hope AMD realizes that they have kind of screwed themselves by saying that. History shows that when a company says their competitions product will fail, the product usually becomes wildly popular.
I guess this is all moot anyways then... if AMD responding to an Nvidia announcement for a reporter will guarantee success for Physx then surely the grandstanding Nvidia took part in vs. Intel will have Larrabee burying all competition.


At the end of the day people may not like what this guy is saying, why, or how, but it's true. AMD is not going to support Nvidia's proprietary APIs (and why the hell would they?), and with out that support other companies will have less incentive to get on board unless Nvidia provides it. That requires either a superior product or probably cash incentive. Now realisticly.. when the immediate alternatives to Nvidia's systems seem to be OpenCL (Apple - but open), DirectX 11 (Microsoft), and Havok (Intel), do you think these other standards won't have the resources behind them to provide both those things moreso than Nvidia? If you were in AMD's shoes who would you side with? They could do all but seriously... why? It'd just confuse other efforts and probably waste their own resources, and for what? To better prop up the competition that they can beat? So they can claim some moral high ground when they recall how Nv made DX10.1 completely moot?
 
Last edited: