• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Removes Restriction on ATI GPUs with NVIDIA GPUs Processing PhysX

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
That was a looooong time ago. Yes, GPU drivers from 2005 worked on my Radeon 9600 and games from that era too, but try running them today... They may work on many cases, but you are surely going to find a lot of problems. As a company Nvidia just wants to stay clear from any problems of that nature. Plain and simple.

so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.

Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.


The only reason nvidia are doing this is because they've done so much dodgy shit disabling features in the name of physX (such as with batman AA) that people might find out *gasp* that in fact, they just disable it on ATI even if physX is working.
 

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.66/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
I don't buy any PR crap, I've been saying this for a long long time, long, even before they said anything. Because I know for a matter of fact that things work that way. Not in the GPU or driver bussiness, but I've been there, so I know what is about. It doesn't matter if the QA is that important in the end or if it works at all, they have to do it, because in many countries it's obligatory. If they spent time and money and it doesn't work, no worries, but oh friend if it doesn't work and no QA was done... be prepared.

And they just don't want to spend the money on QA on something that is not really in their hands. A lot of that QA has to be made on AMD's end and they will just not do it. Even when only Nvidia cards are used, every PhysX driver update needs the latest GPU driver as well, or everything gets fucked up soon, that's something that I have suffered from. So a mix between Ati and Nvidia is always going to be worse.

Now, the idea of allowing it on the beta... that could work, but there's still the fact that it would not work on Vista systems and that's a nightmare to explain to average joe and it owuldn't be very different than the hack anyway. The hack has probably more support than the beta regarding Ati+Nvidia setup.
Yeah and one small disclaimer on the box would cover them in most countries if they are that chicken shit. Like I said if one hacker can make it work great then why can't a billion dollar company? Its PR BS plain and simple. If you still don't think so then you should sue every game developer in the world for not making 100% compatible games with EVERY combination of hardware.

The QA was already done... AGEIA PPU's worked on ATI, nvidia, SIS, matrox, etc.
Yup. Now tell me AGEIA was a better funded company than Nvidia. :rolleyes:
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
That was a looooong time ago. Yes, GPU drivers from 2005 worked on my Radeon 9600 and games from that era too, but try running them today... They may work on many cases, but you are surely going to find a lot of problems. As a company Nvidia just wants to stay clear from any problems of that nature. Plain and simple.

No, they just wanted to force people to use only nVidia hardware, plain and simple. All they had to do was allow Physx to run in coprocessor mode, like the original PPU, if they were worried about conflicting video drivers.

Hell, they don't even have to go that far. They can simply say mixed gpu solutions are not officially supported, and they wash their hands of the imagined support costs.

And none of that explains why they let it go for so long before deciding to cut it out.

It has nothing to do with support at all. It's nVidia not happy with the situation, and taking their ball and going home.

so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.

Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.


The only reason nvidia are doing this is because they've done so much dodgy shit disabling features in the name of physX (such as with batman AA) that people might find out *gasp* that in fact, they just disable it on ATI even if physX is working.

Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.
 
Joined
Jul 19, 2006
Messages
43,587 (6.72/day)
Processor AMD Ryzen 7 7800X3D
Motherboard ASUS TUF x670e
Cooling EK AIO 360. Phantek T30 fans.
Memory 32GB G.Skill 6000Mhz
Video Card(s) Asus RTX 4090
Storage WD m.2
Display(s) LG C2 Evo OLED 42"
Case Lian Li PC 011 Dynamic Evo
Audio Device(s) Topping E70 DAC, SMSL SP200 Headphone Amp.
Power Supply FSP Hydro Ti PRO 1000W
Mouse Razer Basilisk V3 Pro
Keyboard Tester84
Software Windows 11
Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.

It does work with a Physics card rather well. Both AA and the PhysX. So is AA being run exclusively through my GT 240?
 

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.66/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
No, they just wanted to force people to use only nVidia hardware, plain and simple. All they had to do was allow Physx to run in coprocessor mode, like the original PPU, if they were worried about conflicting video drivers.

Hell, they don't even have to go that far. They can simply say mixed gpu solutions are not officially supported, and they wash their hands of the imagined support costs.

And none of that explains why they let it go for so long before deciding to cut it out.

It has nothing to do with support at all. It's nVidia not happy with the situation, and taking their ball and going home.



Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.
The AA works fine when you force it. I even made a thread on it.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
so come up with a toggle in the driver options to switch a video card from GPU to PPU/CUDA card, so that video drivers turn off and only CUDA (and apps that use it) remain.

Set in a safeguard so that it cant be used if a monitor is connected to the card, and away you go, back to the Ageia days.

Like I said GPU drivers and PhysX drivers are closely tied, they can't do that.

Yeah and one small disclaimer on the box would cover them in most countries if they are that chicken shit. Like I said if one hacker can make it work great then why can't a billion dollar company? Its PR BS plain and simple. If you still don't think so then you should sue every game developer in the world for not making 100% compatible games with EVERY combination of hardware.

Like I said because that hacker has never QA it. Thousands of people in the internet have done it for free. If the hacker had to pay to everyone that helped him make the mod stable, he would need to be a multibillion company and have a real interest in spending that much on it too. Besides, they do a lot more reverse engineering in NGOH than it is "socially acceptable" in the bussiness world, if you know what I mean. They have much more (real, applicable) access to Ati hardware than any company will ever do.

Yup. Now tell me AGEIA was a better funded company than Nvidia. :rolleyes:

yup, and Ageia went bankrupt...

Nvidia strugles to stay in green. Hell even Ati struggles and the reason that Stream never kicked off is because they simply didn't want to put money on it. Same for OpenCL right now, or GPU accelerated Havok or countless of other examples. Just because Nvidia has money doesn't mean they have to let it go down the drain.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Like I said because that hacker has never QA it. Thousands of people in the internet have done it for free. If the hacker had to pay to everyone that helped him make the mod stable, he would need to be a multibillion company and have a real interest in spending that much on it too. Besides, they do a lot more reverse engineering in NGOH than it is "socially acceptable" in the bussiness world, if you know what I mean. They have much more (real, applicable) access to Ati hardware than any company will ever do.



yup, and Ageia went bankrupt...

Nvidia strugles to stay in green. Hell even Ati struggles and the reason that Stream never kicked off is because they simply didn't want to put money on it. Same for OpenCL right now, or GPU accelerated Havok or countless of other examples. Just because Nvidia has money doesn't mean they have to let it go down the drain.
It was working fine before nV blocked it. We didn't even need the hacker.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Stop using Batman as an example. It's a poor one, and doesn't support your arguments at all. We've been over this a million times. That is one place nV was not wrong. The AA in Batman does not work in ATI properly, even when you force it.

not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.


I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
It was working fine before nV blocked it. We didn't even need the hacker.

It was working back then and it might work now just like I can take my 9600 pro out of the closet, take it's dirver cd and play many games. Why the hell do they release GPU drivers every month? The previous ones work just well...

They don't want to have to worry EVER. Period. That's why you just cut it off. Other option is enable it and see the web flooding with complaints. And don't say that there would be no complaints, because there's been many complaints about far more irrelevant things than poor fps in a certain game which is the first sympton that would be noticed.

not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.


I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.

When forced from CCC it's not Nvidia's AA, it's the normal supersampling AA that is always posible.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
It was working back then and it might work now just like I can take my 9600 pro out of the closet, take it's dirver cd and play many games. Why the hell do they release GPU drivers every month? The previous ones work just well...

They don't want to have to worry EVER. Period. That's why you just cut it off. Other option is enable it and see the web flooding with complaints. And don't say that there would be no complaints, because there's been many complaints about far more irrelevant things than poor fps in a certain game which is the first sympton that would be noticed.



When forced from CCC it's not Nvidia's AA, it's the normal supersampling AA that is always posible.

no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place (Why not allow ATI to have supersampling AA in the in-game options?)
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place (Why not allow ATI to have supersampling AA in the in-game options?)

Doesn't work properly on the hardware. That was proven a long time ago with back to back screenshots with it enabled on ATI, when people first starting complaining about it.

Or if you mean the regular SSAA, it's because it's not in the engine at all. It would just be the equivalent of forcing it thru the CCC anyway.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
Doesn't work properly on the hardware. That was proven a long time ago with back to back screenshots with it enabled on ATI, when people first starting complaining about it.

screenshots arent the best way to confirm AA is working, if its done in post processing it doesnt show up in screenies - i've heard of this many times before as to why screenshots look worse than in game.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
no one cares that nvidias special optimised AA mode is disabled, just that they blocked AA on ATI cards in the first place (Why not allow ATI to have supersampling AA in the in-game options?)

MY GOD!!! This has been discussed thousands of times. Unreal Engine 3 has no AA and no other UE3 game besides Batman has an in-game AA option. If you want to enable it it has to be made from CCC... why batman has to be any different?? Nvidia didn't block anything, they added their own AA. Period.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
screenshots arent the best way to confirm AA is working, if its done in post processing it doesnt show up in screenies - i've heard of this many times before as to why screenshots look worse than in game.

Even the people running it said there was no difference.

They didn't disable anything for ATI. They added a feature for themselves. 2 entirely different things. If they disabled shit for ATI, we would already be hearing about anti-trust/anti-competitive lawsuits or investigations. Nothing supports your theory, mussels.
 

TheMailMan78

Big Member
Joined
Jun 3, 2007
Messages
22,599 (3.66/day)
Location
'Merica. The Great SOUTH!
System Name TheMailbox 5.0 / The Mailbox 4.5
Processor RYZEN 1700X / Intel i7 2600k @ 4.2GHz
Motherboard Fatal1ty X370 Gaming K4 / Gigabyte Z77X-UP5 TH Intel LGA 1155
Cooling MasterLiquid PRO 280 / Scythe Katana 4
Memory ADATA RGB 16GB DDR4 2666 16-16-16-39 / G.SKILL Sniper Series 16GB DDR3 1866: 9-9-9-24
Video Card(s) MSI 1080 "Duke" with 8Gb of RAM. Boost Clock 1847 MHz / ASUS 780ti
Storage 256Gb M4 SSD / 128Gb Agelity 4 SSD , 500Gb WD (7200)
Display(s) LG 29" Class 21:9 UltraWide® IPS LED Monitor 2560 x 1080 / Dell 27"
Case Cooler Master MASTERBOX 5t / Cooler Master 922 HAF
Audio Device(s) Realtek ALC1220 Audio Codec / SupremeFX X-Fi with Bose Companion 2 speakers.
Power Supply Seasonic FOCUS Plus Series SSR-750PX 750W Platinum / SeaSonic X Series X650 Gold
Mouse SteelSeries Sensei (RAW) / Logitech G5
Keyboard Razer BlackWidow / Logitech (Unknown)
Software Windows 10 Pro (64-bit)
Benchmark Scores Benching is for bitches.
yup, and Ageia went bankrupt...

Nvidia strugles to stay in green. Hell even Ati struggles and the reason that Stream never kicked off is because they simply didn't want to put money on it. Same for OpenCL right now, or GPU accelerated Havok or countless of other examples. Just because Nvidia has money doesn't mean they have to let it go down the drain.

Ageia went bankrupt because they had no financial backing to push Physx into the mainstream. Its not because they couldn't support the QA monetarily. Nvidia bought them out because they could. Now you are ether saying Physx is a waste of time and that is why Nvidia doesn't do the "QA".

Come on man admit it! Nvidia blocked a feature so that you would buy their hardware exclusively. All that hacker did was re-enable it. This has nothing to do with QA and everything to do with investment. If Nvidia was smart and REALLY wanted Phyisx to go mainstream they would sell dedicated PPU's like AGIEA. However they won't. Why because they think they bought the golden goose with Physx. Problem is none of the developers seem to agree unless you toss a bucket of money at them with TWIMTBP program.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.18/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...

i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...

i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.

So it's just an anti-Physx post then? Coming from someone who considers you a friend, that seems a bit trollish to me, Mussels.

Physx is capable of much more. It doesn't have the market share for devs to use it for anything more tho. Non-nV users still need to want to play the games, and using Physx too heavily counts them out of the super advanced features. That IS nV's fault, however, and this blocking Physx on systems with ATI is one of the prime reasons.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
i wasnt the one who brought up AA! i said batman AA as in batman Arkham Asylum...

i was talking about the other stuff, the random debris and effects that got disabled without physX, when they dont need it for those effects.

PhysX is hardware accelerated on at least the PS3, maybe that's why they have certain features. The Xbox is able to handle 3 threads so they might have one only for physx too. AS much as you might disagree, NO, that cannot be done on the PC, because most people don't have quads. And games using PhysX, when runing from the CPU do use 2 threads, although that's something the developer decides, how many to use, always based on the lowest common denominator. For comparison Havok games use only one CPU core, I have tested that myself, with NFS:Shift, Batman and Mass Effect for PhysX and Fallout3, L4D and HL2:Ep2 for Havok.
 

lyndonguitar

I play games
Joined
Apr 1, 2010
Messages
1,878 (0.37/day)
Location
Philippines
System Name X6 | Lyndon-ROG
Processor Intel Core i7-8700k | Intel Core i7 6700HQ
Motherboard Gigabyte Z370 Aorus Gaming 5 | Asus ROG-GL552VX
Cooling Deepcool Captain 240EX
Memory 16GB Corsair Vengeance LED | 8 GB
Video Card(s) NVIDIA GTX 1080 8 GB GDDR5X | NVIDIA GeForce GTX 950M 4GB
Storage SSDs: 500GB, HDDs: 2TB, 2TB, 3TB | SSD: 250GB, HDD: 1TB
Display(s) Samsung 49" CHG90 3840x1080@144Hz, Panasonic 32" HDTV, | 15.6"1080p
Case Cougar Panzer Max
Audio Device(s) HyperX Cloud II | Corsair Gaming H1500 7.1 | ROCCAT Kave 5.1 | Edifier M3200
Power Supply EVGA 750GQ
Mouse Logitech G403 | Razer Deathadder Chroma | Logitech G302 | Mad Catz Cyborg R.A.T. 5
Keyboard Corsair Vengeance K70 Cherry MX Red
Software Windows 10
I have a question? What company is supporting Crysis 2 right now?. Nvidia or ATI? I saw some demos of Crysis 2 run on Eyefinity and was hosted by ATi and wondering because if its Nvidia again, PhysX will be there and i don't have a PhysX Card(yet.) If its Nvidia, might as well buy a cheap Nvidia card.
 

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
If Nvidia was smart and REALLY wanted Phyisx to go mainstream they would sell dedicated PPU's like AGIEA.

They don't do that, because that bussiness model was proved to be fail. Ageia already did it and everybody at the time agreed that a PPU could be a good idea as long as it was integrated, either in the MB or the GPU and Nvidia did exactly that, integrate it into the GPU.

Problem is none of the developers seem to agree unless you toss a bucket of money at them with TWIMTBP program.

That is not a problem with PhysX. That is a problem that affects PhysX, affects the inclusion of dedicated servers, affects the optimization of the PC port, affects the UI... Without the pushing from PC centric companies, most games would lack dedicated servers, would look like a 2000 game while running so bad that would seem they were running off a Pentium 2 and you would have to use the arrow keys to aim and press the triangle to shoot and the circle to jump (not that it isn't happening already... i.e. Dead Space? Street Fighter 4?).
 

newtekie1

Semi-Retired Folder
Joined
Nov 22, 2005
Messages
28,472 (4.23/day)
Location
Indiana, USA
Processor Intel Core i7 10850K@5.2GHz
Motherboard AsRock Z470 Taichi
Cooling Corsair H115i Pro w/ Noctua NF-A14 Fans
Memory 32GB DDR4-3600
Video Card(s) RTX 2070 Super
Storage 500GB SX8200 Pro + 8TB with 1TB SSD Cache
Display(s) Acer Nitro VG280K 4K 28"
Case Fractal Design Define S
Audio Device(s) Onboard is good enough for me
Power Supply eVGA SuperNOVA 1000w G3
Software Windows 10 Pro x64
not just AA, the items/debris that suddenly appears as well, with physX on. its got nothing to do with physX as iirc, it was on the console versions.


I use it as an example because i keep hearing crap about it from nvidia users...and AA works just fine in it on ATI, if you force it via CCC.

No, it wasn't on the console versions, at least not the PS3 version. The console versions looked the same as the PC version with PhysX turned off.

The AA forced through CCC is FSAA, not MSAA, which is why the AA enabled through CCC comes at a huge performance hit, and the AA enabled through the game menu doesn't.

Also, the AA that nVidia added to UE3 for Batman doesn't work on ATi hardware as it wasn't coded for ATi hardware. This is identical to your argument about CUDA not working on ATi hardware. It wasn't coded for ATi hardware, so it doesn't work on ATi hardware.
 

GenL

New Member
Joined
May 30, 2010
Messages
11 (0.00/day)
Guys, just to let you know... about a month ago i've made a fix for Batman which makes the game use its native MSAA code on any hardware.
You can find more info about it here: http://www.ngohq.com/graphic-cards/17716-batman-arkham-asylum-msaa-fix.html

Such thought as...
doesn't work on ATi hardware as it wasn't coded for ATi hardware
are invalid.
This case has nothing to do with nvidia hardware/technology, just two VendorID checks. One in the launcher, another one in the game.

What about CUDA... it's just a badly programmed applications. They are supposed to choose CUDA GPU by themselves, but some apps just ignore anything but primary GPU.
 

the54thvoid

Intoxicated Moderator
Staff member
Joined
Dec 14, 2009
Messages
12,460 (2.38/day)
Location
Glasgow - home of formal profanity
Processor Ryzen 7800X3D
Motherboard MSI MAG Mortar B650 (wifi)
Cooling be quiet! Dark Rock Pro 4
Memory 32GB Kingston Fury
Video Card(s) Gainward RTX4070ti
Storage Seagate FireCuda 530 M.2 1TB / Samsumg 960 Pro M.2 512Gb
Display(s) LG 32" 165Hz 1440p GSYNC
Case Asus Prime AP201
Audio Device(s) On Board
Power Supply be quiet! Pure POwer M12 850w Gold (ATX3.0)
Software W10
The QA issue is nonsense. Nvidia does not guarantee 'safety' of its own drivers as clearly stated in the very end of the product info for all it's WHQL driver releases. The following is a direct quote..

"The software is provided 'as is', without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and noninfringement. In no event shall the contributors or copyright holders be liable for any claim, damages or other liability, whether in an action of contract, tort ot otherwise, arising from, out of or in connection with the software or the use or other dealings with the software"

I'm sure ATI has the same disclaimer, effectively saying if drivers are dodgy, tough - you installed them. It's this sort of small print ant the end of the release notes thate makes a mockery of the notion that just because they're official you can use them with absolute certainty you have recourse to legal action if things go wrong.

So this kind of nullifies any arguments about QA for mixing gfx cards and physx where system damage is the end result.
 
Last edited:

Benetanegia

New Member
Joined
Sep 11, 2009
Messages
2,680 (0.50/day)
Location
Reaching your left retina.
The QA issue is nonsense. Nvidia does not guarantee 'safety' of its own drivers as clearly stated in the very end of the product info for all it's WHQL driver releases. The following is a direct quote..

"The software is provided 'as is', without warranty of any kind, express or implied, including but not limited to the warranties of merchantability, fitness for a particular purpose and noninfringement. In no event shall the contributors or copyright holders be liable for any claim, damages or other liability, whether in an action of contract, tort ot otherwise, arising from, out of or in connection with the software or the use or other dealings with the software"

I'm sure ATI has the same disclaimer, effectively saying if drivers are dodgy, tough - you installed them. It's this sort of small print ant the end of the release notes thate makes a mockery of the notion that just because they're official you can use them with absolute certainty you have recourse to legal action if things go wrong.

So this kind of nullifies any arguments about QA for mixing gfx cards and physx where system damage is the end result.

That disclaimer is as useless as the EULA in games. Maybe it has legal weight in the US, but outside of the US it's useless. They can say as much as they want but, at least in the EU, they have legal responsability no matter what they say. Laws are always above any contract.

Those disclaimers and the EULA are put there in order to make people think they can't do anything if something goes wrong and from what I see, it works.
 
Top