• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Nvidia PhysX 8.07.18 driver ( Geforce 8/9/GT200 series )

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,680 (2.29/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
PhysX is all about the amount of shaders and the speed they run on. More shaders and/or faster shaders = more PhysX power.

That's how I have come to believe the CUDA PhysX program, but it's not just about that, aside from how efficient they can make them for physX, using drivers none-the-less! If PhysX really takes off, this could be a huge thing imo. It's got enough buzz already.

It would be completly unlogic if Nvidia adds deticated shaders for physx because that would be a waste of power.
Yes, but has either NV or ATI done everything how we would view as logical? I'd think it'd be kind of cool if a new 55nm Ageia PPU, with 128-256 memory "built" into a graphics card, or an Ageia PPU built into the GPU could be interesting..what would it do to die sizes? heat output? overclocking? Who knows! Will it ever happen? Probably not, but it could.

But if NV designed a shader specifically for PhysX programming, that aren't counted in the mainstream shaders, yeah it may be kind of "shady", but I could totally believe it happening.

And you have to think of what your saying, 2 GTX280 cards should always be faster then one.

I don't necessarily think he meant a 2nd 280, but more-so a card like the 8600GT p_o_s_pc was referring to. :rolleyes:

Whether or not it's more than heresay, I dunno..I'm not worried, I enjoy my games atm, PhysX or not..sure it's a neat gimmick that to me has more promise than DX10 did at launch, and DX11 does from hype, because we can actually experience it and have been able to unofficially for quite a bit. All these technologies will eventually lead to the consumer winning with better gaming experiences, it's just the time it takes to get everything dialed in I suppose, and set all the facts straight.


:toast:
 
Joined
Jan 24, 2008
Messages
888 (0.15/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
I really don't care about it all that much. My next card will be an ATI card for sure.
 

Kursah

Super Moderator
Staff member
Joined
Oct 15, 2006
Messages
14,680 (2.29/day)
Location
Missoula, MT, USA
System Name Kursah's Gaming Rig 2018 (2022 Upgrade) - Ryzen+ Edition | Gaming Laptop (Lenovo Legion 5i Pro 2022)
Processor R7 5800X @ Stock | i7 12700H @ Stock
Motherboard Asus ROG Strix X370-F Gaming BIOS 6203| Legion 5i Pro NM-E231
Cooling Noctua NH-U14S Push-Pull + NT-H1 | Stock Cooling
Memory TEAMGROUP T-Force Vulcan Z 32GB (2x16) DDR4 4000 @ 3600 18-20-20-42 1.35v | 32GB DDR5 4800 (2x16)
Video Card(s) Palit GeForce RTX 4070 JetStream 12GB | CPU-based Intel Iris XE + RTX 3070 8GB 150W
Storage 4TB SP UD90 NVME, 960GB SATA SSD, 2TB HDD | 1TB Samsung OEM NVME SSD + 4TB Crucial P3 Plus NVME SSD
Display(s) Acer 28" 4K VG280K x2 | 16" 2560x1600 built-in
Case Corsair 600C - Stock Fans on Low | Stock Metal/Plastic
Audio Device(s) Aune T1 mk1 > AKG K553 Pro + JVC HA-RX 700 (Equalizer APO + PeaceUI) | Bluetooth Earbuds (BX29)
Power Supply EVGA 750G2 Modular + APC Back-UPS Pro 1500 | 300W OEM (heavy use) or Lenovo Legion C135W GAN (light)
Mouse Logitech G502 | Logitech M330
Keyboard HyperX Alloy Core RGB | Built in Keyboard (Lenovo laptop KB FTW)
Software Windows 11 Pro x64 | Windows 11 Home x64
I really don't care about it all that much. My next card will be an ATI card for sure.

That's cool, I suppose it's nice the thread you started has enough that care to keep it going then! :D
I can't make a call like that for "next" upgrades...I've had plenty of ATI and NV cards over the years...I go with what fits the budget at the time and gets the performance I want at that particular time, has the positive stuff I want and the negative stuff I'm willing to deal with...the NV GTX 260 won this round...my next could be an ATI, Intel, NV...who really knows? Not a big deal atm.

:toast:
 

p_o_s_pc

F@H&WCG addict
Joined
May 2, 2007
Messages
13,006 (2.09/day)
Location
Newark ohio
System Name el'lappy|Cruncher | Cruncher 2
Processor intel C2D T6400 | i7 3770k @4.2ghz | AII X2 220 @3.4ghz
Motherboard some Acer | Asus P8Z77-V Pro |Gigabyter GA-M61p-S3
Cooling dual 80mm cooling fan | WC'ing ) |cheapo
Memory 2x2gb ddr3 | 2x2gb Ripjaws 1600 |4x512mb D9s
Video Card(s) onboard | 60 1GB(hd7770 ) |8800GTS
Storage Momuntus xt 320gb |Kingston Hyper X 120gb SATA III|500gb WD
Display(s) 17in | 42" 1080P HD 3D TV
Case lappy case | CM HAF XB |none yet
Audio Device(s) onboard | onboard | onboard
Power Supply Dell brick w/ acer end end | Antec EW 650w |Antec SP 350w(upg.soon)
Software Windows 7 Ult. 64bit---->------->
I am with Kursah i can't say if i am getting nv or ATI next. I have had 6 Nv cards and 4 ATI cards.
 

Mussels

Freshwater Moderator
Staff member
Joined
Oct 6, 2004
Messages
58,413 (8.17/day)
Location
Oystralia
System Name Rainbow Sparkles (Power efficient, <350W gaming load)
Processor Ryzen R7 5800x3D (Undervolted, 4.45GHz all core)
Motherboard Asus x570-F (BIOS Modded)
Cooling Alphacool Apex UV - Alphacool Eisblock XPX Aurora + EK Quantum ARGB 3090 w/ active backplate
Memory 2x32GB DDR4 3600 Corsair Vengeance RGB @3866 C18-22-22-22-42 TRFC704 (1.4V Hynix MJR - SoC 1.15V)
Video Card(s) Galax RTX 3090 SG 24GB: Underclocked to 1700Mhz 0.750v (375W down to 250W))
Storage 2TB WD SN850 NVME + 1TB Sasmsung 970 Pro NVME + 1TB Intel 6000P NVME USB 3.2
Display(s) Phillips 32 32M1N5800A (4k144), LG 32" (4K60) | Gigabyte G32QC (2k165) | Phillips 328m6fjrmb (2K144)
Case Fractal Design R6
Audio Device(s) Logitech G560 | Corsair Void pro RGB |Blue Yeti mic
Power Supply Fractal Ion+ 2 860W (Platinum) (This thing is God-tier. Silent and TINY)
Mouse Logitech G Pro wireless + Steelseries Prisma XL
Keyboard Razer Huntsman TE ( Sexy white keycaps)
VR HMD Oculus Rift S + Quest 2
Software Windows 11 pro x64 (Yes, it's genuinely a good OS) OpenRGB - ditch the branded bloatware!
Benchmark Scores Nyooom.
PhysX is all about the amount of shaders and the speed they run on. More shaders and/or faster shaders = more PhysX power.

It would be completly unlogic if Nvidia adds deticated shaders for physx because that would be a waste of power.

And you have to think of what your saying, 2 GTX280 cards should always be faster then one.

way to misquote and take my comments completely out of context.

CUDA is all about the shaders - PHYSX is not. Cuda emulates the physx hardware on the 8x/9x series hardware, while the GT200 cards can easily have a dedicated physx chip, or dedicated part of the main chip (who says it has to be shaders? they own ageia and could have a new chip on there)

you totally misquoted me on the 'two is faster than one' as i was talking about PHYSX. since shaders are taken away from say, a G92 card to use to process physx, you get a small FPS hit (and a massive reduction of CPU usage). Using a second card for physx would free that GPU power back up. On a card with dedicated physx hardware, adding another card for *PHYSX* will not improve FPS - adding it for *SLI* would
 
Joined
Jan 24, 2008
Messages
888 (0.15/day)
System Name Meshify C Ryzen 2019
Processor AMD Ryzen 3900X
Motherboard X470 AORUS ULTRA GAMING
Cooling AMD Wraith Prism LED Cooler
Memory 32GB DDR4 ( F4-3200C16D-32GTZKW, 16-16-16-36 @ 3200Mhz )
Video Card(s) AMD Radeon RX6800 ( 2400Mhz/2150Mhz )
Storage Samsung Evo 960
Display(s) Pixio PX275h
Case Fractal Design Meshify C – Dark TG
Audio Device(s) Sennheiser GSP 300 ( Headset )
Power Supply Seasonic FOCUS Plus Series 650W
Mouse Logitech G502
Keyboard Logitech G 15
Software Windows 10 Pro 64bit
I am with Kursah i can't say if i am getting nv or ATI next. I have had 6 Nv cards and 4 ATI cards.

The reason why I am ditching Nvidia after 7 Nvidia GPU's is because the company has done nothing but leeching of gamers the last couple of years.

I remember a webcast where Michael Haro or Hara ( don't remember the exact name ) from Nvidia announced that the G92 was going to have 1 Teraflop power, that the next-gen cards were going to be 3 times as fast. Not only that, the guy clearly sad that they would get released by the end of 2007.

I am sure that Nvidia already was already at a pretty advanced stage at that time, because otherwise they wouldn't make such announcements. The specs were probably identical to the GTX280 becides the fact that it would be based on a G9 core. But there is nothing that a couple of extra shaders couldn't solve. Afterall, they sad 1 Gigaflop.

But when ATI released such crappy cards Nvidia thaught "how, let's back off here, ATI can't compete with our Geforce 8 products so why release new products if we can still make a lot of money with this old crap. Then they released the 8800GT and the 8800GTS to compete with the slightly better HD3 cards and Nvidia won the battle once again by doing nothing but reducing the size of the chip.

Then Nvidia released the Geforce 9 cards. Each and every one of them is a Geforce 8 product with a new label on it. The only card that I actually consider as "new" is the 9800GX2, but the correct name would have been 8800GX2. How many thousands of gamers bought a Geforce 9 card thinking they were buying a new product? Come on now, Nvidia had to lock the 9600GT so that it couldn't beat the 8800GT in benchmarks.

I actually know a guy who upgraded from a 8800GT to a Geforce 9800GTX ( and he doesn't overclock ), talk about a waste of money. And this is not a guy who has a lot of money, he actually had to save for quite some time to come up with it. The local shop recommended this "brand new product" to him.

Now look at all the news about all the crappy products Nvidia has solled over the years. It's a good thing that they lost a lot of money because of it. And now there are rumors going around that even the G92 cards have the same problem, wich means that it's only a mather of time before a lot of G92 products start to overheat ( if the rumors are true offcourse ).

And I really LOVE the latest stunt Nvidia pulled. Releasing their "pack" at the same day as ATI releases their 4870X2 so that ATI gets less media-attention. This is what I am going to do from now on. I am going to buy an HD4 card, Nvidia can release their GT300 or Geforce 10 but I wait untill ATI releases their HD5 card.

First Intel pays hundreds of companies to prevent AMD sales, they get sued by AMD and the trails have been delayed, delayed and delayed once again... Now Nvidia has been pulling of a lot of stunds to leech money from consumers.

We need AMD/ATI to survive, because if they don't, we can say "bye bye" to PC gaming. Because there is no way in hell we will be able to afford new hardware if Intel and Nvidia don't have anyone to compete with. Not only that, it will slow down the development process a lot. You don't have to be an AMD fanboy to know this is true.

Since I know about Creatives strategy with their faulty drivers, I decided to stop buying creative products. I sold my Creative Zen and I will not buy an X-FI card.

This is why I don't buy Apple hardware/software:
- XP is now, 6 years old? And all the latest software still runs 100% fine on it. Not only that, new software will work on it for years to come.
- Ever tried installing the latest software like CS3 on a MAC OS that is a couple of years old? A lot of professional editing software requires the latest OS, and that will never change.
- Have you seen the latest Apple advertisements? http://www.youtube.com/watch?v=oaN1Nz1Dyls

Hey, I might be a little extreem about this, but I know that many people feel the same way...
Sure, every company screws the consumers every once and a while, but once it becomes a consistant strategy, I say "bye bye, I'll buy somewhere else...".

Some people forget that consumers can controll what companies do and not the other way around.
 
Last edited:
Top