• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce Driver Version 177.79 Released

malware

New Member
Joined
Nov 7, 2004
Messages
5,422 (0.72/day)
Location
Bulgaria
Processor Intel Core 2 Quad Q6600 G0 VID: 1.2125
Motherboard GIGABYTE GA-P35-DS3P rev.2.0
Cooling Thermalright Ultra-120 eXtreme + Noctua NF-S12 Fan
Memory 4x1 GB PQI DDR2 PC2-6400
Video Card(s) Colorful iGame Radeon HD 4890 1 GB GDDR5
Storage 2x 500 GB Seagate Barracuda 7200.11 32 MB RAID0
Display(s) BenQ G2400W 24-inch WideScreen LCD
Case Cooler Master COSMOS RC-1000 (sold), Cooler Master HAF-932 (delivered)
Audio Device(s) Creative X-Fi XtremeMusic + Logitech Z-5500 Digital THX
Power Supply Chieftec CFT-1000G-DF 1kW
Software Laptop: Lenovo 3000 N200 C2DT2310/3GB/120GB/GF7300/15.4"/Razer
Along with the release of the new NVIDIA GeForce 9800 GTX+, 9800 GT, and 9500 GT GPUs comes a new beta driver. The GeForce driver release 177.79 comes to add nothing but these three GPUs to the database of supported GeForce 8-series, 9-series, and 200-series GPUs. It may also bring some performance improvements in various games and applications. Give the release notes a try for more information.

DOWNLOAD: Windows XP 32-bit|64-bit, Windows Vista 32-bit|64-bit

View at TechPowerUp Main Site
 
Hopefully these will be a lot better than those crappy 175.19's!
 
177.39 are still my faves atm :toast:
 
So still no official ones that give PhysX, or have I missed them? Still with 174.70 that have been fantastic, but will change motherboard in couple days and might try something newer.

edit: ah it was in the news here, coming in a week.
http://forums.techpowerup.com/showthread.php?t=66626
 
Last edited:
So still no official ones that give PhysX, or have I missed them? Still with 174.70 that have been fantastic, but will change motherboard in couple days and might try something newer.


Why would you process physisx on your gpu?
We already have struggling gpus.
A sub 200$ quad core can handle the job pretty well.
That CUDA thing is a actually a battle between nvidias ageia and INTELs havok.
The more of processing something by your gpu means the less of processing
something else.
 
Hopefully these will be a lot better than those crappy 175.19's!

Dude, why on earth would you still be using 175 series drivers? The 177 series has been out for months.

Why would you process physisx on your gpu?

Because the GPU is better at it than the CPU. Every gamer has a GPU, not every gamer has a quad-core.

We already have struggling gpus.

Correct, we do.

A sub 200$ quad core can handle the job pretty well.

A sub $100 GPU can do the job better.

That CUDA thing is a actually a battle between nvidias ageia and INTELs havok.

CUDA really has nothing to do with it. CUDA is far more than just physics. PhysX uses a few aspects of CUDA, but CUDA has far better uses than just video games.

The more of processing something by your gpu means the less of processing
something else.

This argument is the same as asking why we processes AA or AF with the GPU. If the GPU didn't have to process AA/AF it could processes something else better. Everything is a trade off.

Of course in same cases, there is absolutely no trade off. Like the GT200 GPU's that have a dedicated section of the GPU die specifically for PhysX. So when you run PhysX on those GPU's, it doesn't take any power away from the graphics rendering.

Of course there is also the fact that your main GPU doesn't have to do the PhysX processing at all. You can have a super high end graphics card doing the graphical rendering, and a $50 GPU doing the PhysX and you will be perfectly fine, with no power being taken away from graphics GPU.
 
Last edited:
Yea they fixed the downclocking issue by totally eliminating 2d clocks, my card no longer goes in to 2d mode, and stays at full clocks. Nvidia is getting pathetic now, break one of most important features to fix the issue I never had. In evga forum Jacob confirmed with me that 2d clocks are not disabled in the driver but that is not the case for me, the card stopped idling in 2d clocks now it it in full 3d clocks all the time.
 
Last edited:
Yea they fixed the downclocking issue by totally eliminating 2d clocks, my card no longer goes in to 2d mode, and stays at full clocks. Nvidia is getting pathetic now, break one of most important features to fix the issue I never had.

What card is that with? There are several ways to get 2D clocks if you really want them, both via software and via BIOS flashes. Most nVidia cards no longer have 2D clocks in the BIOS, which is why they don't downclock, it has nothing to do with the drivers. If you want 2D clocks, set them yourself. There is no reason 2D clocks are "one of the most important features" though. They didn't even help save power.
 
Dude, why on earth would you still be using 175 series drivers? The 177 series has been out for months.

Cus these are the first beta drivers which support my dads 8800GTS.

nvidia.jpg
 
Cus these are the first beta drivers which support my dads 8800GTS.

nvidia.jpg

The first according to nVidia's site. The 177 series has supported the 8800 series for a very long time. 177.72 release a couple weeks ago supported pretty much every card out. I've been running 177.39 since June 20th on my 8800GS...
 
What card is that with? There are several ways to get 2D clocks if you really want them, both via software and via BIOS flashes. Most nVidia cards no longer have 2D clocks in the BIOS, which is why they don't downclock, it has nothing to do with the drivers. If you want 2D clocks, set them yourself. There is no reason 2D clocks are "one of the most important features" though. They didn't even help save power.

are you kidding me, GTX 280 makes a big difference in power saving, I am sure you have seen the results on line. I think you might be talking about 9800gtx series, they did not have any 2d clocks in bios for sure but GTX 200 series uses less voltage and really low clocks in 2d mode which results in big power saving.
 
are you kidding me, GTX 280 makes a big difference in power saving, I am sure you have seen the results on line. I think you might be talking about 9800gtx series, they did not have any 2d clocks in bios for sure but GTX 200 series uses less voltage and really low clocks in 2d mode which results in big power saving.

Your right, the GTX280 does benefit from 2D clocks. How do you know these drivers are the cause of your loss of 2D clocks? A quick scan through the release notes says nothing about losing 2D clocks.
 
because I never had the 2d clock isssue with any of the betas and the last one I was using was 177.70.
 
Sweet. I'll be playing with these drivers come tonight!
 
because I never had the 2d clock isssue with any of the betas and the last one I was using was 177.70.

That is a faulty assumption.
 
alright I reinstalled the driver now, and 2d issue is gone now.
 
nice

This argument is the same as asking why we processes AA or AF with the GPU. If the GPU didn't have to process AA/AF it could processes something else better. Everything is a trade off.

Of course in same cases, there is absolutely no trade off. Like the GT200 GPU's that have a dedicated section of the GPU die specifically for PhysX. So when you run PhysX on those GPU's, it doesn't take any power away from the graphics rendering.

Of course there is also the fact that your main GPU doesn't have to do the PhysX processing at all. You can have a super high end graphics card doing the graphical rendering, and a $50 GPU doing the PhysX and you will be perfectly fine, with no power being taken away from graphics GPU.

Firstly,The gpu doesn't do physx better but faster.But,will games actually benefit from that?
secondly,it is a battle between INTEL and NVIDIA.Because, using the gpu to process what left to cpu in games,which is physx and artificial intelligence means that you dont need even a mid range cpu to play games.That leads to:less money for INTEL or more money for nvidia.
thirdly,a cpu can't process pixels/geometry/textures via rostaration.We need a parallel processor "gpu" to do that.Thats why a cpu can't process aa.
 
Hmmm, just installed these and had crysis crash on me for the first time ever in my playing the game. I think I liked the .66s better........


Firstly,The gpu doesn't do physx better but faster.But,will games actually benefit from that?
secondly,it is a battle between INTEL and NVIDIA.Because, using the gpu to process what left to cpu in games,which is physx and artificial intelligence means that you dont need even a mid range cpu to play games.That leads to:less money for INTEL or more money for nvidia.
thirdly,a cpu can't process pixels/geometry/textures via rostaration.We need a parallel processor "gpu" to do that.Thats why a cpu can't process aa.

Of course games will benefit from faster physics processing. If you can dedicate faster processing to complex physics in the world around you, you get much much more realism. You don't really need a great cpu to play games now, it's been mostly gpu for a while. Cpu is most important in gaming b/c you can bottleneck the gpu if you don't have it fast enough to process it. Everything, after all, goes through the cpu on some level.
 
Last edited:
Why would you process physisx on your gpu?
We already have struggling gpus.
A sub 200$ quad core can handle the job pretty well.
The more of processing something by your gpu means the less of processing
something else.

Maybe I wanna see the extra physics stuff on the Warmonger and Cell Factor? Well I've tried Cell Factor before with physics enabled and it was horrid slow now it might actually run.

Why would I get a quad when no game needs it? It's not like you get hardware physics with a CPU on AGEIA games. My dual handles software physics already, no slowdowns on any Crysis explosion.

And yes I'm not expecting physics for free, it will cost me framerate or eye candy, so? Last level of Crysis is the only game my GPU has struggled everything else flies. I am going from 1280x1024 to 1680x1050 once my new LCD arrives and that will be a bigger hit probably on performance than these physics :)

It's something new and I can get it for FREE. Don't need to go out and buy a PPU or a quad. And not like it is in every game, just these http://www.nzone.com/object/nzone_physxgames_home.html
If going gets too tough I won't cry, I'll just disable it :p
 
Firstly,The gpu doesn't do physx better but faster.But,will games actually benefit from that?
secondly,it is a battle between INTEL and NVIDIA.Because, using the gpu to process what left to cpu in games,which is physx and artificial intelligence means that you dont need even a mid range cpu to play games.That leads to:less money for INTEL or more money for nvidia.
thirdly,a cpu can't process pixels/geometry/textures via rostaration.We need a parallel processor "gpu" to do that.Thats why a cpu can't process aa.

The GPU is far better at calculating physics, it does it faster, which is better. Plenty of games will benifit from this.

The difference between PhysX and Havok is definitely a fight between Intel and nVidia, but you talked about CUDA. CUDA is a completely different technology, used for far more than PhysX. Still kind of a fight between the two companies though, but not really relivant to this discussion.

Yes, I know a CPU can't process AA, that wasn't my point. My point was that any time you add some eye candy to a game, it takes away processing power from something else. Everything is a trade off. Turn AA down from 16x to 8x and run PhysX with little to no frame rate loss. Or leave AA at 16x and don't use PhysX.
 
Works good with my dads 9800GTX+!
 
well the 2d clock issue is still there, if you dont run any 3d apps its fine but once you run 3d app and than go back to desktop the card fails to go back to 2d clocks. I am not making faulty assumptions, this is after reinstalling the driver, and everytime I boot I see 2d clocks and right after running a 3d app and going back to desktop the clocks never return to 2d mode, this happens each and every time, happens both in xp and vista.
 
these are excellent drivers other than the fact they are always sucking juice at those 3d clocks when you are not even playing games for my gtx 280
 
Back
Top