Tuesday, July 29th 2008

NVIDIA GeForce Driver Version 177.79 Released

Along with the release of the new NVIDIA GeForce 9800 GTX+, 9800 GT, and 9500 GT GPUs comes a new beta driver. The GeForce driver release 177.79 comes to add nothing but these three GPUs to the database of supported GeForce 8-series, 9-series, and 200-series GPUs. It may also bring some performance improvements in various games and applications. Give the release notes a try for more information.

DOWNLOAD: Windows XP 32-bit|64-bit, Windows Vista 32-bit|64-bitSource: NVIDIA
Add your own comment

32 Comments on NVIDIA GeForce Driver Version 177.79 Released

#1
alexp999
Staff
Hopefully these will be a lot better than those crappy 175.19's!
Posted on Reply
#2
marsey99
177.39 are still my faves atm :toast:
Posted on Reply
#3
OnBoard
So still no official ones that give PhysX, or have I missed them? Still with 174.70 that have been fantastic, but will change motherboard in couple days and might try something newer.

edit: ah it was in the news here, coming in a week.
http://forums.techpowerup.com/showthread.php?t=66626
Posted on Reply
#5
robspierre6
So still no official ones that give PhysX, or have I missed them? Still with 174.70 that have been fantastic, but will change motherboard in couple days and might try something newer.


Why would you process physisx on your gpu?
We already have struggling gpus.
A sub 200$ quad core can handle the job pretty well.
That CUDA thing is a actually a battle between nvidias ageia and INTELs havok.
The more of processing something by your gpu means the less of processing
something else.
Posted on Reply
#6
newtekie1
Semi-Retired Folder
by: alexp999
Hopefully these will be a lot better than those crappy 175.19's!
Dude, why on earth would you still be using 175 series drivers? The 177 series has been out for months.

by: robspierre6
Why would you process physisx on your gpu?
Because the GPU is better at it than the CPU. Every gamer has a GPU, not every gamer has a quad-core.
We already have struggling gpus.
Correct, we do.
A sub 200$ quad core can handle the job pretty well.
A sub $100 GPU can do the job better.
That CUDA thing is a actually a battle between nvidias ageia and INTELs havok.
CUDA really has nothing to do with it. CUDA is far more than just physics. PhysX uses a few aspects of CUDA, but CUDA has far better uses than just video games.
The more of processing something by your gpu means the less of processing
something else.
This argument is the same as asking why we processes AA or AF with the GPU. If the GPU didn't have to process AA/AF it could processes something else better. Everything is a trade off.

Of course in same cases, there is absolutely no trade off. Like the GT200 GPU's that have a dedicated section of the GPU die specifically for PhysX. So when you run PhysX on those GPU's, it doesn't take any power away from the graphics rendering.

Of course there is also the fact that your main GPU doesn't have to do the PhysX processing at all. You can have a super high end graphics card doing the graphical rendering, and a $50 GPU doing the PhysX and you will be perfectly fine, with no power being taken away from graphics GPU.
Posted on Reply
#7
Nkd
Yea they fixed the downclocking issue by totally eliminating 2d clocks, my card no longer goes in to 2d mode, and stays at full clocks. Nvidia is getting pathetic now, break one of most important features to fix the issue I never had. In evga forum Jacob confirmed with me that 2d clocks are not disabled in the driver but that is not the case for me, the card stopped idling in 2d clocks now it it in full 3d clocks all the time.
Posted on Reply
#8
newtekie1
Semi-Retired Folder
by: Nkd
Yea they fixed the downclocking issue by totally eliminating 2d clocks, my card no longer goes in to 2d mode, and stays at full clocks. Nvidia is getting pathetic now, break one of most important features to fix the issue I never had.
What card is that with? There are several ways to get 2D clocks if you really want them, both via software and via BIOS flashes. Most nVidia cards no longer have 2D clocks in the BIOS, which is why they don't downclock, it has nothing to do with the drivers. If you want 2D clocks, set them yourself. There is no reason 2D clocks are "one of the most important features" though. They didn't even help save power.
Posted on Reply
#9
alexp999
Staff
by: newtekie1
Dude, why on earth would you still be using 175 series drivers? The 177 series has been out for months.
Cus these are the first beta drivers which support my dads 8800GTS.

Posted on Reply
#10
Nkd
I got the gtx280
Posted on Reply
#11
newtekie1
Semi-Retired Folder
by: alexp999
Cus these are the first beta drivers which support my dads 8800GTS.


The first according to nVidia's site. The 177 series has supported the 8800 series for a very long time. 177.72 release a couple weeks ago supported pretty much every card out. I've been running 177.39 since June 20th on my 8800GS...
Posted on Reply
#12
Nkd
by: newtekie1
What card is that with? There are several ways to get 2D clocks if you really want them, both via software and via BIOS flashes. Most nVidia cards no longer have 2D clocks in the BIOS, which is why they don't downclock, it has nothing to do with the drivers. If you want 2D clocks, set them yourself. There is no reason 2D clocks are "one of the most important features" though. They didn't even help save power.
are you kidding me, GTX 280 makes a big difference in power saving, I am sure you have seen the results on line. I think you might be talking about 9800gtx series, they did not have any 2d clocks in bios for sure but GTX 200 series uses less voltage and really low clocks in 2d mode which results in big power saving.
Posted on Reply
#13
newtekie1
Semi-Retired Folder
by: Nkd
are you kidding me, GTX 280 makes a big difference in power saving, I am sure you have seen the results on line. I think you might be talking about 9800gtx series, they did not have any 2d clocks in bios for sure but GTX 200 series uses less voltage and really low clocks in 2d mode which results in big power saving.
Your right, the GTX280 does benefit from 2D clocks. How do you know these drivers are the cause of your loss of 2D clocks? A quick scan through the release notes says nothing about losing 2D clocks.
Posted on Reply
#14
Nkd
because I never had the 2d clock isssue with any of the betas and the last one I was using was 177.70.
Posted on Reply
#15
Cold Storm
Battosai
Sweet. I'll be playing with these drivers come tonight!
Posted on Reply
#16
newtekie1
Semi-Retired Folder
by: Nkd
because I never had the 2d clock isssue with any of the betas and the last one I was using was 177.70.
That is a faulty assumption.
Posted on Reply
#17
Nkd
alright I reinstalled the driver now, and 2d issue is gone now.
Posted on Reply
#18
robspierre6
nice

This argument is the same as asking why we processes AA or AF with the GPU. If the GPU didn't have to process AA/AF it could processes something else better. Everything is a trade off.

Of course in same cases, there is absolutely no trade off. Like the GT200 GPU's that have a dedicated section of the GPU die specifically for PhysX. So when you run PhysX on those GPU's, it doesn't take any power away from the graphics rendering.

Of course there is also the fact that your main GPU doesn't have to do the PhysX processing at all. You can have a super high end graphics card doing the graphical rendering, and a $50 GPU doing the PhysX and you will be perfectly fine, with no power being taken away from graphics GPU.

Firstly,The gpu doesn't do physx better but faster.But,will games actually benefit from that?
secondly,it is a battle between INTEL and NVIDIA.Because, using the gpu to process what left to cpu in games,which is physx and artificial intelligence means that you dont need even a mid range cpu to play games.That leads to:less money for INTEL or more money for nvidia.
thirdly,a cpu can't process pixels/geometry/textures via rostaration.We need a parallel processor "gpu" to do that.Thats why a cpu can't process aa.
Posted on Reply
#19
farlex85
Hmmm, just installed these and had crysis crash on me for the first time ever in my playing the game. I think I liked the .66s better........


by: robspierre6

Firstly,The gpu doesn't do physx better but faster.But,will games actually benefit from that?
secondly,it is a battle between INTEL and NVIDIA.Because, using the gpu to process what left to cpu in games,which is physx and artificial intelligence means that you dont need even a mid range cpu to play games.That leads to:less money for INTEL or more money for nvidia.
thirdly,a cpu can't process pixels/geometry/textures via rostaration.We need a parallel processor "gpu" to do that.Thats why a cpu can't process aa.
Of course games will benefit from faster physics processing. If you can dedicate faster processing to complex physics in the world around you, you get much much more realism. You don't really need a great cpu to play games now, it's been mostly gpu for a while. Cpu is most important in gaming b/c you can bottleneck the gpu if you don't have it fast enough to process it. Everything, after all, goes through the cpu on some level.
Posted on Reply
#20
OnBoard
by: robspierre6
Why would you process physisx on your gpu?
We already have struggling gpus.
A sub 200$ quad core can handle the job pretty well.
The more of processing something by your gpu means the less of processing
something else.
Maybe I wanna see the extra physics stuff on the Warmonger and Cell Factor? Well I've tried Cell Factor before with physics enabled and it was horrid slow now it might actually run.

Why would I get a quad when no game needs it? It's not like you get hardware physics with a CPU on AGEIA games. My dual handles software physics already, no slowdowns on any Crysis explosion.

And yes I'm not expecting physics for free, it will cost me framerate or eye candy, so? Last level of Crysis is the only game my GPU has struggled everything else flies. I am going from 1280x1024 to 1680x1050 once my new LCD arrives and that will be a bigger hit probably on performance than these physics :)

It's something new and I can get it for FREE. Don't need to go out and buy a PPU or a quad. And not like it is in every game, just these http://www.nzone.com/object/nzone_physxgames_home.html
If going gets too tough I won't cry, I'll just disable it :p
Posted on Reply
#21
newtekie1
Semi-Retired Folder
by: robspierre6
Firstly,The gpu doesn't do physx better but faster.But,will games actually benefit from that?
secondly,it is a battle between INTEL and NVIDIA.Because, using the gpu to process what left to cpu in games,which is physx and artificial intelligence means that you dont need even a mid range cpu to play games.That leads to:less money for INTEL or more money for nvidia.
thirdly,a cpu can't process pixels/geometry/textures via rostaration.We need a parallel processor "gpu" to do that.Thats why a cpu can't process aa.
The GPU is far better at calculating physics, it does it faster, which is better. Plenty of games will benifit from this.

The difference between PhysX and Havok is definitely a fight between Intel and nVidia, but you talked about CUDA. CUDA is a completely different technology, used for far more than PhysX. Still kind of a fight between the two companies though, but not really relivant to this discussion.

Yes, I know a CPU can't process AA, that wasn't my point. My point was that any time you add some eye candy to a game, it takes away processing power from something else. Everything is a trade off. Turn AA down from 16x to 8x and run PhysX with little to no frame rate loss. Or leave AA at 16x and don't use PhysX.
Posted on Reply
#23
Nkd
well the 2d clock issue is still there, if you dont run any 3d apps its fine but once you run 3d app and than go back to desktop the card fails to go back to 2d clocks. I am not making faulty assumptions, this is after reinstalling the driver, and everytime I boot I see 2d clocks and right after running a 3d app and going back to desktop the clocks never return to 2d mode, this happens each and every time, happens both in xp and vista.
Posted on Reply
#24
Nkd
these are excellent drivers other than the fact they are always sucking juice at those 3d clocks when you are not even playing games for my gtx 280
Posted on Reply
#25
Kursah
I loaded them up and my GTX260 works just fine, switches to 3D low after gaming for about 1 minute like usual, then down to 2D...boots up into Vista @ 2D.

It does seem like my OC's are a little more unstable, but then I was pushing pretty far with this card...granted it seems some 280's are getting further.

Nkd what are you watching your clocks with? EVGA Precision 1.3.1 came out the other day, made by the same dude that created Rivatuner...so either works great, I've used 2.09 and EVGA...for this card I prefer the EVGA tuner since it has only what I need in the first place...I don't need much beyond clocks for GPU, Shaders, Mem and Fan speed and a few profiles so it fits perfectly.

I am considering going back to 177.70's, those seemed more stable while my card was OC'd. I did learn something funny though when I had my GPU OC too far beyond the Shader link, it'd kick down to 2D speeds...playing Comat Arms @ 1440x900, all settings maxed, 8X AA got me 22-35FPS...it was actually playable...it'd get a little hitchy here and there...but I was pretty amused by that! Thought I'd share!

:toast:
Posted on Reply
Add your own comment