1. Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA GeForce Driver Version 177.79 Released

Discussion in 'News' started by malware, Jul 29, 2008.

  1. malware New Member

    Joined:
    Nov 7, 2004
    Messages:
    5,476 (1.51/day)
    Thanks Received:
    956
    Location:
    Bulgaria
    Along with the release of the new NVIDIA GeForce 9800 GTX+, 9800 GT, and 9500 GT GPUs comes a new beta driver. The GeForce driver release 177.79 comes to add nothing but these three GPUs to the database of supported GeForce 8-series, 9-series, and 200-series GPUs. It may also bring some performance improvements in various games and applications. Give the release notes a try for more information.

    DOWNLOAD: Windows XP 32-bit|64-bit, Windows Vista 32-bit|64-bit

    Source: NVIDIA
     
    Skitzo, pagalms, Kursah and 3 others say thanks.
  2. alexp999

    alexp999 Staff

    Joined:
    Jul 28, 2007
    Messages:
    8,045 (3.04/day)
    Thanks Received:
    862
    Location:
    Dorset, UK
    Hopefully these will be a lot better than those crappy 175.19's!
     
  3. marsey99

    marsey99

    Joined:
    Jul 18, 2007
    Messages:
    1,590 (0.60/day)
    Thanks Received:
    299
    177.39 are still my faves atm :toast:
     
  4. OnBoard

    OnBoard New Member

    Joined:
    Sep 16, 2006
    Messages:
    3,044 (1.03/day)
    Thanks Received:
    379
    Location:
    Finland
    So still no official ones that give PhysX, or have I missed them? Still with 174.70 that have been fantastic, but will change motherboard in couple days and might try something newer.

    edit: ah it was in the news here, coming in a week.
    http://forums.techpowerup.com/showthread.php?t=66626
     
    Last edited: Jul 29, 2008
  5. Arctucas

    Arctucas

    Joined:
    Jul 14, 2006
    Messages:
    1,772 (0.59/day)
    Thanks Received:
    290
  6. robspierre6 New Member

    Joined:
    Jul 27, 2008
    Messages:
    48 (0.02/day)
    Thanks Received:
    3
    So still no official ones that give PhysX, or have I missed them? Still with 174.70 that have been fantastic, but will change motherboard in couple days and might try something newer.


    Why would you process physisx on your gpu?
    We already have struggling gpus.
    A sub 200$ quad core can handle the job pretty well.
    That CUDA thing is a actually a battle between nvidias ageia and INTELs havok.
    The more of processing something by your gpu means the less of processing
    something else.
     
  7. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,032 (6.15/day)
    Thanks Received:
    6,095
    Dude, why on earth would you still be using 175 series drivers? The 177 series has been out for months.

    Because the GPU is better at it than the CPU. Every gamer has a GPU, not every gamer has a quad-core.

    Correct, we do.

    A sub $100 GPU can do the job better.

    CUDA really has nothing to do with it. CUDA is far more than just physics. PhysX uses a few aspects of CUDA, but CUDA has far better uses than just video games.

    This argument is the same as asking why we processes AA or AF with the GPU. If the GPU didn't have to process AA/AF it could processes something else better. Everything is a trade off.

    Of course in same cases, there is absolutely no trade off. Like the GT200 GPU's that have a dedicated section of the GPU die specifically for PhysX. So when you run PhysX on those GPU's, it doesn't take any power away from the graphics rendering.

    Of course there is also the fact that your main GPU doesn't have to do the PhysX processing at all. You can have a super high end graphics card doing the graphical rendering, and a $50 GPU doing the PhysX and you will be perfectly fine, with no power being taken away from graphics GPU.
     
    Last edited: Jul 29, 2008
    Crunching for Team TPU 50 Million points folded for TPU
  8. Nkd New Member

    Joined:
    Sep 15, 2007
    Messages:
    93 (0.04/day)
    Thanks Received:
    18
    Yea they fixed the downclocking issue by totally eliminating 2d clocks, my card no longer goes in to 2d mode, and stays at full clocks. Nvidia is getting pathetic now, break one of most important features to fix the issue I never had. In evga forum Jacob confirmed with me that 2d clocks are not disabled in the driver but that is not the case for me, the card stopped idling in 2d clocks now it it in full 3d clocks all the time.
     
    Last edited: Jul 29, 2008
  9. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,032 (6.15/day)
    Thanks Received:
    6,095
    What card is that with? There are several ways to get 2D clocks if you really want them, both via software and via BIOS flashes. Most nVidia cards no longer have 2D clocks in the BIOS, which is why they don't downclock, it has nothing to do with the drivers. If you want 2D clocks, set them yourself. There is no reason 2D clocks are "one of the most important features" though. They didn't even help save power.
     
    Crunching for Team TPU 50 Million points folded for TPU
  10. alexp999

    alexp999 Staff

    Joined:
    Jul 28, 2007
    Messages:
    8,045 (3.04/day)
    Thanks Received:
    862
    Location:
    Dorset, UK
    Cus these are the first beta drivers which support my dads 8800GTS.

    [​IMG]
     
  11. Nkd New Member

    Joined:
    Sep 15, 2007
    Messages:
    93 (0.04/day)
    Thanks Received:
    18
    I got the gtx280
     
  12. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,032 (6.15/day)
    Thanks Received:
    6,095
    The first according to nVidia's site. The 177 series has supported the 8800 series for a very long time. 177.72 release a couple weeks ago supported pretty much every card out. I've been running 177.39 since June 20th on my 8800GS...
     
    Crunching for Team TPU 50 Million points folded for TPU
  13. Nkd New Member

    Joined:
    Sep 15, 2007
    Messages:
    93 (0.04/day)
    Thanks Received:
    18
    are you kidding me, GTX 280 makes a big difference in power saving, I am sure you have seen the results on line. I think you might be talking about 9800gtx series, they did not have any 2d clocks in bios for sure but GTX 200 series uses less voltage and really low clocks in 2d mode which results in big power saving.
     
  14. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,032 (6.15/day)
    Thanks Received:
    6,095
    Your right, the GTX280 does benefit from 2D clocks. How do you know these drivers are the cause of your loss of 2D clocks? A quick scan through the release notes says nothing about losing 2D clocks.
     
    Crunching for Team TPU 50 Million points folded for TPU
  15. Nkd New Member

    Joined:
    Sep 15, 2007
    Messages:
    93 (0.04/day)
    Thanks Received:
    18
    because I never had the 2d clock isssue with any of the betas and the last one I was using was 177.70.
     
  16. Cold Storm

    Cold Storm Battosai

    Joined:
    Oct 7, 2007
    Messages:
    15,014 (5.84/day)
    Thanks Received:
    2,999
    Location:
    In a library somewhere on this earth
    Sweet. I'll be playing with these drivers come tonight!
     
  17. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,032 (6.15/day)
    Thanks Received:
    6,095
    That is a faulty assumption.
     
    Crunching for Team TPU 50 Million points folded for TPU
  18. Nkd New Member

    Joined:
    Sep 15, 2007
    Messages:
    93 (0.04/day)
    Thanks Received:
    18
    alright I reinstalled the driver now, and 2d issue is gone now.
     
  19. robspierre6 New Member

    Joined:
    Jul 27, 2008
    Messages:
    48 (0.02/day)
    Thanks Received:
    3
    nice

    This argument is the same as asking why we processes AA or AF with the GPU. If the GPU didn't have to process AA/AF it could processes something else better. Everything is a trade off.

    Of course in same cases, there is absolutely no trade off. Like the GT200 GPU's that have a dedicated section of the GPU die specifically for PhysX. So when you run PhysX on those GPU's, it doesn't take any power away from the graphics rendering.

    Of course there is also the fact that your main GPU doesn't have to do the PhysX processing at all. You can have a super high end graphics card doing the graphical rendering, and a $50 GPU doing the PhysX and you will be perfectly fine, with no power being taken away from graphics GPU.

    Firstly,The gpu doesn't do physx better but faster.But,will games actually benefit from that?
    secondly,it is a battle between INTEL and NVIDIA.Because, using the gpu to process what left to cpu in games,which is physx and artificial intelligence means that you dont need even a mid range cpu to play games.That leads to:less money for INTEL or more money for nvidia.
    thirdly,a cpu can't process pixels/geometry/textures via rostaration.We need a parallel processor "gpu" to do that.Thats why a cpu can't process aa.
     
  20. farlex85

    farlex85 New Member

    Joined:
    Mar 29, 2007
    Messages:
    4,830 (1.75/day)
    Thanks Received:
    638
    Hmmm, just installed these and had crysis crash on me for the first time ever in my playing the game. I think I liked the .66s better........


    Of course games will benefit from faster physics processing. If you can dedicate faster processing to complex physics in the world around you, you get much much more realism. You don't really need a great cpu to play games now, it's been mostly gpu for a while. Cpu is most important in gaming b/c you can bottleneck the gpu if you don't have it fast enough to process it. Everything, after all, goes through the cpu on some level.
     
    Last edited: Jul 30, 2008
  21. OnBoard

    OnBoard New Member

    Joined:
    Sep 16, 2006
    Messages:
    3,044 (1.03/day)
    Thanks Received:
    379
    Location:
    Finland
    Maybe I wanna see the extra physics stuff on the Warmonger and Cell Factor? Well I've tried Cell Factor before with physics enabled and it was horrid slow now it might actually run.

    Why would I get a quad when no game needs it? It's not like you get hardware physics with a CPU on AGEIA games. My dual handles software physics already, no slowdowns on any Crysis explosion.

    And yes I'm not expecting physics for free, it will cost me framerate or eye candy, so? Last level of Crysis is the only game my GPU has struggled everything else flies. I am going from 1280x1024 to 1680x1050 once my new LCD arrives and that will be a bigger hit probably on performance than these physics :)

    It's something new and I can get it for FREE. Don't need to go out and buy a PPU or a quad. And not like it is in every game, just these http://www.nzone.com/object/nzone_physxgames_home.html
    If going gets too tough I won't cry, I'll just disable it :p
     
  22. newtekie1

    newtekie1 Semi-Retired Folder

    Joined:
    Nov 22, 2005
    Messages:
    20,032 (6.15/day)
    Thanks Received:
    6,095
    The GPU is far better at calculating physics, it does it faster, which is better. Plenty of games will benifit from this.

    The difference between PhysX and Havok is definitely a fight between Intel and nVidia, but you talked about CUDA. CUDA is a completely different technology, used for far more than PhysX. Still kind of a fight between the two companies though, but not really relivant to this discussion.

    Yes, I know a CPU can't process AA, that wasn't my point. My point was that any time you add some eye candy to a game, it takes away processing power from something else. Everything is a trade off. Turn AA down from 16x to 8x and run PhysX with little to no frame rate loss. Or leave AA at 16x and don't use PhysX.
     
    Crunching for Team TPU 50 Million points folded for TPU
  23. ShadowFold

    ShadowFold New Member

    Joined:
    Dec 23, 2007
    Messages:
    16,921 (6.78/day)
    Thanks Received:
    1,644
    Location:
    Omaha, NE
    Works good with my dads 9800GTX+!
     
  24. Nkd New Member

    Joined:
    Sep 15, 2007
    Messages:
    93 (0.04/day)
    Thanks Received:
    18
    well the 2d clock issue is still there, if you dont run any 3d apps its fine but once you run 3d app and than go back to desktop the card fails to go back to 2d clocks. I am not making faulty assumptions, this is after reinstalling the driver, and everytime I boot I see 2d clocks and right after running a 3d app and going back to desktop the clocks never return to 2d mode, this happens each and every time, happens both in xp and vista.
     
  25. Nkd New Member

    Joined:
    Sep 15, 2007
    Messages:
    93 (0.04/day)
    Thanks Received:
    18
    these are excellent drivers other than the fact they are always sucking juice at those 3d clocks when you are not even playing games for my gtx 280
     

Currently Active Users Viewing This Thread: 1 (0 members and 1 guest)

Share This Page