Monday, December 21st 2015

NVIDIA Releases GeForce 361.43 WHQL Drivers

NVIDIA released one of its last GeForce driver releases for the year, version 361.43 WHQL. Surprisingly, these drivers aren't "Game Ready," and as such aren't built around optimization for any new game releases. They instead bring along GameWorks VR 1.1 support, including SLI support for VR applications that use OpenGL. The latest version of Occulus SDK also comes included. The drivers add/update SLI profiles for DayZ, Dungeon Defenders 2, Elite Dangerous (64-bit executable), Hard West (DirectX 11 renderer), and Bless.
DOWNLOAD: NVIDIA GeForce 361.43 WHQL
Add your own comment

19 Comments on NVIDIA Releases GeForce 361.43 WHQL Drivers

#1
Nelson Ng
Just finished setting up my system and everything was perfect!

Powered on my system in the morning (before I go to work) and was prompted to install this new driver by the Nvidia GeForce Experience application. Okayed the installation....

Halfway through, the installation failed at the Driver upgrade stage. Selected retry again still failed. Now my Win 10 with 3-display setup defaults back to the low resolution singe display.

Restarted the system and tried the driver upgrade again... still failed. Dig out the previous driver and tried... also failed! Luckily saved a System Restore System point only yesterday. Rolled back to the previous System Restore point and my perfect configuration is back again.

Immediately turned OFF the check for updates and auto download options. So much thanks to Nvidia for almost destroying my setup. Also downloaded Win Updates Disabler and disabled Windows Updates. Not giving any of these s***holes another chance to turn my system into another Guinea Pig...

Rushed off to work (late)

Nelson Ng Yeng Wai
(Singapore)
Posted on Reply
#2
NC37
The major important part about this is this is the WDDM 2.0 driver for Fermi. It isn't a perfect one as SLI doesn't work under it, but nVidia came through as promised and delivered it before the end of the year.

Sadly, too late for me. Lovin my 390, see ya in another few years nVidia!
Posted on Reply
#3
john_
NC37, post: 3389534, member: 61225"
The major important part about this is this is the WDDM 2.0 driver for Fermi. It isn't a perfect one as SLI doesn't work under it, but nVidia came through as promised and delivered it before the end of the year.

Sadly, too late for me. Lovin my 390, see ya in another few years nVidia!
They have promised it for the day Windows 10 will be announced officially, not before the end of the year. Before Windows 10 in July, everyone was betting their house that AMD wouldn't be able to support DX12 with GCN 1.0 cards and Nvidia will be ready with support for Fermi cards on Windows 10 official announcement. In the end, AMD offered DX12 support on time for GCN 1.0, Nvidia took 5 more months to offer partially support. But even that partially support is at least something. It's good to see that they also fixed most occasions with the 144Hz high power consumption bug(GPU still seems to go full speed if using two monitors, based on a comment I read).
Posted on Reply
#4
renz496
john_, post: 3389562, member: 137560"
They have promised it for the day Windows 10 will be announced officially, not before the end of the year. Before Windows 10 in July, everyone was betting their house that AMD wouldn't be able to support DX12 with GCN 1.0 cards and Nvidia will be ready with support for Fermi cards on Windows 10 official announcement. In the end, AMD offered DX12 support on time for GCN 1.0, Nvidia took 5 more months to offer partially support. But even that partially support is at least something. It's good to see that they also fixed most occasions with the 144Hz high power consumption bug(GPU still seems to go full speed if using two monitors, based on a comment I read).
why compare GCN to Fermi? at the very least Fermi will be able to run DX12 games unlike AMD card from the same generation as Fermi where AMD refuse to support DX12 despite the hardware is fully capable.
Posted on Reply
#5
john_
renz496, post: 3389585, member: 104135"
why compare GCN to Fermi? at the very least Fermi will be able to run DX12 games unlike AMD card from the same generation as Fermi where AMD refuse to support DX12 despite the hardware is fully capable.
Fermi was made available in 2010 and it was a completely new architecture. The same period AMD had the 6000 series that was in fact an evolution of 5000 series that was out on 2009. So Fermi is a newer design. The first design from AMD targeting gpu compute that also offers the necessary features to support DX12, is GCN 1.0. And I am comparing them for the reason I wrote. Everyone was certain 6 months ago that AMD will fail to support GCN 1.0, while Nvidia will come up with full support of DX12 on Fermi cards thanks to their miraculous driver team. 5 months later, offers partially support. It's much better than nothing, never said the opposite.
Posted on Reply
#6
renz496
john_, post: 3389609, member: 137560"
Fermi was made available in 2010 and it was a completely new architecture. The same period AMD had the 6000 series that was in fact an evolution of 5000 series that was out on 2009. So Fermi is a newer design. The first design from AMD targeting gpu compute that also offers the necessary features to support DX12, is GCN 1.0. And I am comparing them for the reason I wrote. Everyone was certain 6 months ago that AMD will fail to support GCN 1.0, while Nvidia will come up with full support of DX12 on Fermi cards thanks to their miraculous driver team. 5 months later, offers partially support. It's much better than nothing, never said the opposite.
so? if your point that fermi is newer just because they were coming out a bit later than AMD 5k series then GCN also coming much later which is at the end of 2011. that's mean GCN design is much 'newer' and much more advance than fermi. also AMD did change the design for 6900 series so AMD card will have better compute capability. just that it was short lived when AMD move to GCN later. and don't mix in about directx support and compute. 5k and 6k series IS DX11 capable GPU. compute or not they are compliant to DX11 spec just like Fermi was. what actually preventing AMD to support DX12 on both series? did you think Fermi was able to support DX12 because of compute? and for the record nvidia never did talk about Fermi will be able to fully support DX12 spec. what they say actually is quite opposite of that when they say there will be newer hardware needed to support some of the DX12 feature.
Posted on Reply
#7
rtwjunkie
PC Gaming Enthusiast
I'm wondering if this fixes the afterburner OSD issue. the last driver made my GPU read as permanently read as running at 400Mhz on three different systems. Rolling back to the previous one (so I am now 2 versions back today) fixed that issue.
Posted on Reply
#8
RejZoR
The updating for this driver is bugged. I had to use DDU and clean the old driver before I could upgrade. Rather silly...
Posted on Reply
#9
renz496
rtwjunkie, post: 3389661, member: 56774"
I'm wondering if this fixes the afterburner OSD issue. the last driver made my GPU read as permanently read as running at 400Mhz on three different systems. Rolling back to the previous one (so I am now 2 versions back today) fixed that issue.
i did enable OSD all the time and i see nothing weird with this driver. though i did skip few driver release because i was lazy to do driver updates every week or so. my previous driver was 358.87.
Posted on Reply
#10
Red_Machine
This driver causes a ridiculous amount of load on my CPU, causing the OS and programs to lag, and eventually leading to overheating. I've submitted a bug report to nVIDIA support, so hopefully it'll be fixed soon.
Posted on Reply
#11
c2DDragon
Installed the updated drivers, I don't know why...I think I act like a bot, nothing interesting for me in this release.
No issue/bug so far.
Posted on Reply
#12
purplekaycee
Had the same problem last week when trying to upgrade nvidia drivers for black ops 3.
What's the best and latest drivers I can download for my gtx 780 to play black ops 3 on Windows 10. 64 bit?
Posted on Reply
#13
Slizzo
purplekaycee, post: 3390161, member: 152577"
Had the same problem last week when trying to upgrade nvidia drivers for black ops 3.
What's the best and latest drivers I can download for my gtx 780 to play black ops 3 on Windows 10. 64 bit?
I've been using the latest drivers all along for my GTX 780s. Game runs great in SLI with G-Sync (can't say this about all titles, Fallout 4 in SLI runs like garbage with G-Sync. Force to single card mode and it's fine).

I'm using the previous WHQL game ready drivers currently.



GUYS, also please be aware that if you're trying to update your driver through GeForce Experience that there is an update for the application as well that fixes a lot of the issues with installing drivers that were mentioned here. Make sure you're running the latest version of GFE!!
Posted on Reply
#14
efikkan
"Front buffer rendering", does not sound wise.
Posted on Reply
#15
R-T-B
efikkan, post: 3391500, member: 150226"
"Front buffer rendering", does not sound wise.
Where did you read anything resembling that?
Posted on Reply
#18
xorbe
Red_Machine, post: 3389768, member: 88861"
This driver causes a ridiculous amount of load on my CPU, causing the OS and programs to lag, and eventually leading to overheating. I've submitted a bug report to nVIDIA support, so hopefully it'll be fixed soon.
I had this (desktop lag) problem with a past driver. When I uninstalled and reinstalled, it worked. Try that.
Posted on Reply
#19
efikkan
R-T-B, post: 3391753, member: 41983"
Ah, you caught me sir! I did not read the article yet. ;)
I found some explaination in the occulus docs.
Since VR renders two virtual screens, it leverages one buffer to update one eye when the other is rendering, instead of waiting for both to be done. The term "front buffer rendering" sounds a bit confusing since it's used as a virtual back buffer.
For anything like this to work the game needs strict control over framerate. It will also limit the ability to optimize the geometry workload, which will be the same for both eyes.
Posted on Reply
Add your own comment