Wednesday, April 25th 2018

NVIDIA Releases GeForce 397.31 WHQL Drivers

NVIDIA today releases GeForce 397.31 WHQL drivers. The drivers see NVIDIA discontinue regular support for 32-bit versions of Windows. It also sheds support for GPUs based on NVIDIA "Fermi" GPU architecture (GeForce 400 series and 500 series). The drivers also add first official support for NVIDIA RTX real-time ray-tracing technology. To use it, you'll need a GPU based on NVIDIA's next-generation "Volta" architecture (such as the $3,000 TITAN V), the latest major version of Windows 10, and Microsoft DXR developer package. The drivers also add support for Vulkan 1.1 API. Besides the above three, GeForce 397.31 WHQL is game-ready for "BattleTech" and "FrostPunk." Grab it from the link below.

The change-log follows.

Discontinued Support
  • 32-bit Operating Systems: Beginning with Release 396, NVIDIA is no longer releasing Game Ready drivers for 32-bit operating systems for any GPU architecture.
  • NVIDIA Fermi GPUs: Beginning with Release 396, the NVIDIA Game Ready driver no longer supports NVIDIA GPUs based on the Fermi architecture.
Game Ready
Provides the optimal gaming experience for BattleTech and FrostPunk.

New Features
NVIDIA RTX Technology
Developer preview for NVIDIA RTX ray tracing technology for DirectX 12. NVIDIA RTX supports the Microsoft DirectX Raytracing (DXR) API on NVIDIA Volta GPUs.
In order to get started with developing DirectX Raytracing applications accelerated by RTX, you'll need the following:
  • NVIDIA Volta GPU
  • NVIDIA driver version 396 or higher
  • Windows 10 RS4
  • Microsoft's DXR developer package, consisting of DXR-enabled D3D runtimes, HLSL compiler, and headers
Vulkan 1.1
This driver release provides full support for the new Vulkan 1.1 API and passes the Vulkan Conformance Test Suite (CTS) version

HD Audio
This driver adds new sample rates 32 kHz, 88.2 kHz, and 176.4 kHz to the HDMI device for all GPUs

Display Driver Standalone Installer
The standalone display driver installer now removes extracted files after installing the driver, leaving a smaller footprint on the hard drive.

Display Driver Libraries
Added new libraries (nvdlist.dll and nvdlistx.dll) to support Optimus and MSHybrid notebooks.

Added a new API that lets the client reconfigure the decoder resolution and other post processing parameters (such as the display resolution, cropping rectangle, and aspect ratio of the decoded frame) without having to destroy and recreate the decoder instance. This API is useful in scenarios where the decoder instance initialization time takes up a significant portion of the overall decode execution time; for example, in back-to-back decoding of multiple short clips of different resolutions.

The new API will be included in Video Codec SDK 8.2, which is expected to release in late Q2 2018.

Application SLI Profiles
Added or updated the following SLI profiles:
  • Descenders
  • Frostpunk
  • Warhammer: Vermintide 2
  • Far Cry 5
3D Vision Profiles
Added or updated the following 3D Vision profiles:
  • Descenders - Good
  • EVE Valkyrie - Warzone - Good
Fixed Issues in this Release
  • [GeForce GTX 1080 Ti][Doom]: The game crashes due to the driver reverting to OpenGL 1.1 when HDR is enabled. [2049623]
  • [GeForce GTX 1060][Far Cry 5]: The game crashes after a few minutes of game play. [2096077]
  • NvfbcPluginWindow temporarily prevents Windows from shutting down after launching a Steam game. [2068833]
  • [Firefox]: Driver TDR error may occur when using Firefox. [2049523]
  • [GeForce GTX 1060][Rise of Tomb Raider]: Flickering/corruption occurs when opening the in-game options UI. [200351146]
  • [NVIDIA Control Panel][SLI][Diablo III]: With V-Sync on and SLI enabled, the game freezes after switching windows (ALT+TAB) a few times. [1951584]
Windows 10 Issues
  • [Microsoft Edge][HDR] With HDR turned ON, video playback in full-screen mode on an HDR display may cause corruption of the video and desktop. To recover, manually turn the monitor OFF and then back ON. A future driver
  • [Far Cry 5]: Green flickering occurs in the game when using HDR with non-native resolution. To work around, either quit and then restart the game while in the desired resolution, or press [Alt+tab] away from and then back to the game, or press [Alt+Enter] to switch to windowed mode and then back to full-screen mode.
  • [NVIDIA TITAN V][G-Sync]: G-Sync displays may go blank when switching between different overclocked memory clocks multiple times. [200361272]
  • [SLI][GeForce GTX 780 Ti]: There is no display output when connecting the DisplayPort and two DVI monitors. [1835763]
  • [GeForce TITAN (Kepler-based)]: The OS fails after installing the graphics card on a Threadripper-enabled motherboard. [1973303]
  • [Pascal GPUs][Gears of War 4]: Blue-screen crash may occur while playing the game. [2008731]
  • [GeForce GTX 1080 Ti]{Warhammer Vermintide 2][DirectX 12]: TDR errors may occur when changing resolutions in game. [200395335]. To work around, use the DirectX 11 game option.
  • [NVIDIA Control Panel][Surround]: NVIDIA Surround hot keys do not work. [200394749]
  • [GeForce Experience][ShadowPlay]: The "In-Game Overlay" option cannot be enabled, nor does Shadowplay recording work. [200390642]
  • The Microsoft Media Foundation library must be installed in order to use these features. Be sure to first install the Media Foundation package.
Add your own comment

15 Comments on NVIDIA Releases GeForce 397.31 WHQL Drivers

I wonder if dumping of 32bit and Fermi resolves into any speed improvements.
Posted on Reply
RejZoRI wonder if dumping of 32bit and Fermi resolves into any speed improvements.
Actually the lazybones left the Fermi functions in the kernel mode layer handler (nvlddmkm.sys).
They removed only CUDA 2.0 architecture from nvcompiler*.dll
Posted on Reply
So, the answer is no...
Posted on Reply
RejZoRI wonder if dumping of 32bit and Fermi resolves into any speed improvements.
It might in the long term, if legacy support prevents them from doing certain changes to the design.

But primarily it's about resource management. They have to draw the line somewhere in terms of legacy support, or the overall quality of the product will ultimately decline. Usually it's not even about money, but how to manage a large team efficiently. An ever-growing set of platforms and features will impact release cycles, resulting in new features arriving later and/or sacrificing QA. Even if they hire the best developers and QA personnel money can buy, and even with a good modular design and good unit tests etc. in place, any project will ultimately get to a point where more resources will just increase the overhead, resulting in having to choose between lowering quality, slowing down development or deprecating stuff.

I have never worked at neither Nvidia nor AMD, but I have great respect for their driver teams, since I know the pain of maintaining a code base across many platforms with an "ever-expanding" feature set. Deprecating stuff is usually not something developers do light-hearted, it can be really painful for a development team to have to disappoint a part of their user base, but ultimately something have to give. Even if all the developers work on isolated branches and do basic verification of their changes, many new problems/bugs occur when merging these together, resulting in extra team members adding to development and QA overhead. I know Nvidia's driver team have a huge backlog of feature-requests from enterprise customers, game developers and end users alike, in addition to their own wishes. I assume AMD's driver team does have a long backlog too, like most large projects do. One concrete example I know of from sources was the launch of DirectX 12, when Nvidia already had many resources tied up on CUDA development, leaving them to prioritize stability and quality of the basic implementation and delaying some performance tuning for a few months. They also chose to bring many of the driver level changes of DirectX 12 to the core of the driver, resulting in "less gains from DirectX 12" in the public's view. From a purely technical point of view, Nvidia make the right priorities, but from a PR perspective it has resulted in a negative impression lasting to this day.

This fundamental problem of balancing feature requests, product maintenance, PR and customer requests is not unique to Nvidia, it is probably something all developers of larger projects have experienced. I've been there many times myself trying to negotiate with product management and (important) customers; the customer wants feature X immediately, the developers want feature Y because it's important for the product in the long run, and know doing X first will just result in making Y twice as hard and X to be redone afterwards resulting in three times more effort, which again impacts the next feature request. Over many years I've seen many "stitched together" features accomplished through "quick fixes" which has ultimately ruined code bases, making products "unmaintainable", new features "impossible" and resulting in notoriously broken products, even sometimes developers resigning. I think nearly every long-term developer will know what I'm talking about here, perhaps even you have seen this.

So the answer is it's complicated…
I don't know if I answered your question :)
Posted on Reply
RejZoRI wonder if dumping of 32bit and Fermi resolves into any speed improvements.
There are different drivers for 32 and 64 bit. So no, the 32 bit driver cannot hold back the 64 bit driver, because the former is not even installed when you use the latter.
That said, with 32 bit gone, it could become possible to enable some optimizations and such that weren't possible before (the drivers are packaged separately, but they do share most of the code). Even so, there won't be any significant gains to be had.
Posted on Reply
Just a small FYI, there are multiple threads over at the official forums mentioning some issues with this driver, seems to be mainly with 1060 cards but saw 1080 mentioned in one thread. Sounds like it is best to hold off until they fix it.
Posted on Reply
Semi-Retired Folder
Why the big version number jump?
Posted on Reply
newtekie1Why the big version number jump?
Source control (git) auto-versioning schemes when creating a new branch?
It would mean new branch got lot more development (and bugs) than the parent branch which got couple of game optimizations.
Posted on Reply
newtekie1Why the big version number jump?
Shedding 32 bit support means at least the build system was seriously simplified. That's new branch material and Nvidia's drivers have always been versioned after their development branches.
Posted on Reply
Well, gaming was ok with this latest driver but somehow windows became sluggish.....:wtf:
I went back to the previous one 391.35, all good again now.
Posted on Reply
Let's hope they finally fixed the Firefox TDR this time. Last driver claimed to have fixed and just reduced the occurrence rate (a lot, from once a day to once a week).
Posted on Reply
The new driver lowered my benchmark result of Far Cry 5 , for 8 to 10 FPS ,withGTX 1080 SLI .4K Ultra priset.from 68 to 58 FPS avrige . So we can say that the SLI correction is useless, even harmful.
Posted on Reply
bogamiThe new driver lowered my benchmark result of Far Cry 5 , for 8 to 10 FPS ,withGTX 1080 SLI .4K Ultra priset.from 68 to 58 FPS avrige . So we can say that the SLI correction is useless, even harmful.
Or, we could go out on a limb and say 1 sample is not statistically relevant :P
It still sucks for you, because you're the only "sample" you care about.
Posted on Reply
It seems the desktop monitor refresh rate isn't working properly with this driver, with or without G-Sync.
Posted on Reply
went back to 389.10
Posted on Reply
Add your own comment