• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Adds DirectX 12 Support to GeForce "Fermi" Architecture

Well mobile gm108 is maxwell gen 1 too, but for desktop graphics only gm107 derivations were made.
True, forgot about the GM108
 
Could this be this the reason I had so many problems installing/running windows 10 on an older (2008) Asus laptop with GT425M before?
It would simply freeze up all the time, either during installation or the short moments I was in windows it would freeze up as well...
 
Could this be this the reason I had so many problems installing/running windows 10 on an older (2008) Asus laptop with GT425M before?
It would simply freeze up all the time, either during installation or the short moments I was in windows it would freeze up as well...
I doubt it. Win10 supports DX12, it does not require it. It was probably Asus and their "famous" ACPIs.
 
On topic, this quiet Fermi rollout mystifies me. Why would Nvidia not tell Fermi owners? It makes no sense to release the capability and not publicize it.

There was someone harassing the GeForce forum for most of the last 2 years, I don't remember who, but he would always ask about Fermi getting DX12 support in driver threads.

Also 3DMark says my GTX470 doesn't have enough memory and may not complete TimeSpy.
 
It's not 12_0 capable, it's 11_0 capable. The only thing that changed really is the fact that now it can run the API itself.
(for reference, even Kepler and Maxwell gen 1. are 11_0, not even 11_1 like GCN1 & Haswell/Broadwell, let alone 12_0 like GCN2-4 or 12_1 like GCN5 and Maxwell gen 2. & Pascal and Skylake/Kaby Lake)

edit:

No they didn't. GCN1 is 11_1, 2-4 are 12_0, 5 is 12_1.
D3D12 supports feature levels 11_0, 11_1, 12_0, 12_1

So? That makes it totally capable of running DX12 code. The problem here is that Nvidia took a couple of years to fulfill the promise of bringing DX12 to Fermi. They promised Vulkan too and later decided it was "not worth it".
AMD was very clear from the beginning saying only GCN will have it, and the best of all is Intel, giving support to Ivy Bridge and up.
 
Tried it on an old Fermi card I had laying around.

untitled.png
 
So? That makes it totally capable of running DX12 code. The problem here is that Nvidia took a couple of years to fulfill the promise of bringing DX12 to Fermi. They promised Vulkan too and later decided it was "not worth it".
AMD was very clear from the beginning saying only GCN will have it, and the best of all is Intel, giving support to Ivy Bridge and up.
I have to repeat my question: AMD left GCN 1.0 out initially. When did they add DX12 support for GCN 1.0?
 
This is like back in the old days, when companies would regularly add support for newer versions of DirectX in software. My old GeForce2 MX was a DX7 card, but the latest drivers made it report DX9 in dxdiag.
 
Last time I checked, I couldn't even lauch API test :)
Also, I'm sorry W1zzard.... ("fix" needs to be fixed) : LINK.

Scores for my GTX 580 Lighting (1,5GB) :
Time Spy : LINK (Score : 962)
API Overhead test 2.0 : LINK
Gears of War 4* :
FnJUu7s.png

*have a problem with resolution scaling lock... (trying to fix).
 
Last edited:
I have to repeat my question: AMD left GCN 1.0 out initially. When did they add DX12 support for GCN 1.0?

From day one, same with Vulkan.
 
Two pages of posts, and most of them negative. How can Direct3D 12 support for Fermi be a bad thing? If the shoe were on the other foot, the tone would be completely different if this was the competitor. Nvidia is offering great driver support for >10 years for their hardware, and even backporting some features to their legacy drivers, which beats the driver support of their competitor by far, which has been known to drop support for 2-3 year old hardware.

Many still use Fermi hardware, even I have one in daily use. But it seems that all of you have missed the most important part of this news; accelerating the adoption of new APIs. Even those who don't own a Fermi card should be really happy about this, since it removes the last reason for developers to spend resources on pre-Direct3D 12 APIs. Accelerating the adaptation of new APIs serves everyone, even if the new APIs don't show their full potential on old hardware. This is something worth celebrating.
 
From day one, same with Vulkan.
Thanks for the source. I know for a fact that at least on Linux Vulkan was only supported on GCN 2.0 at launch. It has trickled down since, but it still requires the closed source driver.
 
On Linux is different, AMD is implementing a new open driver in parallel with a closed one, AMDGPU (based on the current open mesa radeonsi) and AMDGPU-PRO (using the Windows OpenGL and Vulkan components, the idea being to use the profile optimizations that professional programs need), the open one has alpha support for GCN1.0 and beta for GCN1.1, the AMDGPU-PRO has full support, but is a worse driver than the open AMDGPU. Support for Vulkan on AMDGPU-RPO is a given as it is the same one in use on Windows, but for AMDGPU you have to use the open RADV driver, that is not made by AMD (they did say they will open their Vulkan driver and fuse it with RADV, along with OpenCL). As GCN1.0 is no compatible by default with AMDGPU, you don't get a Vulkan driver unless you use AMDGPU-PRO.
 
Two pages of posts, and most of them negative. How can Direct3D 12 support for Fermi be a bad thing? If the shoe were on the other foot, the tone would be completely different if this was the competitor. Nvidia is offering great driver support for >10 years for their hardware, and even backporting some features to their legacy drivers, which beats the driver support of their competitor by far, which has been known to drop support for 2-3 year old hardware.

Many still use Fermi hardware, even I have one in daily use. But it seems that all of you have missed the most important part of this news; accelerating the adoption of new APIs. Even those who don't own a Fermi card should be really happy about this, since it removes the last reason for developers to spend resources on pre-Direct3D 12 APIs. Accelerating the adaptation of new APIs serves everyone, even if the new APIs don't show their full potential on old hardware. This is something worth celebrating.

It IS great news, but it's too late and on an API that is not as good as the one they already say they are not implementing (Vulkan).
Don't forget how Nvidia drops performance optimizations on older generations of "supported" cards, you get a new driver with Fermi or Kepler, but don't expect those 50% improvements in the notes to apply to your card. Personal experience with a, at the moment, "supported" 7600GT.
 
I have to repeat my question: AMD left GCN 1.0 out initially. When did they add DX12 support for GCN 1.0?
I guess at least since Windows 10 was released. Or even before then actually. I've been able to run DX12 programs on my 280X since I first installed one of the earliest pre-release Insider Preview builds. One of the first things I tried to run was Star Swarm in D3D12 mode to see how it compared to D3D11 and Mantle modes. Long story short it worked. And I found out D3D12 was ~2x as fast D3D11, but still ~50% slower than Mantle on my card.
 
Last edited:
Did the GeForce 200 series ever get DX11, or they still stuck on DX10/10.1?
 
It IS great news, but it's too late and on an API that is not as good as the one they already say they are not implementing (Vulkan).
Don't forget how Nvidia drops performance optimizations on older generations of "supported" cards, you get a new driver with Fermi or Kepler, but don't expect those 50% improvements in the notes to apply to your card. Personal experience with a, at the moment, "supported" 7600GT.
Nobody is expecting major architecture specific optimizations on >5 year old architectures, but the fact that you still get support and some new features and optimizations is far better than the competition.

Did the GeForce 200 series ever get DX11, or they still stuck on DX10/10.1?
No, pre-Fermi never got Direct3D 11 and OpenGL 4.x, since there are major hardware features lacking. Nvidia does still supports all the parts of OpenGL 4.x they can on GeForce 8000->300 though.

The reason why Nvidia are able to implement Direct3D 12 on Fermi, even though it's very different from Kepler, is the fact that there are very few new hardware requirements from Direct3D 11 to 12 (base requirements). Since Fermi is using different modules internally in the driver, most of the support had to be rewritten from scratch, and since it's a low-priority issue it has taken some time. But Direct3D 12 will never work as well on Fermi as compared to newer architectures.
 
Last edited:
Did the GeForce 200 series ever get DX11, or they still stuck on DX10/10.1?

If the GPU is designed for DX10/OGL3, there is no changing that. You can ask the Linux Mesa team to emulate OpenGL 4.0, good luck with tesselation.
DX12 is DX11 with a low overhead design, that is why it can be implemented on old DX11 hardware, same as with Vulkan, can be implemented on anything that can run OpenGL 4.2 (we are at 4.5+). On the other hand, DX11 is a bunch of extensions on top of DX10, if you want to run DX11 apps on a card designed for DX10, you have to CPU emulate them, with the corresponding performance penalty, that is why it's not possible to make a 8800GT or an HD3850 run DX11.

Nobody is expecting major architecture specific optimizations on >5 year old architectures, but the fact that you still get support and some new features and optimizations is far better than the competition.

Old architectures don't even get game optimizations, practically a zombie in the drivers. It's not different from having a fixed old driver. DX12 is not a "new feature", it was promised years ago, and is very late compared to the rest of the supported archs.
 
Last edited:
Old architectures don't even get game optimizations, practically a zombie in the drivers.
The usefulness of game optimizations are exaggerated, it usually only helps edge cases. In fact, I would prefer vendors to stop doing them all together.
Nevertheless, they add support for new versions of operating systems, both for Windows, Linux and BSD.

It's not different from having a fixed old driver. DX12 is not a "new feature", it was promised years ago, and is very late compared to the rest of the supported archs.
Fermi is using a different code path from Kepler and newer, so it is a new feature for the Fermi driver. There is a reason why Nvidia's Direct3D 12 performance has improved a lot since its first introduction, it's due to tweaking their API implementation in the specific driver code paths. Creating optimal code paths for Vulkan, Direct3D and OpenGL for each architecture are the optimizations that matter, this benefits all software, unlike random (stupid) tweaks to a shader program of a specific game which might help them 7% in that edge case.
 
Yeah the thing with Linux or BSD is that the closed drivers are always playing catch-up with the kernel changes. But yeah, you want proper support outside of Windows? Go Nvidia, there is no other way.
 
Last time I checked, I couldn't even lauch API test :)
Also, I'm sorry W1zzard.... ("fix" needs to be fixed) : LINK.

Scores for my GTX 580 Lighting (1,5GB) :
Time Spy : LINK (Score : 962)
API Overhead test 2.0 : LINK
Gears of War 4* :
FnJUu7s.png

*have a problem with resolution scaling lock... (trying to fix).

I have one of those somewhere in the basement too. I got it second hand :-)

To everyone saying AMD 7xxx supported DX12 from the start unlike Fermi. Fermi was released about the same time as AMD 5xxx series of GPUs, which still do not support DX12, fwiw.
 
There was someone harassing the GeForce forum for most of the last 2 years, I don't remember who, but he would always ask about Fermi getting DX12 support in driver threads.
At that rate, Vulkan support for XP isn't far behind!

On a more serious note, it's nice to see hardware that's been out of production for more than five years get a feature boost. It understandably doesn't happen often (poor return on investment). Now to find a decent DX12 game that can run on a 560 Ti... oh wait, I'm not on Win10. So much for that I guess. Still nice to see this though.
 
Who uses those hot bricks still?

Imho they rate with the 5800 and 2900.
 
Back
Top