• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Nvidia 7 Series Doesn't Support DirectX 11.2

Windows 8.1 is right around the corner. I can be annoyed by a "new generation" of GPUs not supporting current standards, when they probably knew about DX 11.2 long before any of us did.

Just seen your edit.

It's not a current standard. It's an upcoming standard ie it's not been released, therefore cannot possibly be supported by current hardware. End of story. I cannot believe that you can't grasp such an obvious and simple thing.

It's been explained by myself and others and nt exhaustively in the post above, too.
 
Wow this thread needs to be locked down, it's like tapping on a empty can
 
If the Xbox One is using all AMD hardware and none of AMD hardware has 11.2 support currently. I have a hard time believing that the Xbox One will have 11.2 support.

The XboxOne will support DX11.2, I'm sure the custom APU that AMD is building for it supports DX11.2. However, since none of the current PC hardware supports it, that means none of the devkits support is, so I can guarantee none of the launch games will use it.
 
NVidia is working hard on its GL api, it supports more than 50 more extentions than amd, right now with amd's codenaming and confusion with marketing, all I can imagine is amd's offices full of flying papers and paper air planes going from cubical to cubical.

AMD vs NV comparison is irrelevant. What's important is OpenGL 4.3 is light years ahead of DX11.1 or DX11.2.

One of the bigest differnces has to be bindless resources/extensions. This thing speeds up everything and makes openGL close to consoles (bind to metal) with 25k calls...

DirectX11 tops out at ~ 16k calls


Driver doesnt need to search for a texture and ask is A really A or is it B unlike DirectX that needs to do that by every call.




Bindless resources
The Modern CPU Bottleneck

OpenGL has evolved in a way that allows applications to replace many of the original state machine variables with blocks of user-defined data. For example, the current vertex state has been augmented by vertex buffer objects, fixed-function shading state and parameters have been replaced by shaders/programs and constant buffers, etc.. Applications switch between coarse sets of state by binding objects to the context or to other container objects (e.g. vertex array objects) instead of manipulating state variables of the context. In terms of the number of GL commands required to draw an object, this enables applications to be an order of magnitude more efficient. However, this explosion of objects bound to other objects has led to a new bottleneck - pointer chasing and CPU L2 cache misses in the driver, and general L2 cache pollution.

Recent OpenGL graphics applications tend to change state at roughly these frequencies:
for (...) { // cold
data downloads, render target changes, etc.
for (...) { // warm
bind textures
for (...) { // hot
bind constants
bind vertex buffers
Draw();
}
}
}
The most frequent state changes are binding vertex buffer objects (every draw), followed closely by binding constant buffers. Vertex buffer and constant buffer binds are significantly more expensive than one might expect. These binds require several reads from the driver internal object data structure to accomplish what the driver actually needs to do. In an OpenGL driver, it looks like this:

name->obj (lookup object by name)
obj->{refcount, GPU address, state, etc.} (dereference object to reference count it, to get its GPU virtual address, and validate its state).

Each of these dereferences has a high probability of causing a CPU L2 cache miss due to the inherently LRU-eviction-unfriendly nature of graphics applications (each frame starts over at the beginning). These L2 cache misses are a huge bottleneck in modern drivers, and a penalty paid for every frame rendered.

Bindless Graphics has the following desirable properties:

The driver need not dereference a vertex buffer or constant buffer on the CPU in order for the GPU to use it.
Relieves the limits on how many buffer objects can be accessed at once by shaders
Buffer objects are accessed as C-style pointer dereferences in the shading language
Allows for dependent pointer fetches, enabling more complex scene graph structures to be built into buffer objects providing significant new flexibility in the use of shaders.

Measurements have shown that bindless graphics can result in more than 7x speedup!

Where is your source for this (out of curiosity).

And does the Red camp support these features...?

AMD 7000 series has full Dx11.1 in HW level.

Nvidia uses only basic dx11.1 features, they skipped the rest

NVIDIA has confirmed that their Kepler CPUs (GTX 600) do not support DX11.1. The feature level is stuck with DX11_0.

Not supported DX11_1 features:

- Target-Independent Rasterization (2D-Rendering)
- 16xMSAA Rasterization (2D-Rendering)
- Orthogonal Line Rendering Mode
- UAV in non-pixel-shader stages
http://www.tomshardware.com/news/Ke...sco-Feature-Set-Graphics-Core-Next,19839.html

http://www.heise.de/newsticker/meld...endig-zu-DirectX-11-1-kompatibel-1754119.html
 
Last edited:
Yeah, they jumped the gun by release a product that was produced 8 months ago that doesn't support an API that isn't even officially released yet, and won't be for another 4-5 months. Oh, and when it is released, it will only be available to a small percentage of users and there likely won't be any software that uses it. Yeah, nVidia really jumped the gun, that's for sure.

So Nvidia releasing cards DX10 capable before the launch of Windows Vista never happened? It is jumping the gun for someone like me who plans to spend around $500 to $600 dollars on a card when I only update every 2 generations. I will skip these cards based solely on that because while it may not be important to you, it is to me due to the length of time I keep a GPU.
 
Most of DX 11.2 is backwards compatible any ways.
 
So Nvidia releasing cards DX10 capable before the launch of Windows Vista never happened?

It was actually around the same time that Vista was finalized. Vista was released in late 2006 to businesses, with the full commercial release happening in early 2007 (Jan/Feb I think). The important point is that DX10 was fully ratified some time before the official release date of Vista, so nvidia could release a card based on it.

I explained to you in the post above exactly why nvidia or amd can't possibly release a card based on an unreleased standard and it's really simple and obvious, but of course you've chosen to ignore it, lol.

For someone that doesn't want to understand something so basic and obvious, you're doing a really good job of it. :rolleyes:
 
The future is OpenGL. I think DirectX will be less relevant yet its versions.
 
Looks like someone got kidnapped by M$ marketeers. I wouldn't be concerned until there is a dx 11.2 game out. There aren't even any in development (that anyone knows of) right now. So.. say in three to five years when there are actual dx 11.2 titles out, the card will most likely be obsolete anyways. Oh, remember to shell out for Windows 8.1 as well.
 
Last edited:
Does it come with shader model 5.2 or 6 ? if it has none of those then doesn't bother me much. OpenGL makes more sense, its well past dew for developers to move away from the DX fiasco.
 
So Nvidia releasing cards DX10 capable before the launch of Windows Vista never happened?

No, it really didn't. Technically, if you must, nVidia released their DX10 capable G80 based cards 1 day before Vista was officially released, the cards came out Nov. 7 and Vista was officially released Nov. 8. However, it was a paper launch, and actual retail availability of the cards didn't happen until the end of Nov. But even still, that is 1 Day before the OS was released. With DX11.2 you're asking for support in a product that was available almost a year before release. That's asinine.

It is jumping the gun for someone like me who plans to spend around $500 to $600 dollars on a card when I only update every 2 generations. I will skip these cards based solely on that because while it may not be important to you, it is to me due to the length of time I keep a GPU.

Not winning any points with me, my cards before the GTX670s were GTX470s. I generally upgrade every 2 generations as well. My GTX470s didn't support DX11.1, and it didn't make one bit of difference just like the GTX700 series not supporting DX11.2. If no game is going to use it what is the point of not getting a card that doesn't support it. Even if you do keep them for several years, even if a few games eventually do support DX11.2, the improvements are extremely minor. But as I already pointed out, in the PC market it isn't likely any games will really support DX11.2 because Win7 doesn't support DX11.2. And as I pointed out already, it doesn't look like your choices from AMD are going to be any better as they don't support DX11.2 either.
 
Looks like someone got kidnapped by M$ marketeers. I wouldn't be concerned until there is a dx 11.2 game out. There aren't even any in development (that anyone knows of) right now. So.. say in three to five years when there are actual dx 11.2 titles out, the card will most likely be obsolete anyways. Oh, remember to shell out for Windows 8.1 as well.

Whilst I agree entirely I have to say its something nvidia do need to work to resolve , at least 11.1 via a driver update.
Only because the likes of me who upgrades gfxs years apart and with the future in mind might stear towards amd especially in half a year to a years time , when the first hints at a possible use ingame comes out ( of dx11.1-2).
 
DX10 > DX10.1, same scenario all over again with DX11. That incredibly minor .1 / .2 I seriously doubt extends to more than some DX API enhancements which all current DX11 hardware should be capable of.
 
Whilst I agree entirely I have to say its something nvidia do need to work to resolve , at least 11.1 via a driver update.
Only because the likes of me who upgrades gfxs years apart and with the future in mind might stear towards amd especially in half a year to a years time , when the first hints at a possible use ingame comes out ( of dx11.1-2).

Well at least with DX11.1 they do support the important features, the ones they don't support are likely never going to be used in a game, and if they are using the DX11 method isn't going to matter(really who cares about 2D rendering improvements at this point?).
 
DirectX 11.2 allows for CPU and GPU to share memory space. It's for APUs in the consoles, and next-gen AMD chips like Kaveri, which will be out at the end of the year.

That's why the update is required, since this is a very different way for doing games, since CPU and GPU will not have their own memory spaces.

It's also not really back-wards compatible, and what it is for is also why NVidia doesn't support it...they don't have a device that has both CPU and GPU sharing memory buffer.

Interesting comments, however, about nothing that is even related to DX 11.2...it's pretty ridiculous that anyone expects a discrete card to need it.
 
And the number of games released as of Jun 29 2013 that require DX 11.2 support is? :wtf:
 
great little article. i can see how the tech was developed for consoles, and spread back to PC.


the article says some large number of Nvidia GPU's support tiled resources, not that they support DX 11.2. One can assume tiled resources can run on most/all DX11 hardware.

Pretty much. I don't get why some people get / got all in a uproar over DX10.1 and DX11.1/.2 and their GPU "only" supports DX10/11.... just doesn't seem worth the time of getting in a uproar to me for something like that.
 
great little article. i can see how the tech was developed for consoles, and spread back to PC.


the article says some large number of Nvidia GPU's support tiled resources, not that they support DX 11.2. One can assume tiled resources can run on most/all DX11 hardware.

Well this Tiled Resources tech is part of the DX11.2 spec, which obviously adds to the confusion. :laugh:
 
The XboxOne will support DX11.2, I'm sure the custom APU that AMD is building for it supports DX11.2. However, since none of the current PC hardware supports it, that means none of the devkits support is, so I can guarantee none of the launch games will use it.

I wonder if its a case that the GCN 2.0 architecture technically supports the "new" features of DX11.2 but doesn't officially support it? Kind of like the 10.1 features in Assasins creed.

Of course real world it will make bugger all difference to any of us.
 
Back
Top