Monday, September 17th 2018

NVIDIA Segregates Turing GPUs; Factory Overclocking Forbidden on the Cheaper Variant

While working on GPU-Z support for NVIDIA's RTX 20-series graphics cards, we noticed something curious. Each GPU model has not one, but two device IDs assigned to it. A device ID is a unique identification that tells Windows which specific device is installed, so it can select and load the relevant driver software. It also tells the driver, which commands to send to the chip, as they vary between generations. Last but not least, the device ID can be used to enable or lock certain features, for example in the professional space. Two device IDs per GPU is very unusual. For example, all GTX 1080 Ti cards, whether reference or custom design, are marked as 1B06. Titan Xp on the other hand, which uses the same physical GPU, is marked as 1B02. NVIDIA has always used just one ID per SKU, no matter if custom-design, reference or Founders Edition.

We reached out to industry sources and confirmed that for Turing, NVIDIA is creating two device IDs per GPU to correspond to two different ASIC codes per GPU model (for example, TU102-300 and TU102-300-A for the RTX 2080 Ti). The Turing -300 variant is designated to be used on cards targeting the MSRP price point, while the 300-A variant is for use on custom-design, overclocked cards. Both are the same physical chip, just separated by binning, and pricing, which means NVIDIA pretests all GPUs and sorts them by properties such as overclocking potential, power efficiency, etc.
When a board partner uses a -300 Turing GPU variant, factory overclocking is forbidden. Only the more expensive -30-A variants are meant for this scenario. Both can be overclocked manually though, by the user, but it's likely that the overclocking potential on the lower bin won't be as high as on the higher rated chips. Separate device IDs could also prevent consumers from buying the cheapest card, with reference clocks, and flashing it with the BIOS from a faster factory-overclocked variant of that card (think buying an MSI Gaming card and flashing it with the BIOS of Gaming X).

All Founders Edition and custom designs that we could look at so far, use the same -300-A GPU variant, which means the device ID is not used to separate Founders Edition from custom design cards.
Add your own comment

90 Comments on NVIDIA Segregates Turing GPUs; Factory Overclocking Forbidden on the Cheaper Variant

#76
Vayra86
Captain_Tom said:
No matter how you dice it, the 2070 is on the lower end of the bracket pal. Period. Also this is a new node, so you can't compare die sizes to the other nodes (And like you said, much of the die size is wasted for non-gaming uses).

That would be like saying the 1080 Ti is almost the same as the GTX 970 since they have similar die sizes - except you can't say that because you are comparing 16nm dies to 28nm dies. They are different processes with different capabilities.

Fact:

-100
-102
-104 Midrange
-106 Low End
-108
Just noticed this now, but if this is what you want to believe, be my guest, not stopping ya. Its just completely off from reality. 'Its a new node' - yes, but its not some new alien dimension we speak of, its simple math and measurement. Node got x% smaller, the die got x% bigger, the transistor counts have sky rocketed. Result: the die is more expensive / difficult to make. If you want to argue that... the asylum is the other way "pal".

Naming schemes and product stacks and pricing is all artificial and abstract. You need to look at absolutes: die size, transistor count, bus width and VRAM system, and the board design. Those are indicators that tell you how much it costs to produce a GPU and affect yields. A great example of how things can change is the way Nvidia used the first Titan. In the end, it became a 'budget friendly' 780, barely losing performance. That was an x80 product using a Gx100 chip while the x80ti was essentially a full fat Gx100. Just a year earlier, the same company used a 104 to create an x70 and x80. See how these things shift?

You need to get your 'facts' straight.
Posted on Reply
#77
Captain_Tom
Vayra86 said:
Just noticed this now, but if this is what you want to believe, be my guest, not stopping ya.
You think I "want" this reality? I "want" a xx106 card being sold for $600 to morons who make it sell out overnight?

Yeah I will say it again: Rationalize Nvidia selling low end for $600 all your want. Drink dat Koolaid Bro! Hope that 10-20% performance gain is worth it for those who rationalize this as a "high end" card lol.

High = top of the totem pole. The 2070 won't be half as strong as a full Turing card lol. That means it is barely mid range at best...
Posted on Reply
#78
Vayra86
Captain_Tom said:
You think I "want" this reality? I "want" a xx106 card being sold for $600 to morons who make it sell out overnight?

Yeah I will say it again: Rationalize Nvidia selling low end for $600 all your want. Drink dat Koolaid Bro! Hope that 10-20% performance gain is worth it for those who rationalize this as a "high end" card lol.

High = top of the totem pole. The 2070 won't be half as strong as a full Turing card lol. That means it is barely mid range at best...
Once again: price is abstract and strangely you still only focus on that and the name they've given it. And the reason it is very high now is *also* because there is no competition in the segment RTX operates in. The ratio I just used for Nvidia's large GPUs is the same one we ALL used for Intels' large monolithic CPU designs - we ALL noticed how Zen completely changed the game with a different kind of design. And those Intel CPUs aren't even remotely as large as what you see on GPU:



I know, this is hard to swallow, but the reality is, large dies are costly and that means top end GPU can and will see price changes depending on its size, and can even push it out of the gaming market altogether because it simply isn't profitable to make one for gamers (the history of Titan in a nutshell). Gamers, mind you, that are more concerned with 'top of the totem pole epeen' than they are with realistic numbers and facts.

There is a difference between pointing out why something is the way it is, and agreeing with it. I've always said Turing and its large dies are a wasteful practice with questionable returns. I would have much rather seen Pascal ported to 12nm and die size used for raw performance. Thén we could justify the current price point.
Posted on Reply
#79
Captain_Tom
Vayra86 said:
Once again: price is abstract and strangely you still only focus on that and the name they've given it.
I am not focusing on names, I am focusing on the positioning of the names.

-815=V100/T100 (815 is a bigger number than the others!)
-715=TU102
-545=TU104
-445=TU106 (V100 is 80% bigger)
-300=TU116 (This is less than half as big as the biggest number)
-<200=TU118

See the 2070 at the bottom of midrange? Let me look up the definition of "middle" for you: "at an equal distance from the extremities of something; central."

That is my entire point, that the 2070 is half of the performance Nvidia could be bringing to the table right now. I do not call that anything short of what it literally is: half of an Enthusiast card. At best you could compare this to cards like the GTX 660 Ti and R9 380X - half of the top card.

Your argument that "things got more expensive" is also complete BS. It is not this much more expensive. The bloody GTX 580 sold for $499 with a die almost as big as the TU104, and it had TERRIBLE yields on 40nm at the time. 12nm has no such yield problems, and in fact it was built for good yields on large cards.

You cannot compare die size between nodes! Middle is Middle.
Posted on Reply
#80
Vayra86
Captain_Tom said:
I am not focusing on names, I am focusing on the positioning of the names.

-815=V100/T100 (815 is a bigger number than the others!)
-715=TU102
-545=TU104
-445=TU106 (V100 is 80% bigger)
-300=TU116 (This is less than half as big as the biggest number)
-<200=TU118

See the 2070 at the bottom of midrange? Let me look up the definition of "middle" for you: "at an equal distance from the extremities of something; central."

That is my entire point, that the 2070 is half of the performance Nvidia could be bringing to the table right now. I do not call that anything short of what it literally is: half of an Enthusiast card. At best you could compare this to cards like the GTX 660 Ti and R9 380X - half of the top card.

Your argument that "things got more expensive" is also complete BS. It is not this much more expensive. The bloody GTX 580 sold for $499 with a die almost as big as the TU104, and it had TERRIBLE yields on 40nm at the time. 12nm has no such yield problems, and in fact it was built for good yields on large cards.

You cannot compare die size between nodes! Middle is Middle.
With the minor exception that V100 was never a consumer card. GPU evolved and is used for different purposes now. All of the previous generations used a 104 for the x70, Turing (along with Volta) is the first gen to use special cores on top of it, which requires a larger variation in die size. We're still talking about large dies up to and including 106.
Posted on Reply
#81
Jism
All about maximizing profits. Stop the practices of consumers buying cheaper Turing GPU's and OC these those Premium models.

Nvidia is starting to look like Apple. Charging 1200$ for a PHONE!
Posted on Reply
#82
TopHatProductions115
FreedomEclipse said:
So....

In short, we are going to be needing software mods to get overclocking back?? Challenge accepted

(not by me of course, I wouldnt know where to start but there are lots of smart people on the internet)


Ahhh I get it now.
XD What's next? will nVIDIA go to war with driver/vBIOS modders through driver/device updates (GeForce Experience cringe to the max) like Apple did with the jailbreaking community before Cydia passed on?
Posted on Reply
#83
theoneandonlymrk
yakk said:
Yup, and nvidia controls which dies, quality, and can charge the AIB more. Same formula the AIBs were charging on the end user, except now nvidia also gets some more of that extra profit.
Whilst all that's true ,i don't like it as a consumer and Nvidia won't be getting my money this year, im fine with middle being high end, they Are in a different feature ballpark after all and higher Dx level but further segregation is too much a 2070 should just be a 2070 chip not -A or b grade etc.

With autoclocking you sort of get the performance you directly pay for with regards to the cooling attached and Vrm design etc so overclocking is for benches only mostly now but it's nice to have the option on my personal possessions.
Posted on Reply
#84
RealNeil
theoneandonlymrk said:
i don't like it as a consumer
This,.....

I think that they were spoiled by the mining trend profit avalanche and that they're doing everything they can to milk us for more cash now that its over.
They rightly noticed that gamers were still buying their products when prices were so grossly inflated, and they're counting on us to continue doing just that.

The gouge is the new norm. (its bullshit too)

I'm ready to never buy into 20 series GPUs (and beyond) as a way to protest because voting with your wallet is most effective.
Their hobbling of mid-range card's SLI capabilities was another step in the rape of the gaming market.

I really don't have to own the very best GPUs on the market. Good GPUs will be fine for me. Ones that I can Crossfire together are key for me.

NVIDIA can kiss my ass.
Posted on Reply
#85
yakk
RealNeil said:
This,.....

I think that they were spoiled by the mining trend profit avalanche and that they're doing everything they can to milk us for more cash now that its over.
They rightly noticed that gamers were still buying their products when prices were so grossly inflated, and they're counting on us to continue doing just that.

The gouge is the new norm. (its bullshit too)

I'm ready to never buy into 20 series GPUs (and beyond) as a way to protest because voting with your wallet is most effective.
Their hobbling of mid-range card's SLI capabilities was another step in the rape of the gaming market.

I really don't have to own the very best GPUs on the market. Good GPUs will be fine for me. Ones that I can Crossfire together are key for me.

NVIDIA can kiss my ass.
Not only mining, and I would even say mining is probably by far not their main reason.... cards like the xx80ti & Titan always purposefully push the upper limit of the current pricing.

They are priced disproportionately high, and yet... people buy them! So is it that these cards are priced high, or the average cards priced too low?

Well... IMO nvidia is testing this new high pricing. With a glut of inventory they have always they really have nothing to lose by testing increased pricing "a-la Apple style". If their sales don't suffer, then you better bet this will be the new pricing norm.

So the consumer is determining themselves what price is fair market value. Not buying into the new high prices is the only way to avoid massive, permanent, price hikes.

Heck, even AMD saw this and apparently changed their plans on their (very expensive to produce) Vega 2 Instinct cards which they seemingly never thought of selling as a consumer card until they saw nvidia pulling it off with *massive* margins on their new parts.
Posted on Reply
#86
RealNeil
yakk said:
Not buying into the new high prices is the only way to avoid massive, permanent, price hikes.
Agreed.
I'm doing pretty good for GPUs right now. I have one 1080Ti, one Vega-56, two 1080FEs, two 1070Ti, and two Vega-64s.
Any new purchases will be AMD based until NVIDIA stops with their Reindeer games.
Posted on Reply
#87
hat
Enthusiast
AMD isn't just gonna let nVidia buttplunder the high end market. There's lots of people who don't buy these cards, or do buy them begrudgingly, who would prefer to have a competitive product from AMD that doesn't cost as much. This would be the perfect time for AMD to strike back. nVidia wants an arm, a leg and 1.5 kidneys for a high end card... if AMD can sell one for less, they'll make a good profit on it.
Posted on Reply
#88
TheHunter
And what is funny is that all those with -A were blower style GPU's, geizhaltz had them listened but then removed its sub directory.
Posted on Reply
#89
RealNeil
hat said:
nVidia wants an arm, a leg and 1.5 kidneys for a high end card... if AMD can sell one for less, they'll make a good profit on it.
I'll surely be buying them if they make them. They don't have to be the best, all they have to do is be GOOD.
Posted on Reply
#90
hat
Enthusiast
RealNeil said:
I'll surely be buying them if they make them. They don't have to be the best, all they have to do is be GOOD.
I think a lot of people are on that boat. nVidia's high end products are just too expensive, and a lot of people are unable or unwilling to pay those prices to play video games.
Posted on Reply
Add your own comment