Tuesday, October 13th 2020

NVIDIA Reportedly Moving Ampere to 7 nm TSMC in 2021

A report straight from DigiTimes claims that NVIDIA is looking to upgrade their Ampere consumer GPUs from Samsung's 8 nm to TSMC's 7 nm. According to the source, the volume of this transition should be "very large", but most likely wouldn't reflect the entirety of Ampere's consumer-facing product stack. The report claims that TSMC has become more "friendly" to NVIDIA. This could be because TSMC now has available manufacturing capacity in 7 nm due to some of its clients moving to the company's 5 nm node, or simply because TSMC hadn't believed NVIDIA to consider Samsung as a viable foundry alternative - which it now does - and has thus lowered pricing.

There are various reasons being leveraged at this, none with substantial grounds other than "reported from industry sources". NVIDIA looking for better yields is one of the appointed reasons, as is its history as a TSMC customer. NVIDIA shouldn't have too high a cost porting its manufacturing to TSMC in terms of design changes to the silicon level so as to cater to different characteristics of TSMC's 7 nm, because the company's GA100 GPU (Ampere for the non-consumer market) is already manufactured at TSMC. The next part of this post is mere (relatively informed) speculation, so take that with a saltier disposition than what came before.
That NVIDIA is looking to tier its manufacturing process across high-end and the rest of its product stack (with 7 nm for high-end and 8 nm for the rest of it) would become a headache for themselves and for consumers, should NVIDIA just have two suppliers for the same graphics products. There would likely be need for some changes in the power delivery designs, there are a range of new quality assurance tests that have to be taken for the new silicon, and NVIDIA would set itself up for legal troubles should they just silently update the manufacturing process on high-end models - not only would early adopters be understandably miffed about their product having evolved over time, as there could be some claims regarding 8 nm-based models being bought after the 7 nm ones are launched. And if NVIDIA were to put a sticker on retail boxes updating the 8 nm to 7 nm, well, then any user could just decline to purchase any 8 nm cards, and only look for the 7 nm versions, which might leave NVIDIA with a real immovable supply problem.

No. If this report checks out, NVIDIA will likely launch the newly produced top-end Ampere cards (we're thinking RTX 3090, RTX 3080 and RTX 3070) in 7 nm Super versions, taking a page from their RTX 20-series book. The introduction of a higher-performing product built in 7 nm within a whole new series would protect NVIDIA from legal troubles while allowing them to publicly announce the transition. This would keep early adopters "happy" in the respect that this is a whole new product launch - users would receive that much better than by feeling that they were beta testers for NVIDIA's tango with Samsung, as reception for the 20-series Super cards shows. The usage of this new process would also allow NVIDIA to improve performance further over the original 30-series cards, due to lower leakage and higher potential operational frequencies - perhaps in addition to NVIDIA's 20-series strategy of trickling down bigger GPU designs.
Source: DigiTimes
Add your own comment

120 Comments on NVIDIA Reportedly Moving Ampere to 7 nm TSMC in 2021

#51
Exyvia
ColddeckedUse both Samsung and TSMC in the same SKU? That's impossible, they just bought the damn wafers lol... These TSMC 7nm ampere's, if true, will almost certainly carry a super tag or even 4000 series if improvements are big...
You cannot use the same design in the core for both Samsung and TSMC due to contracts, they'll need to develop another - albeit similarly design to migrate to TSMC.

Samsung 8nm is equivalent to TSMC 10nm which is equivalent to Intel 14nm.
Posted on Reply
#52
quadibloc
Given how hard it has been for Nvidia to meet demand for the 3080 (and even the 3090!) no wonder they want to make use of as much capacity as possible. Whether the 7nm GPUs will have names like 3080 SUPER instead of, say, 3080+ or 3080A with maybe only a very slight performance increase, though, is something we'll have to wait and see.
Posted on Reply
#53
Nephilim666
Unless there is full disclosure of shipped units the whole demand/supply debate is fruitless. I would assume that they are severely supply constrained looking at the shipped numbers that have leaked in specific regions.
I'm upgrading from Vega 64 so I've saved and ordered a 3090. If AMD announce a GPU that is better than a 3080 in terms of performance at 4k with RT on then I'll just cancel my 3090 preorder and wait for the AMD card since it will likely be vastly more efficient.
Posted on Reply
#54
Camm
I've wondered how Nvidia could respond to RDNA2, as the 3080\3090 really don't have anyway to make a Super\Ti version.

Guess we just found out, move to TSMC 7nm,and clocks above 2Ghz should become achievable, there's your Super parts, phase out non-Super.

That being said, how much capacity does TSMC 7nm actually have for Nvidia, since WSA's are usually negotiated a year or so in advance.
Posted on Reply
#55
Minus Infinity
Hard to have sympathy for early adopters; they regularly get screwed over and probably repeat the same mistake every time there's a new product. Roll on Big Navi.
Posted on Reply
#56
Imsochobo
quadiblocGiven how hard it has been for Nvidia to meet demand for the 3080 (and even the 3090!) no wonder they want to make use of as much capacity as possible. Whether the 7nm GPUs will have names like 3080 SUPER instead of, say, 3080+ or 3080A with maybe only a very slight performance increase, though, is something we'll have to wait and see.
It's not a capacity issue.
If there is nothing coming into etailers which sometimes disclose it and you see how little they move.. 200 gpu's in a month lot of demand ?
when they are selling 1000 a month of a single sku every month prior to announcement.

Samsung 8nm might not yield the behemoth chips, maybe g6x production is struggling, there is a lot of new things here for nvidia and they rushed it a bit too.
Posted on Reply
#57
Mussels
Freshwater Moderator
shit like this makes me wonder if one day, i'll regret my 3080 purchase





if its even arrived before 2021, that is
Posted on Reply
#58
wolf
Performance Enthusiast
This was bound to happen sooner or later IMO, not fussed at all. Products get iterated on, not the first time we've seen it and will be far from the last.
Vayra86Stick that 10GB POS where the sun don't shine tyvm
Damn it's a pity it's so fast and runs so cool and quiet, otherwise I guess I'd be pissed.
Posted on Reply
#59
Prima.Vera
Vayra86Yep...

I'm going to sit comfortably on this 1080 for another year, it just got confirmed. Stick that 10GB POS where the sun don't shine tyvm

Smoke > Fire. Always
What's wrong with 10GB or VRAM? You are playing on a 1080p monitor bro :laugh: :laugh: :laugh: :laugh: :slap:
You only need that much if you play on 4K with resolution scaling and all BS
Posted on Reply
#60
Mussels
Freshwater Moderator
I mean, as someone who'd have got a 3090 if they were in stock even i accept 10GB is enough for a very long time... games will simply be optimised for it. 8GB is still gunna be the sweet spot for a very long time, games will simply target that since its so common.
Posted on Reply
#61
Minus Infinity
Prima.VeraWhat's wrong with 10GB or VRAM? You are playing on a 1080p monitor bro :laugh: :laugh: :laugh: :laugh: :slap:
You only need that much if you play on 4K with resolution scaling and all BS
What sort of knob would buy 3080 for 1080p gaming. It's a 1440p Ultra high setting or 4K card all the way. 1080p will be handled with ease by 3060 for $300 less.
Posted on Reply
#62
Mussels
Freshwater Moderator
Minus InfinityWhat sort of knob would buy 3080 for 1080p gaming. It's a 1440p Ultra high setting or 4K card all the way. 1080p will be handled with ease by 3060 for $300 less.
high refresh 1080p totally makes sense for them? or a 1080p user + VR...

WE DONT JUDGE AROUND HERE. EXCEPT APPLE PRODUCTS.
Posted on Reply
#63
renz496
CammI've wondered how Nvidia could respond to RDNA2, as the 3080\3090 really don't have anyway to make a Super\Ti version.

Guess we just found out, move to TSMC 7nm,and clocks above 2Ghz should become achievable, there's your Super parts, phase out non-Super.

That being said, how much capacity does TSMC 7nm actually have for Nvidia, since WSA's are usually negotiated a year or so in advance.
just look how big nvidia A100 chips are. if they move to TSMC then they got the limit up to 800mm2 for their biggest chip. right now GA102 is 628mm2 using samsung 8nm process node. need more performance? just make bigger chip. capacity wise TSMC could give some to nvidia. AMD most likely not going to get all that space that being left by apple and huawei. even if the WSA need to be negotiated much earlier nvidia is one of TSMC longest partner since late 90s. in 2012 nvidia are complaining about not getting enough capacity to meet their demand and a few weeks after that TSMC publically announcing that they were giving some of AMD and Qualcomm 28nm capacity to nvidia. they will not going to take AMD capacity this time but TSMC could reserve some space leaved by apple and huawei to nvidia. nvidia after all still making their A100 at TSMC.
Posted on Reply
#64
Camm
renz496just look how big nvidia A100 chips are. if they move to TSMC then they got the limit up to 800mm2
A completely new design sure, but I highly doubt Nvidia can bring to market a new chip in a year. It'll be an iteration on what they have. A100 is also too far removed from GA102 to be useable. A non-cut GA102 would be the largest Nvidia could make, which with enhanced yields, is certainly possible, but as we've seen from the 3080 -> 3090, there's fuck all performance left in terms of size increase.
renz496AMD most likely not going to get all that space that being left by apple and huawei.
AMD has repeatedly been shown by TSMC to be a priority customer for 7nm capacity, which is likely due to AMD wafer agreements having options for extra capacity as they become available, where as Nvidia likely does not have those options in place.
Posted on Reply
#66
renz496
CammA completely new design sure, but I highly doubt Nvidia can bring to market a new chip in a year. It'll be an iteration on what they have. A100 is also too far removed from GA102 to be useable. A non-cut GA102 would be the largest Nvidia could make, which with enhanced yields, is certainly possible, but as we've seen from the 3080 -> 3090, there's fuck all performance left in terms of size increase.



AMD has repeatedly been shown by TSMC to be a priority customer for 7nm capacity, which is likely due to AMD wafer agreements having options for extra capacity as they become available, where as Nvidia likely does not have those options in place.
even so TSMC cannot give AMD all of those extra capacity. TSMC for their part also need to look after their relationship with other chip maker. hence when apple offering TSMC to be their exclusive foundry TSMC refusing to do that. TSMC can't burn the bridge with other chip maker just to satisfy AMD needs. and look what being said in this article in the first place: TSMC is becoming more "friendly" towards nvidia. that's more or less TSMC is asking nvidia to fab their stuff at TSMC. it is not just about their relationship with various chip maker but TSMC also are wary about their competitor in this fab business namely samsung. with TSMC being over crowded some of the big names in the industry (like Qualcomm) start looking at samsung as an alternative to fab their chip. nvidia end up using samsung for all their gaming ampere chip is a big win for samsung because nvidia is giving samsung an experience that they never had before: making big chips which is in the past only TSMC capable of. if samsung keep attracting those big chip maker towards their fab in a few years samsung will close the gap with TSMC. samsung might as well win the contract for apple once again. samsung working with the likes of nvidia and qualcomm could increase samsung expertise in the future. TSMC is well aware about this because some of the success they had today can be attributed to the crazy work that they had done with nvidia since late 90s.
Posted on Reply
#67
PerfectWave
LuminescentSuureee, reality hits when you look at Steam hardware survey.
Steam survevy ask also to have data from my virtual machine lol ... really a proof!
Posted on Reply
#68
Vayra86
ODOGG26Can't believe people really believe this. For one, there's not enough 7nm wafers for them now. Two, do you know how long it would take for this if even possible to come to market (guess is like 6-8 months) and then be obsolete months later once Hopper comes out. Some people will believe anything these days without even stopping to think for a second. AMD has TSMC 7nm pretty much locked up. Their first in line for any additional wafers that comes available. Plus that would be too costly for NVIDIA to change to TSMC for only a few months to then change to maybe 5nm for hopper months later. :banghead:.
Refresh / SUPER line up. Just wait for it.
Posted on Reply
#69
Mastakony
ZoneDymowonder how rich the people at TSMC must be, also poor samsung
Samsung won't take offense...
Nvidia is just ridiculous for this moveand prove that RTX 3000 was a quick release cause they fear AMD...
AMD has a REAL new architecture...

The only reason I could buy an RTX is my Nvidia Gsync screen
Posted on Reply
#70
Flanker
Vayra86Refresh / SUPER line up. Just wait for it.
Reminds me of the 8800GTX --> 9800GTX days
Posted on Reply
#71
Shatun_Bear
320W 10GB 3080 will always be a joke. This is another slap in the face from Nvidia to their loyal fans. How much more can they take seems to be Huang's game.

Because a 20GB 3080 on TSMC will not only clock higher but do so whilst consuming ~30% less watts.
Chrispy_Samsung's nodes aren't up to TSMC's standards, but what about GloFo, are they now such a mess that nobody in the CPU or GPU business dares touch them?
GloFo couldnt get their 7nm up and running so abandoned chasing the cutting edge of node development and is concentrating on 12nm and older nodes for customers not at the bleeding edge.
Posted on Reply
#72
Vayra86
Shatun_Bear320W 10GB 3080 will always be a joke. This is another slap in the face from Nvidia to their loyal fans. How much more can they take seems to be Huang's game.

Because a 20GB 3080 on TSMC will not only clock higher but do so whilst consuming ~30% less watts.



GloFo couldnt get their 7nm up and running so abandoned chasing the cutting edge of node development and is concentrating on 12nm and older nodes for customers not at the bleeding edge.
I wonder what pixie dust you're on lately but 30%? Lmao. Don't get too enthusiastic now
Posted on Reply
#73
Chrispy_
Vayra86I wonder what pixie dust you're on lately but 30%? Lmao. Don't get too enthusiastic now
The only same-product, different-process jump I can think of was Vega64 > Radeon VII and that was about 30% faster for the same power draw (or you could save 30% power to match the Vega64's performance).

It's not a single apple-to-apples easy number as this was GloFo to TSMC, 8GB to 16GB, and 64CU vs 60CU but it's probably the closest comparison of 14nm to 7nm there is because at least the class of product and architecture remain completely unchanged (both a ~300W, 60-ish CU, HBM2, GCN 5, 2048-bit design running on identical drivers and software stack, both available to the market simultaneously).

30% may not be accurate but it's kind of in line with TSMC's claims about their 7nm node - when it first launched TSMC were saying that it could go 30% faster at the same power budget or up to 50% lower power at the same performance levels. "Up to" comes with all the usual marketing-speak caveats, ofc.
Posted on Reply
#74
EarthDog
Vayra86I wonder what pixie dust you're on lately but 30%? Lmao. Don't get too enthusiastic now
Whatever it is... it's clearly uncut. :p
Posted on Reply
#75
Colddecked
Minus InfinityWhat sort of knob would buy 3080 for 1080p gaming. It's a 1440p Ultra high setting or 4K card all the way. 1080p will be handled with ease by 3060 for $300 less.
Sometimes people don't upgrade their monitors until they have the graphics card to do so... that's what I would do honestly.
Posted on Reply
Add your own comment
May 9th, 2024 00:38 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts