Sunday, August 13th 2023

NVIDIA Blackwell Graphics Architecture GPU Codenames Revealed, AD104 Has No Successor

The next-generation GeForce RTX 50-series graphics cards will be powered by the Blackwell graphics architecture, named after American mathematician David Blackwell. kopite7kimi, a reliable source with NVIDIA leaks revealed what the lineup of GPUs behind the series could look like. It reportedly will be led by the GB202, followed by the GB203, and then the GB205 and GB206, followed by the GB207 at the entry level. What's surprising here, is the lack of a "GB204" succeeding the AD104, GA104, TU104, and a long line of successful performance-segment GPUs by NVIDIA.

The GeForce Blackwell ASIC series begins with "GB" (GeForce Blackwell) followed by a 200-series number. The last time NVIDIA used a 200-series ASIC number for GeForce GPUs was with "Maxwell," as the GPUs ended up being built on a more advanced node, and with a few more advanced features, than what the architecture was originally conceived for. For "Blackwell," the GB202 logically succeeds the AD102, GA102, TU102, and a long line of "big chips" that have powered the company's flagship client graphics cards. The GB103 succeeds AD103, as a high SIMD count GPU with a narrower memory bus than the GB202, powering the #2 and #3 SKUs in the series. There is curiously the lack of a "GB104."
NVIDIA's xx04 ASICs have powered a long line of successful performance-thru-high end SKUs, such as the TU104 powering the RTX 2080, and the GP104 powering the immensely popular GTX 1080 and GTX 1070 series. The denominator has been missing the mark for the past two generations. The "Ampere" based GA104 powering the RTX 3070 may have sold in volumes, but a its maxed out RTX 3070 Ti hasn't quite sold in numbers, and missed the mark against the Radeon RX 6800 (similar price). Even with Ada, while the AD104 powering the RTX 4070 may be selling in numbers, the maxed out chip powering the RTX 4070 Ti, misses the mark against the RX 7900 XT with a similar price. This has caused NVIDIA to introduce the AD103 in the desktop segment—a high CUDA core-count silicon with a mainstream memory bus width of 256-bit—out to justify high-end pricing, which will continue in the GeForce Blackwell generation with the GB203.

As with AD103, NVIDIA will leverage the high SIMD power of GB203 to power high-end mobile SKUs. The introduction of the GB205 ASIC could be an indication that NVIDIA's performance-segment GPU will come with a feature-set that would avoid the kind of controversy NVIDIA faced when trying to carve out the original "RTX 4080 12 GB" using the AD104 and its narrow 192-bit memory interface.

Given NVIDIA's 2-year cadence for new client graphics architectures, one can expect Blackwell to debut toward Q4-2024, to align with mass-production availability of the 3 nm foundry node.
Source: VideoCardz
Add your own comment

71 Comments on NVIDIA Blackwell Graphics Architecture GPU Codenames Revealed, AD104 Has No Successor

#26
ViperXZ
ARFCan be a refresh of the brand new arch, with the first iteration cancelled to see the light of the day. They are skipping GB100 and go straight to GB200...
It makes sense because right now ADA isn't selling that well, the covid times are over, the fat selling times are gone. Otherwise both is possible, a refresh and GB200 could be more than a refresh, a 2nd gen of ADA, I kinda overlooked this earlier. GM200 for example wasn't a refresh of GM100, it was a 2nd generation Maxwell product. This could very well be the case with ADA as well, just that it was renamed to Blackwell, for whatever (probably marketing) reasons.
Posted on Reply
#27
Assimilator
ARFIf indeed the new gens both from AMD and nvidia are delayed till 2025 | 2026, then you can safely assume that's the end of the graphics market as we know it.
2018 - RTX 2000
2020 - RTX 3000
2022 - RTX 4000
2025 | 2026 - RTX 5000

3 | 4 years to launch a new generation ?
Quite possible, since die shrinks can no longer provide the massive performance increases and power decreases we've been used to, due to reaching the limitations of both physics and economics. Physics because silicon becomes susceptible to quantum effects below ~7nm and has doubtful scaling below 1nm, while copper at such small sizes has massive leakage issues; and economics because processes (EUV) needed to etch chips at such tiny sizes are expensive and slow, and any replacements to copper or silicon are going to make chip production incredibly expensive.

There are technologies like chiplets and vertical stacking that can mitigate some of these fundamental limitations and allow us to eke a little more life out of copper and silicon, but those are really just band-aids and side-steps.

GTX 900 - 28nm
GTX 1000 - 16nm/14nm
RTX 2000 - 12nm
RTX 3000 - 8nm
RTX 4000 - 4nm
RTX 5000 - 3nm if 2024, which may not be worth it if TSMC has managed to get 2nm to volume production by end of 2024 (assuming Apple doesn't snipe all that capacity)
Posted on Reply
#28
ViperXZ
AssimilatorQuite possible, since die shrinks can no longer provide the massive performance increases and power decreases we've been used to, due to reaching the limitations of both physics and economics. Physics because silicon becomes susceptible to quantum effects below ~7nm and has doubtful scaling below 1nm, while copper at such small sizes has massive leakage issues; and economics because processes (EUV) needed to etch chips at such tiny sizes are expensive and slow, and any replacements to copper or silicon are going to make chip production incredibly expensive.

There are technologies like chiplets and vertical stacking that can mitigate some of these fundamental limitations and allow us to eke a little more life out of copper and silicon, but those are really just band-aids and side-steps.

GTX 900 - 28nm
GTX 1000 - 16nm/14nm
RTX 2000 - 12nm
RTX 3000 - 8nm
RTX 4000 - 4nm
RTX 5000 - 3nm if 2024, which may not be worth it if TSMC has managed to get 2nm to volume production by end of 2024 (assuming Apple doesn't snipe all that capacity)
Silicon has limits, other material can surpass those limits and still shrink further. That debate is old, tech will never stop or slow down.
Posted on Reply
#29
blacksea76
ir_cowWith AMD dropping out and Intel not anywhere close to NVIDIA, I don't think we will see the 50 series until 2025 at the earliest. NVIDIA even cancelled the 4090 Ti because it has no competition.
You must have some insider knowledge if you know AMD is dropping out? From where? There was a rumour they won't try to compete with 4090 anymore, is that your insider knowledge?
Posted on Reply
#30
Assimilator
ViperXZSilicon has limits, other material can surpass those limits and still shrink further. That debate is old, tech will never stop or slow down.
Tell us, captain obvious, how long do you think it'll take for the whole semiconductor industry plus the entirety of its supply chain to completely retool for completely different materials and completely different ways of fabricating them together?

What, exactly, is going to be the socio-economic impact when we're told that the chips we've been used to getting faster and cheaper for decades, can only keep becoming faster if we're willing to pay orders of magnitude more for them because faster is only possible with those new, vastly more expensive, materials and methods?
Posted on Reply
#31
ViperXZ
AssimilatorTell us, captain obvious, how long do you think it'll take for the whole semiconductor industry plus the entirety of its supply chain to completely retool for completely different materials and completely different ways of fabricating them together?
They are already researching for that since many many years now, even if you didn't hear about it. No need to get angry, btw. Technological advancement is no myth.
AssimilatorWhat, exactly, is going to be the socio-economic impact when we're told that the chips we've been used to getting faster and cheaper for decades, can only keep becoming faster if we're willing to pay orders of magnitude more for them because faster is only possible with those new, vastly more expensive, materials and methods?
It's funny that you think that pricing will slow down, it's increasing since 7nm. So why should i state anything else. That being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better. One of the reasons why new silicon is so expensive right now is that the manufacturing of it became too complicated on the advanced nodes. Not necessarily the case for a new technological approach. Oh yes, I'm being optimistic here.
Posted on Reply
#32
Dr. Dro
TheinsanegamerNWhen you take a chip typically put on a 104 die, move it to a 106 die, but sell it for 104 die prices, you deserve to be called out for it. "but muh chip names dont matter" is a very poor excuse for this behavior.


HARD disagree. The last time AMD tried this, it gave nvidia the upper mid range and high end on a silver platter, leading not only to price increases, but nvidia getting more money from sweet high end margins to continue pulling ahead. When you drop out of a market, its HARD to break back in, as everyone will have moved on.
If you refer to Vega 20 or Navi 10 v. Turing, they always had it... Price aside the Titan RTX could pull almost twice the frame rates of Radeon VII, it just cost like 4 VIIs though
Posted on Reply
#33
Vayra86
BorisDG1. We already knew it's Blackwell
2. It's said from numerous sources, that it comes at earliest 2025.
Yep and nothing stops Nvidia from getting a new cut later down the line. No 104? I don't see it, why leave such a massive gap between the high end and midrange? Or perhaps this is just Nvidia reshuffling the tiers again. Whatever it will be, the x04 is generally 2/3rds ~ 3/4th of the 'full' die on the top. Maybe the x03 is taking that spot, because the current gen's x70 and x80 positioning was problematic.
TheinsanegamerNWhen you take a chip typically put on a 104 die, move it to a 106 die, but sell it for 104 die prices, you deserve to be called out for it. "but muh chip names dont matter" is a very poor excuse for this behavior.


HARD disagree. The last time AMD tried this, it gave nvidia the upper mid range and high end on a silver platter, leading not only to price increases, but nvidia getting more money from sweet high end margins to continue pulling ahead. When you drop out of a market, its HARD to break back in, as everyone will have moved on.
Yep... if AMD abandons the high end we can well see them take 2-3 generations again to recover. If you don't play ball in the top or subtop, you're just not playing much. AMD will find Intel in the midrange so they stand to gain nothing, they've just dropped a segment where there's but one competitor and now have two everywhere else.
Posted on Reply
#34
Dr. Dro
Vayra86Yep and nothing stops Nvidia from getting a new cut later down the line. No 104? I don't see it, why leave such a massive gap between the high end and midrange? Or perhaps this is just Nvidia reshuffling the tiers again. Whatever it will be, the x04 is generally 2/3rds ~ 3/4th of the 'full' die on the top. Maybe the x03 is taking that spot, because the current gen's x70 and x80 positioning was problematic.
We kind of reached an inflection point imo. GPUs have gotten plenty fast, and the battle has shifted to feature set. Except that has been standardized as well, so as long as you have a Pascal card 1060 and above/equivalent AMD RDNA card, smooth 1080p 60 gaming is well within your reach.

By Turing you're already fully compliant with DirectX 12 Ultimate and looking strictly at RT performance and general efficiency gains, and Ada's advanced features like SER aren't used by games yet.

So for eSports segment, GPUs have been good enough for the past 6 years. For AAA flagships, the past 4. And Ada all but made currently released RT games easy to run, so it makes more sense than ever to issue this surcharge if you want to go above and beyond in performance... Because otherwise your experiences are probably going to be the same.
Posted on Reply
#35
Vayra86
ViperXZThey are already researching for that since many many years now, even if you didn't hear about it. No need to get angry, btw. Technological advancement is no myth.

It's funny that you think that pricing will slow down, it's increasing since 7nm. So why should i state anything else. That being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better. One of the reasons why new silicon is so expensive right now is that the manufacturing of it became too complicated on the advanced nodes. Not necessarily the case for a new technological approach. Oh yes, I'm being optimistic here.
Dies can just go bigger and nodes can be refined for better yield and reduced cost as well. We're already seeing this in fact. Shrinking isn't the only path forward on silicon - DUV required lots of patterning/steps to make a chip for a small (like 7nm) node, and EUV now has reduced that number significantly. This is all a gradual process. If anything new is going to replace silicon it will be a very slow process too, and silicon will still remain alongside it for a long time.
Posted on Reply
#36
ARF
ViperXZThat being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better.
1. Online gaming but every computer would need a cable optic fibre connection with as low latency as possible, and servers in every country, as densely located across the globe as possible.
2. Technology that gets rid of the traditional polygon rasterisation - transition to unlimited detail with unlimited zoom of points | atoms in the 3D instead of polygons and stripes.


Something close to it is actually used in Unreal Engine 5 - Nanite.
Posted on Reply
#37
ir_cow
blacksea76You must have some insider knowledge if you know AMD is dropping out? From where? There was a rumour they won't try to compete with 4090 anymore, is that your insider knowledge?
I know nothing besides the rumor mill
Posted on Reply
#38
R-T-B
AssimilatorNobody cares about Linux.
Like it or not, the industry very much does.
Posted on Reply
#39
AusWolf
ARFIf indeed the new gens both from AMD and nvidia are delayed till 2025 | 2026, then you can safely assume that's the end of the graphics market as we know it.
I disagree. AMD and Nvidia have probably realised that gamers don't need yearly upgrades, as game graphics aren't advancing quickly enough these days. If they can keep selling the same line of products for 3-4 years, that's less R&D cost and more focus on generational uplift. I mean, please don't tell me that one desperately needs to upgrade a 6600 XT to a 7600, or a 3060 to a 4060.
Posted on Reply
#40
Legacy-ZA
Meh, can't get excited anymore. The performance increases, but the average person can no longer afford it.
Posted on Reply
#41
Minus Infinity
Space LynxJust a rumor AMD is dropping out. AMD is just delayed cause Apple bought 100% of the 3nm node from TSMC in 2024. AMD will be back in winter 2025 or spring 2026. I'd bet money on it. Apple is just whoring itself for the best node for the entirety of 2024 cause they rich as fuck and want to leave everyone else in the dust temporarily
More than a rumour. And to be clear it's only N41 and N42 that are being canned to focus on I presume what will be called N51 and N52. RDNA5 is already lloking good apparently but high end RDNA4 is lacklustre and offering only small gains over 7900 cards. Sources inside AMD have said it would require delays to both RDNA4 and RDNA5 launch dates to get it right. Given Blackwell isn't coming out to 2025 AMD doesn't need to worry too much. Just focus on 7700 and 7800 this year and they can get 8600 out by September next year apparently. Hope they can make a 8600 a true generational uplift unlike this years crap.
Posted on Reply
#42
Prima.Vera
ir_cowWith AMD dropping out and Intel not anywhere close to NVIDIA, I don't think we will see the 50 series until 2025 at the earliest. NVIDIA even cancelled the 4090 Ti because it has no competition.
That's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)
Posted on Reply
#43
Space Lynx
Astronaut
Prima.VeraThat's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)
I'm notnsure why this is even news. we have always been on a 2 to 3 year cycle for high end cards.
Posted on Reply
#44
ir_cow
Space LynxI'm notnsure why this is even news. we have always been on a 2 to 3 year cycle for high end cards.
It was every 6 months at one point when ATi rocked NVIDIA.
Posted on Reply
#45
Prima.Vera
Interesting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
Posted on Reply
#46
Space Lynx
Astronaut
ir_cowIt was every 6 months at one point when ATi rocked NVIDIA.
maybe the most recent gpu's is all i can remember, huh. interesting.

radeon 6800 xt launch date was november 2020, so it took 3 years for 7800 xt to replace it. its prob fair to say its not just covid, but we are on a 3 year cycle.

im glad i got my 7900 xt for the price i did, im set for many years. i was playing assassins creed brotherhood at 160 fps ultra setting last night (one of the few ac games that supports high refresh) and the fans didn't even kick on on the gpu, cause the card itself is so powerful. absolutely blew me away. lol

not a single frame drop either, was 160 the entire time. same as in uncharted 4. unbelievable the thing this power has, now in uncharted 4 the fans do kick in to high gear and she gets got. its about the only game that makes my card extra hot.
Posted on Reply
#47
ViperXZ
Prima.VeraThat's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)
Why? Is your buying decision dependant on what the fastest GPU is right now, or what Nvidia told you "you have to have, because of feature X"? Please don't.
Prima.VeraInteresting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
Yes, but these GPUs were way different back then. 3DFX SLI (the original SLI, not Nvidias SLI, which name was essentially bought from buying up the rest of 3DFX) works by using multiple GPUs in tandem who do different parts of the work and it's perfectly synchronized, not like Crossfire / SLI now (I mean it's dead anyway), because the hardware was specifically build for it from the start. But newer GPUs, they don't work like that, they are built to work alone, and it's just optional to let them work with a partner card. Now that was a lot of work and they abandoned it, also it was mostly multiple cards, not multiple GPUs on one board. TLDR: way different tech than what 3DFX used back then, but of course it's a shame 3DFX is gone. It's one of the biggest regrets of all time.
Space Lynxmaybe the most recent gpu's is all i can remember, huh. interesting.

radeon 6800 xt launch date was november 2020, so it took 3 years for 7800 xt to replace it. its prob fair to say its not just covid, but we are on a 3 year cycle.

im glad i got my 7900 xt for the price i did, im set for many years. i was playing assassins creed brotherhood at 160 fps ultra setting last night (one of the few ac games that supports high refresh) and the fans didn't even kick on on the gpu, cause the card itself is so powerful. absolutely blew me away. lol

not a single frame drop either, was 160 the entire time. same as in uncharted 4. unbelievable the thing this power has, now in uncharted 4 the fans do kick in to high gear and she gets got. its about the only game that makes my card extra hot.
The 7800 XT isn't even the real successor to the 6800 XT, the 7900 XT is (they renamed everything 1 tier up), more or less, but the chip is more deactivated in comparison. Just like with Nvidia gave us a fat chip for just 700 $ (i mean if you were lucky to get one), the 3080, the 6800 XT was just a slightly deactivated part of a 6900 XT big chip to compete with that, that all for 650-700$, something we can't imagine now. Nvidias big chip is 1600$ now and still not the full thing, it's ridiculous, but whatever I guess. The 4080 isn't that big but still costs 1200$. In comparison AMD didn't increase prices that much, so I'm not surprised people don't like this gen, mainly because Nvidia got too greedy. First they did it to lose old inventory of 3090 and down, but that was never the real reason. The real reason was, they saw with covid times that people will pay ridiculous prices and they kept it up.
Posted on Reply
#48
Assimilator
Prima.VeraInteresting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
Chiplets are nothing like 3dfx's approach. You are comparing apples to pigs.
ViperXZit's a shame 3DFX is gone
Nope, they were a poorly-managed company that made many questionable decisions and ultimately contributed little long-term value to the graphics card industry, except nostalgia.
Posted on Reply
#49
ViperXZ
AssimilatorNope, they were a poorly-managed company that made many questionable decisions and ultimately contributed little long-term value to the graphics card industry, except nostalgia.
What a terrible take, that has unfortunately not much to do with reality. Poorly managed, at the end, yes, not in general. Otherwise you're spot-on-wrong.
Posted on Reply
#50
mb194dc
There's almost no market for the very high end? That's my guess as to why no 4090 ti or AMD high end next gen.

Most users are still stuck on 1080p which 2 gens back can handle.

Something like a 6800xt can handle 4k 60 and be found for circa 500 these days.

After the leaps ahead of that gen, looks like we're in for stagnation for a while?
Posted on Reply
Add your own comment
Dec 12th, 2024 15:17 CST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts