• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA Blackwell Graphics Architecture GPU Codenames Revealed, AD104 Has No Successor

A refresh because why else name it "2xx" if it's a brand new arch? Ofc I'm just speculating like everyone else here. But it's "just" 3 years.

Can be a refresh of the brand new arch, with the first iteration cancelled to see the light of the day. They are skipping GB100 and go straight to GB200...
 
Can be a refresh of the brand new arch, with the first iteration cancelled to see the light of the day. They are skipping GB100 and go straight to GB200...
It makes sense because right now ADA isn't selling that well, the covid times are over, the fat selling times are gone. Otherwise both is possible, a refresh and GB200 could be more than a refresh, a 2nd gen of ADA, I kinda overlooked this earlier. GM200 for example wasn't a refresh of GM100, it was a 2nd generation Maxwell product. This could very well be the case with ADA as well, just that it was renamed to Blackwell, for whatever (probably marketing) reasons.
 
If indeed the new gens both from AMD and nvidia are delayed till 2025 | 2026, then you can safely assume that's the end of the graphics market as we know it.
2018 - RTX 2000
2020 - RTX 3000
2022 - RTX 4000
2025 | 2026 - RTX 5000

3 | 4 years to launch a new generation ?
Quite possible, since die shrinks can no longer provide the massive performance increases and power decreases we've been used to, due to reaching the limitations of both physics and economics. Physics because silicon becomes susceptible to quantum effects below ~7nm and has doubtful scaling below 1nm, while copper at such small sizes has massive leakage issues; and economics because processes (EUV) needed to etch chips at such tiny sizes are expensive and slow, and any replacements to copper or silicon are going to make chip production incredibly expensive.

There are technologies like chiplets and vertical stacking that can mitigate some of these fundamental limitations and allow us to eke a little more life out of copper and silicon, but those are really just band-aids and side-steps.

GTX 900 - 28nm
GTX 1000 - 16nm/14nm
RTX 2000 - 12nm
RTX 3000 - 8nm
RTX 4000 - 4nm
RTX 5000 - 3nm if 2024, which may not be worth it if TSMC has managed to get 2nm to volume production by end of 2024 (assuming Apple doesn't snipe all that capacity)
 
Quite possible, since die shrinks can no longer provide the massive performance increases and power decreases we've been used to, due to reaching the limitations of both physics and economics. Physics because silicon becomes susceptible to quantum effects below ~7nm and has doubtful scaling below 1nm, while copper at such small sizes has massive leakage issues; and economics because processes (EUV) needed to etch chips at such tiny sizes are expensive and slow, and any replacements to copper or silicon are going to make chip production incredibly expensive.

There are technologies like chiplets and vertical stacking that can mitigate some of these fundamental limitations and allow us to eke a little more life out of copper and silicon, but those are really just band-aids and side-steps.

GTX 900 - 28nm
GTX 1000 - 16nm/14nm
RTX 2000 - 12nm
RTX 3000 - 8nm
RTX 4000 - 4nm
RTX 5000 - 3nm if 2024, which may not be worth it if TSMC has managed to get 2nm to volume production by end of 2024 (assuming Apple doesn't snipe all that capacity)
Silicon has limits, other material can surpass those limits and still shrink further. That debate is old, tech will never stop or slow down.
 
With AMD dropping out and Intel not anywhere close to NVIDIA, I don't think we will see the 50 series until 2025 at the earliest. NVIDIA even cancelled the 4090 Ti because it has no competition.
You must have some insider knowledge if you know AMD is dropping out? From where? There was a rumour they won't try to compete with 4090 anymore, is that your insider knowledge?
 
Silicon has limits, other material can surpass those limits and still shrink further. That debate is old, tech will never stop or slow down.
Tell us, captain obvious, how long do you think it'll take for the whole semiconductor industry plus the entirety of its supply chain to completely retool for completely different materials and completely different ways of fabricating them together?

What, exactly, is going to be the socio-economic impact when we're told that the chips we've been used to getting faster and cheaper for decades, can only keep becoming faster if we're willing to pay orders of magnitude more for them because faster is only possible with those new, vastly more expensive, materials and methods?
 
Tell us, captain obvious, how long do you think it'll take for the whole semiconductor industry plus the entirety of its supply chain to completely retool for completely different materials and completely different ways of fabricating them together?
They are already researching for that since many many years now, even if you didn't hear about it. No need to get angry, btw. Technological advancement is no myth.
What, exactly, is going to be the socio-economic impact when we're told that the chips we've been used to getting faster and cheaper for decades, can only keep becoming faster if we're willing to pay orders of magnitude more for them because faster is only possible with those new, vastly more expensive, materials and methods?
It's funny that you think that pricing will slow down, it's increasing since 7nm. So why should i state anything else. That being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better. One of the reasons why new silicon is so expensive right now is that the manufacturing of it became too complicated on the advanced nodes. Not necessarily the case for a new technological approach. Oh yes, I'm being optimistic here.
 
When you take a chip typically put on a 104 die, move it to a 106 die, but sell it for 104 die prices, you deserve to be called out for it. "but muh chip names dont matter" is a very poor excuse for this behavior.


HARD disagree. The last time AMD tried this, it gave nvidia the upper mid range and high end on a silver platter, leading not only to price increases, but nvidia getting more money from sweet high end margins to continue pulling ahead. When you drop out of a market, its HARD to break back in, as everyone will have moved on.

If you refer to Vega 20 or Navi 10 v. Turing, they always had it... Price aside the Titan RTX could pull almost twice the frame rates of Radeon VII, it just cost like 4 VIIs though
 
1. We already knew it's Blackwell
2. It's said from numerous sources, that it comes at earliest 2025.
Yep and nothing stops Nvidia from getting a new cut later down the line. No 104? I don't see it, why leave such a massive gap between the high end and midrange? Or perhaps this is just Nvidia reshuffling the tiers again. Whatever it will be, the x04 is generally 2/3rds ~ 3/4th of the 'full' die on the top. Maybe the x03 is taking that spot, because the current gen's x70 and x80 positioning was problematic.

When you take a chip typically put on a 104 die, move it to a 106 die, but sell it for 104 die prices, you deserve to be called out for it. "but muh chip names dont matter" is a very poor excuse for this behavior.


HARD disagree. The last time AMD tried this, it gave nvidia the upper mid range and high end on a silver platter, leading not only to price increases, but nvidia getting more money from sweet high end margins to continue pulling ahead. When you drop out of a market, its HARD to break back in, as everyone will have moved on.
Yep... if AMD abandons the high end we can well see them take 2-3 generations again to recover. If you don't play ball in the top or subtop, you're just not playing much. AMD will find Intel in the midrange so they stand to gain nothing, they've just dropped a segment where there's but one competitor and now have two everywhere else.
 
Yep and nothing stops Nvidia from getting a new cut later down the line. No 104? I don't see it, why leave such a massive gap between the high end and midrange? Or perhaps this is just Nvidia reshuffling the tiers again. Whatever it will be, the x04 is generally 2/3rds ~ 3/4th of the 'full' die on the top. Maybe the x03 is taking that spot, because the current gen's x70 and x80 positioning was problematic.

We kind of reached an inflection point imo. GPUs have gotten plenty fast, and the battle has shifted to feature set. Except that has been standardized as well, so as long as you have a Pascal card 1060 and above/equivalent AMD RDNA card, smooth 1080p 60 gaming is well within your reach.

By Turing you're already fully compliant with DirectX 12 Ultimate and looking strictly at RT performance and general efficiency gains, and Ada's advanced features like SER aren't used by games yet.

So for eSports segment, GPUs have been good enough for the past 6 years. For AAA flagships, the past 4. And Ada all but made currently released RT games easy to run, so it makes more sense than ever to issue this surcharge if you want to go above and beyond in performance... Because otherwise your experiences are probably going to be the same.
 
They are already researching for that since many many years now, even if you didn't hear about it. No need to get angry, btw. Technological advancement is no myth.

It's funny that you think that pricing will slow down, it's increasing since 7nm. So why should i state anything else. That being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better. One of the reasons why new silicon is so expensive right now is that the manufacturing of it became too complicated on the advanced nodes. Not necessarily the case for a new technological approach. Oh yes, I'm being optimistic here.
Dies can just go bigger and nodes can be refined for better yield and reduced cost as well. We're already seeing this in fact. Shrinking isn't the only path forward on silicon - DUV required lots of patterning/steps to make a chip for a small (like 7nm) node, and EUV now has reduced that number significantly. This is all a gradual process. If anything new is going to replace silicon it will be a very slow process too, and silicon will still remain alongside it for a long time.
 
That being said, it's very possible that some marvelous technological breakthrough delivers us what silicon can't anymore: good pricing, while being better.

1. Online gaming but every computer would need a cable optic fibre connection with as low latency as possible, and servers in every country, as densely located across the globe as possible.
2. Technology that gets rid of the traditional polygon rasterisation - transition to unlimited detail with unlimited zoom of points | atoms in the 3D instead of polygons and stripes.


Something close to it is actually used in Unreal Engine 5 - Nanite.
 
You must have some insider knowledge if you know AMD is dropping out? From where? There was a rumour they won't try to compete with 4090 anymore, is that your insider knowledge?
I know nothing besides the rumor mill
 
If indeed the new gens both from AMD and nvidia are delayed till 2025 | 2026, then you can safely assume that's the end of the graphics market as we know it.
I disagree. AMD and Nvidia have probably realised that gamers don't need yearly upgrades, as game graphics aren't advancing quickly enough these days. If they can keep selling the same line of products for 3-4 years, that's less R&D cost and more focus on generational uplift. I mean, please don't tell me that one desperately needs to upgrade a 6600 XT to a 7600, or a 3060 to a 4060.
 
Meh, can't get excited anymore. The performance increases, but the average person can no longer afford it.
 
Just a rumor AMD is dropping out. AMD is just delayed cause Apple bought 100% of the 3nm node from TSMC in 2024. AMD will be back in winter 2025 or spring 2026. I'd bet money on it. Apple is just whoring itself for the best node for the entirety of 2024 cause they rich as fuck and want to leave everyone else in the dust temporarily
More than a rumour. And to be clear it's only N41 and N42 that are being canned to focus on I presume what will be called N51 and N52. RDNA5 is already lloking good apparently but high end RDNA4 is lacklustre and offering only small gains over 7900 cards. Sources inside AMD have said it would require delays to both RDNA4 and RDNA5 launch dates to get it right. Given Blackwell isn't coming out to 2025 AMD doesn't need to worry too much. Just focus on 7700 and 7800 this year and they can get 8600 out by September next year apparently. Hope they can make a 8600 a true generational uplift unlike this years crap.
 
With AMD dropping out and Intel not anywhere close to NVIDIA, I don't think we will see the 50 series until 2025 at the earliest. NVIDIA even cancelled the 4090 Ti because it has no competition.
That's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)
 
That's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)

I'm notnsure why this is even news. we have always been on a 2 to 3 year cycle for high end cards.
 
Interesting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
 
It was every 6 months at one point when ATi rocked NVIDIA.

maybe the most recent gpu's is all i can remember, huh. interesting.

radeon 6800 xt launch date was november 2020, so it took 3 years for 7800 xt to replace it. its prob fair to say its not just covid, but we are on a 3 year cycle.

im glad i got my 7900 xt for the price i did, im set for many years. i was playing assassins creed brotherhood at 160 fps ultra setting last night (one of the few ac games that supports high refresh) and the fans didn't even kick on on the gpu, cause the card itself is so powerful. absolutely blew me away. lol

not a single frame drop either, was 160 the entire time. same as in uncharted 4. unbelievable the thing this power has, now in uncharted 4 the fans do kick in to high gear and she gets got. its about the only game that makes my card extra hot.
 
That's actually a good thing. You can keep your very expensive video cards longer, and don't feel the temptation to buy a new generation, just because is 50% stronger than what you currently have. Even if there are no quality games out there worthy of a new GPU investment. ;)
Why? Is your buying decision dependant on what the fastest GPU is right now, or what Nvidia told you "you have to have, because of feature X"? Please don't.
Interesting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
Yes, but these GPUs were way different back then. 3DFX SLI (the original SLI, not Nvidias SLI, which name was essentially bought from buying up the rest of 3DFX) works by using multiple GPUs in tandem who do different parts of the work and it's perfectly synchronized, not like Crossfire / SLI now (I mean it's dead anyway), because the hardware was specifically build for it from the start. But newer GPUs, they don't work like that, they are built to work alone, and it's just optional to let them work with a partner card. Now that was a lot of work and they abandoned it, also it was mostly multiple cards, not multiple GPUs on one board. TLDR: way different tech than what 3DFX used back then, but of course it's a shame 3DFX is gone. It's one of the biggest regrets of all time.
maybe the most recent gpu's is all i can remember, huh. interesting.

radeon 6800 xt launch date was november 2020, so it took 3 years for 7800 xt to replace it. its prob fair to say its not just covid, but we are on a 3 year cycle.

im glad i got my 7900 xt for the price i did, im set for many years. i was playing assassins creed brotherhood at 160 fps ultra setting last night (one of the few ac games that supports high refresh) and the fans didn't even kick on on the gpu, cause the card itself is so powerful. absolutely blew me away. lol

not a single frame drop either, was 160 the entire time. same as in uncharted 4. unbelievable the thing this power has, now in uncharted 4 the fans do kick in to high gear and she gets got. its about the only game that makes my card extra hot.
The 7800 XT isn't even the real successor to the 6800 XT, the 7900 XT is (they renamed everything 1 tier up), more or less, but the chip is more deactivated in comparison. Just like with Nvidia gave us a fat chip for just 700 $ (i mean if you were lucky to get one), the 3080, the 6800 XT was just a slightly deactivated part of a 6900 XT big chip to compete with that, that all for 650-700$, something we can't imagine now. Nvidias big chip is 1600$ now and still not the full thing, it's ridiculous, but whatever I guess. The 4080 isn't that big but still costs 1200$. In comparison AMD didn't increase prices that much, so I'm not surprised people don't like this gen, mainly because Nvidia got too greedy. First they did it to lose old inventory of 3090 and down, but that was never the real reason. The real reason was, they saw with covid times that people will pay ridiculous prices and they kept it up.
 
Interesting comments reads, but I was wondering. Maybe 3dFX had a great idea 20 years ago, but bad implementation.
Instead of using 1 big monolithic GPU, why not using multiple small ones, that are easy and faster to produce due to better yields.
They call it "chiplets" nowadays, so why for AMD to abandon the High-end and Enthusiast segment, if you can stack 4 or more of those, basically doubling performance of the video card with each pair added. I thought they were going to smoke enVideea with this...
Confusing times.
Chiplets are nothing like 3dfx's approach. You are comparing apples to pigs.

it's a shame 3DFX is gone
Nope, they were a poorly-managed company that made many questionable decisions and ultimately contributed little long-term value to the graphics card industry, except nostalgia.
 
Nope, they were a poorly-managed company that made many questionable decisions and ultimately contributed little long-term value to the graphics card industry, except nostalgia.
What a terrible take, that has unfortunately not much to do with reality. Poorly managed, at the end, yes, not in general. Otherwise you're spot-on-wrong.
 
Back
Top