Tuesday, August 29th 2023

Reports Suggest AMD Ending Production of Navi 23 GPU

ITHome has picked up on interesting retail activity in China, where AMD Radeon RX 6650 XT graphics cards are deeply discounted. This seems to correspond to a possible discontinuation of Team Red's Navi 23 XT GPU—a Board Channel source stated: "AMD factory has stopped production of a certain GPU. At the present time, shipments from all AIB brands have stopped with inventory being cleared. AMD has stopped production for the Radeon RX 6650 XT, and nearly all brands will have their inventory cleared by the end of September." Board partners in China appear to running sales promotions, with cards reduced from an original MSRP of 3099 RMB ($425) down to as low as 1739 RMB (~$240), although these adjusted prices are mostly hovering around the 2000 RMB (~$275) mark.

AMD recently declared that its Radeon RX 7000 desktop lineup is now complete, following the unveiling of mid-range RX 7800 XT and RX 7700 XT cards at last week's Gamescom trade fair in Cologne, Germany. Their low-to-mid tier Radeon RX 7600 card, based on the Navi 33 XL GPU, is the sole successor to multiple RDNA 2 predecessors (RX 6600, 6600 XT & 6650 XT). AMD and its board partners are likely prioritizing larger scale RDNA 3 production, so the latest batch of GPU industry insider information is not all that surprising. Tom's Hardware points out that: "there is hardly any point for AMD to continue production of Navi 23. The company's RDNA 3-based Navi 33 GPU integrates 13.3 billion transistors, has 2048 SPs, and performs better than its direct predecessor. Meanwhile, it has a smaller die size (204 mm² vs 237 mm²) and is made on TSMC's N6 process technology (as opposed to N7 in the case of Navi 23), so it may well be cheaper to produce."
Sources: VideoCardz, Tom's Hardware, ITHome
Add your own comment

11 Comments on Reports Suggest AMD Ending Production of Navi 23 GPU

#1
MrDweezil
Is that weird? They have a new chip that covers the same price/performance segment.
Posted on Reply
#2
c2DDragon
Almost nothing to see on the AMD shop in France.

The 7600 is too expensive I think.
Posted on Reply
#3
80-watt Hamster
c2DDragonThe 7600 is too expensive I think.
Agreed, and Micro Center's current in-store promotion for $230 suggests it could have been cheaper. They also declined entirely to launch a sub-$200 model. Then again, their last attempt at that segment got us the 6500 XT, so maybe that's for the best.
Posted on Reply
#4
TheinsanegamerN
80-watt HamsterAgreed, and Micro Center's current in-store promotion for $230 suggests it could have been cheaper. They also declined entirely to launch a sub-$200 model. Then again, their last attempt at that segment got us the 6500 XT, so maybe that's for the best.
The 6500 wouldnt have been bad had AMD not cheaped out so much. 32MB of cache, a 96 bit 6GB bus, and tuned to 75w would have made it a far more attractive product.

This, of course, wouldnt have been as immediately profitable, so AMD didnt do it.

The RX 7600 should be under $200, IMO under $150 for an 8GB product in 2023.
Posted on Reply
#5
Denver
c2DDragonAlmost nothing to see on the AMD shop in France.

The 7600 is too expensive I think.
Not really, I saw it on sale for U$ 229 (correcting the price for inflation it would be about U$ 200).
Posted on Reply
#6
Random_User
Greetings!

Nothing strange. Scan.co.uk had these EOL about year ago.

Sadly though, these (6650XT) cards never been truly available here. And AMD store doesn't exist here at all. There were times of some bumps in sales by third party stores, but nothing serious. I'm not even talking about any discounts. In this country, they are non-existant. Never were.
Perhaps the only time there's a discount by sellers here, it's probably dead item which most likely won't even boot, or won't pass POST screen.

It was $500 two years ago, as it is now (as 4060TI currently costs). (7600 Pulse and others are minimum $400+here, and is also the price of 3060/4060). But mostly, like all other VGAs 6650XT most likely been sold to miners by hundreds if not thousands. Notably Sapphire versions.
They all were heading to mining farms by pallets. Even all local (country) forums and Y T videos were full of pics and comments and about mining, long after Ethereum has been bust. And were available in extremely low quantities (almost non-existant) since the release. Even in stores which claim they were/are official "distributors" and "partners" of both, AMD and Sapphire Tech.

And there's no guaranty, that these cards are not the ones that been in mining previously. It hard to find a single sellers in my country, which was not caught in selling broken used stuff in mass amounts, for over two decades, refusing to accept returns as "defects caused by user", or "lost the tradable appearance". So either buy in EU/US (which mostly actually not only refuse to deliver to my country, but even block the access to their sites by country's IP altogether), or pay the same price to the local scalpers, and still get no warranty anyway (No, I'm not from Russia, China or etc).

So yet another generation of cards is gone EOL, without appearing on shelves, or coming down to sane level prices.
Posted on Reply
#7
notaburner
80-watt HamsterAgreed, and Micro Center's current in-store promotion for $230 suggests it could have been cheaper. They also declined entirely to launch a sub-$200 model. Then again, their last attempt at that segment got us the 6500 XT, so maybe that's for the best.
Genuine question here: Do sub $200 cards (like ~$150-175 msrp) still have a place in the market? Seems like the last few generations, any card in this territory ends up being woefully underpowered in terms of price/perf with the last gen cards needing pcie4.0 to even be remotely competitive. Suppose it's useful for those who can't exceed that budget and are essentially willing to take any level of performance. But at that point, you're getting closer to being able to justify an iGPU at like $20 (difference between a f or non f sku) to have "in the door" level of performance. Seems like the cards that are sold in the ~$150 range would need to be much closer to $100 for them to make sense but that seems unlikely given the fixed costs for producing all of the board components. Not to mention competition from used cards from the next tier up.
Posted on Reply
#8
80-watt Hamster
notaburnerGenuine question here: Do sub $200 cards (like ~$150-175 msrp) still have a place in the market? Seems like the last few generations, any card in this territory ends up being woefully underpowered in terms of price/perf with the last gen cards needing pcie4.0 to even be remotely competitive. Suppose it's useful for those who can't exceed that budget and are essentially willing to take any level of performance. But at that point, you're getting closer to being able to justify an iGPU at like $20 (difference between a f or non f sku) to have "in the door" level of performance. Seems like the cards that are sold in the ~$150 range would need to be much closer to $100 for them to make sense but that seems unlikely given the fixed costs for producing all of the board components. Not to mention competition from used cards from the next tier up.
One would think so. The strongest IGP AFAIR is the 5700G, still half the performance of a 1060 or 6500 XT. The latter performed relative to contemporary titles at about the same level as the RX 470 did in its day, at a fairly similar launch MSRP ($200 vs. $180). But somehow the 470 was the value champ of its generation (and continued to be along with its Polaris siblings for a weirdly long time), while the 6500 XT got excoriated as a waste of resources that never should have seen the light of day, according to the folks around here, anyway. (It was something of a bastard child, saddled with too few PCIe lanes and a memory bus that hamstrung its core, but that's not really the topic here).

I don't know how well the 6500 XT sold. But if that class of performance, which is still way beyond integrated, was acceptable in 2016, why shouldn't it be now? Maybe you're right, and a manufacturer can't make a profit on a $150 card, either because the margin's too thin no matter what, or the current market won't allow them to shift enough units to make the numbers work. If it's the former, why is that? $100 cards were viable five years ago. Even accounting for inflation, $150 seems like it should work. Are both Green and Red holding AIBs' feet to the fire on chip prices? Seems plausible. Could be that all but the hardcore have moved to consoles; it's no secret that the PC is a vanishing sight in the general user's home. So again, you could be correct that the market for that class of card simply isn't there. Which is a shame, because that wouldn't bode well for PC gaming as a whole, as the higher bar of entry will turn away an ever larger proportion of a (presumably) shrinking potential customer base.

The "budget" cards this generation are the 4060 and 7600, both launched for $300. Does that seem absurd to anyone else? All things considered, they do pretty well for themselves at 1080p and even 1440p, save for in the Crysis-wannabes like TLoU1 and CP77. But that's at Ultra, and Ultra is mostly dumb, generally entailing a significant performance hit for an insignificant bump in visuals. That suggests there's room for an even lesser variant that would perform well at 1080p Med-High, which is where that class of hardware usually played. But with the aforementioned cards being the price they are, it's hard to imagine the 50 variants coming in under two bills.
Posted on Reply
#9
notaburner
Well said, totally agree with you. It's seems to me like the "big" IGPU style would have a huge cost advantage over any lower tier card just because it's basically just adding more silicon and not requiring any board components to create. Something like the Xbox Series S APU but in desktop form. It makes me wonder (at least on AMDs side) that they think a part like that would eat into sales of the $250 "budget" cards. Or that consumers will just pony up the $250 if that's the only option, so they aren't losing any sales by not having a cheaper option out there.

Another thing on the AIB side could just be that a $150 card really allows minimal room for high end models. They seem reluctant to release cards near MSRP (could be due to high silicon cost) so having those STRIX type cards is probably necessary for them. Once you start adding $30-50 on top of a $150 card, you're so close to the next tier that they are a horrible proposition. At MSRP, you'd have to move so much volume to have any decent return that it's probably not that attractive. Though, I doubt that the AIBs have a huge say in what Nvidia/Amd decide to roll out.

One final issue that I'm not entirely educated on could just be low end laptops. The consumers who in the past might have gone for those $150 cards might just be getting laptops with the lowest end configurations. IIRC, the lowest end Nvidia mobile chips are basically in the same performance class of what a hypothetical 50 class card would be so maybe they're able to squeeze out more margin by focusing there instead of on desktop cards.
Posted on Reply
#10
80-watt Hamster
notaburnerWell said, totally agree with you. It's seems to me like the "big" IGPU style would have a huge cost advantage over any lower tier card just because it's basically just adding more silicon and not requiring any board components to create. Something like the Xbox Series S APU but in desktop form. It makes me wonder (at least on AMDs side) that they think a part like that would eat into sales of the $250 "budget" cards. Or that consumers will just pony up the $250 if that's the only option, so they aren't losing any sales by not having a cheaper option out there.

Another thing on the AIB side could just be that a $150 card really allows minimal room for high end models. They seem reluctant to release cards near MSRP (could be due to high silicon cost) so having those STRIX type cards is probably necessary for them. Once you start adding $30-50 on top of a $150 card, you're so close to the next tier that they are a horrible proposition. At MSRP, you'd have to move so much volume to have any decent return that it's probably not that attractive. Though, I doubt that the AIBs have a huge say in what Nvidia/Amd decide to roll out.

One final issue that I'm not entirely educated on could just be low end laptops. The consumers who in the past might have gone for those $150 cards might just be getting laptops with the lowest end configurations. IIRC, the lowest end Nvidia mobile chips are basically in the same performance class of what a hypothetical 50 class card would be so maybe they're able to squeeze out more margin by focusing there instead of on desktop cards.
I didn't think about the modest gaming laptop angle; you may be on to something there. Stronger IGPs, though, have some non-trivial challenges in a PC. See Iris XE and Broadwell as a case study. The thermal concerns are surmountable; 60W of processor (like a 12100/F) plus 80W of graphics (about 2/3 of a 4060's 115W) comes to 140W, easily handled by a modest air cooler. Packaging the VRAM is the technical stumbling block. System RAM is too slow at the performance level we're talking, and 8GB is considered the bare minimum these days. Now we need to fit that in along with all required traces, valuable PCB real estate that will be wasted on any processor that's sold without the IGP, which will be most of them. This is what we're looking at adding to the CPU (4060 shown):



AD107 die is 159 mm^2; let's say that a die that does what we want can be ~25% smaller for 120sqmm. Each RAM package looks pretty similar in size to the die, so 120 + 4 * 160 = 760sqmm. A 12100 die comes in at 163sqmm, which we'll round down to 150 for lopping off its IGP and to make math easy. That plus our hypothetical graphics silicon comes to ~900sqmm, a 6X increase. Cutting the VRAM in half and doubling density would get us down to one VRAM package and 430sqmm, still twice a 13900 or 5700G. If one sticks to shared system memory, though... 120 +150 = 270 is big, but doable. The question then is, of course, will enough people buy it. Seems like the answer is no.
Posted on Reply
#11
notaburner
80-watt HamsterI didn't think about the modest gaming laptop angle; you may be on to something there. Stronger IGPs, though, have some non-trivial challenges in a PC. See Iris XE and Broadwell as a case study. The thermal concerns are surmountable; 60W of processor (like a 12100/F) plus 80W of graphics (about 2/3 of a 4060's 115W) comes to 140W, easily handled by a modest air cooler. Packaging the VRAM is the technical stumbling block. System RAM is too slow at the performance level we're talking, and 8GB is considered the bare minimum these days. Now we need to fit that in along with all required traces, valuable PCB real estate that will be wasted on any processor that's sold without the IGP, which will be most of them. This is what we're looking at adding to the CPU (4060 shown):



AD107 die is 159 mm^2; let's say that a die that does what we want can be ~25% smaller for 120sqmm. Each RAM package looks pretty similar in size to the die, so 120 + 4 * 160 = 760sqmm. A 12100 die comes in at 163sqmm, which we'll round down to 150 for lopping off its IGP and to make math easy. That plus our hypothetical graphics silicon comes to ~900sqmm, a 6X increase. Cutting the VRAM in half and doubling density would get us down to one VRAM package and 430sqmm, still twice a 13900 or 5700G. If one sticks to shared system memory, though... 120 +150 = 270 is big, but doable. The question then is, of course, will enough people buy it. Seems like the answer is no.
Seems like the most realistic packaging option is basically shoving on an extra large graphics die and just absorbing the hit from the slower shared memory to get the equivalent performance of a hypothetical lower tier discrete card. Like an AD107 die +shared memory to make an "AD108" discrete level performer.

On the power side, from what I've seen, most motherboards seem to only have a single phase for the iGPU. Would probably end up needing to beef up motherboards to allow for 80+ watts on the iGPU which would drive up cost and limit compatibility.

I can see it being hard to market something like a $350-$400 APU and a more expensive motherboard to people shopping in the absolute budget range. Could absolutely see a nightmare of people buying the APU and then grabbing the cheapest board they can find and having no support/throttling, not to mention them buying the cheapest RAM they can find. Maybe in the end the value checks out even with the more expensive parts, but it would probably only be viable product for a system integrator who can hide that complexity and sell it as a "System that supports the latest games" for $100 under their lowest tier offering with a discrete card.
Posted on Reply
Add your own comment
May 4th, 2024 03:07 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts