• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 6600 XT, 6600 to Feature Navi 23 Chip With up to 2048 Stream Processors

So, we get a 1000$ card that performs at the level of 5700 xt from 2019 that was 400$, great.
1000$ is optimistic, at launch day they might go well over 1000$

Yep, 5700XTs go for ~1100$ on the second hand market in my country atm and I don't really expect this to be any cheaper.

Tbh I'm not even sure whats the point of relasing new cards in this current situation.:rolleyes: 'I mean ppl like me who are waiting for normal priced mid range cards can't afford these anyway'
 
Pro 5600M is Navi12 based, and was made for Apple to use in their Macbook. Navi12 has same shader count and CU as Navi10+some new instructions and uses HBM instead of GDDR. Pro 5600M and RX 5600M uses different GPUs.
What i am tring to say that AMD has capability of making low power gpu, but there needs to a market for it. If OEM and People only pushing Nvidia and ignoring AMD, AMD will also ignor that market segment. There is 50W RX 550, which is faster than GT1030, but always gets ingored.
The RX 550 gets ignored because it gets obliterated by the 1650 in absolute performance, and the GT 1030 in perf/watt. When it came out it was losing to the 1050 in the 50w catagory while costing as much as the 1050ti. Before the current GPU nonsense the RX 550 was commonly going for $140-150, which is just too much for what it offers.
 
The RX 550 gets ignored because it gets obliterated by the 1650 in absolute performance, and the GT 1030 in perf/watt. When it came out it was losing to the 1050 in the 50w catagory while costing as much as the 1050ti.
RX 550 is not competing with GT 1050 /Ti /1650 . It is/was competing with GT 1030 with similar price and more memory. And 1650 is slower than 2017's RX 570.
 
The relatively low ETH hashrate pefrormance might make them available to gamers on a budget. If priced correctly ($250-300) those GPUs could save the day for gamers and lower vastly the prices on the used market that are crazy atm. AMD must make many to help the situation. Let's see...
 
For what it's worth, almost everything AMD makes is on N7P now, which is the successor to the original N7(FF) node. I could be wrong but it looks like the N7+ node (EUV lithography) is being sold only to mobile chip makers like Qualcomm and HiSilicon (Huaweii), and almost all customers who originally used N7 have shifted to N7P - possibly because TSMC is converting N7 to N7P as it phases out N7.

Wait hold up, where did you hear that they shifted to N7P? There was a whole lot of speculation that 3000XT and 5000 CPUs had made the jump due to the improved volt/freq but I thought AMD put the matter to rest after they specifically confirmed that the chips were still made on N7FF?

Maybe I missed something about the new Navi 2x parts on N7P but I can't find anything beyond pre-release speculation about the CPUs being on N7P.
 
RX 550 is not competing with GT 1050 /Ti /1650 . It is/was competing with GT 1030 with similar price and more memory.
RX 550 WAS competing with the 1650, as at least in america it was the same price. The 1050ti was $150 and the 1650 was commonly available at $160.
And 1650 is slower than 2017's RX 570.

The RX570 is in a totally different class of card, and requires extra power.
 
RX 550 WAS competing with the 1650, as at least in america it was the same price. The 1050ti was $150 and the 1650 was commonly available at $160.
RX 550 had launched price of $80, same as GT 1030 GDDR5. You need to see TPU's GPU database.
The RX570 is in a totally different class of card, and requires extra power.
RX 550, compared to GT 1650 is also is different class of card, $80 vs $150 MRSP. 50W vs 75W + almost half of the cards with 6-Pin power connector.
 
Those scores place these cards in the same ballpark as the RDNA-based RX 5700 XT and RX 5700.

Decent upgrade, IF 6600 would be priced at 249 and 6600XT at 299 bucks (or lower). Anything above that is a hard pass.
Not a chance.

I picked up a 5700xt Raw II last year in April for my new build at around $370 from Best Buy. 5700XTs are currently going for around $1k on eBay.

Assuming that the performance of the 6600xt and 6600 are comparable, if not slightly better, than their forebears, I can't see AMD pricing them that low while still maintaining the 6700xt at it's MSRP. That's too big of a gulf, and if their performance is siginficantly closer to the 6700xt than the 5700xt is, pricing that low definitely wouldn't make sense.
 
Wait hold up, where did you hear that they shifted to N7P? There was a whole lot of speculation that 3000XT and 5000 CPUs had made the jump due to the improved volt/freq but I thought AMD put the matter to rest after they specifically confirmed that the chips were still made on N7FF?

Maybe I missed something about the new Navi 2x parts on N7P but I can't find anything beyond pre-release speculation about the CPUs being on N7P.
Honestly, it's confusing and there's no direct statement from either AMD or TSMC specifically naming AMD's process.

The clearest information probably comes from Dr Ian Cutress speaking to AMD directly to clarify that N7+ as shown in AMD's Zen3 product slides referred to "better than N7FF" and not actually the coincidentally named N7FF+ EUV process (3rd gen) at TSMC. That leaves only one possibility - N7P:

"In speaking with AMD, the company confirmed that its next generations of 7nm products are likely to use process enhancements and the best high-performance libraries for the target market, however it is not explicity stating whether this would be N7P or N7+, just that it will be ‘better’ than the base N7 used in its first 7nm line.​

There was also an article about one of the TSMC investor calls about 18 months ago now that mentioned they were upgrading most of their 1st gen N7FF production to 2nd gen after successful launch of their 2nd gen process. If you connect the dots I believe that means that TSMC converted existing N7FF production into N7P as those are the node names for 1st gen and 2nd gen 7nm at TSMC respectively. I don't know if TSMC has any 1st gen fabs left, or whether all 1st gen fabs are now 2nd gen. All I can say for certain is that a lot of the 1st gen production capability was removed, making any remaining 1st gen much more scarce than it was for the Zen2/Navi launch period if it even exists at all now.

As I said, there's no direct statement, but there are credible sources like Ian Cutress and TSMC's own investor calls.
 
Last edited:
Pro 5600M is Navi12 based, and was made for Apple to use in their Macbook. Navi12 has same shader count and CU as Navi10+some new instructions and uses HBM instead of GDDR. Pro 5600M and RX 5600M uses different GPUs.
What i am tring to say that AMD has capability of making low power gpu, but there needs to a market for it. If OEM and People only pushing Nvidia and ignoring AMD, AMD will also ignor that market segment. There is 50W RX 550, which is faster than GT1030, but always gets ingored.
RX 550 is not really a GT1030 competitor. In Australia the RX 550 is hard to purchase and expensive $200 AUD. Low profile RX 550 cards are usually out of stock. For non-gaming HTPC use, the GT1030 aircooled is hard to beat. I have two. However the AMD Ryzen™ 4000 Series Renoir Mobile Processor with Radeon™ Vega 7 Graphics is a better solution. However, this is very hard/impossible to obtain for self builds. I also have an Asus PN50 and it is very capable.
 
We'll likely see one sooner or later. It's likely we'll get one with 1280 cores or so, the RX 560 was a 960/1024 core card.

I need a 75 watt low profile GPU to replace my RX 560 in my low profile media PC. The 560 is long in the tooth, and nvidia's linux drivers leave a lot to be desired. With the RTX 3050 being a 90w card, all hopes are that AMD makes a 75w card. Likely a RX 6500 or RX 6400 card.
AMD unusually pushes out a 75w 1-slot Radeon Pro card so if they haven't yet for RDNA2 I'm pretty certain it'll come eventually. They tend to be cheaper more affordable Radeon Pro cards as well than their higher end workstation cards since they are targeted more at the low end workstation card segment.I defiantly think they could just reduce clock speeds and voltage to get a 75w card since voltage is squared. I'm not sure it would be a major step forward though for cards in that TDP for factor. It would in some ways I suppose, but as other pointed out the ray tracing silicone would eat into a portion of the rasterization improvements. It would still have improvements though like the infinity cache's impact on bandwidth plus all the architectural improvements to RDNA2 over RDNA like the upscaling that's going to be coming around the corner.

I actually think AMD could combine mixed GPU usage with a low end card like it along with a APU and get some reasonable results using one for rendering and the other for post process improvements and upscale. The sooner we get a real ecosystem supporting secondary GPU's usage for post process IQ improvements the better. They could be discrete or APU based, but a lot of people have access to a secondary slower GPU these days that could be put to better usage. It's kind of a mostly untapped potential to be exploited though for 3D modeling spare GPU resources have been put to use for awhile. I'm a real fan of what Lucid Hydra aimed to achieve, but for technical reasons couldn't pull off well enough at that point in time and on their budget. The mCable is another example as well and if they can do that with a tiny lil dongle chip obviously more advanced post process can be achieved by something like a APU or spare older discrete GPU.
 
Last edited:
In these times of shortages, I really wish AMD and Nvidia allocated more wafers to mid-range GPUs. While I know the high-end ones creates more revenue, there is also something about satisfying the customer base.

I'm a bit off 75w gpu fanatic with no power cables. Would love to see AMD implement one from their camp. But with the latest trend of increasing power consumption on the cards, I'm resigned on the possibility off it.
Having low TDP GPUs are great, but I think anything beyond ~50W should have a separate power connector.
I believe the PCIe spec only allows for 66W on the 12V rail through the PCIe slot, and remember that the power draw of graphics cards tend to have a bit of fluctuation

We'll likely see one sooner or later. It's likely we'll get one with 1280 cores or so, the RX 560 was a 960/1024 core card.

I need a 75 watt low profile GPU to replace my RX 560 in my low profile media PC. The 560 is long in the tooth, and nvidia's linux drivers leave a lot to be desired. With the RTX 3050 being a 90w card, all hopes are that AMD makes a 75w card. Likely a RX 6500 or RX 6400 card.
There is nothing comparable to the quality of Nvidia's official Linux driver.

So what's the issue with your RX 560? Is it worn out? Because I don't think any new low-end GPU is going to offer that much of a performance uplift.
And since you are focusing so much on TDP, what's your restriction? Is it cooling, PSU?
Most of today's GPUs do consume very little during non-gaming, so is it really a problem to put in a 150W GPU? (if you could find a low-profile version)
 
thats why i said "120w" and not 75w. old 1060 > TSMC 16 nm GPU and sucks that. nvidia just sucks hard now. they forgot how to made gpus w that kind of W envelope, and increased TDP to this 180- 350- 450W BS ( AMD even worst)
 
RX 550 is not really a GT1030 competitor. In Australia the RX 550 is hard to purchase and expensive $200 AUD. Low profile RX 550 cards are usually out of stock. For non-gaming HTPC use, the GT1030 aircooled is hard to beat. I have two. However the AMD Ryzen™ 4000 Series Renoir Mobile Processor with Radeon™ Vega 7 Graphics is a better solution. However, this is very hard/impossible to obtain for self builds. I also have an Asus PN50 and it is very capable.
RX 550 and GT 1030 both launched with $80 MRSP, that is why they are compitator to each other. In my country before the mining GT 1030 and RX 550 priced similarly. Some version of GT 1030 2GB GDRR5 was/is expensive than 4GB RX550 and some are cheaper than 2GB RX 550.
 
In these times of shortages, I really wish AMD and Nvidia allocated more wafers to mid-range GPUs. While I know the high-end ones creates more revenue, there is also something about satisfying the customer base.

Yeah. Short term profits come at the expense of abusing the majority of their customer base, who will either:

1) Be more interested to try another brand. Hell, that could even be Intel in the near future.​
2) Give up and switch to consoles or other HTPC/entry-level alternatives.​
3) Give up and buy something on the used market instead.​

All three of those scenarios are lost sales and two of them are lost future customers too.
 
In these times of shortages, I really wish AMD and Nvidia allocated more wafers to mid-range GPUs. While I know the high-end ones creates more revenue, there is also something about satisfying the customer base.
Sadly i agree, i/we feel completly neglected here. I would like to upgrade but not at those ludicrous prices, to be honest in my mind and on paper the RX480 and RDNA1 was very similiar spec wise, but a lot more expensive. As time passes AMD becomes more and more expensive. Not a good trend that i like to see to be honest..

1) Be more interested to try another brand. Hell, that could even be Intel in the near future.
I like ATI/AMD colors better, so i would like to stick to them for now.
2) Give up and switch to consoles
The games i play (other than some AAA tiltes) are on PC so that's not an option.
3) Give up and buy something on the used market instead.
Sadly yes, that an alternate and cheaper option. That's how i bought most of my cards and CPUs, with warranty though.
 
Yeah, 2080Ti perf/3070 perf, but 12GB and MSRP of $479, such a "terrible value". Almost like 3070s. :D
Bro, it IS terrible value, because 2080Ti WAS and IS an absolute ultra RIP OFF. Compared to 2080Ti, the 3070 SEEMS the super deal of the century, but guess what, it is also overpriced. You should not forget past generations, or did the turing generation erased everyones memory?

Add a $100 to those prices. Unfortunately
I'm really curious about prices, if these are same perf of 5700/5700XT and those debuted at $350/$400, then the 6600/6600XT should be $250/$300 if not why would anyone buy last gen perf at the same price?

On a side note, 5600XT debuted at $280...
 
Last edited:
Bro, it IS terrible value, because 2080Ti WAS and IS an absolute ultra RIP OFF. Compared to 2080Ti, the 3070 SEEMS the super deal of the century, but guess what, it is also overpriced. You should not forget past generations, or did the turing generation erased everyones memory?

We bought 2080Ti cards because they had 11GB VRAM. They were a rip-off for gaming but when your alternative was €2000 for an RTX5000 which performs significantly worse, you pay the €1250 for a 2080Ti.

For gaming the 5700-series really were the kings of that generation. At $400 they bascially killed it for anyone except the 1% who were in the market for a $1000 GPU. No, they didn't have RTX but we're now almost 4 years on from RTX launch and reviewers/streamers/regular people are still undecided on whether it's any good, with an extremely limited number of supported titles. I have bought three RTX cards for personal use and still have two of them and not once have I ever found the RTX features to be of real value; An interesting novelty that's worth looking at? Sure, but not something I've ever left turned on for more than a couple of hours as the tradeoffs in performance AND texture quality just disappoint me to much to accept (DLSS 2.0 is good but I can count the titles where it's better than being left off on the fingers of one hand).

The 5700-series was priced to end Nvidia's monopoly pricing on higher-end GPUs and it succeeded. Not only did leaked performance/pricing of the 5700/5700XT force Nvidia to slash their RTX-card prices by around 30% with the Super re-brand of the RTX lineup, it also made a mockery of Nvidia's overpriced 16-series, because the performance/$ of the 5700 was better than the lower-end 16-series card. When the 5600XT came out at the same price as the 1660Ti it was pretty clear Nvidia was ripping you off.

Turing was, both at launch, and after the Super mid-life refresh, a rip-off.
 
3070 SEEMS the super deal of the century
It is refreshing to see two GPUs with the same ballpark price and preformance, but one being super deal of the century, while other (curiously, the one with more VRAM) being of terrible value.

It's as if someone was biased or something... :D
 
It is refreshing to see two GPUs with the same ballpark price and preformance, but one being super deal of the century, while other (curiously, the one with more VRAM) being of terrible value.

It's as if someone was biased or something... :D
I'm talking MSRP :)
 
Back
Top