• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 6600

Man I just cant shake the feeling that in a normal market environment this would have been a RX570 successor and priced around $200...
 
From a technical standpoint i find interesting how navi 23 competes against the 3060 with a smaller die, narrower memory bus, and lower power consumption. They should be notably cheaper to manufacture than a 3060. I wonder how would they be priced in a normal market.
$150 - $200
 
Yes. The vanilla 6600 might be the single most efficient ETH mining card on the market right now, with reports of 30MH/s at 50W

The 6600XT at 32MH/s at 55-60W was already pretty damn desirable.

I'm running a farm of 24x RX5700 with 56MH/s at 130W per card, and those are really good mining cards that sell for almost $1000. From a pure efficiency standpoint, a vanilla 6600 is 40% better. There are downsides in that you use more cards in total which requires more overhead in management and more motherboards/mining rigs but overall I think these are worth it for the efficiency tradeoff.
So you're the one causing the exasperating problems lol
 
I have no idea what MH stands for, I`m assuming mining per hour or something along those.
Pretty much yes.

Ethereum blockchain calculation rate for GPUs is usually in the range of 10-90 million hash checks a second, so it's just abbreviated to MegaHash (MH/s)
Hashrate itself is basically "mining performance" and the 30MH/s of an RX 6600 currently makes about $60/month profit.

At $330 an RX 6600 has paid for itself in 5.5 months, after which you have a free RX6600 making you $60/month.
At $600 it takes 10 months to break even.

If you can procure enough cards (say 100) then that's $6000 a month from mining and you can comfortably quit your day job, and if you then make mining a full-time job you can justify buying thousands of cards which is why mining is so disruptive to the GPU supply.

So you're the one causing the exasperating problems lol
I wish! Small time hobby miners who are just getting their feet wet to see what the mining fuss is all about use way more cards than they would as a gamer or regular PC enthusiast, but we're still buying cards at retail and paying the same price as everyone else.

The real problem are the huge mining farms in China and Russia that intercept cards by the thousand before they even reach distributors. I'm in a mining pool with my paltry 1.3GH/s and there are several people in that pool with ~5TH/s each (so 90,000 cards each). This is just one mining pool, and there are at least 6 large mining pools. I am not exaggerating when I say that there are tens of millions of cards mining ETH right now - Ethereum's total network hashrate is closing in on 1 PetaHash/second. That's the equivalent of 33 million RX 6600 cards mining ETH at any one time.

Gamers are screwed, and I'm saying that as an avid gamer.
 
Last edited:
Yeah no hope of getting it with its MSRP. They're over 600EUR here in Finland.
 
The Fighter 6600 card was available in the US for 329$, pretty good deal considering the market conditions.
Also the 3060 is way more, if this card even goes up to 400$, compared to the 650$ for the 3060, is still a better buy.
 
Honestly only thing impressive is that it's not much slower than the 5700XT but's uses 2/3rds the power. Efficiency way up with RDNA2 over RDNA as promised.

Look 7600 will no doubt be a good buy because that will easily be a strong 1440p card and I suspect RT will no longer be an embarrassment for AMD, but given this cards woeful RT and no doubt price north of $500US on the street, keep what you have.
 
Welp, sold out at a price well above what I'm willing to pay. I guess I'll see how the 7300XT performs, maybe some time in 2023.
 
Kinda funny reading all the comments - it's almost like everyone forgot about the pandemic and a very restricted supply chain, both of which have led to inflation in every country.

Even without mining - this would never be a $199 card. Those days are gone. Cutting edge 7NM TSMC, 8GB of gddr6, vastly higher architecture complexity..

People are happy to drop $1k+ on a new iphone yet expect 2015 pricing for gpus...
 
Why did you change your game benchmark suite ?.. rx 6800 xt is not faster than the 3080 at 4k
 
Sigh.
At anything above $200 this card is an insult. Rip.
 
Nice card - though you are right that it should be used to game at 1080p. The relative performance chart clearly shows it falling behind at higher resolutions.

Why is this? VRAM lacking? 8GB should be enough for even 1440p currently, assuming you don't want like 120 or 144 FPS.

now all we need is RTX 3060 for $279+$50 available anywhere :) i'd like to buy 1000 pieces please, so i can resell with 100% profit

You got a spare 300 grand lying around?
 
128bit, 8GB and here for 500€ :kookoo:
 
I like the card but in reality this is a >200 USD card. Can not justify paying 500 for these.
I can see this card being sold for $600 like current 1660 Ti, or even more..
 
Kinda funny reading all the comments - it's almost like everyone forgot about the pandemic and a very restricted supply chain, both of which have led to inflation in every country. Even without mining - this would never be a $199 card. Those days are gone. Cutting edge 7NM TSMC, 8GB of gddr6, vastly higher architecture complexity..
People didn't "forget". It's just that 1. People are getting tired of new "paper releases" and related gushing marketing hype of cards that are instantly out of stock instead of seeing more effort to increase stock of existing cards (ie, if you're short on manufacturing capacity, don't keep diluting what you have got with more variants that aren't that much cheaper), and 2. "Vastly higher architecture complexity" = This is still a 128-bit 1080p card that's selling for 2x more than what 192 bit peer cards sold for and is barely +30% faster than the 2-year old 1660 Super that was selling for half the price pre-cryptopsychosis, so "vastly higher architecture complexity" = +15% per year is not particularly impressive if it isn't priced correctly (the 128-bit GTX 1650 Super was +70-80% faster than the GTX 1050Ti at same price point after 3 years...)
 
Last edited:
People didn't "forget". It's just that 1. People are getting tired of new "paper releases" and related gushing marketing hype of cards that are instantly out of stock instead of seeing more effort to increase stock of existing cards (ie, if you're short on manufacturing capacity, don't keep diluting what you have got with more variants that aren't that much cheaper), and 2. "Vastly higher architecture complexity" = This is still a 128-bit 1080p card that's selling for 2x more than what 192 bit cards (let alone peer cards) sold for and is barely +30% faster than the 2-year old 1660 Super that was selling for half the price pre-cryptopsychosis, so "vastly higher architecture complexity" = +15% per year is not particularly impressive if it isn't priced correctly (the 128-bit GTX 1650 Super was +70-80% faster than the GTX 1050Ti at same price point after 3 years...)
1 - AMD created a lower performance model to salvage the defective dies. That literally is the definition of 'more effort to increase stock'. They can't magically have TSMC make more... Not sure what your point is here?

2 - RDNA2 development didn't happen in a shed with 4 blokes. This new architecture took thousands of engineers years. Far more development time then what was required previously. Le me guess, you could have done so much better.
So again not sure what you point is here?

Finally... my original reply regarding reasons for the current pricing seem to have completely flew over your head. Reminds me of a Simpsons quote: "Old man yells at cloud"
 
People didn't "forget". It's just that 1. People are getting tired of new "paper releases" and related gushing marketing hype of cards that are instantly out of stock instead of seeing more effort to increase stock of existing cards (ie, if you're short on manufacturing capacity, don't keep diluting what you have got with more variants that aren't that much cheaper), and 2. "Vastly higher architecture complexity" = This is still a 128-bit 1080p card that's selling for 2x more than what 192 bit peer cards sold for and is barely +30% faster than the 2-year old 1660 Super that was selling for half the price pre-cryptopsychosis, so "vastly higher architecture complexity" = +15% per year is not particularly impressive if it isn't priced correctly (the 128-bit GTX 1650 Super was +70-80% faster than the GTX 1050Ti at same price point after 3 years...)
While I agree with you on the paper release point. Increasing stock of existing cards isn't necessarily related to too many variants.
The product stack is made by making one wafer and disabling defective parts in chips to try make use of most of the chips on it.

For example, AMD makes a 6600xt wafer then test the chips, all chips that pass will be made into 6600xts. All chips that don't will then have some parts disabled and made into 6600 or even something lower.

If the chips in the 6600 can pass 6600xt tests then it makes financial sense for AMD to sell them as 6600xts and increase their profit margin.
 
About 500€+ here with "orders fulfilled in chronological order" which means "pay now get your card next year... maybe... or maybe we will offer you something worse for same amount of money".

Edit.
Make it 600€+ since the "MSRP" one is not in stock.
1634204418962.png
 
Finally... my original reply regarding reasons for the current pricing seem to have completely flew over your head.
It didn't. Quite the opposite. Die salvaging is certainly a valid point but at the end of the day, you whining that people shouldn't complain about pricing because that's just the way it is, misses the point that there were people who bought the competition's cards at MSRP (for which there is no equivalent opportunity for this card) and for those of us who did, of course we're going to compare buying this as a potential 'upgrade' vs what we actually spent, not what we would spend if we had no GPU. Same reason many 1660S / 1070 / 1080 / 1080Ti owners are in no rush to upgrade here when the "budget GPU" value just isn't there vs holding until things cool down. At the end of the day, comparing 2019 vs 2021 prices is entirely valid for someone who paid 2019 prices and is ultimately deciding "Do I spend hundreds upgrading or keep my 2019 card" for the sake of just 30% fps, more than they are deciding "Do I buy a RX6600 vs RTX3600"?

Reminds me of a Simpsons quote: "Old man yells at cloud"
Ironically, I thought the same thing reading your "Everyone who criticises overpriced budget GPU's must buy an $1100 iPhone" strawman. ;)
 
Last edited:
Yeah its very fishy..
I wouldn't call it that. Obviously you can put a larger cache, which costs die size = money -- eventually your money. To me it rather looks like AMD took a hard look at cache size vs performance and decided that the current size is sufficient for a target of 1080p, which makes perfect sense to me.
 
I wouldn't call it that. Obviously you can put a larger cache, which costs die size = money -- eventually your money. To me it rather looks like AMD took a hard look at cache size vs performance and decided that the current size is sufficient for a target of 1080p, which makes perfect sense to me.
The amd sponsored tiles are skuing your benchmarks.. rx 6800 xt is not faster than a 3080 plus rx 6900 xt is way to close to a 3090.. the games you added are amd hardware friendly.. just my thoughts
 
M
Pretty much yes.

Ethereum blockchain calculation rate for GPUs is usually in the range of 10-90 million hash checks a second, so it's just abbreviated to MegaHash (MH/s)
Hashrate itself is basically "mining performance" and the 30MH/s of an RX 6600 currently makes about $60/month profit.

At $330 an RX 6600 has paid for itself in 5.5 months, after which you have a free RX6600 making you $60/month.
At $600 it takes 10 months to break even.

If you can procure enough cards (say 100) then that's $6000 a month from mining and you can comfortably quit your day job, and if you then make mining a full-time job you can justify buying thousands of cards which is why mining is so disruptive to the GPU supply.


I wish! Small time hobby miners who are just getting their feet wet to see what the mining fuss is all about use way more cards than they would as a gamer or regular PC enthusiast, but we're still buying cards at retail and paying the same price as everyone else.

The real problem are the huge mining farms in China and Russia that intercept cards by the thousand before they even reach distributors. I'm in a mining pool with my paltry 1.3GH/s and there are several people in that pool with ~5TH/s each (so 90,000 cards each). This is just one mining pool, and there are at least 6 large mining pools. I am not exaggerating when I say that there are tens of millions of cards mining ETH right now - Ethereum's total network hashrate is closing in on 1 PetaHash/second. That's the equivalent of 33 million RX 6600 cards mining ETH at any one time.

Gamers are screwed, and I'm saying that as an avid gamer.
My MH/s is 0 :)

can you sell me one of those 5700 cards for $200 cad ;)
 
The amd sponsored tiles are skuing your benchmarks.. rx 6800 xt is not faster than a 3080 plus rx 6900 xt is way to close to a 3090.. the games you added are amd hardware friendly.. just my thoughts
Hmm .. your comment made me curious and I looked at the charts just for you: Days Gone is clearly NVIDIA-friendly, Deathloop and F1 2021 is mostly balanced, Far Cry 6 actually seems pro-NVIDIA, Resident Evil favors AMD. Seems like a reasonable mix? I really don't look at games like this, I add interesting, important new games, do try to cover all the major game engines.

and I'm always open to suggestions for benchmarks, games, anything
 
Last edited:
Back
Top