• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Radeon RX 7600

This is a great card for the price, and one that improves in virtually every facet over its predecessor the RX 6600 all while starting at a lower MSRP. The RX 7600 is ~25% faster than the RX 6600, a substantial improvement. Well done AMD, a great card for the price which will become even sweeter with the deep discounts and free games AMD offers.
~25% faster and ~35% more expensive in my country. An improvement? Sidegrade at best. It would start much cheaper if it was a real 7600. The performance was unfortunately too bad to create it from a cut-out Navi 33 and the full die (the same shader count as in 6600 XT) had to be used.
 
At 270usd the 7600 is worse than last gen 330usd 3060 12GB, let alone upcoming 300usd 4060
 
Correct, but MLID was citing a discussion with AMD partners and being told that the margins were still very healthy at $269.
If that's true, then $300 was just a pure cash grab trying to do the bare minimum to bring anything new to the market.

The disappointing performance (it's a lot less than the 10-15% faster than a 6650XT that everyone expected) means that they have to adjust the price downwards to compete with all the unsold new stock at $250 and all of the used cards at under $200. Nvidia's not their competitor, Nvidia is riding the exploitation train this gen - The real competition is all their own RDNA2 stock both on the store shelves and also the used market.

At this point, I think hardware companies are charging practically whatever they want, really. I understand the need for margins - software development costs, marketing costs, and profit - but I can't realistically see a 7600 having a BoM over $130 at the worst case scenario.
 
Clearly AMD talks to journalist to get feedback and there's a lot of back-and-forth with journalists before the final prices are decided on.
Confirmed. NVIDIA does this too

But I don't think it's part of the pricing process, but rather to get sentiment from the reviewers and to identify potential issues (mostly technical in nature)
 
Just no! The MSRP of 6600 was this high only due to the launch during the crypto boom and GPU shortage. In a normal situation, it wouldn't sell at all. 3060 12GB had the same MSRP, but higher performance, more VRAM, and all the NVIDIA goodies, together with better drivers. Since 2 weeks I see the sales of 6600 at just below $200, so it's a much better reference point.


In other words, you're saying that one should under no circumstances compare similarly priced GPUs like 7600 and 6650 XT, but according to the seller's intent, compare against a worse-performing 6600. Preferably ignoring its $200 market price and sticking to its crypto-inflated MSRP from 2 years ago.

Also, you seem completely unaware of the fact that the 7600 is a rebranded 7600 XT that takes a full Navi 33 die. Just like its 6600 XT predecessor. It was an emergency decision taken when the Navi 3 performance was so far off the targets that as you can see the full Navi 33 can just barely beat the 6650 XT. 7600 is basically a mostly unsuccessful 6650 XT refresh in 6nm. The efficiency gain from 5+6nm that comes with 7900XTX and XT (and is still inferior to Ada) is nowhere to be found.
How much faster is the 6650XT vs the 6600?
 
This shows how stupid people are and lack critical thinking. We knew Navi 33 die size and manufacturing node for a while now.

How would AMD raise or maintain IPC, while shrinking the die on what is largely the same manufacturing process, while doubling the shader count( and tflop count)? Particularly without the xfactor chiplet presents(at the time). Particularly with barriers which should limit performance. People somehow expected the impossible possible.

Some of these impossibilities included violating the laws of physics(double transistor density while not running into heat issues on the same node), fix some glaring flaw which made RDNA2 inefficient in regards to performance per die space(sarcasm). Magically get over bandwidth barriers with only l3 cache only 1 quarter of the size.

No one thought of these potential and obvious roadblocks which should temper their expectations(atleast in the AMD fanbase). It should have been obvious that AMD was doing something very cheap in terms of silicon which meant IPC was going way down(gaming performance per teraflop). Being able to to more than double performance, while not increasing die space, on the same node with some severe bottlenecks in place is impossible, particularly with how well RDNA2 was designed.

But people ignored this and all I can blame is the tribalism effect. People just following each other, believing a youtuber, while not evaluating if it's true or not. Simultaneously not listening to reason which is valid because they would rather feel good and be stupid.

AMD have form for doing it.

RV670 -> RV730 is a great example of the old high end being on par with the new low end.
Also RV770 -> Juniper (5770) was similar but that had a die shrink as well.

So the idea that N33 could be to N21 as RV730 was to RV670 is not exactly without precedent. I also remember the 4000 series hype train and that was a lot like RDNA 3 except it had a payoff at the end. If only AMD had made a 1,200 shader version to take the performance crown, they would have probably gained a lot more market share and who knows what things would look like now.
 
Confirmed. NVIDIA does this too

But I don't think it's part of the pricing process, but rather to get sentiment from the reviewers and to identify potential issues (mostly technical in nature)
This is the first GPU I've seen where journalists are stating that AMD asked them for feedback about their originally $299 MSRP.
It's a new thing, by the sounds of it and a good thing, since most journalists are writing for the consumer, and consumers are their revenue stream.

How much faster is the 6650XT vs the 6600?

About 20%

1685016962612.png
 
Last edited:
This is the first GPU I've seen where journalists are stating that AMD asked them for feedback about their originally $299 MSRP.
It's a new thing, by the sounds of it and a good thing, since most journalists are writing for the consumer, since consumers are their revenue stream.



About 20%

View attachment 297518
So I am going to bring back my initial thought of why if the 6600 exists people are not using that as the comparison. Are we all AMD employees and know for a fact that they are not releasing the 7600XT?
 
where journalists are stating
It's basically standard and happens all the time .. for decades .. some reviewers are happy to feedback, others rather keep everything secret until publishing. Some companies accept the feedback, some are too lazy to do anything with it, some feel offended.
 
At 270usd the 7600 is worse than last gen 330usd 3060 12GB, let alone upcoming 300usd 4060
Until you watch the Level 1 video. What you are saying is not true for the purpose of this card. Of course here will come DLSS and Ray Tracing,
 
The name change was definitely earlier. We've been seeing box leaks without the XT for a while now (weeks).
Yeah that's what I was referencing in the second paragraph - I can't remember exactly when the box leaks first hit, IIRC it was at the beginning of this month when the 40-series was announced, not launched.

The cards themselves have been in production for months, so dropping the XT from then name in the last few weeks is definitely a bit of a last-minute move in the grand scheme of things.
 
since most journalists are writing for the consumer, since consumers are their revenue stream.
While that is certainly true, I also consider samples received from vendors an opportunity to provide my feedback that doesn't get lost in multiple layers of support, and that eventually will benefit the consumer because the next product might be better.
 
Look at it this way, if stagnation is the name of the game then your GPU purchase will last longer!

Keeping a GPU for 6 years was madness a decade ago. Now its normal.
People forget this far too easily.

Stalling performance gen-to-gen enabled me to get 150,- on a 6- year old GTX 1080. Back in the Kepler days I bought a 780ti for 175,-, and that was 3 years post release... its like a completely different world.

Now more than ever does it REALLY matter that you buy cards that are specced for longer usage. So... yep... here he is again... with enough VRAM...

Buying anything midrange by that notion is actually off the table unless you love to burn money for mediocre ~ subpar gaming performance.
Basically, high end or bust, regardless of your budget. Anything x70ti and below is basically pointless @ green, and anything below 16GB is pointless on RDNA2/3. This was my conclusion the moment the x70's were released. Lower wasn't getting better perf/$... so you should stay far away from it, simple enough. Reselling midrange more than 2~3 years away from its release is difficult, and if you do it, its value has plummeted barely making it worth the effort.

Look at the current Ampere situation and you know gen-to-gen upgrading or reselling is an absolute no-go. Nobody wants last gen perf and featureset for new gen's current price and you don't want to halve the price of your Ampere GPU either because you don't own it long enough. You're literally stuck between rock & hard place then, even despite the presence of 8GB or having more.
 
Last edited:
So I am going to bring back my initial thought of why if the 6600 exists people are not using that as the comparison. Are we all AMD employees and know for a fact that they are not releasing the 7600XT?
The 6600 is being used as a comparison. It's a $199 part with a free game and the 7600 is 35% more expensive and only 24% faster, therefore worse value for money and not very interesting. Also, there's been no mention of a free game with the 7600, though I cannot see retail listings yet to confirm or deny that.

Why would you buy the 7600 unless you wanted to pay extra just for the AV1 encoding - and if you're the sort of person willing to pay extra for an encoder, why on earth are you looking at AMD cards? NVENC has a significant advantage still across all three major codecs.

As for knowing for a fact that they are not releasing the 7600XT, that's been covered plenty in this thread already, but we DO know for a fact that this is full-die Navi33. You don't need to be an AMD employee to know that there's nothing they can release in this product family with more shaders, memory channels, or bus-width. At best they can double the VRAM and overclock the snot out of it, but that'll hurt the power efficiency and it's not exactly rocking the boat in that department at the moment so killing the power efficiency just ruins the one small victory it has over the RDNA2 cards.
 
How much faster is the 6650XT vs the 6600?
Basically the same as the 7600 minus 3-4%. I can get a 6650 XT for 226$. Sale from yesterday and it's still in stock. Normally it wouldn't last long but people are quite saturated with cheap 8 GB GPUs.
 
So I am going to bring back my initial thought of why if the 6600 exists people are not using that as the comparison. Are we all AMD employees and know for a fact that they are not releasing the 7600XT?

Here's the deal, this card was meant to be a 7600 XT all along. Realizing its performance, they just knocked it down one tier and shipped the full die as a non-XT anyway. Then decided to pitch it against the 6600 and not the 6650 XT, because the entire industry would have laughed at them wasn't for that SKU knockdown + price cut. A 20% gain is marketable, a 4% one is not. No one would care.

Also looks like the Gigabyte card went up for sale (pre-order?) in the first EU store for... 460 euros :roll:

Geizhals link
 
Last edited:
MRSPs for the 6000-series were universally criticised in every review as BS, and that entire generation failed to hold MSRP even before the ETH mining boom was over - simply because the older-gen cards like the 5700XT were much better miners.

AMD tried to cash-grab because of the mining boom, and all it did was make their lineup look overpriced at launch, and get scathing reviews because of the high MSRP.

Now that their RDNA3 cards are underwhelming and underperforming, they're going back to their unjustifiably (and unrealistically) high MSRPs from the bad old days and waving that around as their defense! HUB on youtube does great monthly GPU pricing updates and you can see going back nearly three years that the real price of the 6600 has been under $250 (newegg) for about 9 months since ETH mining ended, and for the last 7 months it's been under $220. That $330 price was bogus from day one, even in the height of ETH mining so you cannot trust the marketing lies!


Correct, but MLID was citing a discussion with AMD partners and being told that the margins were still very healthy at $269.
If that's true, then $300 was just a pure cash grab trying to do the bare minimum to bring anything new to the market.

The disappointing performance (it's a lot less than the 10-15% faster than a 6650XT that everyone expected) means that they have to adjust the price downwards to compete with all the unsold new stock at $250 and all of the used cards at under $200. Nvidia's not their competitor, Nvidia is riding the exploitation train this gen - The real competition is all their own RDNA2 stock both on the store shelves and also the used market.
I don't disagree one bit. Only point I was trying to make is the trending of prices going downward is a good thing for consumers. I think part of the issue is a crap ton of features that probably aren't needed or used (often).................in a lower end card. RT usually is meh anyway at this level, the AI thing using up transistors, etc................................if you converted all those transistors strictly for gaming fps, spend about $10 for 16x pcie and a 256-bit bus (even if gains are negligable) it would probably gain a decent amount of perfomace overall (fps wise)?

I guess my question is.......what would be a great mid-ish card for a great price today with rampant greedflation??

And before you answer remember............the ATI HD4850 came out in 2008 for ~$200USD it was pretty well acclaimed mid-ish card for the increase in perf over the past generation.

So 15 years later what would the expectation be for a mid-ish card and price?? An RX6700XT for $250 USD? (I mean obviously from a consumer stand point, because we all know what companies would like to charge)
 
I guess my question is.......what would be a great mid-ish card for a great price today with rampant greedflation??

And before you answer remember............the ATI HD4850 came out in 2008 for ~$200USD it was pretty well acclaimed mid-ish card for the increase in perf over the past generation.

So 15 years later what would the expectation be for a mid-ish card and price?? An RX6700XT for $250 USD? (I mean obviously from a consumer stand point, because we all know what companies would like to charge)

One of the first gaming GPUs I've owned in the past was the GTX 275, which was a slightly lower-cost version of the GTX 285 on a GTX 260 PCB + memory configuration. Came with a fully enabled GT200B (470 mm²), 219 watt power spec and a 448-bit memory interface, you can't even argue that it was a cheap/low complexity PCB. It was effectively a single core GTX 295, with the same 896 MB and everything. MSRP $249 back in the day:


Adjusting for US inflation, it'd be an $350 card today

1685023451093.png


Let's throw in rising R&D costs etc, costs of the best, newest components to build such a card... let's be really generous here, it'd be an $499 GPU at best today. And remember, back then they sent free game copies on actual physical disks (I still have the physical copy of GRID that came with it), tons of display head adapters, power cables, had quality packaging, etc. none of which are expenses they have today.
 
I have a 2060 6GB, is it worth upgrading to this 7600? or better another GPU?

I only play at 1080p and single-player games.
 
I have a 2060 6GB, is it worth upgrading to this 7600? or better another GPU?

I only play at 1080p and single-player games.
Go for a 6700XT/4070, or wait to see if RX 7700 is ever released.
 
I have a 2060 6GB, is it worth upgrading to this 7600? or better another GPU?

I only play at 1080p and single-player games.

I would wait.

I would not be surprised if 7600 drops in price when 4060 releases. I would not be surprised if there is a 16GB variant in the wings either.

If you really need something right now then the 6700XT or 6800XT are very good options depending on budget. The 6800 is also pretty good and in the UK can be picked up for £460 right now. Just 11% more than the 4060Ti goes for and you get 16GB of VRAM and a better performing part overall.
 
I have a 2060 6GB, is it worth upgrading to this 7600? or better another GPU?

I only play at 1080p and single-player games.

If the 2060 is still breathing life into your gaming performance targets/tolerance, stick with it for a while longer and see how things shape up over the next few months (price cuts, newer model releases, increased VRAM SKUs, etc).

If you've already hit the graphics tolerance threshold and need a replacement ASAP - although the 7600 sees a decent performance uplift over the 2060, I wouldn't touch it at the given asking price and skimped spec sheet. There are similar performing previous Gen models available for "less" money or for a little more cash on top there are better performing and more credible specced offerings to carry forward. The latter being the more befitting option for a more appealing upgrade - but thats down to your budget (?)
 
I don't disagree one bit. Only point I was trying to make is the trending of prices going downward is a good thing for consumers. I think part of the issue is a crap ton of features that probably aren't needed or used (often).................in a lower end card. RT usually is meh anyway at this level, the AI thing using up transistors, etc................................if you converted all those transistors strictly for gaming fps, spend about $10 for 16x pcie and a 256-bit bus (even if gains are negligable) it would probably gain a decent amount of perfomace overall (fps wise)?

I guess my question is.......what would be a great mid-ish card for a great price today with rampant greedflation??

And before you answer remember............the ATI HD4850 came out in 2008 for ~$200USD it was pretty well acclaimed mid-ish card for the increase in perf over the past generation.

So 15 years later what would the expectation be for a mid-ish card and price?? An RX6700XT for $250 USD? (I mean obviously from a consumer stand point, because we all know what companies would like to charge)
Mid-tier cards were traditionally half the resources of the flagship - so when the flagships are 384-bit, 24GB cards with ~6000shaders, the mid-tier cards should be 192-bit, 12GB cards with ~3000 shaders. That's not a million miles away from the 6700XT and probably what Navi32 will be - but the outlier is the cost; $200 in 2008 is $340 in today's market adjusting for inflation and tariffs only.

It's telling that AMD almost tried to sell the entry-level 7600(XT) at $330 before hastily renaming it and reducing the price twice to $269. We should have been getting the 7700XT for that sort of money if 'rampant greedflation' wasn't gripping the GPU market.
 
What this proves for the first time since RDNA3 launched, is that RDNA3 is worthless.

We have a near perfect comparison with the 6650XT - the only significant difference being the architecture - and it achieves precisely nothing.
Same thing I thought. RDNA 3 ipc is only about 7-10% higher at best. I think this round they mostly prioritized decoupling the chips to chiplets and making it work architecturally, rather than going for maximum performance and efficiency.
 
is it worth upgrading to this 7600? or better another GPU?
You must be really hypersensitive to notice this ~40 percent performance uplift. As in you now play ~30 FPS and +40 percent gives you cancer 42 FPS which is the opposite of impressive.

The least reasonable upgrade is twice the performance of your current card. You just invest money in bull cake otherwise.

Wait till cards like 6900 XT or 3080-12 (or even 4070 Ti) become affordable for you and make a purchase. There is no such thing as excessive performance.

Oh and make sure your PSU can handle these extremely high TDP monsters. I really hate modern GPUs for being like that.
 
Back
Top