• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

When will gpu prices return to normal.

Status
Not open for further replies.
I am not looking at them - you brought them in the discussion.

The 6800 XT is double the performance of 6600.
Uhm... Your memory seems to be failing. Someone compared its value to the RTX 3050, after which you jumped in with "it's still terrible value", which you "argued for" by quiting some silly 2160p benchmarks.

Also, the 6800 XT is more than double the price of the 6600. So, if it's more than double the price, for double the performance, that's actually worse value, no?
 
While I mostly agree with you overall, Pascal was a generation dominated by a stretching and shifting of product segments - with the "60 tier" stretching into not one, not two, but ... four? five? GPUs? The RTX-GTX split ensured that, with the sudden appearance of 1660, 1660 Ti, 1660 Super, 2060, and 2060 Super. And arguably a sixth with the 2060 12GB, though that was much later. The 1660 slotted in at the traditional "60-tier" price level, while the 2060 was the one everyone actually noticed, as it was presented as far more attractive (and performed much better despite nominally sharing a tier). So things got confusing real fast there. One would think Ampere would smooth that out with GTX cards disappearing, but it sure doesn't seem that way.
Yeah to be 100% honest I completely forgot those cards even existed. Those cards though where still 300-350 euro range at launch so that's way off from the "179-229$ price range" and "should cost no more than 200" nonsense he's spouting
 
Uhm... Your memory seems to be failing. Someone compared its value to the RTX 3050, after which you jumped in with "it's still terrible value", which you "argued for" by quiting some silly 2160p benchmarks.

Also, the 6800 XT is more than double the price of the 6600. So, if it's more than double the price, for double the performance, that's actually worse value, no?

The 6800 XT is a better value. The 6600 is a piece of junk because it's also a badly designed product, and its price makes the things much worse.

1660883363697.png

1660883389120.png

AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble - Control | TechPowerUp
MSI Radeon RX 6600 XT Gaming X Review - Control | TechPowerUp
 
You’re a halo user. The market isn’t there yet. The vast (vaaaaaast) majority of users don’t play at 4K and, even if they did, all GPUs are garbage (and that’s not entirely true, as you get much better performance margins, according to reviews, with their caveats, with more expensive products). All of your claims about the economics underlying the actual limits of technology for your xxxtreme requirements don’t have any actual foundations in how the economy is organized. You are designed to be to fucked by the market. That’s consumer choice.
 
The 6800 XT is a better value. The 6600 is a piece of junk because it's also a badly designed product, and its price makes the things much worse.

View attachment 258602
View attachment 258603
AMD Radeon RX 6800 XT Review - NVIDIA is in Trouble - Control | TechPowerUp
MSI Radeon RX 6600 XT Gaming X Review - Control | TechPowerUp
You seem awfully fond of using Control as an illustration, why might that be? Oh, right, it's a major outlier that greatly exaggerates the performance difference between these two GPUs. Go figure. What you're saying here is pure nonsense. The 6600 is much better value than the 6800 XT.
 
You’re a halo user. The market isn’t there yet. The vast (vaaaaaast) majority of users don’t play at 4K and, even if they did, all GPUs are garbage (and that’s not entirely true, as you get much better performance margins, according to reviews, with their caveats, with more expensive products). All of your claims about the economics underlying the actual limits of technology for your xxxtreme requirements don’t have any actual foundations in how the economy is organized. You are designed to be to fucked by the market. That’s consumer choice.

That's not true. Every card above the Radeon RX 6800, including the XT, 6900 XT and 6950 XT, not to mention the new generation which is due in several weeks do perfectly fine support 4K gaming.
It's just that the 6600/XT is heavily limited, be it low VRAM, low shaders performance, low PCIe bandwidth, etc...
 
That's not true. Every card above the Radeon RX 6800, including the XT, 6900 XT and 6950 XT, not to mention the new generation which is due in several weeks do perfectly fine support 4K gaming.
It's just that the 6600/XT is heavily limited, be it low VRAM, low shaders performance, low PCIe bandwidth, etc...
... or, maybe, it's the entirely common and normal fact that as you step down in price, you also step down in performance, and that 2160p is still unequivocally a high-end resolution? You keep going "waa, waa, the 6600 is terrible value because it doesn't game well at 2160p", yet ... value compared to what? There is nothing that provides a notably better value at that resolution, only things that deliver better absolute performance. Value is a function of price and performance. You're using the world 'value' in a way that just doesn't make sense. You want more than GPUs of that class can deliver. That's fine, but it doesn't make them poor value, it just makes them unsuited to your use. And, again, your examples are ludicrously obvious cherry-picking. The base RX 6600 does decently at 2160p for its class, even at stupid Ultra settings:
average-fps-3840-2160.png

The 3060 is marginally faster, with the 3060 Ti pulling clearly ahead, just barely cracking 60fps average.
average-fps_3840-2160.png

What does this tell us?
- That the 6600 can play 2160p perfectly fine as long as it's not forced to run at Ultra. If it does 40fps average at Ultra, it'll do >60 at medium-high easily. That doesn't mean it'll hit 60 in the outlier titles, but then, seeing how the 3070 didn't crack 30 in CP2077, that's hardly surprising.
- For its price, it's a decently competitive card even at 2160p despite RDNA2 not scaling that well to higher resolutions. You need to step up to the much more expensive 3060 Ti for a meaningful increase.
- The class of performance that you seem to be asking for is simply not where this card sits in the product stack, and it is priced accordingly. This makes your claims about it being poor value fall apart entirely.
 
... or, maybe, it's the entirely common and normal fact that as you step down in price, you also step down in performance, and that 2160p is still unequivocally a high-end resolution? You keep going "waa, waa, the 6600 is terrible value because it doesn't game well at 2160p", yet ... value compared to what? There is nothing that provides a notably better value at that resolution, only things that deliver better absolute performance. Value is a function of price and performance. You're using the world 'value' in a way that just doesn't make sense. You want more than GPUs of that class can deliver. That's fine, but it doesn't make them poor value, it just makes them unsuited to your use. And, again, your examples are ludicrously obvious cherry-picking. The base RX 6600 does decently at 2160p for its class, even at stupid Ultra settings:
average-fps-3840-2160.png

The 3060 is marginally faster, with the 3060 Ti pulling clearly ahead, just barely cracking 60fps average.
average-fps_3840-2160.png

What does this tell us?
- That the 6600 can play 2160p perfectly fine as long as it's not forced to run at Ultra. If it does 40fps average at Ultra, it'll do >60 at medium-high easily. That doesn't mean it'll hit 60 in the outlier titles, but then, seeing how the 3070 didn't crack 30 in CP2077, that's hardly surprising.
- For its price, it's a decently competitive card even at 2160p despite RDNA2 not scaling that well to higher resolutions. You need to step up to the much more expensive 3060 Ti for a meaningful increase.
- The class of performance that you seem to be asking for is simply not where this card sits in the product stack, and it is priced accordingly. This makes your claims about it being poor value fall apart entirely.

Yeah, let's agree that the 6600/XT is a card for 1080p medium.
Which is pathetic for its current price range.
 
I'm too much of a GPU elitist to really chime in on this one, but Navi 23, while not really a 4K card shouldn't put too ugly a show at 1440p targeting 60 fps imo. The 6650 XT (fully enabled and enhanced core) might even handle some of the lighter games at 4K 60 just fine, I guess. The mobile RTX 3050 (so GA107 with 16 SMs out of 20 enabled), at 80W, generally does 1080p60 perfectly well in most games, the 4 GB VRAM being its real issue IMO. RX 6600 can't really be worse than that.
 
I`m still wating for a 970GTX level of pref/$$$
 
Yeah, let's agree that the 6600/XT is a card for 1080p medium.
Which is pathetic for its current price range.
Like ... what? Man, your reality distortion field is strong, clearly.

First off: The 6600 and 6600 XT are not "a card", they are two distinct SKUs with quite distinct performance levels.

Second:
average-fps-1920-1080.png

That's the 6600, not the XT. 1080p medium, you say? At what refresh rate, 240Hz? 'Cause that review suite quite clearly shows it delivering an average 114fps at Ultra. It's still well above 60fps average at 1440p.

You're talking out of your rear end here, and your claims about "value" are just plain-faced ridiculous. Please just stop making a fool out of yourself.
 
I`m still wating for a 970GTX level of pref/$$$

That's a difficult metric to compare to at this point, since none of the benchmarks from the 970 launch are in use anymore. But here I go anyway. If we look at the 970's $330 launch price, the only card that matches right now (in the US) is the 6600 XT, which averages over 100fps (and at least 60) in TPU's test battery at 1080p, and nearly 100 at 1440 (most games above 60). It falls on its face a bit at 4K, but so did the 970. Power envelope is even similar, at 160W for the Radeon vs. the GeForce's 150W.

Granted, the 6600 XT isn't as close to the high-end cards as the 970 was in its day. But the ceiling has been raised by a TON over the last couple of generations. Top single-GPU dog that gen was the 980 ti, a 250W design. The 3080 ti by contrast pulls 350W, while a 3090 ti will suck down around 450W. If we limit ourselves to the TDP of the 980 ti, the highest current-gen performer is the 6800 on the AMD side and 3070 for Nvidia (yes, it's actually 220W, but the next-step-up 2070 ti is 290W). By the TPU 980 ti FE review, the 970 was 25-30% behind overall. The 6600 XT is 25-50% down at 1080/1440 depending on whether you're comparing the 3070 or 6800; the 6800 pulls further ahead as resolution scales, and is actually double at 4K.

TPU wasn't doing their AVGfps chart back when Maxwell launched, so I quick made my own, choosing 1440p because someone inevitably scoffs at 1080p and 4K isn't a reasonable target at this price point. The dashed line represents the ~99fps average of the 6600 XT @ 1440p. Looking at the two side-by-side now, I'd conclude that the 6600 XT is actually a better P/P card than the GTX 970.

1660920633954.png


1660920825408.png


TL;DR: If you're looking for high-end-adjacent performance for $350 or less, that's probably never happening. If what you want is equivalent or better performance in current titles for that same money, AMD has a 6600 XT they'd love to sell you.

What about ray-tracing? It can't enable it?

If one searches for a reason to dislike something, one will always find it. The 6600 and XT suck by the metrics that matter to you. We get it. They're excellent cards for users with other priorities. Personally, I'd love to have either one.
 
If one searches for a reason to dislike something, one will always find it. The 6600 and XT suck by the metrics that matter to you. We get it. They're excellent cards for users with other priorities. Personally, I'd love to have either one.

Actually, the problem that I see is the one that you mentioned - the gigantic performance difference between it and the higher end cards.

It is a very risky purchase without future proofing.
New games launched and you will observe under 50 FPS everywhere regardless of the settings.

Granted, the 6600 XT isn't as close to the high-end cards as the 970 was in its day. But the ceiling has been raised by a TON over the last couple of generations. Top single-GPU dog that gen was the 980 ti, a 250W design. The 3080 ti by contrast pulls 350W, while a 3090 ti will suck down around 450W. If we limit ourselves to the TDP of the 980 ti, the highest current-gen performer is the 6800 on the AMD side and 3070 for Nvidia (yes, it's actually 220W, but the next-step-up 2070 ti is 290W). By the TPU 980 ti FE review, the 970 was 25-30% behind overall. The 6600 XT is 25-50% down at 1080/1440 depending on whether you're comparing the 3070 or 6800; the 6800 pulls further ahead as resolution scales, and is actually double at 4K.
 
Actually, the problem that I see is the one that you mentioned - the gigantic performance difference between it and the higher end cards.
As the post you quoted said, you're looking - and very, very hard - for a reason to dislike this card. Is it further behind the top than the 970 was? Yes, because the 970 was a third-tier card, while the 6600 is 6th (if you count only the first round of RDNA2 SKUs) on AMD's tier list. This has been talked about for years: that as more resolutions become useable, the range of what is "usable" performance widens considerably, and necessitates an increasing number of SKUs - especially as production costs also rise as we come ever closer to running into various production/engineering/lithography walls that have yet to be worked around. Does the existence of many more SKUs make the 6600 a worse deal? Not at all, as those SKUs are much more expensive, and much worse value, even if they're also faster. The 6600, let alone the XT, is a great deal, delivering excellent value for money, great performance at 1080p and 1440p, and nothing of what you're saying comes even close to being an argument against that, let alone being a convincing one.
It is a very risky purchase without future proofing.
No GPU is future proof in any way, shape or form. This is nonsense.
New games launched and you will observe under 50 FPS everywhere regardless of the settings.
As has been the case with every GPU ever made. As time passes, its ability to keep up with new launches will diminish. There is no reason to expect the 6600 or 6600 XT to be outliers in this regard.
You are trying really, really, really hard here, so I guess kudos for the effort if nothing else? Sadly it isn't working though. You can't expect passable RT performance at this price level. I mean, you could buy a 3050 - which costs more in many places - and get ... let's see:
watch-dogs-legion-rt-1920-1080.png

Oh, right, it's worse than both the 6600 and 6600 XT. So, to get passable 1080p RT performance in that title, you ened a 6700 XT or 3060 Ti - both of which are also much more expensive. So ... the value proposition is still there, no, if you have to pay more to get more?

Are you perhaps seeing a pattern here? Something like "if you want more, you pay more"? 'Cause that's what your examples are illustrating - no matter how much work you put into picking as selectively as possible or framing them in extremely biased ways. Also, I thought you didn't care about anything but 2160p? So why are you looking at 1080p testing, all of a sudden? Oh, right, you're trying desperately to cherry-pick a defense of your ludicrous "this is a 1080p medium card" stance, right. 'Cause when you said that, what you meant was "this card can't handle RT". Makes perfect sense.
 
The recent back-and-forth conversation in this thread between two TPU forum participants is a good illustration of several points.

You can typically make a stance (about graphics cards) by cherry picking through various benchmark results. That's one reason why I like 10-game, 15-game, 20-game average scores. While I never play all of the games, the averages do even out any particular game's preferred architecture or idiosyncrasies.

This leads us to another point: most people play more than one game over the ownership of a given gaming device. It's possible to build a pure play PC optimized for a single game (like Hollow Knight) but that's not a real world usage case for Joe Consumer or even the typical DIY PC Builder.

There are details that these game average FPS scores don't always reveal in an obvious way.

One example is display resolution. One manufacturer's cards don't do so well at higher resolutions compared to the equivalent competition.

Ray tracing performance is superior in one. DLSS is available in one. It's important to note that graphics card reviews generally run separate tests for these two features.

From a pure rasterization standpoint, AMD cards offer a better value (performance per dollar), particularly at lower gameplay resolutions. However if you play games that use ray tracing (more are added as time goes on) and take advantage of DLSS (same), there are benefits of Nvidia cards that AMD's current lineup don't offer.

There are other weird little features that might favor an Ampere card. I happen to use Nvidia Broadcast for cleaning up live audio and video. The Tensor cores apparently do most of the heavy lifting here. While this undoubtedly is not in everyone's usage case, it is a real world task that my recent GeForce cards can offer.

In the end, the best strategy is to buy from a reputable merchant with a reasonable return policy and to use the card heavily in its expected usage situations during that return window to determine whether or not the product works for your specific needs.

I don't play Cyberpunk 2077 so I really don't care about cards that perform exceptionally well with that title. That's why the aggregate average game scores are more useful than a single game comparison. I do play Control but I'm certainly not going to base a graphics card purchase on that one game benchmark.
 
Last edited:
When GPU prices return to normal I shall be 84ish.
 
A report from graphics card channel dealers published by MyDrivers states that ASUS, Gigabyte, and MSI are having a difficult time trying to convince retailers and distributors to buy Radeon RX 6000 series cards for sale in the consumer segment. The reason is just like NVIDIA's GPUs which saw a huge price jump during the mining boom but pricing has now plummeted heavily. AMD's Radeon RX 6000 series graphics card prices have plummeted even worse than NVIDIA's GPUs & there is little to no demand for gaming cards right now.

GPU Price Crash Is Making It Hard For AIBs To Offload AMD Radeon Graphics Cards Too, RX 6700 XT Drops Below $400 US, RX 6600 Below $260 US (wccftech.com)
 
Actually, the problem that I see is the one that you mentioned - the gigantic performance difference between it and the higher end cards.

It is a very risky purchase without future proofing.
New games launched and you will observe under 50 FPS everywhere regardless of the settings.

I agree that there's a problem, but disagree on what that problem is. Both manufacturers have been raising the bar on what's considered high-end by pushing power consumption. I'm personally of a mind that 300W+ for graphics is absurd. This particular arms race has gotten way out of hand. The 980 ti (since I was already talking about it above), the biggest, baddest card of its day that wasn't two cards bolted to one board, launched at $650. Its successor stuck with the same power envelope, wiped the floor with it performance-wise, and asked $700. Today, that power envelope will get you another 50% performance on top of that in a 6800 or 3070, and cost you $600-700. Price/performance isn't the issue, IMO. It's expectations.
 
...and? Is this supposed to support your claim that the 6600 (and XT) is poor value? 'Cause "no demand" is not the same as "everyone thinks things are too expensive". After all, a huge portion of the glut of products now is due to overproduction after a period of unprecedented demand, which is now being followed by (increased) recession and economic anxiety across much of the wealthier parts of the world.
 
...and? Is this supposed to support your claim that the 6600 (and XT) is poor value? 'Cause "no demand" is not the same as "everyone thinks things are too expensive".

There is no bad product, there is a bad price.

Give me the Radeon RX 6800 XT for 300 euro, I will create that demand right now and buy the card.
 
There is no bad product, there is a bad price.

Give me the Radeon RX 6800 XT for 300 euro, I will create that demand right now and buy the card.

I want one at these charity prices too, big guy. Get in line, I'll flash my cred as a Vanguard beta tester group member and say I need it more than you :eek:

The recent back-and-forth conversation in this thread between two TPU forum participants is a good illustration of several points.

You can typically make a stance (about graphics cards) by cherry picking through various benchmark results. That's one reason why I like 10-game, 15-game, 20-game average scores. While I never play all of the games, the averages do even out any particular game's preferred architecture or idiosyncrasies.

This leads us to another point: most people play more than one game over the ownership of a given gaming device. It's possible to build a pure play PC optimized for a single game (like Hollow Knight) but that's not a real world usage case for Joe Consumer or even the typical DIY PC Builder.

There are details that these game average FPS scores don't always reveal in an obvious way.

One example is display resolution. One manufacturer's cards don't do so well at higher resolutions compared to the equivalent competition.

Ray tracing performance is superior in one. DLSS is available in one. It's important to note that graphics card reviews generally run separate tests for these two features.

From a pure rasterization standpoint, AMD cards offer a better value (performance per dollar), particularly at lower gameplay resolutions. However if you play games that use ray tracing (more are added as time goes on) and take advantage of DLSS (same), there are benefits of Nvidia cards that AMD's current lineup don't offer.

There are other weird little features that might favor an Ampere card. I happen to use Nvidia Broadcast for cleaning up live audio and video. The Tensor cores apparently do most of the heavy lifting here. While this undoubtedly is not in everyone's usage case, it is a real world task that my recent GeForce cards can offer.

In the end, the best strategy is to buy from a reputable merchant with a reasonable return policy and to use the card heavily in its expected usage situations during that return window to determine whether or not the product works for your specific needs.

I don't play Cyberpunk 2077 so I really don't care about cards that perform exceptionally well with that title. That's why the aggregate average game scores are more useful than a single game comparison. I do play Control but I'm certainly not going to base a graphics card purchase on that one game benchmark.

I agree. End of the day having an abundance of options to pick from is an excellent thing. There is always a product that will fit your personal needs best.
 
With the decrease in crypto, a lot of garbage ex miner cards are being sold on eBay right now, most are labeled honestly as "nonfunctional", "parts", or "repair". With that said, probably a good portion of the "used" and "refurbished" cards are also probably non functional.
 
Status
Not open for further replies.
Back
Top