• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

GPU Pricing and Performance

Joined
Jan 9, 2025
Messages
37 (0.32/day)
How did we get here? How can a new-gen High End card (say a 5060) be worse in performance than an Enthusiast Class (say a 3080) from 2 generations prior? It just seems wrong.

Am I nuts for thinking so?
 
I can spend many words on this but fundementally it's because they can charge whatever they want and people will still pay. What are people going to do, not buy stuff? Not gonna happen.
 
Stagnant market and the fact that it gets harder and harder to develop new nodes and get more architectural performance gains gen over gen. NV is at fault for keeping the prices high, sure, but they aren’t sandbagging on performance, at least I don’t think they are. The times when a new x60 card was matching or beating a previous architecture top dog are gone and they are not coming back.
 
Yeah, monkey see, monkey do. Late stage capitalism 101.
 
Most of the gains will come from using a new node. Who knows what they will use for 6000 series, 3nm or 2nm, but the jump will be bigger than it was going from 4000 to 5000, but not as big as going from 3000 to 4000
 
but they aren’t sandbagging on performance, at least I don’t think they are. The times when a new x60 card was matching or beating a previous architecture top dog are gone and they are not coming back.

Why are they gone? Because Nvidia has made it so.
 
Nitpicking: The 5060 is high end? Back in the day, we used to call *60s "midrange."

Stagnant market and the fact that it gets harder and harder to develop new nodes and get more architectural performance gains gen over gen. NV is at fault for keeping the prices high, sure, but they aren’t sandbagging on performance, at least I don’t think they are. The times when a new x60 card was matching or beating a previous architecture top dog are gone and they are not coming back.
People keep assuming growth to be infinitely linear, despite every historical, real life example proving that this is impossible.

Node shrinks are one thing, but avenues for improving the logic itself are not infinite either. Lack of competition most likely has an impact, but even with it, it's not guaranteed we would get to the good ol' days. At the very least, not without some revolutionary breakthrough in electronics, software/math, or both.

Why are they gone? Because Nvidia has made it so.
Accusations are easy, proving them rarely is.
 
How can a new-gen High End card (say a 5060) be worse in performance than an Enthusiast Class (say a 3080) from 2 generations prior? It just seems wrong.

5060 is entry level now. 1060 ~980+2GB. 2060 ~ 1080 minus 2GB and then 3060 ~2070+4GB, 4060 ~2080, 5060 ~ 2080 Ti minus 3GB. At that rate the 6060 will provide 3080 performance. nothing wrong with it, just awkward or inconsistent.
 
Why are they gone? Because Nvidia has made it so.
Yes, because famously NV is the only player in the market and the 7600 was beating the 6900XT or maybe the 6800XT. Oh wait, that has not happened. The LAST time we had a mid-range GPU match or overtake the flagship parts was Pascal in 2016. That was 10 years ago. And that had the benefit of both a new node a a new architecture. It was a lifetime jump. Since then, development and RnD just got harder. Closest we got since then is, I suppose, the base 9070 matching the 7900XT and the 9070XT overtaking it, but both are hardly "mid-range" cards even at the fantastical MSRP.

People keep assuming growth to be infinitely linear, despite every historical, real life example proving that this is impossible.

Node shrinks are one thing, but avenues for improving the logic itself are not infinite either. Lack of competition most likely has an impact, but even with it, it's not guaranteed we would get to the good ol' days. At the very least, not without some revolutionary breakthrough in electronics, software/math, or both.
People seem to memory-hole the fact that this has ALREADY happened with CPUs. Core count growth and innovations like 3D cache happen exactly because growing ST performance just got harder and harder. Remember, in the middle of 2000s Intel was planning for 10Ghz single-core CPUs and software developers were expecting ST performance to grow more and more and MT techniques were seen as niche to the extreme. It never panned out.
 
software developers were expecting ST performance to grow more and more and MT techniques were seen as niche to the extreme. It never panned out.
Nvidia seems to have learned from that, at least. Shifting to ML and RTRT/PT is a good strategy. Newer, largely unexplored fields = greater probability of finding that one optimisation trick that double your throughput and give you those %%% you can plaster all over your presentations slides/product packaging.

We've been doing (practical) raster graphics for mote than half a century now. People also tend to forget that.
 
Accusations are easy, proving them rarely is.
I'm just saying Nvidia could have done it.
Yes, because famously NV is the only player in the market and the 7600 was beating the 6900XT or maybe the 6800XT. Oh wait, that has not happened. The LAST time we had a mid-range GPU match or overtake the flagship parts was Pascal in 2016. That was 10 years ago. And that had the benefit of both a new node a a new architecture. It was a lifetime jump. Since then, development and RnD just got harder. Closest we got since then is, I suppose, the base 9070 matching the 7900XT and the 9070XT overtaking it, but both are hardly "mid-range" cards even at the fantastical MSRP.

AMD is a follower. Also I'm just saying things could have been different and the main reason they're not is money.
 
AMD is a follower. Also I'm just saying things could have been different and the main reason they're not is money.
Yes, the for-profit company is being conscious of its bottom-line. In other news - sky is blue, water is wet and bald people have no hair.
 
What we have here is a failuya to communicate....

"So we get what we had here last week, which is the way he wants it"......."And he gets it" -The Captain to Cool hand Luke
 
Yes, the for-profit company is being conscious of its bottom-line. In other news - sky is blue, water is wet and bald people have no hair.

And hence my first statement "what are people going to do". How much profit is too much profit? There is no such thing. Nvidia could have dropped the prices for all their gaming GPUs a whole bunch without it having a significant impact on their bottom line, in exchange for goodwill from gamers. But they don't have to, no matter how expensive their stuff is people will buy it and say please and thank you and only loosers forum posters will complain. In a way it's beautiful, I do like when gamers get shafted and angry.
 
5060 is entry level now. 1060 ~980+2GB. 2060 ~ 1080 minus 2GB and then 3060 ~2070+4GB, 4060 ~2080, 5060 ~ 2080 Ti minus 3GB. At that rate the 6060 will provide 3080 performance. nothing wrong with it, just awkward or inconsistent.
So it's really the pricing that is out of wack with the performance, combined with a sort of strange numerical system?
 
Last edited:
How did we get here? How can a new-gen High End card (say a 5060) be worse in performance than an Enthusiast Class (say a 3080) from 2 generations prior? It just seems wrong.
Market is stagnating, generational uplift is getting harder and harder to do & not as compelling to NVIDIA whereas AMD is still providing compelling amounts of gen uplift (but again, they're gonna follow the market leader, they dont seem to wanna try and exceed the market leader again), but the fact remains that the whole market, GPU performance uplift per gen, and etc are all stagnating to a halt. And pricing is increasing, mostly thanks to a combination to my homeland of cheeseburgerland wrecking MANY peoples things, NVIDIA not really wanting to price stuff well / correctly because they dont really care and it'll sell anyway (seriously, every reviewer had next to nothing nice to say about the 5070 and its still sold out lol), and a lack of competition for NVIDIA. AMD and Intel can try as much as they want but NVIDIA is really too big to fail now unless they misstep hard.

Also, AI AI AI AI AI AI AI AI AI AI AI. Sorry had to meet the buzzword quota.
Am I nuts for thinking so?
You're not the only one.
 
I think the card is worse or similar to my previous Radeon 6600XT. I see it as basic graphic card with maybe 4 display outputs and dedicated memory. Maybe a few encoders and decoders without physx. Such stuff belongs in the below 100€ range with 3 fans. With two or one fans in the below 40€ range. I bought the 6600XT when it was new - the performance is dated. Similar dated as the nvidia 3070 which was available at that time with 8GiB and too less VRAM from that time period. I bought amd as I saw instantly too low VRAM. The Radeon 6600XT was a bad product for its high price of 420€ during the mining hype.

-- Mostly everyone buys NVIDIA. I bought 3x AMD graphic cards 6600XT / 6800 non xt / 7800xt as a first purchaser. 95% Nvidia card sold means the market is as it is.

-- I would not buy anything below a RAdeon 9070. 2023 I bought a 7800xt. That card is also dated. Anything below that is a very bad price to performance deal for windows 11 pro gaming.

-- You may read those gaming card / graphic card topics. Most often the topic poster buys than a nvidia graphic card. Regardless of .. .. .. .. .. ..
The few users who use software for windows 11 pro which only supports nvidia graphic cards make the situation not better. Amd could have spend money or efforts or hardware to get more software support.
 
Last edited:
If I were to buy today, I would not get any card with below 16gb VRAM. And I would feel better about my purchase if it had 20+ gb.

Maybe the Super/TI series 5080 will rectify that.
 
I think the card is worse or similar to my previous Radeon 6600XT. I see it as basic graphic card with maybe 4 display outputs and dedicated memory. Maybe a few encoders and decoders without physx. Such stuff belongs in the below 100€ range with 3 fans. With two or one fans in the below 40€ range. I bought the 6600XT when it was new - the performance is dated. Similar dated as the nvidia 3070 which was available at that time with 8GiB and too less VRAM from that time period. I bought amd as I saw instantly too low VRAM. The Radeon 6600XT was a bad product for its high price of 420€ during the mining hype.

-- Mostly everyone buys NVIDIA. I bought 3x AMD graphic cards 6600XT / 6800 non xt / 7800xt as a first purchaser. 95% Nvidia card sold means the market is as it is.

-- I would not buy anything below a RAdeon 9070. 2023 I bought a 7800xt. That card is also dated. Anything below that is a very bad price to performance deal for windows 11 pro gaming.

-- You may read those gaming card / graphic card topics. Most often the topic poster buys than a nvidia graphic card. Regardless of .. .. .. .. .. ..
The few users who use software for windows 11 pro which only supports nvidia graphic cards make the situation not better. Amd could have spend money or efforts or hardware to get more software support.
No offence but you are just a tech snob/elitist.
Take it how you will/want to but your preferences aint reflecting the average real world needs and use cases in general.
 
I think the card is worse or similar to my previous Radeon 6600XT. I see it as basic graphic card with maybe 4 display outputs and dedicated memory. Maybe a few encoders and decoders without physx. Such stuff belongs in the below 100€ range with 3 fans. With two or one fans in the below 40€ range. I bought the 6600XT when it was new - the performance is dated.
Christ, the takes are getting more unhinged by the minute. 100 euros/dollars wouldn’t even cover the BOM on a triple fan 6600XT, even leaving aside that being unnecessary on such a low TBP card. 40 euros for a dual one? Are you genuinely insane?

I would not buy anything below a RAdeon 9070. 2023 I bought a 7800xt. That card is also dated. Anything below that is a very bad price to performance deal for windows 11 pro gaming.
A 2023 card being outdated in 2023 is sure is a take. And what in the name of God does Windows 11 has to do with anything? And what is “pro gaming”? You attend many CS Majors by any chance? Because that’s the only type of “pro gaming” I know. Or did you mean Pro as in Windows 11 Professional? Because that makes even less sense - for gaming all Windows editions are effectively identical.

The few users who use software for windows 11 pro which only supports nvidia graphic cards make the situation not better. Amd could have spend money or efforts or hardware to get more software support.
So you also don’t understand what CUDA is or what it does. Cool. It’s not about “just” software support. AMD has been working on improving and cracking into professional workload market for years now and there is a very good reason why they struggle. It’s not easy.
Also, saying that “a few” users use GPUs for compute and/or rendering professionally is… a statement.
 
Nitpicking: The 5060 is high end? Back in the day, we used to call *60s "midrange."


People keep assuming growth to be infinitely linear, despite every historical, real life example proving that this is impossible.

Node shrinks are one thing, but avenues for improving the logic itself are not infinite either. Lack of competition most likely has an impact, but even with it, it's not guaranteed we would get to the good ol' days. At the very least, not without some revolutionary breakthrough in electronics, software/math, or both.


Accusations are easy, proving them rarely is.
That may be true, but what you see in most industries is that when improvements plateau, market entry for new players gets easier and products cheaper.
 
That may be true, but what you see in most industries is that when improvements plateau, market entry for new players gets easier and products cheaper.
Semiconductor chips are an exception to the rule - RnD gets exponentially more expensive and so do new nodes. The latest player to enter based on the rumors is Xiaomi with plans to design their own SOCs and they are not what I would call a poor upstart company. Prior to them we had Apple doing the same and, well, same thing. A new unknown player just stepping in and starting making competitive CPUs/GPUs/SOCs is just not feasible these days.
 
100 euros/dollars wouldn’t even cover the BOM on a triple fan 6600XT

Whataboutism: Have you ever read those Iphone BOM articles? Are you saying a graphic card costs more than an iphone? Just to give you an idea about costs.
43€ - far overpriced the seller earns a lot https://geizhals.at/inno3d-geforce-....html?hloc=at&hloc=de&hloc=eu&hloc=pl&hloc=uk
100€ is generous - i considered that already

Or is that graphic card such advanced? https://www.amd.com/en/products/pro...tion-9004-and-8004-series/amd-epyc-9684x.html
 
Last edited:
@_roman_
You realize that NV and AMD sell the GPU and memory package to AIBs for a set price, right? That price for the 6600XT is nowhere near a point where a card could be sold for 100 dollars, let alone 40. And we are disregarding everything else that the AIB has to make themselves here, like the cooling and the PCB. I have no idea what the GT 710 even has to do with any of it, wasn’t the card you brought up. And the raw BOM isn’t the only thing that is counted in the final pricing. The very idea of a factory new 6600XT for 40 dollars is in itself laughable for anyone who even remotely understands the market.

tl:dr - Meds, now.
 
Back
Top