• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Confirms New "Enthusiast-class" Radeon 7000-series Graphics Cards This Quarter

7950xtx ?
Maybe the original rumored multi GPU die package ?
2 x 96 CU , runs in 2 GHz ?
Hell yes.. Fill in the full stack AMD.
 
Sucks that still only minor leaks about the RX 7800 series. That's what interests me.

How about making a proper RX 7970 instead?
Funny that I have a Matrix HD 7970 and its triple-slot cooler doesn't look that beefy these days anymore. :laugh:
 
Prices on the 6700/6800/6950 cards have actually inched back up a bit over the last 2 months compared to the sales they were having in May, so I guess they've sold though the excess inventory and need something to sell.
 
they should "unlaunch" RDNA3 and re-release them like that.
7900XTX to 7900XT (799€)
7900XT to 7800XT (599€)
7900 GRE to 7800 (499€)

People tend to forget that new hardware is supposed to be faster than the old one without paying 1:1 for the performance increase... otherwise we would pay 30 grand for a 4090 when we compare it to a FX 5900 Ultra.
 
If it's true, I wonder if nVidia won't revive the 4090Ti. :p
 
I have a theory that we are going to 4 years circle in gaming GPUs. I believe that Ada Next might be just a refresh of Ada with cards that will be fixing some of the current line issues. Maybe this is one more reason why Nvidia released such a flawed series. To make 5000 series look superior to 4000 series, while being the same architecture. So, maybe 4090 Ti will be 5090 with a 512bit data bus, 4080 Ti will end up as a 5080 and all cards underneath will get extra CUDA cores and VRAM to look superior to the current line of 4060s and 4070s. AMD done that in the past, in their gloom and doom days when they didn't had much money to spend. When they gone from 7000 series to 200 series.

Why I am saying it? Because AMD is not really in a hurry to come out with these new RDNA3 cards. Maybe they are not better than 6000 series or even worst in some cases (video playback power usage), maybe they are more expensive to produce offering nothing new really in performance, maybe there are still too many 6000 cards in the market. We all have speculated those reasons. But what if they KNOW they have time? What if they know that Nvidia will greatly decelerate gaming cards development? We see low end graphics cards that are 10 years or more old to keep selling in the market. Who says that we will keep getting a TRUE new architecture every 2 years? In the past we where expecting a new architecture every year, even sooner than that. Now we are at a 2 years circle. Who says we can't go to 4 years circle with a simple refresh in the middle of that period?

Maybe, but they risk getting the same problem of the current 4000 series where 3000 is just more atractive. We're at a performance point where there's not much to go from here, we're getting high refresh rate at 4k, bumping the resolution further will require a lot more compute and doesn't seem worth it not only on processing but also on storage to save all those high res assets.

Before we go into a 4 year cycle there's also the 3 year option, that could be a good option, but they could also jump into something similar to what cpu's did with a "tick tock" of new architecture and then a refresh improved version (4000 5000 in this case)

Hopefully in the Ada Next/5000 series/whatever they don't forget to implement DP2.1 like on 4000 series at least, that was a dick move that is also helping monitors justify staying with the older standard in what looks like a chicken and egg problem
 
Maybe, but they risk getting the same problem of the current 4000 series where 3000 is just more atractive.
I don't see it as a problem for Nvidia, if the buyer ends up paying Nvidia anyway.

If companies like Nvidia had hit a ceiling and can't produce much stronger GPUs without building something that needs a 2KW PSU and a 360mm water cooling system just to work, they will have to hit the brakes and slow down. They done it already this last decade, going from a one year circle to a two years circle, but I guess they have to slow down even more. We probably have reached the top of current technologies for gaming GPUs.
 
You do realize the price of everything does in fact slowly go up over time right ? Older generations almost always fair better when it comes to perf/dollar.
You do realize that perf/$ is supposed to improve, right? Not stay the same or, worse, decrease between generations?

"but muh inflation/price increases" completely deflects the argument that price gouging has become normalized.
I could compare today's generation to ATI 4000 and GTX 200 series in terms of perf/price but it would be totally worthless unless it's adjusted to how prices have changed since then.
You could absolutely do that, and the current gen woudl blow the doors off of those cards.

Doesnt change the reality that this gen has been a largely sideways move in perf/$.

I have a theory that we are going to 4 years circle in gaming GPUs. I believe that Ada Next might be just a refresh of Ada with cards that will be fixing some of the current line issues. Maybe this is one more reason why Nvidia released such a flawed series. To make 5000 series look superior to 4000 series, while being the same architecture. So, maybe 4090 Ti will be 5090 with a 512bit data bus, 4080 Ti will end up as a 5080 and all cards underneath will get extra CUDA cores and VRAM to look superior to the current line of 4060s and 4070s. AMD done that in the past, in their gloom and doom days when they didn't had much money to spend. When they gone from 7000 series to 200 series.

Why I am saying it? Because AMD is not really in a hurry to come out with these new RDNA3 cards. Maybe they are not better than 6000 series or even worst in some cases (video playback power usage), maybe they are more expensive to produce offering nothing new really in performance, maybe there are still too many 6000 cards in the market. We all have speculated those reasons. But what if they KNOW they have time? What if they know that Nvidia will greatly decelerate gaming cards development? We see low end graphics cards that are 10 years or more old to keep selling in the market. Who says that we will keep getting a TRUE new architecture every 2 years? In the past we where expecting a new architecture every year, even sooner than that. Now we are at a 2 years circle. Who says we can't go to 4 years circle with a simple refresh in the middle of that period?
That would be a very dumb move. Yes, AI is the current moneymaker, but gaming is still a multi billion dollar industry, and unlike EVGA, nVidia isnt dumb enough to abandon a lucrative market. They are putting the resources into new architectures anyway, not making gaming versions of them is leaving money on the table.
 
I have a theory that we are going to 4 years circle in gaming GPUs. I believe that Ada Next might be just a refresh of Ada with cards that will be fixing some of the current line issues. Maybe this is one more reason why Nvidia released such a flawed series. To make 5000 series look superior to 4000 series, while being the same architecture. So, maybe 4090 Ti will be 5090 with a 512bit data bus, 4080 Ti will end up as a 5080 and all cards underneath will get extra CUDA cores and VRAM to look superior to the current line of 4060s and 4070s. AMD done that in the past, in their gloom and doom days when they didn't had much money to spend. When they gone from 7000 series to 200 series.

Why I am saying it? Because AMD is not really in a hurry to come out with these new RDNA3 cards. Maybe they are not better than 6000 series or even worst in some cases (video playback power usage), maybe they are more expensive to produce offering nothing new really in performance, maybe there are still too many 6000 cards in the market. We all have speculated those reasons. But what if they KNOW they have time? What if they know that Nvidia will greatly decelerate gaming cards development? We see low end graphics cards that are 10 years or more old to keep selling in the market. Who says that we will keep getting a TRUE new architecture every 2 years? In the past we where expecting a new architecture every year, even sooner than that. Now we are at a 2 years circle. Who says we can't go to 4 years circle with a simple refresh in the middle of that period?
I think that cutting edge nodes availability and excess inventory is the real determinant for the slower release in the current era : Since TSMC 28nm issues, the only real new arch that we got on a same node was maxwell and RDNA 2 iirc ? It's seems that it became an exception rather than a norm. (Even Apple could only do a refresh in 2022 since they couldn't use TSMC 3nm).

Both AMD and Nvidia had a slower than usual roll out because of their overstock, and from what I've seen on my national/continental tech retailers, AMD still have a fair amount of RDNA 2 GPUs to get rid of. AMD scheduled RDNA 4 for a 2024 release, so If nvidia planned blackwell for a 2026 release, it would mean that they don't think that RDNA4 will be a threat in any shape or form.
Current rumors says that Nvidia isn't planning anything new before 2025...so the 3 year release cycle started ? I'm honestly not mad about that. It give things more time to mature.
1690981897167.png
 
Last edited:
That looks like 2023 to me, though it could be delayed a bit for sure.
 
That looks like 2023 to me, though it could be delayed a bit for sure.
That's just how AMD launch graph are ,they...shouldn't be read like graphs :D. The first year is the launch of the first product , the last year the launch of the last product.
1690983188288.png
 
Dear AMD Or AMD marketing
please Define what the hell you mean by "enthusiast class" card??? :confused:
$999 was the price of the last generation enthusiast class card RX 6900 XT you sold. What the hell is the ther 7900 XTX if it's not an "enthusiast class card"
 
Dear AMD Or AMD marketing
please Define what the hell you mean by "enthusiast class" card??? :confused:
$999 was the price of the last generation enthusiast class card RX 6900 XT you sold. What the hell is the ther 7900 XTX if it's not an "enthusiast class card"
I'm enthusiastic about everything - so it can mean anything, really. :laugh:
 
It wasn't clock speeds IIRC, it probably had something to do with drivers?

Maybe they fixed the software issues finally!
It's both. They missed their clock targets and the driver can't see dual issue opportunities staring it in the face. Honestly, fixing the clock issues, if they had the time and resources to do it, would be easier than fixing the compiler. Compilers are dumber than expert human programmers at optimizing hot code.
 
But doesn't enthusiast mean a 7950XTX? I mean the term enthusiast is reserverd for people who invest the most...
When I google enthusiast GPU I get 6900XT, 7900XTX, 3090 and 4090 as a result...
Or does Lisa mean the 7900 GRE?

I think people interprete too much into it.
 
You do realize that perf/$ is supposed to improve, right? Not stay the same or, worse, decrease between generations?

"but muh inflation/price increases" completely deflects the argument that price gouging has become normalized.
Yes you should more performance each generation with the same money. I'd say at least +20-30% for 1 GPU generation or CPU over 2 generations or 2 years.
Nvidia gave literally >60-70% raster performance with the 4090, but every class under that gets 10% less per class. 4080 is 50%, 4070Ti is 40%, 4070 is ~30%, 4060Ti is...5-10%...you get why the 4060Ti is bad... :D

Inflation IS very high right now in 2020 what would cost 100$ is not around 118$, that is 18% inflation... So a 3080 that costs 699, costs now ~839$...
In that case the 100$ more for a 4070 DO make sense. Since the 4060Ti is the same 399$, you get how much they are saving with that card.
The 4070Ti was supposed to be the REAL 4080 with 100$ higher price + 100$ inflation adjustment making it 899$ from 699$. The now 4080 16GB is something inbetween... they should have called it 4080Ti, but Jensen wanted to save up the Ti moniker for later after AMD releases their products at that time...
 
Aren't they referring to the Navi 32 cards, RX7700 12GB and RX7800 16GB?
We're yet to see anything with that chip, and they'll be taking at least 10 months between the release of the first N31 and first N32 card with is really weird.
 
Could be an improved RDNA3 as it seems RDNA3 didn't achieve their performance targets.
 
I hope it's the 7800 series. The 7950 XTX would be dumb. An overclocked 7900 XTX will never catch the 4090, only increase prices further for no reason.

7900 series is called Ultra-Enthusiast series by AMD reps, and has been for a long time. so "Enthusiast" line will be 7800 series.
 
Aren't they referring to the Navi 32 cards, RX7700 12GB and RX7800 16GB?
We're yet to see anything with that chip, and they'll be taking at least 10 months between the release of the first N31 and first N32 card with is really weird.

If 7700-12 and 7800-16 are considered "enthusiast" according to AMD, then what is 7600-8? High-end? Ultra-high end? :rolleyes:
 
I think that cutting edge nodes availability and excess inventory is the real determinant for the slower release in the current era : Since TSMC 16nm issues, the only real new arch that we got on a same node was maxwell and RDNA 2 iirc ? It's seems that it became an exception rather than a norm. (Even Apple could only do a refresh in 2022 since they couldn't use TSMC 3nm).

Both AMD and Nvidia had a slower than usual roll out because of their overstock, and from what I've seen on my national/continental tech retailers, AMD still have a fair amount of RDNA 2 GPUs to get rid of. AMD scheduled RDNA 4 for a 2024 release, so If nvidia planned blackwell for a 2026 release, it would mean that they don't think that RDNA4 will be a threat in any shape or form.
Current rumors says that Nvidia isn't planning anything new before 2025...so the 3 year release cycle started ? I'm honestly not mad about that. It give things more time to mature.
View attachment 307316
Turing was also on the same node as Pascal; 12 nm was essentially a tweaked TSMC 16 nm.
 
Back
Top