• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Skip a Radeon RX 7700 Series Launch For Now, Prioritize RX 7600 Series, Computex Unveiling Expected

Yet there are 9 different models still available of the 6800XT on Newegg direct while the original card that competed with it the 3080 is all sold out..... Same with the 6950XT ton of stock yet it's original competitor the 3090ti is all sold out...... So more like AMD is being forced to drop prices because nobody wants the 6800XT-6950XT
Well that depend.

If AMD still making Navi 21 Chips and Nvidia no longer making any GA102 die, one can still be in stock while selling more than the other.

I am not saying one sold more than the other, but what you based your argument to demonstrate that is flawed.

Also, When a new gen get release, price have to drop because the price/performance ratio drop with it. You can't sell a GPU 1000$ if the competitor sell a newer GPU 1000$ that is more powerful.
 
No need for AMD to launch 7700 or 7800 when their current offerings beat the competition in that segment
You could say the same thing with the 7600

6600 currently sells as low as $199 and the 6650XT as low as $255 (both of which include a $60 game) so I doubt an entry level RDNA3 is going to beat those 2 in value while still being able to make a profit.

I'd rather them continue selling the 6600/6650 at those prices and focus on the upper midrange segment which needs help badly (4060 ti and 4070 are overpriced and the 6800XT and 6900 need a replacement with lower power consumption)
 
  • Like
Reactions: cbb
There's a huge market left dry by both Nvidia and Intel - and any hardware survey will show it quite clearly:

sub-$300 dGPU buyers waiting for a compelling upgrade over their RX480, RX580, GTX1060 and GTX1650 models.

All they want is card that isn't 7 years old, and obviously the RX6500XT and RTX3050 failed to interest them, likely due to a lack of VRAM, performance/$, or awkward PCIe lane-count handicaps.
And then you see a 4060ti with 8GB already. 6GB x50ti?

Yes, it absolutely does - by failing to meet its performance target while demanding more energy than it does. Remember, price is about where the negatives of the 4070 end. The power efficiency is off the charts high, it's the the first 1x8-pin GPU since the 1080!
Its not the most efficient Ada GPU, the 4080 is.

Also 200W x70 isn't a novelty, Ampere was just gloriously shite.
 
AMD do not do huge price competition because it's pointless for now. Right now the mind share of AMD is just not high enough and the mind share of Nvidia is just so dominant.


People that are willing to buy AMD will do it whenever the GPU is 10-15% cheaper or 50-60% cheaper. Doing a price war just to get the same amount of people buying their products doesn't really make sense. Would a GFX really cheaper really move the needle? I mean they were always the best bang for the bucks and people still choosed Nvidia.

If people were way more willing to buy AMD, it would make sense to make a price war. But right now, not that much.

Like we said frequently, of all the people wanting cheaper price from AMD, the vast majority just want that so they can buy their Nvidia cards for cheaper.
Yeah that tends to happen when you cede the high end of your market to nvidia for years on end, arrive nearly 2 years late with non competitive products (vega, fury, ece) and gain yourself a reputation for terrible support (one they are still fighting to this day, and often own goal, as seen with the recent rDNA2 driver fiasco).
Oh jeez someone fails to understand facts
My business pays a higher % tariff than does the GPU business yet our prices have not increased anywhere near the GPU jump
Our manufacturing cost has also gone up but yet again nowhere near the jump of the GPUs
Our two best years in business were 2020 & 2021, just like AMD (70& increase in 2022 Q2 to last years Q2) & Nvidia (record breaking profits up 46% from Q1 2022)
And we are in a far more competitive business than GPUs (more brands) yet we have never been sued for GPU price fixing like Nvidia & AMD (ATI) have.
It is also worth pointing out that nvidia margins peaked with the 4090 release, and since then margins have been ona steady decline:


Nvidia, at the moment, is seeing margin on par with pascal's launch in 2016. Not bad margins by any means, but hardly in a position to do major price juts. Clearly the priceof the market has shifted. It's not just nvidia, and there is not much they can do about it.
So you're asking them to do what Nvidia is doing and to do it better than them. Do you think it is reasonable?
Yes. We live in a capitalistic society. If AMD wishes to make billions of dollars from GPU sales, it will need to compete with the company that is currently dominant. That typically means offering a better service for a lower cost.
AMD have more leeway to drop prices than Nvidia. In fact the prices on the 7900XT are dropping exactly because resellers have more margin on the AMD cards than on Nvidia ones. The chiplet approach allows them to have competitive prices in a recession. I think they have the right long term strategy, keep a healthy growth and healthy margins as a company. Should they spend 4X more to make something stronger that the 4090? I don't think that would be a good strategy, even if many users want it.
The current strategy, of giving nvidia an uncontested halo card every generation, has done diddly squat for AMD. The last time their market share was impressive, they had a halo competitor. It's not a coincidence.

I just explained to you there is no reason to believe they couldn't make a card as fast as a 4070, that's absurd, I don't know why you insist on that. And the power consumption is hardly relevant, a 4070 is roughly 80% of a 7900XT according to TPU, just extrapolating linearly, 80% of the power consumption would put an AMD equivalent at 255W (probably even less because of fewer memory chips).

No one would care about 50W, certainly not to the extent of being a deal braker, they would care if it was cheaper. Be real.
I mean, I agree with you, but have you SEEN GPU reviews lately? Half the commenters whining about 55w of power use on $600+w GPUs (as was the case with the 4070 VS the 7900xt recently) and how much it'll cost them. I've had arguments with them in this very forum. Go look at any of the recent GPU review postings or the forum discussing the RX 6000 series as a viable 4000 alternative and you will find it on the first page.

Power usage is a great selling point to people who suck at financial math, which is the majority of people. Nvidia heard the complaints about ampere and reigned themselves in a bit with ada, and the power use has been praised near universally.
 
Yeah that tends to happen when you cede the high end of your market to nvidia for years on end, arrive nearly 2 years late with non competitive products (vega, fury, ece) and gain yourself a reputation for terrible support (one they are still fighting to this day, and often own goal, as seen with the recent rDNA2 driver fiasco).

It is also worth pointing out that nvidia margins peaked with the 4090 release, and since then margins have been ona steady decline:


Nvidia, at the moment, is seeing margin on par with pascal's launch in 2016. Not bad margins by any means, but hardly in a position to do major price juts. Clearly the priceof the market has shifted. It's not just nvidia, and there is not much they can do about it.

Yes. We live in a capitalistic society. If AMD wishes to make billions of dollars from GPU sales, it will need to compete with the company that is currently dominant. That typically means offering a better service for a lower cost.

The current strategy, of giving nvidia an uncontested halo card every generation, has done diddly squat for AMD. The last time their market share was impressive, they had a halo competitor. It's not a coincidence.


I mean, I agree with you, but have you SEEN GPU reviews lately? Half the commenters whining about 55w of power use on $600+w GPUs (as was the case with the 4070 VS the 7900xt recently) and how much it'll cost them. I've had arguments with them in this very forum. Go look at any of the recent GPU review postings or the forum discussing the RX 6000 series as a viable 4000 alternative and you will find it on the first page.

Power usage is a great selling point to people who suck at financial math, which is the majority of people. Nvidia heard the complaints about ampere and reigned themselves in a bit with ada, and the power use has been praised near universally.
OTOH AMD hasn't made much of a profit battling Nvidia either. They've chosen to carve out their own garden instead, consoles.
dGPU presence on PC is just there to keep the console deal at this point. I have no illusions... But console and PC gaming are as close together as they'll ever be.

We need to read between the lines here. AMD said on multiple occasions they have no rush with RT. Look where we are today, they're stoic in that sense, even if the hardware does it, they don't really push it. The same thing happens with GPU availability and pricing/undercutting. They simply don't care anymore. They have the consoles and therefore command the majority of the gaming market and its direction. Even mobile is closer to a console than to a PC if you look at controlling it and therefore the games that can be ported over. AMD found a pretty comfortable spot I think.
 
Disappointing. I expect nothing out of Navi 33 except being Navi 23 on steroids.
It comes out 2 years later with an absurdly uninteresting amount of VRAM, a price which I don't expect to see provide much of an upgrade over RDNA2, and a performance that unless some miracle happens, will neither impress nor justify any price above $200.

It should be a simple rule really:
8Go VRAM: <$300
12Go VRAM: <$500
16Go VRAM: <$800

Low Budget, Minimal, Safe. And Minimal starting at $300 is already pretty ridiculous.

This isn't even against either brand. I expect that both Nvidia and AMD will produce sub $300 cards that yield no particular value, and no future proofing of any kind. Just bad buys across the board under $300, and only AMD will provide good buys under $500. And that'll be Navi 32, which is apparently left in the lab for however many months yet.

This entire year is a poor one for good buys...
 
OTOH AMD hasn't made much of a profit battling Nvidia either. They've chosen to carve out their own garden instead, consoles.
dGPU presence on PC is just there to keep the console deal at this point. I have no illusions... But console and PC gaming are as close together as they'll ever be.
Consoles area major benefit, but the margins on them are VERY thin compared to juicy PC sales. Even with their limited sales, I'd bet good money the actual net income from PC is very close to what the consoles provide.

Back when AMD was genuinely competitive, the good old days of evergreen, AMD was making bucko bucks on GPUs. Their CPU division is what kneecapped them, with the late and slow phenoms, the cheap alternative that was phenom II, and the utter trainwreck that was the construction cores (barf). All at the same time as intel's glorious coroe, nehalem, and sandy bridge were strutting up the street with sweet OC performance. AMD's desperation led to the rebrandeon era, and while thankfully keller and su managed to save AMD from failure, the result now is the opposite, the CPU division is proceeding well with strong leadership and clear value in newer products, while the GPU division is floundering, seemingly lost on what to make next.
 
while the GPU division is floundering, seemingly lost on what to make next.
Lol, let's relax with the theatrics please.

RTG is going exactly where Ryzen went: chiplets. They've been beating that drum for 4 years, at this point you'd think everyone's well aware of the plan.
The plan is to chiplet everything until you've lost any chance of fighting vs the scalability of chiplets. The problem is that chiplets in Zen 1 demanded Zen+ and later Zen2 to truly materialise.

RDNA3 is, for now anyway, a total failure. It didn't really challenge Nvidia, has terrible power draw, still lacks severely in RT, hasn't got any killer features that Nvidia doesn't have, etc, and lots several strong suits of RDNA2, namely, the power draw, VRAM advantage, and price to perf advantage. The advantages are still present, but less strongly than last gen, although the feature set has generally greatly improved (RT is weak but usable, AV1 means that Nvidia doesn't win by default vs AMD's awful AVC encoder...).

The cause for RDNA3's disappointing performance may be chiplets, may not be. But rocking one of their cards right now, I can say that it's got a good amount of power, but it's...lacking mastery, I suppose is the term. Power control, drivers, everything that is necessary to make the card REALLY as great as they promised is severely lacking, even 5 months after.
It feels like AMD knows exactly what they want, to scale smaller dies on cards to a ridiculous degree, to not produce one 4090 type card, but 3 6800 type chips and connect them into one supercard. This is a sound way of scaling, it's worked for Ryzen, for Threadripper and for Epyc. But it's hard. It's particularly hard with GPUs that have extreme latency requirements. AMD is going where they want to. But for all the value in the hardware, the software, drivers, and general state of AMD, is just not following. Some people that bought XTXs day one are still waiting for ROCm official support ffs, almost 6 months later. CUDA worked on the 4090 before it even went to reviewers.

AMD has a great goal ahead. What they don't have currently is the ability to support it in a timely manner with the right software and drivers. Hopefully considering that they made a revenue of $9B in 2020 and $21B in 2022, the money pot they've been feeding is now going to be redistributed to RTG. As you said, RTG helped AMD survive back when the CPUs were sinking hard. Now the CPUs are striking Intel all over the stack. It's time for the elevator to be sent back to RTG with all that money that Ryzen is making. But as usual, the money today will show results in 3-4 years, not less.
 
AMD has the advantage of having their hardware in the consoles. But some developers still use the PC as primary target and then port the game to the consoles. While the native console games have just found their way to PC and they are terribly unoptimized (there's no excuse for the requirements they need for the graphics they deliver).

Anyway AMD should push/help the developers utilize their hardware better, so they can gain an extra uplift in the pc space.

AMDs RDNA 3 seems to be competitive in UE5 with Lumen/Nanite.
Most games, the next 3-5 years will be developed in UE5. AMD should just not let this go out of their hands.

And regarding RT, performance etc. I read that AMD has to be competitive at all fronts. Yes it would be nice but they are not.
But lowering the prices, it would be easily the first choice to gamers. nVidia doesn't want and I could say, they cannot lower their prices because they sell the absolute gaming package and that has to be paid back.
 
Focusing on the 6600 replacement, in this market, makes every bit of sense. They need a real budget options. It needs to be $300 or below. It also need to offer something, and not be just cut-down everything. It actually does not really matter if performance is just on par with the say the 6600, as long as they actually cut the prize. If they actually keep the TBP low, in the current market, that is a really, really, good thing. Both for need for smaller cooler, lesser VRM, less noise, less power supply. Incrasing the VRAM to 12GB, would be really welcome to.
What is supposed to replace the 6600 and would be an actual improvement in your eyes?
Cause I'm all out of ideas for a card with 8Go of VRAM in a market that is realising just how insufficient that is.
 
AMD has the advantage of having their hardware in the consoles. But some developers still use the PC as primary target and then port the game to the consoles. While the native console games have just found their way to PC and they are terribly unoptimized (there's no excuse for the requirements they need for the graphics they deliver).

Anyway AMD should push/help the developers utilize their hardware better, so they can gain an extra uplift in the pc space.

AMDs RDNA 3 seems to be competitive in UE5 with Lumen/Nanite.
Most games, the next 3-5 years will be developed in UE5. AMD should just not let this go out of their hands.

And regarding RT, performance etc. I read that AMD has to be competitive at all fronts. Yes it would be nice but they are not.
But lowering the prices, it would be easily the first choice to gamers. nVidia doesn't want and I could say, they cannot lower their prices because they sell the absolute gaming package and that has to be paid back.
UE5 will not save AMD, anymore then mantle, DX12, or being in the PS4 did. AMD needs better RT hardware.
What is supposed to replace the 6600 and would be an actual improvement in your eyes?
Cause I'm all out of ideas for a card with 8Go of VRAM in a market that is realising just how insufficient that is.
Ideally the 6700xt tier performance and 12GB of hardware needs to be made into a 7600. AKA, what should have happened with the 6600.
 
6600 currently sells as low as $199 and the 6650XT as low as $255 (both of which include a $60 game) so I doubt an entry level RDNA3 is going to beat those 2 in value while still being able to make a profit.
Navi 33, used for the 7600XT, is smaller than Navi 23, used for the 6600XT: 204 vs 237 mm^2. Manufacturing costs will be lower and the performance will be a bit higher (10 to 15%) than the older RDNA2 equivalent.
 
Ideally the 6700xt tier performance and 12GB of hardware needs to be made into a 7600. AKA, what should have happened with the 6600.
Impossible since Navi 33 is on a 128 bit bus, which means 8Go of VRAM.
That's why I expect nothing out of it.

The RX 7700 xt relies on a 256bit bus, that means 16Go. THAT is where the absolute "good budget card" will be, and if it can be sold for $450-500, AMD will make a killing.
 
The whole GPU market reeks of cartel agreements.
 
The whole GPU market reeks of cartel agreements.
Ridiculous. AMD's been fighting tooth and nail to take down Intel. You think they're going sweet on Nvidia? Why?
AMD's position simply isn't good in the GPU space. They are still on a forced policy of deciding which corners to cut.

My guess is that if they want headroom, they require better overall support/software, and IMO that will absolutely start with ROCm and HIP from 3rd party. Internally/for drivers, they just need more personnel, dammit. AMD feels like they're acting out the fight because their strikes are so weak that Nvidia never takes damage.
 
Impossible since Navi 33 is on a 128 bit bus, which means 8Go of VRAM.
That's why I expect nothing out of it.

The RX 7700 xt relies on a 256bit bus, that means 16Go. THAT is where the absolute "good budget card" will be, and if it can be sold for $450-500, AMD will make a killing.
If the 7700 XT or say they make a 7750 or XTX variant. That is 16G @256bit and it's in that $450-500 range as you say. I will totally sell my Merc 6750 XT to get that. :respect::pimp::peace::lovetpu:
 
If the 7700 XT or say they make a 7750 or XTX variant. That is 16G @256bit and it's in that $450-500 range as you say. I will totally sell my Merc 6750 XT to get that. :respect::pimp::peace::lovetpu:
Because it would be drawing less power?, because performance wise i doubt the 7700xt will be more than 22% faster than the 6700xt, there is just no room. The 7800xt probably is 12-15% faster than the 6800xt. And the 7800 vanilla almost as fast as the 6800xt. That puts the 7700xt at 3070 Ti level.
 
Navi 33, used for the 7600XT, is smaller than Navi 23, used for the 6600XT: 204 vs 237 mm^2. Manufacturing costs will be lower and the performance will be a bit higher (10 to 15%) than the older RDNA2 equivalent.
Problem is the original 6600 MSRP was $330 and the original 6600XT MSRP was $380 so even if RDNA3 offsets the price via slightly cheaper production cost to slightly cheaper launch price it will still be a significantly worst value.
 
Because it would be drawing less power?, because performance wise i doubt the 7700xt will be more than 22% faster than the 6700xt, there is just no room. The 7800xt probably is 12-15% faster than the 6800xt. And the 7800 vanilla almost as fast as the 6800xt. That puts the 7700xt at 3070 Ti level.
If the vram is 16g @256 and the specs are pretty good yes. Right now I have 12g @192. It's great but we will see.
 
So who really cares about an rx7600? .... but hey, what do I know.
There are so many different user. Many of us do not care anything that is above ~75W - ~150W TDP VGA. Personally I am waiting for this level RX 7600.
 
Problem is the original 6600 MSRP was $330 and the original 6600XT MSRP was $380 so even if RDNA3 offsets the price via slightly cheaper production cost to slightly cheaper launch price it will still be a significantly worst value.
Doesn't mean a thing in this case, it came out at the height of the mining hell. They certainly didn't expect the price to stay there forever.

The 6600 IMO is the definition of a $250 card, and it slumped down to $200 where it belongs.
If the vram is 16g @256 and the specs are pretty good yes. Right now I have 12g @192. It's great but we will see.
Also this. If the 6700 xt was my go-to advice for anyone wanting 1440p on a budget, the 7700 xt will be the exact same advice, but with the added bonus of quasi-certain longevity over the PS5 era. It's a card that will not shine, but will stand the test of time solidly until about 2027 easily, and if you can get it for $500 and take it as a 1440p card, I think it'll be a great deal.
 
if you can get it for $500
Wait wait, the 7800 should be the one at $500. When the 6700 xt came at $480 that was a semi ripoff. With the 4070 at $600 i don't see the 7800 xt costing anymore than that, and the 7800 at $520, next the 7700 xt at $420.
 
Impossible since Navi 33 is on a 128 bit bus, which means 8Go of VRAM.
That's why I expect nothing out of it.

The RX 7700 xt relies on a 256bit bus, that means 16Go. THAT is where the absolute "good budget card" will be, and if it can be sold for $450-500, AMD will make a killing.
Current leaked specs say 7700XT is on 192 bit bus and 12GB memory like 6700XT. 7800XT is 256 bit bus. Navi 32 is only 60CU's max too.
 
The 3080 selling more doesn’t mean it’s better… same for the 4070. People are stupid
Does that apply to CPUs as well with amd having the majority of DIY builders? Or they become smart then?
 
Back
Top