Wednesday, May 3rd 2023

AMD CEO Dr Lisa Su Confirms Mainstream RDNA3 GPUs in Q2-2023

AMD CEO Dr Lisa Su, in her Q1-2023 Financial Results call with investors and analysts, confirmed that the company plans to expand the Radeon RX 7000 series with the addition of new "mainstream" GPUs based on the RDNA3 graphics architecture in this quarter (Q2-2023). This confirms the launch of the Radeon RX 7600 XT later this month, but could also hint at other SKUs the company considers mainstream, such as the RX 7500 XT. AMD has for long considered the RX x700 series as performance-segment, and the RX 7600 XT launch right after the high-end RX 7900 series would hint that the company is still figuring out the economics of its RX 7700 series and RX 7800 series.

"In gaming graphics, channel sell-through of our Radeon 6000 and Radeon 7000 series GPUs increased sequentially. We saw strong sales of our high-end Radeon 7900 XTX GPUs in the first quarter, and we're on track to expand our RDNA 3 GPU portfolio with the launch of new mainstream Radeon 7000 series GPUs this quarter," said Dr Lisa Su. With GPU prices in free-fall since the GPU-accelerated crypto-mining crash, AMD is in the process of clearing out its Radeon RX 6000 series inventory as it creates room for the RX 7000 series. Enthusiast-segment SKUs of the yesteryear, such as the RX 6900 series, could be had at prices under $600.
Sources: Seeking Alpha, VideoCardz
Add your own comment

29 Comments on AMD CEO Dr Lisa Su Confirms Mainstream RDNA3 GPUs in Q2-2023

#1
ModEl4
I would love to be proved wrong, but it seems 6700XT 12GB should be equal in 1080p raster performance, 4% faster in QHD and 15% faster in 4K vs full Navi33 8GB or around that range (±5% depending actual frequency that Navi33 can hit). I hope AMD has taken notice from RTX 4070 launch and price Navi33 accordingly...
Posted on Reply
#2
Squared
This may be the second launch of a mainstream graphics card to a non-cryptocrazed market in a long time, after Intel Arc. And even today the RX 6600 is a disappointment because of its limited PCIe lanes.
Posted on Reply
#3
AusWolf
SquaredAnd even today the RX 6600 is a disappointment because of its limited PCIe lanes.
Really?


On topic: the market is in deep need of true mainstream cards for mainstream prices. Nvidia has been beating the high end for so long that they even forgot what the word mainstream means. I hope AMD can show them.
Posted on Reply
#4
mechtech
SquaredThis may be the second launch of a mainstream graphics card to a non-cryptocrazed market in a long time, after Intel Arc. And even today the RX 6600 is a disappointment because of its limited PCIe lanes.
I think that's the 6500xt & 6400 with 4 lanes. I believe the 6600 has 16 lanes.

I got an RX6600 black november for $270 CAD (199 usd)............probably should have gotten two.

www.techpowerup.com/review/msi-radeon-rx-6400-aero-itx/31.html

at 1080p the 6600 is only ~11% behind the 3060...................which was $525 CAD at time I got the 6600.
Posted on Reply
#5
AusWolf
mechtechI think that's the 6500xt & 6400 with 4 lanes. I believe the 6600 has 16 lanes.
It's got 8 lanes, but it's not as restricted as the 6400 and 6500 XT are by their 4.
Posted on Reply
#6
Avro Arrow
SquaredThis may be the second launch of a mainstream graphics card to a non-cryptocrazed market in a long time, after Intel Arc. And even today the RX 6600 is a disappointment because of its limited PCIe lanes.
Uh, no... that's the RX 6500 XT. The RX 6400 is also only x4 but it's only a glorified video adapter so nobody cares about that. The RX 6600, 6600 XT and 6650 XT are some of the most beloved RDNA2 cards ever made with the best value on the market for over a year now. They're only x8 but it has been shown to have no significant effect on their performance.
Posted on Reply
#7
AusWolf
Avro ArrowUh, no... that's the RX 6500 XT. The RX 6400 is also only x4 but it's only a glorified video adapter so nobody cares about that.
Nearly RX 570 level performance with way less power consumption, no power plug, and availability of half-height variants? I wouldn't write it off so quickly (I had one and it was awesome).
Avro ArrowThe RX 6600, 6600 XT and 6650 XT are some of the most beloved RDNA2 cards ever made with the best value on the market for over a year now. They're only x8 but it has been shown to have no significant effect on their performance.
That I agree with.
Posted on Reply
#8
Squared
AusWolfReally?
I'm almost certainly exaggerating the problem. But this average frame rate across games at 1440p is optimistic; the average includes games that saw no performance drop. Moreover the worst case resolution tested was 1080p. I game at 2560x1080, and results like this had me concerned that a few games might have bad enough performance to lower the value of the RX 6600 XT.

The 6600 XT has been over $300 until this year; that just seems like a lot of money for something with this handicap, considering I paid a lot less for the RX 480 and more recently the 5600 XT and they both have 16 lanes.
Posted on Reply
#9
erocker
*
Well, a 6600XT is almost 20% faster than a 6600 non-XT at 1440p.
Posted on Reply
#10
bug
SquaredI'm almost certainly exaggerating the problem. But this average frame rate across games at 1440p is optimistic; the average includes games that saw no performance drop. Moreover the worst case resolution tested was 1080p. I game at 2560x1080, and results like this had me concerned that a few games might have bad enough performance to lower the value of the RX 6600 XT.

The 6600 XT has been over $300 until this year; that just seems like a lot of money for something with this handicap, considering I paid a lot less for the RX 480 and more recently the 5600 XT and they both have 16 lanes.
Honestly, do you know someone who games on PCIe 4 x8?
I believe 6500 was the neutered one, 6600 is fine.
Posted on Reply
#11
AusWolf
bugHonestly, do you know someone who games on PCIe 4 x8?
I believe 6500 was the neutered one, 6600 is fine.
RX 6600 series is x8 lanes.
Posted on Reply
#12
KV2DERP
bugHonestly, do you know someone who games on PCIe 4 x8?
I believe 6500 was the neutered one, 6600 is fine.
Yeah, 6500XT was the neutered one. Not only it lacks PCIe bandwidth at gen 3, but the memory is also constrained by it's 4GB 64 bit with around 144GB/s bandwidth.

To put that into perspective, RX 580 had 256GB/s bandwidth and double the VRAM available to the GPU.
Posted on Reply
#13
tugrul_SIMD
Since 7600xt has 1/4 of compute power of rtx 4090, its price should not go beyond 1/4 of price of rtx 4090.
Posted on Reply
#14
Avro Arrow
AusWolfNearly RX 570 level performance with way less power consumption, no power plug, and availability of half-height variants? I wouldn't write it off so quickly (I had one and it was awesome).
Sorry, I should've been more clear... It's a glorified video adapter with any motherboard that's less than PCI-Express 4.0 because the x4 connection just chokes it.

When I say "glorified video adapter" it just means that it's very weak in gaming and, to be fair, the RX 6400 is pretty weak in gaming by today's standards. The RX 6400 and RX 6500 XT both have a silver lining though... When you upgrade your system, your card will be automatically faster! How many other cards can claim that? :peace:

I actually just bought a Powercolor RX 6500 XT ITX Gaming, not because I think it's a good card but because it cost only $161.39CAD ($118.73USD) brand-new (I honestly don't know how I got it so cheap) from Canada Computers.

It just proves that there's no such thing as a bad video card, just bad prices. I bought it for my mom's HTPC so it really will be a glorified video adapter! In a world where you can't get a decent used RX 580 for less than $200CAD, this was an absolute bargain! :laugh:
Posted on Reply
#15
AusWolf
Avro ArrowSorry, I should've been more clear... It's a glorified video adapter with any motherboard that's less than PCI-Express 4.0 because the x4 connection just chokes it.

When I say "glorified video adapter" it just means that it's very weak in gaming and, to be fair, the RX 6400 is pretty weak in gaming by today's standards. The RX 6400 and RX 6500 XT both have a silver lining though... When you upgrade your system, your card will be automatically faster! How many other cards can claim that? :peace:

I actually just bought a Powercolor RX 6500 XT ITX Gaming, not because I think it's a good card but because it cost only $161.39CAD ($118.73USD) brand-new (I honestly don't know how I got it so cheap) from Canada Computers.

It just proves that there's no such thing as a bad video card, just bad prices. I bought it for my mom's HTPC so it really will be a glorified video adapter! In a world where you can't get a decent used RX 580 for less than $200CAD, this was an absolute bargain! :laugh:
I love the expression "by today's gaming standards" because there's a hundred different meanings to it. :D

If you assume that one only plays the newest AAA games at high settings, then the 6400 / 6500 XT really aren't for you. But if you enjoy slightly older games, or you don't mind decreasing your graphics settings, then they really aren't that bad. Like I said, I had one, and I gamed on it, so I know. Keeping your expectations in check is just as much a thing as buying a more expensive graphics card is.
Posted on Reply
#16
64K
AusWolfI love the expression "by today's gaming standards" because there's a hundred different meanings to it. :D
I've watched the words "Gaming" and "Gamer" become very fluid over time. A Gamer used to mean a specific type of video game player. Now it just applies to anyone who plays games. Even people who play Candy Crush type games on their cell are called Gamers and you have the branching. Casual Gamer, Avid Gamer, Hardcore Gamer.

As far as hardware goes, manufacturers just slap on the word "Gaming" to a piece of hardware and charge more for it.
Posted on Reply
#17
Avro Arrow
AusWolfI love the expression "by today's gaming standards" because there's a hundred different meanings to it. :D
It's true but I couldn't think of a better way to say it without taking half a screen of text. :laugh:
AusWolfIf you assume that one only plays the newest AAA games at high settings, then the 6400 / 6500 XT really aren't for you. But if you enjoy slightly older games, or you don't mind decreasing your graphics settings, then they really aren't that bad. Like I said, I had one, and I gamed on it, so I know. Keeping your expectations in check is just as much a thing as buying a more expensive graphics card is.
Absolutely. There are no bad cards, just bad prices. I got that RX 6500 XT for the equivalent of about $117USD ($161CAD) and, while I know that it's relatively terrible compared to pretty much all other new cards (mostly because of that PCI-e4 x4 connector and the lack of a hardware encoder), for my mother's HTPC, it's perfect.

I still can't believe that people on eBay are trying to sell their old RX 580 cards for well over $200CAD. It felt like I had found the holy grail when I saw this listing. Having said that, this card is so cheaply made that it makes me laugh. It has a cheap plastic shroud and a single fan. The way I see it though, is when a card that is a "bottom-of-the-barrel" model like the RX 6500 XT (I still don't know why it has an "XT" suffix) is made as cheaply as possible, it's not a flaw, but a feature. The less you pretty the thing up, the less expensive the card will be. Let's face it, someone looking to buy an RX 6500 XT is looking to get a card for as little as possible without having to resort to the RX 6400. Having said that, this card was less expensive from Canada Computers than ALL of their RX 6400 models. Go figure, eh?

Just look at this thing, it's so cheaply made that it's actually cute:

The pure cheapness that Powercolor put into this card even extends to the box art (meaning the lack thereof).

I admit it, I cracked up when I read "Unleash the Gaming Power" on this box. :D
64KI've watched the words "Gaming" and "Gamer" become very fluid over time. A Gamer used to mean a specific type of video game player. Now it just applies to anyone who plays games. Even people who play Candy Crush type games on their cell are called Gamers and you have the branching. Casual Gamer, Avid Gamer, Hardcore Gamer.
I use the terms "PC Gamer" (desktop), "Craptop Gamer" (laptop), "Console Gamer" and "Mobile Gamer" (tablet/phone).
64KAs far as hardware goes, manufacturers just slap on the word "Gaming" to a piece of hardware and charge more for it.
Yup. That's why I avoid motherboards that say anything about gaming. I look for boring/vague motherboard names like "ASRock X570 Pro4".
Posted on Reply
#18
AusWolf
Avro ArrowIt's true but I couldn't think of a better way to say it without taking half a screen of text. :laugh:

Absolutely. There are no bad cards, just bad prices. I got that RX 6500 XT for the equivalent of about $117USD ($161CAD) and, while I know that it's relatively terrible compared to pretty much all other new cards (mostly because of that PCI-e4 x4 connector and the lack of a hardware encoder), for my mother's HTPC, it's perfect.

I still can't believe that people on eBay are trying to sell their old RX 580 cards for well over $200CAD. It felt like I had found the holy grail when I saw this listing. Having said that, this card is so cheaply made that it makes me laugh. It has a cheap plastic shroud and a single fan. The way I see it though, is when a card that is a "bottom-of-the-barrel" model like the RX 6500 XT (I still don't know why it has an "XT" suffix) is made as cheaply as possible, it's not a flaw, but a feature. The less you pretty the thing up, the less expensive the card will be. Let's face it, someone looking to buy an RX 6500 XT is looking to get a card for as little as possible without having to resort to the RX 6400. Having said that, this card was less expensive from Canada Computers than ALL of their RX 6400 models. Go figure, eh?

Just look at this thing, it's so cheaply made that it's actually cute:

The pure cheapness that Powercolor put into this card even extends to the box art (meaning the lack thereof).

I admit it, I cracked up when I read "Unleash the Gaming Power" on this box. :D
That was a good deal. :)

Honestly, I don't understand why everybody bashes the 6500 XT for lacking a video encoder. I mean, if you stream your gameplay on a low-end budget card at min/med details, you're doing it wrong. :laugh:

I still think it's a pretty decent entry level gaming card. I enjoyed using mine. I agree that its initial price was quite bad, but it has improved enough recently so that it can actually be recommended for some light, low-spec gaming. Mine was the opposite of yours: the Asus Tuf is probably the most overbuilt, over-engineered version out there. I didn't like the proprietary RGB control software, but I loved how quiet it was. Heck, if the Sapphire Pulse 8 GB version was available, I'd probably buy one just as a backup. :D
Avro ArrowYup. That's why I avoid motherboards that say anything about gaming. I look for boring/vague motherboard names like "ASRock X570 Pro4".
The problem with those boards is that they're usually just as badly built as your Powercolor 6500 XT. A lot of research has to be done before buying one. Other than that, I agree - I'm not a fan of overstuffed "gamery" designs of our times, either.
Posted on Reply
#19
Avro Arrow
AusWolfThat was a good deal. :)

Honestly, I don't understand why everybody bashes the 6500 XT for lacking a video encoder. I mean, if you stream your gameplay on a low-end budget card at min/med details, you're doing it wrong. :laugh:
Personally, I don't really care if it has a hardware encoder or not because software encoding usually produces a better result. The problem is that the result isn't a lot better but it sure takes the CPU a lot longer to do. I laugh about it because hardware encoders have been included on video cards for over a decade so a video card without it seems.... incomplete.
AusWolfI still think it's a pretty decent entry level gaming card. I enjoyed using mine. I agree that its initial price was quite bad, but it has improved enough recently so that it can actually be recommended for some light, low-spec gaming. Mine was the opposite of yours: the Asus Tuf is probably the most overbuilt, over-engineered version out there. I didn't like the proprietary RGB control software, but I loved how quiet it was. Heck, if the Sapphire Pulse 8 GB version was available, I'd probably buy one just as a backup. :D
It seems like it would be decent as long as you had PCI-Express 4.0 on your motherboard. Every review that I've seen shows that it's performance suffers significantly from bandwidth loss when used on a motherboard with PCI-e3 or older.
AusWolfThe problem with those boards is that they're usually just as badly built as your Powercolor 6500 XT.
I don't agree. The only "flagship" motherboard that I ever bought is the only one that ever failed on me (and I do mean ever). Imagine buying an MSi K9A2 Platinum with (at the time) AMD's flagship 790FX chipset, a board designed to handle Quad-Crossfire, a board that cost an arm and a leg:

Then imagine that it fails 15 months after purchase despite the fact that you never overclocked it nor did you ever have more than two HD 4870s mounted on it. Then imagine that, because these things only had a 1-year warranty back then, MSi tells you to go pound sand (I haven't bought a single MSi product since). The icing on the cake is that I bought this when I worked at Tiger Direct (I also never sold another MSi product after that so I more than got my revenge on them) and it failed while I was away at university and could only afford the cheapest motherboard that I could find, an ECS/Elitegroup model number IC780M-A2 (or something similar) that cost me about $80. It was the cheapest and most dinky-looking motherboard that I have ever seen to date:

It looks exactly like this but it's an AM2+ board, not an AM3 board. I was able to find the IC780M-A which is AM2+ but the colours are wrong and my motherboard has the i-Cafe logo on it. In any case, this was back in late 2008/early 2009 so about 14-15 years ago. What if I told you that this cheap and dinky-looking motherboard that resembles something that a kid would make with Lego blocks still works perfectly to this day? It was at the core of my mother's HTPC until I upgraded her to the FX-8350 last Christmas so that she didn't have to use W7 any more (since many programs are starting to drop support for W7). I now use it sporadically for an old game that only works with GeForce cards (I use my old 8500 GT for that).

Never make an assumption on the build quality of an item based on its price, how it has been named or how it looks because any board can die young and any board can be more or less immortal.

As for the Powercolor RX 6500 XT Gaming ITX, it's not a bad design, it's just a plain design. It seems that this card was just made to work (and little else). There's something to be said for that because that's all anyone really needs and if it costs less as a result, it becomes more accessible to more people. Honestly, I've never cared if a card had lots of RGB on it because that tends to make cards more expensive. I'd rather have a faster card without RGB than a slower card with RGB.
AusWolfA lot of research has to be done before buying one. Other than that, I agree - I'm not a fan of overstuffed "gamery" designs of our times, either.
I always research the hell out of any PC part before I buy it, partly because I want the best for my money and partly because I enjoy researching PC parts. I bought the K9A2 Platinum because it had great reviews, but sometimes you just get unlucky. C'est la vie! :rolleyes:
Posted on Reply
#20
AusWolf
Avro ArrowPersonally, I don't really care if it has a hardware encoder or not because software encoding usually produces a better result. The problem is that the result isn't a lot better but it sure takes the CPU a lot longer to do. I laugh about it because hardware encoders have been included on video cards for over a decade so a video card without it seems.... incomplete.
I don't care about video encoding at all, but each to their own. :)
Avro ArrowIt seems like it would be decent as long as you had PCI-Express 4.0 on your motherboard. Every review that I've seen shows that it's performance suffers significantly from bandwidth loss when used on a motherboard with PCI-e3 or older.
That is true.
Avro ArrowI don't agree. The only "flagship" motherboard that I ever bought is the only one that ever failed on me (and I do mean ever). Imagine buying an MSi K9A2 Platinum with (at the time) AMD's flagship 790FX chipset, a board designed to handle Quad-Crossfire, a board that cost an arm and a leg:

Then imagine that it fails 15 months after purchase despite the fact that you never overclocked it nor did you ever have more than two HD 4870s mounted on it. Then imagine that, because these things only had a 1-year warranty back then, MSi tells you to go pound sand (I haven't bought a single MSi product since). The icing on the cake is that I bought this when I worked at Tiger Direct (I also never sold another MSi product after that so I more than got my revenge on them) and it failed while I was away at university and could only afford the cheapest motherboard that I could find, an ECS/Elitegroup model number IC780M-A2 (or something similar) that cost me about $80. It was the cheapest and most dinky-looking motherboard that I have ever seen to date:

It looks exactly like this but it's an AM2+ board, not an AM3 board. I was able to find the IC780M-A which is AM2+ but the colours are wrong and my motherboard has the i-Cafe logo on it. In any case, this was back in late 2008/early 2009 so about 14-15 years ago. What if I told you that this cheap and dinky-looking motherboard that resembles something that a kid would make with Lego blocks still works perfectly to this day? It was at the core of my mother's HTPC until I upgraded her to the FX-8350 last Christmas so that she didn't have to use W7 any more (since many programs are starting to drop support for W7). I now use it sporadically for an old game that only works with GeForce cards (I use my old 8500 GT for that).

Never make an assumption on the build quality of an item based on its price, how it has been named or how it looks because any board can die young and any board can be more or less immortal.
Those were different times. CPUs didn't consume nearly as much power as they do now, so boards were generally worse-built, even the high-end ones. Look at that "dinky" AM3 board of yours with its seemingly 4-phase power delivery, or the high-end MSi with... is it 4+1 phases? Nearly every entry-level LGA-1700 or AM5 board laughs at them from a distance. There is also much greater difference between an entry-level and a high-end board these days. It really makes a difference now, but back then, it didn't matter much unless you needed those extra SATA ports or the second PCI-e x16 or something.
Posted on Reply
#21
Avro Arrow
AusWolfThose were different times. CPUs didn't consume nearly as much power as they do now, so boards were generally worse-built, even the high-end ones.
I don't know what you're talking about. The TDP of the Phenom II X4 940 was 125W. That's 20W more than my R7-5800X3D and almost double the TDP of the R7-5700X. The FX-8350 was also 125W and don't forget that the K9A2 Platinum was designed to handle four video cards so that's another 300W through the PCI-Express slots (75W x 4). That's far more juice than you're going to find in modern motherboards because multi-GPU uses more juice than any CPU (except maybe the i9-13900K).
AusWolfLook at that "dinky" AM3 board of yours with its seemingly 4-phase power delivery, or the high-end MSi with... is it 4+1 phases?
That's not the actual board, the actual board was AM2+ (otherwise it looks identical). And when I bought it, I only did so because it was literally all I could afford (I was at uni) and didn't care about the power delivery because I knew that it was made to handle a 125W CPU and I was only going to be running a single HD 4870 since it only had one slot.
AusWolfNearly every entry-level LGA-1700 or AM5 board laughs at them from a distance. There is also much greater difference between an entry-level and a high-end board these days. It really makes a difference now, but back then, it didn't matter much unless you needed those extra SATA ports or the second PCI-e x16 or something.
I don't think that's true. I think that it would be much more challenging to build a board that can handle 4 video cards at once than what we have today. I can say that the 990FX motherboard handled my RX 5700 XT and RX 6800 XT in a makeshift gaming rig for 6 months and my ASRock X370 Killer SLI handled twin R9 Furies in 3DMark without issue.
Posted on Reply
#22
AusWolf
Avro ArrowI don't know what you're talking about. The TDP of the Phenom II X4 940 was 125W. That's 20W more than my R7-5800X3D and almost double the TDP of the R7-5700X. The FX-8350 was also 125W and don't forget that the K9A2 Platinum was designed to handle four video cards so that's another 300W through the PCI-Express slots (75W x 4). That's far more juice than you're going to find in modern motherboards because multi-GPU uses more juice than any CPU (except maybe the i9-13900K).
125 W back then meant 125 W. Nowadays, you have to multiply it by 1.35x on AMD to get the correct power consumption figure. The highest TDP rating is that of the 7950X at 170 W. In actual power consumption, that's 170 x 1.35 = 230 W. 12th and 13th gen Intel chips can easily exceed even this. High-end motherboards have to handle this without issues.

Not to mention, I had a mid-range Asus board with my FX-8150 because I thought it was gonna be enough. Well, it sort of was, but its VRM ran so hot that its heatsink could burn your fingers if you touched it.
Avro ArrowThat's not the actual board, the actual board was AM2+ (otherwise it looks identical). And when I bought it, I only did so because it was literally all I could afford (I was at uni) and didn't care about the power delivery because I knew that it was made to handle a 125W CPU and I was only going to be running a single HD 4870 since it only had one slot.

I don't think that's true. I think that it would be much more challenging to build a board that can handle 4 video cards at once than what we have today. I can say that the 990FX motherboard handled my RX 5700 XT and RX 6800 XT in a makeshift gaming rig for 6 months and my ASRock X370 Killer SLI handled twin R9 Furies in 3DMark without issue.
GPUs get their power through the 24 pin EPS cable at 75 W per GPU. It's not the same as 230+ Watts on a single socket.
Posted on Reply
#23
kapone32
Avro ArrowI don't think that's true. I think that it would be much more challenging to build a board that can handle 4 video cards at once than what we have today. I can say that the 990FX motherboard handled my RX 5700 XT and RX 6800 XT in a makeshift gaming rig for 6 months and my ASRock X370 Killer SLI handled twin R9 Furies in 3DMark without issue.
You mentioned the 990FX and if you had the Sabretooth it actually has more PCIe lanes than the current crop. It's not just that but also how the boards are programmed. As an example my current board has 3 PCie x16 lanes and though that might seem good the fact is how the board is actually wired as well. The first slot is x16 with 4 lanes shared with the 2nd PCIe slot and another 4 on the M2 slot beside it. The 3rd slot is getting 4 lanes from the first chipset. In the days of PLX chips I am pretty sure that the 990FX had 2x16 and x8 wired with a PCI slot thrown in. The thing is my board is expensive and that is exactly why. I doubt there is a B650 that is as flexible as the Asus B550XE so we are not getting better and in no way is it the same. The truly uber platforms were X299 and X399 for normal consumers and TRX40 was for content creators. The best board in terms of flexibility is the MSI Ace X670E but that board is 4 0s after tax. The next best is the Carbon but that is also $700 before tax so you have to pay for PCIe flexibility on modern platforms. Z790 is no better as it seems that the first M2 slot and the first x16 slot share lanes meaning it is difficult to get 2x8 wired slots on those boards.
Posted on Reply
#24
Avro Arrow
AusWolf125 W back then meant 125 W. Nowadays, you have to multiply it by 1.35x on AMD to get the correct power consumption figure. The highest TDP rating is that of the 7950X at 170 W. In actual power consumption, that's 170 x 1.35 = 230 W. 12th and 13th gen Intel chips can easily exceed even this. High-end motherboards have to handle this without issues.
I don't know what you're talking about because that has never been true. The Phenom II X4 940 drew almost 220W at max load while the FX-8350 drew over 250W at max load.

Check this out:

AM2+ Era (Techspot):

AM3+ Era (Techspot):


And check out the FX-9590's numbers from AnandTech!

AM4 Era (Techspot):

(Note that the Ryzen 7 5700X consumes 32 fewer watts than the Ryzen 7 5800X so it would be at 174W total system draw.)

AM5 Era:

In the AM5 era, Intel's CPUs just look like hyper-OC versions of their previous gens while AMD has that stupid "race to 95°C" thing to max their power use because they just both want their performance numbers to be maxxed-out for review benchmark charts like these. IIRC, the R7-5800X3D uses a bit more power than the R7-7700X in Eco Mode. I get the feeling that Eco Mode is the same as AMD Cool'n'Quiet, a setting that is turned on by default in all AMD CPUs and APUs before Zen4.

Other than that, it doesn't appear that CPU power usage has appreciably gone up over the years. They're (almost) all in the 150-275W total system power between the AM2+ and AM4 eras with the power consumption in the AM5 era being artificially inflated to produce greater performance numbers. So, no, 125W did not mean 125W any more than it does today (unless you're Intel and say that the i9-13900K has a TDP of 125W). Tech advancement not only increases performance, it also increases effciency.

The most power-hungry consumer-grade CPU before the i9-13900K was the FX-9590 from the AM3+ era. It didn't perform even close to the R7-5950X but it used a crap-tonne more power.

Hell, even with the insanely-powerful video cards of today, the most power-hungry video card ever made was made nine years ago in 2014 with a TDP of 580W. The suggested PSU for this card was 950W.
Powercolor Radeon R9 290x2 Devil 13 4GB

Things aren't nearly as bad today with regard to power use as it appears. It's just that, with the war in Ukraine and the resultant spike in energy costs across the EU (caused by terrible energy decisions made by clueless politicians), power usage has come under more of a microscope than it ever had before. Couple that with the artificially-inflated power consumption numbers caused by AMD and Intel wanting to occupy the "top spot" on benchmark charts. Let's face it, people are just plain stupid sometimes. They behave like the top-spot CPU or GPU is somehow relevant to them even if they're not buying that specific product. Like, sure, the RTX 4090 is the fastest card in the world but what does that have to do with the noob who bought an RTX 4070 because he assumed that it must be faster than an RX 7900 XT because "It's nVidia, just like the RTX 4090!".

This is the kind of guano-insane mindset that has brought us to where we are now.
AusWolfNot to mention, I had a mid-range Asus board with my FX-8150 because I thought it was gonna be enough. Well, it sort of was, but its VRM ran so hot that its heatsink could burn your fingers if you touched it.
That's because ASUS is easily the most overrated motherboard brand in history. I never had that problem with my Gigabyte "Ultra-Durable" 990FX motherboard and my FX-8350. Now that I think about it, ASUS is the only mainstream motherboard brand that I have never owned (Although I do own an ASUS Vivobook craptop).
AusWolfGPUs get their power through the 24 pin EPS cable at 75 W per GPU. It's not the same as 230+ Watts on a single socket.
Don't you just hate it when people repeat the exact same thing that you said and try to use it to argue against you? :roll:
Avro ArrowI don't know what you're talking about. The TDP of the Phenom II X4 940 was 125W. That's 20W more than my R7-5800X3D and almost double the TDP of the R7-5700X. The FX-8350 was also 125W and don't forget that the K9A2 Platinum was designed to handle four video cards so that's another 300W through the PCI-Express slots (75W x 4). That's far more juice than you're going to find in modern motherboards because multi-GPU uses more juice than any CPU (except maybe the i9-13900K).
^^^ From the post that you were responding to. Please note the bold/italic text. ^^^
Posted on Reply
#25
bug
@Avro Arrow If anything, more complex parts are all but expected to fail sooner. They're pricier so one would assume they undergo more thorough testing. I somehow doubt that, since added functionality/parts increase test scenarios exponentially.

Also, TigerDirect... the latest to bite the dust :cry:
Posted on Reply
Add your own comment
Apr 24th, 2024 20:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts