• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD Ryzen 5 7600

Raptor stepping can be significantly better. Gaming IPC performance increase for Raptor in 1% lows was at 6%. Tested in 6 games at 5Ghz, 13900K vs 12900K, RTX 4090 @1080p. On top of that, the memory controller has the same architecture but supposedly performs better because of some claimed improvements in Intel 7 lithography.

Is that IPC increase not mainly because of cache? The amount of L2 cache is the same on the 12400 and 13400 (for both P-cores and E-cores), the L3 is slightly larger (just 2 MB).

And official memory support is 4800, just like Alder Lake.

The 13400 tops out at 4.6 GHz, which will be its biggest problem, especially on B chipsets.
It has exactly the same specs as the 12600K, but with lower clocks.

And it is a dead-end platform. Yes, many people upgrade after many years, so it does not affect them, but the fact is you will be able to easily replace the 7600 with something newer, faster and more efficient.


The 12400 was a fantastic CPU, because it was 1/3 cheaper than a 5600X. And AM4 was already a dead-end platform as well.
In my country the 12400 costs ~$176, the 13400 costs ~$232. Why would anyone choose the newer one?

Look at the 12600K in the picture below (Hardware Unboxed review). The 13400 will be a bit slower than that, because max boost clock is 300 MHz lower, while the rest of the specs are identical.

photo_2023-01-09_20-52-57.jpg
 
Last edited:
I don't think that platform cost should be a negative anymore because right now, an R5-7600X with a B650 mobo only costs $460. That's pretty damn reasonable compared to LGA1700 if you ask me because on newegg, the only B660 motherboard they have in stock costs $200 which pushes the platform price to $520. This means the AM5 platform price could now be considered a positive.

In any case, this CPU is less than 3% weaker in gaming than the R5-7600X and gaming's be the most hardware-intensive thing that anyone with a 6-core CPU would do. I do lament the lack of an R5-7600X3D but this CPU sure looks like a winner to me.
 
I don't think that platform cost should be a negative anymore because right now, an R5-7600X with a B650 mobo only costs $460. That's pretty damn reasonable compared to LGA1700 if you ask me because on newegg, the only B660 motherboard they have in stock costs $200 which pushes the platform price to $520. This means the AM5 platform price could now be considered a positive.

In any case, this CPU is less than 3% weaker in gaming than the R5-7600X and gaming's be the most hardware-intensive thing that anyone with a 6-core CPU would do. I do lament the lack of an R5-7600X3D but this CPU sure looks like a winner to me.
Well, as a DDR5 platform, the real cost comparison needs to be an R7 7600 on B650 vs i5-13400(F) on B660 DDR5.

I agree that the price comparison between those two now looks evenly matched, but we need reviews to answer the question of whether DDR5 is even worth bothering with at this price point for now.

At the very least, these non-X CPUs offer the best AM5 value yet, and while they're not as fast as Intel, they are vastly more power-efficient, which is an increasingly-important metric to judge a CPU by.
 
for $230/£200'ish the 7600 definitely looks promising. Love the efficiency but im more interested in the 3 year+ support plan.

Just wandering, are we expecting to see AM5 boards drop in price? It's a tough one.... for gaming either 5800X3D on an existing AM4 board (i have a couple available) for cost effectiveness or switching up to DDR5 AM5 (7600/7700) for the 3-5 year same platform upgrade.

I'm not feeling intel this time around. It sucks having been glued on a single platform for too long... first the 7700K on Z270 and currently the 9700K on Z390. I prefer upgrading more often (1.5-2 years) and seeing AM4 succeeded all expectations, i fancy having a punt at AM5.
 
Last edited:
Wow I thought I was the only one who had this problem and that I should've known. I complained about this on reddit and someone pointed it out and wham the AM5 experience improved so much. It's especially annoying because fanspeeds are set to 100% (WHY???) ASUS should really get blasted for this, bunch of fools.
Now we wait for the A620 boards to release for the AM5 puzzle to finally be complete and I can stop suggesting AM4.
7600 is a winner if it's at least 30$ cheaper or more.

I do think that the cooler is e-waste as long as the 5000 series is an option and cheaper. If you think the stock cooler is acceptable on a 7600X you can probably save a lot more by going AM4...
I have memory context restore enabled on my board and all unnecessary controllers disabled, and the boot time still feels way longer than any other system I've used outside of a server room.
 
Any i5 13400(f) , 13500 reviews coming soon ?
 
You can just set any value in PBO Advanced
The score of i5-12600 in Cinebench R23 multi is definitely a mistake. Probably 14000, not 11000.
12500 works at 4.1 GHz (300MHz less), all cores, and gets a higher score in multi (capture).

Intel is in trouble, with their locked and low-clocked Alder Lake rehash. And Meteor Lake for desktop is basically irrelevant.
How do you know? I have only seen comparisons of zen 4 and 12th intel locked. There are no reviews for 13th locked yet.
PS: i5 12th locked does not have E cores. 13400 has 4, and 13500/13600 has 8. The conclusion of all reviewers is that they bring a substantial multi-core boost and help in other applications (including gaming) by taking over the background tasks.
 

Attachments

  • cine 12500.jpg
    cine 12500.jpg
    797.5 KB · Views: 115
Last edited:
Price is not bad, okay included cooler, too bad B650 motherboards are still very expensive compared to B450/B550 and B660.
 
The cooler (AMD&Intel) is a bad joke. It is only useful as an emergency solution.
 
The cooler (AMD&Intel) is a bad joke. It is only useful as an emergency solution.
Wraith prism is the best stock cooler currently available and superior to the Intel cooler. It performs slightly worse than CM 212. It does a descent job up to 100W load.
Screenshot_20230110_091140_YouTube.jpg
 
I rely on the impressions of those who have used them. Both Intel and AMD use coolers designed to the limit and very noisy.
I doubt that cooler is decent if the Noctua NH-U14S barely keeps it at decent temperatures in gaming. There you can also find the temperatures with the stock cooler. Do you think they are decent?
 

Attachments

  • Clipboard01.jpg
    Clipboard01.jpg
    102.1 KB · Views: 183
I don't think that platform cost should be a negative anymore because right now, an R5-7600X with a B650 mobo only costs $460.

460 usd is a reasonable cost for a low tier CPU and a mobo?!

What do you drive a Lambo?
 
At the very least, these non-X CPUs offer the best AM5 value yet, and while they're not as fast as Intel, they are vastly more power-efficient, which is an increasingly-important metric to judge a CPU by.
Nice try. Besides the superb e-core (13900K only 3W in idle), the reality is ... a bit more complicated. Enjoy!
Key word: real world applications
P.S. 13500+mobo ddr5 = $450. If you want to keep the old DDR4 memories, you can make a combo for ~$350 with this processor. 13400 is even cheaper.
 
Last edited:
How do you know? I have only seen comparisons of zen 4 and 12th intel locked. There are no reviews for 13th locked yet.
PS: i5 12th locked does not have E cores. 13400 has 4, and 13500/13600 has 8. The conclusion of all reviewers is that they bring a substantial multi-core boost and help in other applications (including gaming) by taking over the background tasks.

Have you read my next post? I do know, because the 13400 has exactly the same specs as the 12600K, but with a 300 MHz lower boost clock.

And let us be honest here, productivity performance on a CPU like this is completely irrelevant. It makes literally no difference whether something takes 10% longer to complete. The end result is identical. But in games, any performance difference might result in hitting or missing your framerate target. And you will notice that immediately.
And as can be seen in reviews, productivity performance is a mixed bag anyway. Some apps do better on Intel (mainly rendering), some on AMD (most apps benefit from single-threaded performance).
Also, continuing to quote the E-core marketing is getting tiresome. There is only one situation where it is valid - maxing out the P-core utilization. But no games ever do that (outside of shader compilation in the menus). And if the P-cores are slower than the competition, E-cores will never make up the difference, games do not work that way.

I praised Alder Lake when it came out (except for the stock efficiency of the K SKUs, which was horrible). But Raptor Lake is a drastically worse product across the board. More expensive, with higher performance only possible by increasing power consumption. And these locked CPUs are not even Raptor Lake, but prices went up anyway.
 
Last edited:
Those e-cores take over all the background tasks leaving the P cores to deal exclusively with games. That's how it helps in gaming. In the reviews, you will not find the reality of the users' computers. The reviewers use empty windows. Most of us have various installed (Skype, Discord, Magician, etc., etc.) that these E-core will take care of when you run the game.
Tests are tests, in their simplicity, the real world is... a little more complicated.
 
I praised Alder Lake when it came out (except for the stock efficiency of the K SKUs, which was horrible). But Raptor Lake is a drastically worse product across the board. More expensive, with higher performance only possible by increasing power consumption. And these locked CPUs are not even Raptor Lake, but prices went up anyway.
But when I look at the TPU review of the 13600K and the gaming power consumption (12 games average) it gives these figures (these are chip power only)

12600K 56 W

13600K (Power limits removed) 73 W
13600K (Stock) 74 W

10600K 76 W

11600K 93 W

So despite the significant increase in performance over the 12600K the 13600K does use more power, but less than 20 W more than the 12600K. And both products use less power than the two older Intel CPUs that preceded them. We will have to see how the 13400 and 13500 perform, a bit less than 56 W maybe for the 13400 and a bit more than that for the 13500. There is a 13600 as well which again might use a touch more. If you overclock the 13600K to the max then power consumption will increase dramatically, but that is a choice not a requirement.

It is correct that the 7600 turns in decent figures, TPU lists the maximum power consumption as 57 W with PBO Max. So better than any of the Intel CPUs apart from the 12600K. But the differences are small.

The main conclusion I come to is that Alder Lake/Raptor Lake and Zen 4 as a group are more power efficient than Comet Lake, Rocket Lake and Zen 3. But I think it would be stretching it a bit to label Raptor Lake as dramatically worse in anything...
 
I have 11600KF and it's not its consumption that scares me. By default, the 3070Ti has a TDP of 290W. Through optimization, my model consumes a maximum of 230W, offering the same performance. No matter how optimized the new processors are, I don't see the drama in 20-30W when the video cards even exceed 400W.
What are we talking about? Did I save the consumption of a light bulb while the boiler in the other room activates the alarms at the power plant?
When I access that computer, I will post a screenshot of its consumption in PCMark, because it saves the CPU consumption for the entire duration.

i5-12600 + igp.

----
And... i5-12500 with 3.5W (Package: CPU + igp) in multi-monitor mode, while we are arguing here and watching videos on youtube. Should I ask for help from the government to be able to pay the bill?
 

Attachments

  • 12600.jpg
    12600.jpg
    389.8 KB · Views: 128
  • 12500 multimonitor.jpg
    12500 multimonitor.jpg
    2 MB · Views: 129
Last edited:
I rely on the impressions of those who have used them. Both Intel and AMD use coolers designed to the limit and very noisy.
I doubt that cooler is decent if the Noctua NH-U14S barely keeps it at decent temperatures in gaming. There you can also find the temperatures with the stock cooler. Do you think they are decent?
I was talking about the Wraith prism, that is a much better cooler than wraith stealth, sorry, thought the 7600 also had the prism.
cpu-temperature-gaming.png

78C in gaming vs 68C with U14S is not that bad, quite okay for me atleast.
 
Should I ask for help from the government to be able to pay the bill?

If only the government could help me get rid of the heat output.

My undervolted 3080 turns my room into a sauna during the summer, together with the rest of my equipment (TV, receiver). I am never buying a 300+ W card again.

Power consumption is the main thing I will be looking at from now on, for both CPUs and GPUs.
 
Nice try. Besides the superb e-core (13900K only 3W in idle), the reality is ... a bit more complicated. Enjoy!
Key word: real world applications
P.S. 13500+mobo ddr5 = $450. If you want to keep the old DDR4 memories, you can make a combo for ~$350 with this processor. 13400 is even cheaper.
I'm not sure why you linked that, the only relevant graph in that entire article is the 105W PL1 vs 7950X Eco105W, and this this topic doesn't involve the 7950X at all because there is currently no non-X variant of the 7950X.

Additionally, 105W PL1/PL2 isn't the same limitation as AMD's Eco105W setting. Eco105 is a 142W PPT, which is why it draws more power than the hard limit of 105W for Intel's PL1/PL2 option in that PCW video.

If you want to compare apples-to-apples between AMD and Intel, you have to set the PPT limit manually to 105W.

Edit:
Oh man, I just watched that video with sound turned on rather than seeking to the graphs for each chapter. My opinion of PCW's testing experience and professionalism wasn't very high but if you listen to Gordon talking about the eco modes it really shows they are totally ignorant of how AMD and Intel calculate TDP. He's talking about how the Intel system set to 65W was using 20W less power than the AMD system set to "65W". He clearly does not understand that PL1/PL2's equivalent on AMD is the PPT not the TDP. The graph was 20W higher because he'd set the AMD system to 88W and the Intel to 65W and was completely oblivious to his mistake.

Personally, I don't think someone in his profession should be reviewing hardware without this basic understanding of what he's reviewing. It's not difficult to search the web for AMD's TDP/PPT information and AMD have stated the information officially in multiple locations on both Twitter and Reddit from AMD verified accounts. There are countless web search results for articles and interviews discussing the difference between TDP and PPT for AMD, and this information has been common knowledge/essential knowledge for anyone testing/reviewing AMD in the last 6 years. IMO, such ignorance is worse than FUD, because consumers trust independent reviews - at least more than they trust manufacturers' marketing claims. There are 2-month-old comments to that PCW video on Youtube but PCW haven't bothered issuing a reply, retraction, edit to the video, or anything. It's just sat there as misinformation that looks bad for AMD and good for Intel, which is absolutely ludicrious considering that Intel's "TDP" is a disingenious 125W for a CPU that is sucking down 253W by design, and given the opportunity will very easily use over 300W.
 
Last edited:
Don't completely trust HWiNFO, buy a wattmeter and test from the wall outlet. A good optimized 13600K is more power efficient than almost any AMD processor. My 13600K system overclocked on 5400Mhz for 2 core boost draws 62-63W from the wall during 1Т CinebenchR23 which is less than any Zen 4 system in Idle without any load! This is without any power limiters applied.
 
If only the government could help me get rid of the heat output.

My undervolted 3080 turns my room into a sauna during the summer, together with the rest of my equipment (TV, receiver). I am never buying a 300+ W card again.

Power consumption is the main thing I will be looking at from now on, for both CPUs and GPUs.
3070Ti - 13W idle (according to HW info). In load, skyrocket.
12500 with igp - 25W idle, the whole system, according to the wattmeter.
Except for accessing pages, the whole system consumes ~30W in www, 35W youtube 1080p, 45W youtube 4K, or max 70w in any game that can be run by igp. At the other extreme: 146W max in Cine 23 multi, the whole system.
10500, 11500, 12500 are not spectacular. Nobody takes them into account with the reviews, people don't really look for them, but they do an excellent job and you can often find them at attractive prices.
With an EVGA GeForce RTX 3080 FTW3 Ultra video card, TPU review, I place it 6% below the 7700X, very close to the 7600(X). We are talking about 1080p, because in 1440p and 4k the differences almost disappear.
Not bad for a processor that last year cost me 210 euros, when the 5600X costs 320 euros and the 5800X3D almost 500 euros. Bonus: IGP
 
Last edited:
Back
Top