• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Ryzen 5 7600

Joined
Mar 22, 2010
Messages
11 (0.00/day)
Тhe cover photo is classic click bite but otherwise it is worth to watch it.
 
Joined
May 17, 2021
Messages
3,005 (2.82/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
If only the government could help me get rid of the heat output.

My undervolted 3080 turns my room into a sauna during the summer, together with the rest of my equipment (TV, receiver). I am never buying a 300+ W card again.

Power consumption is the main thing I will be looking at from now on, for both CPUs and GPUs.

put the PC on another room, it is amazing for thermals and noise... and even space
 
Joined
Jun 6, 2022
Messages
621 (0.91/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
As promised, the consumption of 11600KF in the "real world". If you test the extremes in reviews, the real world is different. You can take the processor over 100W, but in most cases it stays quiet and consumes decently, even in games with the 3070 Ti, which is not really entry level in the world of video cards.
Bonus: 12500 power consumption in the same PCMark10
P.S. The results are inconclusive because the video card has a huge impact in this test. You can't compare an RTX with an igp.
 

Attachments

  • 11600KF PCMark10 power consumption.jpg
    11600KF PCMark10 power consumption.jpg
    162.3 KB · Views: 72
  • 12500 PCMark10 power consumption.jpg
    12500 PCMark10 power consumption.jpg
    152.7 KB · Views: 73
Joined
Mar 22, 2010
Messages
11 (0.00/day)
13600K
P - 5400-2Cores, 5300-4Cores, 5200-5Cores, 5100-All Cores, AVX-5000
E - 4100
Ring - 4500


RAM-3600 CL16 Gear 1



RAM-4200 CL19 Gear 1



Express
 
Last edited:
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
Please stop posting 3DMark graphs of power consumption as if they prove something. The only accurate measure of power consumption is physically at the connectors and the wall.
 
Last edited by a moderator:
Joined
Mar 22, 2010
Messages
11 (0.00/day)
No problem.
I have 36-37W at idle.
About 62W at 1T CinebechR23 (about 2100 points)
And little bit less than 200W(about 145W CPU power consumption from HWinfo) nT CinebechR23 (about 25 000 points)
Those are physical readings at the connector on the wall from a system with optimized and slightly overclocked 13600K.

Let someone with Zen 4 post his readings.
 
Last edited:
Joined
Dec 10, 2022
Messages
464 (0.94/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
Well, as a DDR5 platform, the real cost comparison needs to be an R7 7600 on B650 vs i5-13400(F) on B660 DDR5.

I agree that the price comparison between those two now looks evenly matched, but we need reviews to answer the question of whether DDR5 is even worth bothering with at this price point for now.
I couldn't agree more because I don't think that DDR5 is worth it. I think that AMD made a mistake by offering the AM5 platform with only DDR5 compatibility. At the same time though, DDR4 and DDR3 were also premature. When DDR3 came out, DDR2 was still perfectly viable and the same could be said about DDR4 vis a vis DDR3. Hell, just think of the people out there still using DDR3 with their FX processors. For example, Steve Burke, the Tech Jesus himself, still uses an FX-8350 in his home PC.
At the very least, these non-X CPUs offer the best AM5 value yet, and while they're not as fast as Intel, they are vastly more power-efficient, which is an increasingly-important metric to judge a CPU by.
You know what's funny? Ten years ago, people were quoting the high power usage of AMD CPUs and Radeon GPUs as reasons they didn't buy them. Now those people have been proven to be full of donkey doo-doo because suddenly such things don't seem to matter to them anymore. :laugh:

460 usd is a reasonable cost for a low tier CPU and a mobo?!

What do you drive a Lambo?
How much did the Intel DDR5 platform cost again?

What do you game on, Sandy Bridge?
 
Joined
Jun 6, 2022
Messages
621 (0.91/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
Please stop posting 3DMark graphs of power consumption as if they prove something. The only accurate measure of power consumption is physically at the connectors and the wall.
It's a problem with AMD video cards, because they report much less consumption in reality. That's how they are designed. You can find the explanation in each video card review. For processors, consumption is accurately reported, via VRM sensors.

You know what's funny? Ten years ago, people were quoting the high power usage of AMD CPUs and Radeon GPUs as reasons they didn't buy them. Now those people have been proven to be full of donkey doo-doo because suddenly such things don't seem to matter to them anymore. :laugh:
They consumed too much and offered little, that was the problem with FX. Now it's funny that many AMD fans invoke the consumption of Intel processors in programs that they don't use without realizing that the same processor consumes in many others less than an AMD consumes in idle.
I would like to see hwinfo captures here with Zen 4 consumption after 2-3 hours of use. Minimum, maximum, average. At Zen 3 I saw a few.
 
Last edited by a moderator:
Joined
Feb 18, 2005
Messages
5,238 (0.75/day)
Location
Ikenai borderline!
System Name Firelance.
Processor Threadripper 3960X
Motherboard ROG Strix TRX40-E Gaming
Cooling IceGem 360 + 6x Arctic Cooling P12
Memory 8x 16GB Patriot Viper DDR4-3200 CL16
Video Card(s) MSI GeForce RTX 4060 Ti Ventus 2X OC
Storage 2TB WD SN850X (boot), 4TB Crucial P3 (data)
Display(s) 3x AOC Q32E2N (32" 2560x1440 75Hz)
Case Enthoo Pro II Server Edition (Closed Panel) + 6 fans
Power Supply Fractal Design Ion+ 2 Platinum 760W
Mouse Logitech G602
Keyboard Logitech G613
Software Windows 10 Professional x64
For processors, consumption is accurately reported, via VRM sensors.
If it was accurately reported, we wouldn't have TPU and GN showing us that it's not.
 
Last edited:
Joined
Feb 20, 2019
Messages
7,270 (3.86/day)
System Name Bragging Rights
Processor Atom Z3735F 1.33GHz
Motherboard It has no markings but it's green
Cooling No, it's a 2.2W processor
Memory 2GB DDR3L-1333
Video Card(s) Gen7 Intel HD (4EU @ 311MHz)
Storage 32GB eMMC and 128GB Sandisk Extreme U3
Display(s) 10" IPS 1280x800 60Hz
Case Veddha T2
Audio Device(s) Apparently, yes
Power Supply Samsung 18W 5V fast-charger
Mouse MX Anywhere 2
Keyboard Logitech MX Keys (not Cherry MX at all)
VR HMD Samsung Oddyssey, not that I'd plug it into this though....
Software W10 21H1, barely
Benchmark Scores I once clocked a Celeron-300A to 564MHz on an Abit BE6 and it scored over 9000.
It's a problem with AMD video cards, because they report much less consumption in reality. That's how they are designed. You can find the explanation in each video card review. For processors, consumption is accurately reported, via VRM sensors.
It's only a problem if you don't read the labels of the values you're given.
  • Radeons report the GPU Chip Power Draw
  • Geforces report both the GPU Chip Power Draw and the Board Power Draw.
The difference is that Nvidia boards typically have shunt resistors at the connectors to measure exact power draw from each of the sockets and slot. Just like Radeons, if you only look at the GPU Chip Power Draw on a Geforce, you'll see it reporting about 80-85% of the Board Power Draw.

Would it be nice to have shunt resistors on Radeons to measure Board Power Draw? Yes.
 
Joined
May 17, 2021
Messages
3,005 (2.82/day)
Processor Ryzen 5 5700x
Motherboard B550 Elite
Cooling Thermalright Perless Assassin 120 SE
Memory 32GB Fury Beast DDR4 3200Mhz
Video Card(s) Gigabyte 3060 ti gaming oc pro
Storage Samsung 970 Evo 1TB, WD SN850x 1TB, plus some random HDDs
Display(s) LG 27gp850 1440p 165Hz 27''
Case Lian Li Lancool II performance
Power Supply MSI 750w
Mouse G502
Joined
Jun 6, 2022
Messages
621 (0.91/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
If it was accurately reported, we wouldn't have TPU and GN showing us that it's not.
Ok, it's wrong with me. The 25W reported by the wattmeter for the entire system at idle is the consumption of the processor only. It is only correct for AMD, with 20W, only processor, in idle and 50 degrees, which means that the AMD processor generates current, not consumes it.

----
The average is 6.1 hours/day since system installation (upgrade to W11, to be exact). Certainly in January 2023 it was higher because I have been on leave since December 23. In 11 days it consumed 5.5KW, 500W/day for minimum 6.1 h/day use. The whole system. Bankruptcy, what else!
The highlight is that here I have the WoT base and some older games, so I play, but the system is intended especially for office, www, movies and other activities that eat up my time.
 

Attachments

  • 11 ian 2023.jpg
    11 ian 2023.jpg
    4.4 MB · Views: 62
Last edited:
Joined
Dec 28, 2012
Messages
3,475 (0.84/day)
System Name Skunkworks
Processor 5800x3d
Motherboard x570 unify
Cooling Noctua NH-U12A
Memory 32GB 3600 mhz
Video Card(s) asrock 6800xt challenger D
Storage Sabarent rocket 4.0 2TB, MX 500 2TB
Display(s) Asus 1440p144 27"
Case Old arse cooler master 932
Power Supply Corsair 1200w platinum
Mouse *squeak*
Keyboard Some old office thing
Software openSUSE tumbleweed/Mint 21.2
This pricing makes ZERO sense... lowest rung non-X CPU (7600) is a whopping $70 cheaper than its X counterpart, yet the next one up (7700) is a mere $15 cheaper? Surely it should be the other way around?

It doesn't matter though, people are still not going to pay a premium for a 6-core 7600 + motherboard + DDR5 when you can get an 8-core 5800X3D + motherboard + DDR4 for less and have better gaming performance. I don't understand which genius at AMD thought that people would be happy to pay more for CPU performance, after years of the same company charging less. A smart company would be selling its newer CPUs for less than its old ones to encourage consumers to upgrade... maybe AMD is just hoping that AM4 stock will run out and consumers will be forced to buy AM5... or, maybe those consumers will just buy Intel...

Everything about AM5 has felt like the Bulldozer-era AMD TBH - clueless about the actual value proposition of their product and expecting people to buy it solely due to brand loyalty.
It's AMD's old cycle. They were doing alright before bulldozer with CPUs, in GPUS they were great, got to 49% market share with evergreen.....then just rebranded everything of the 6000s, shoved it at us, and got caught with their pants down when fermi 2.0 came out.

Or before that, when athlon 64 was kicking intels arse and AMD went and bought ATi while pushing back their new arch and charging $1000 for the FX-62 only for conroe to exist and obliterate them from orbit.
Wraith prism is the best stock cooler currently available and superior to the Intel cooler. It performs slightly worse than CM 212. It does a descent job up to 100W load.
View attachment 278394
That's A) on a ryzen 5 3600, not a 7600, and B) the new wraith stealth cooler (which is what comes with the 7600) is hot garbage.
 
Joined
Aug 9, 2019
Messages
1,520 (0.89/day)
Processor Ryzen 5600X@4.85 CO
Motherboard Gigabyte B550m S2H
Cooling BeQuiet Dark Rock Slim
Memory Patriot Viper 4400cl19 2x8@4000cl16 tight subs
Video Card(s) Asus 3060ti TUF OC
Storage WD blue 1TB nvme
Display(s) Lenovo G24-10 144Hz
Case Corsair D4000 Airflow
Power Supply EVGA GQ 650W
Software Windows 10 home 64
Benchmark Scores CB20 4710@4.7GHz Aida64 50.4ns 4.8GHz+4000cl15 tuned ram SOTTR 1080p low 263fps avg CPU game
It's AMD's old cycle. They were doing alright before bulldozer with CPUs, in GPUS they were great, got to 49% market share with evergreen.....then just rebranded everything of the 6000s, shoved it at us, and got caught with their pants down when fermi 2.0 came out.

Or before that, when athlon 64 was kicking intels arse and AMD went and bought ATi while pushing back their new arch and charging $1000 for the FX-62 only for conroe to exist and obliterate them from orbit.

That's A) on a ryzen 5 3600, not a 7600, and B) the new wraith stealth cooler (which is what comes with the 7600) is hot garbage.
As I replied later the prism works fine on 7700 and I thought the 7600 also had prism. Stealth is crap.
 
Joined
Dec 10, 2022
Messages
464 (0.94/day)
System Name The Phantom in the Black Tower
Processor AMD Ryzen 7 5800X3D
Motherboard ASRock X570 Pro4 AM4
Cooling AMD Wraith Prism, 5 x Cooler Master Sickleflow 120mm
Memory 64GB Team Vulcan DDR4-3600 CL18 (4×16GB)
Video Card(s) ASRock Radeon RX 7900 XTX Phantom Gaming OC 24GB
Storage WDS500G3X0E (OS), WDS100T2B0C, TM8FP6002T0C101 (x2) and ~40TB of total HDD space
Display(s) Haier 55E5500U 55" 2160p60Hz
Case Ultra U12-40670 Super Tower
Audio Device(s) Logitech Z200
Power Supply EVGA 1000 G2 Supernova 1kW 80+Gold-Certified
Mouse Logitech MK320
Keyboard Logitech MK320
VR HMD None
Software Windows 10 Professional
Benchmark Scores Fire Strike Ultra: 19484 Time Spy Extreme: 11006 Port Royal: 16545 SuperPosition 4K Optimised: 23439
so it's fine because Intel. Got it.
The AM5 platform including the CPU and motherboard are about the same price now as the Intel LGA 1700 with DDR5. If both platforms are about the same price, NEITHER of them can really be called negative and it has nothing to do with Intel or AMD. I didn't think it was that complicated of a concept, but I guess I was mistaken because you clearly proved me wrong in that regard.
 
Last edited by a moderator:
Joined
Jun 6, 2022
Messages
621 (0.91/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
Off and On at the same time (the impact of the processor in gaming).
I replaced 11600KF with 10500. The reason: two processors, one motherboard and 10500 practically cannot be sold. I got rid of 11600KF immediately, at a reasonable price.
I did some tests beforehand, so that I don't regret it later. To my surprise, the 10500 behaves very well next to the 3070Ti. I even have a 3% blck reserve (120MHz more, all cores) and the Gaming profile in the BIOS (another 100 MHz more on all cores), but I'll try it when I have time, I have no serious reasons now. The eye does not distinguish the differences.
In synthetic tests, the differences are huge. They are also in gaming (see force), but the eye does not see them. With the downgrade to pcie 3.0 on the video card and nvme, you'd say it's a catastrophe, but the reality is completely different.
So, if we invoke gaming, what is the impact of the processor really in gaming? The reviews use powerful, top-of-the-line video cards. Most users, hmmm... I think entry-mid video cards definitely dominate.
So what are we talking about? How to spend the money and encourage them to sell us at a high price?
 

Attachments

  • 10500 nfs unbound.jpg
    10500 nfs unbound.jpg
    312.6 KB · Views: 53
  • forza 5 10500 + 3070 Ti.jpg
    forza 5 10500 + 3070 Ti.jpg
    355.7 KB · Views: 46
  • Forza Horizon 5 11600KF_3070Ti.jpg
    Forza Horizon 5 11600KF_3070Ti.jpg
    209.4 KB · Views: 59
  • Forza Horizon 5 10500_3070Ti.png
    Forza Horizon 5 10500_3070Ti.png
    839.9 KB · Views: 50
Joined
Jan 17, 2018
Messages
372 (0.16/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
Off and On at the same time (the impact of the processor in gaming).
I replaced 11600KF with 10500. The reason: two processors, one motherboard and 10500 practically cannot be sold. I got rid of 11600KF immediately, at a reasonable price.
I did some tests beforehand, so that I don't regret it later. To my surprise, the 10500 behaves very well next to the 3070Ti. I even have a 3% blck reserve (120MHz more, all cores) and the Gaming profile in the BIOS (another 100 MHz more on all cores), but I'll try it when I have time, I have no serious reasons now. The eye does not distinguish the differences.
In synthetic tests, the differences are huge. They are also in gaming (see force), but the eye does not see them. With the downgrade to pcie 3.0 on the video card and nvme, you'd say it's a catastrophe, but the reality is completely different.
So, if we invoke gaming, what is the impact of the processor really in gaming? The reviews use powerful, top-of-the-line video cards. Most users, hmmm... I think entry-mid video cards definitely dominate.
So what are we talking about? How to spend the money and encourage them to sell us at a high price?
Are you just kind of rambling to yourself on the thread? Oh well, I'll engage this time.

You're comparing two generations that were barely different from one another. The 10 & 11th Generation Intels were something like 0-5% apart. There was basically no meaningful difference between the two. On top of that you're comparing Forza 5, a game that has barely a meaningful difference in FPS at 1080p in this very review. 20 of the CPU's in Forza 5 are within 2 FPS of one another.

There are some games where there is a huge, meaningful improvement in CPU upgrades for gaming. Typically they are not graphically intensive games but sim-calculation heavy games such as Factorio, Paradox Interactive Grand Strategy Games(Crusader Kings, Stellaris, etc.), X4 Foundations, etc. However, Far Cry 6, Borderlands 3 and Age of Empires 4 all have fairly meaningful FPS gains from a 10600k to the 13600k.
 
Joined
Jun 6, 2022
Messages
621 (0.91/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
You are seriously mistaken. Series 10 was the last of the skylake series, series 11 being a breakdown solution by redesigning a 10nm processor for 14nm lithography. The differences between them are very big, both single and multiple. According to the benchmark, 11600K exceeds 10500 by more than 30%. In reviews. In real life, you don't die with a 10500 in the system.
Ontopic:
Yes, I know, it's wrong. It's not just my wattmeter that generates errors, but everything that indicates a consumption-friendly Intel. Like in this video.
 

Attachments

  • what_lightroom.jpg
    what_lightroom.jpg
    700.4 KB · Views: 55
  • what_photoshop.jpg
    what_photoshop.jpg
    701.6 KB · Views: 61
  • what_premiere.jpg
    what_premiere.jpg
    700.5 KB · Views: 58
Last edited:
Joined
Jan 17, 2018
Messages
372 (0.16/day)
Processor Ryzen 7 5800X3D
Motherboard MSI B550 Tomahawk
Cooling Noctua U12S
Memory 32GB @ 3600 CL18
Video Card(s) AMD 6800XT
Storage WD Black SN850(1TB), WD Black NVMe 2018(500GB), WD Blue SATA(2TB)
Display(s) Samsung Odyssey G9
Case Be Quiet! Silent Base 802
Power Supply Seasonic PRIME-GX-1000
You are seriously mistaken. Series 10 was the last of the skylake series, series 11 being a breakdown solution by redesigning a 10nm processor for 14nm lithography. The differences between them are very big, both single and multiple. According to the benchmark, 11600K exceeds 10500 by more than 30%. In reviews. In real life, you don't die with a 10500 in the system.
Ontopic:
Yes, I know, it's wrong. It's not just my wattmeter that generates errors, but everything that indicates a consumption-friendly Intel. Like in this video.
You're all over the place. Your last post was about games, not productivity. In games, the 11600k & 10600k were separated by a mere 0.8% at 720p, and it actually was the 10600k that was that little bit faster than the 11600k. I'm just sticking with your own talking points.

We get it, you like Intel... a lot apparently. Until we see a 13400, 13500, 13600 or 13700 non-K, the 7600 non-X, the subject of this thread, is slightly more power efficient than Intel 13th gen and on par with the 12600k.
 
Joined
Jun 6, 2022
Messages
621 (0.91/day)
System Name Common 1/ Common 2/ Gaming
Processor i5-10500/ i5-13500/ i7-14700KF
Motherboard Z490 UD/ B660M DS3H/ Z690 Gaming X
Cooling TDP: 135W/ 200W/ AIO
Memory 16GB/ 16GB/ 32GB
Video Card(s) GTX 1650/ UHD 770/ RTX 3070 Ti
Storage ~12TB inside + 6TB external.
Display(s) 1080p@75Hz/ 1080p@75Hz/ 1080p@165Hz+4K@60Hz
Case Budget/ Mini/ AQIRYS Aquilla White
Audio Device(s) Razer/ Xonar U7 MKII/ Creative Audigy Rx
Power Supply Cougar 450W Bronze/ Corsair 450W Bronze/ Seasonic 650W Gold
Mouse Razer/ A4Tech/ Razer
Keyboard Razer/ Microsoft/ Razer
Software W10/ W11/ W11
Benchmark Scores For my home target: all ok
I will also like AMD when I see 1W in idle with a hexacore, not 20+W. I don't think that the average consumption for www, movies, office, multimedia and other light activities exceeds an idle AMD.
Regarding the gaming system, I'm not worried because the video card eats 80% of the GPU-CPU power. Here, the price matters.

At least one hour of WoT with the igp. If you ask the wattmeter, ~60W average in this game, towards 70W peak. The whole system!
You can also capture sessions of 3-4 hours with ryzen, let's see the consumption in real life.
 

Attachments

  • over 3h.jpg
    over 3h.jpg
    691.8 KB · Views: 68
Last edited:

rabidhole

New Member
Joined
Jan 29, 2023
Messages
1 (0.00/day)
There's something I'd really like to know, were the emulation benchmarks done with AVX-512 enabled/capable 12th gens?
 
Joined
Aug 10, 2021
Messages
166 (0.17/day)
System Name Main
Processor 5900X
Motherboard Asrock 570X Taichi
Memory 32GB
Video Card(s) 6800XT
Display(s) Odyssey C49G95T - 5120 x 1440
At least one hour of WoT with the igp. If you ask the wattmeter, ~60W average in this game, towards 70W peak. The whole system!
You can also capture sessions of 3-4 hours with ryzen, let's see the consumption in real life.
What do you determine as "consumption in real life"?
W1zzard already have 3 different power consumption test cases, that is actually testing the CPU power, and not relying on the software monitor (a combination of those is somewhat close to my real usage)
All power measurements on this page are based on a physical measurement of the voltage, current and power flowing through the 12-pin CPU power connector(s), which makes them "CPU only," not "full system." We're not using the software sensors inside the processor, as these can be quite inaccurate and will vary between manufacturers.
The 7600 is also 18W in single thread test case, which last time I checked is under 20+W
Strangely enough(?) the lowest consumption single-thread CPU is 5700G...
 
Joined
Dec 12, 2012
Messages
717 (0.17/day)
Location
Poland
System Name THU
Processor Intel Core i5-13600KF
Motherboard ASUS PRIME Z790-P D4
Cooling SilentiumPC Fortis 3 v2 + Arctic Cooling MX-2
Memory Crucial Ballistix 2x16 GB DDR4-3600 CL16 (dual rank)
Video Card(s) MSI GeForce RTX 4070 Ventus 3X OC 12 GB GDDR6X (2610/21000 @ 0.91 V)
Storage Lexar NM790 2 TB + Corsair MP510 960 GB + PNY XLR8 CS3030 500 GB + Toshiba E300 3 TB
Display(s) LG OLED C8 55" + ASUS VP229Q
Case Fractal Design Define R6
Audio Device(s) Yamaha RX-V381 + Monitor Audio Bronze 6 + Bronze FX | FiiO E10K-TC + Sony MDR-7506
Power Supply Corsair RM650
Mouse Logitech M705 Marathon
Keyboard Corsair K55 RGB PRO
Software Windows 10 Home
Benchmark Scores Benchmarks in 2024?
What do you determine as "consumption in real life"?
W1zzard already have 3 different power consumption test cases, that is actually testing the CPU power, and not relying on the software monitor (a combination of those is somewhat close to my real usage)

The 7600 is also 18W in single thread test case, which last time I checked is under 20+W
Strangely enough(?) the lowest consumption single-thread CPU is 5700G...

Yeah, so many people say that Ryzens consume ~50 W in idle. If that's so, how are there so many apps that consume less than 20 W in TPU testing?

Someone has to be right. Do some reviewers/users have power saving options disabled? Like C-states or whatever they have on AMD.
 
Top