• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD to Launch FX-9590 Refresh Package

this got me intrigued cause any reviews I looked at, the i7-3820 seems to consume more power than the old 940

SBE_3820_LGA2011_54.jpg


it is understandable that 940 consumes around what 10-15w? more than the 945? But even if we add it up that would still fit in the not negligible bracket between the i7-3820 so I'm curious where you pulled that 80w difference.

Okay, first of all you can't just pull screenshots from anywhere and expect that you're talking about the right thing. First of all, if you read what they were doing:
For our overall system load test, we ran Prime 95 In-place large FFTs on all available threads for 15 minutes, while simultaneously loading the GPU with OCCT v3.1.0 GPU:OCCT stress test at 1680x1050@60Hz in full screen mode.

Also are those overclocked power draws? Yeah, I don't think so. Those are stock. So consider for a moment that SB-E idles like a champ even when overclocking.

So okay, assuming they did the same amount of work, they're the same. Oh wait, how much slower is the 945 against the 3820 again? Look at the numbers, most of them show the 3820 to be twice as fast as the 945, according to the review that you took that screenshot from. So lets assume you have both CPUs and the 3820 spends half as much time doing the same job because it does twice as much in the same amount of time (more or less, but on average I would say that is correct.)

So lets assume we record over a few days of load where the 3820 takes 0.75 days instead of 1.5 days like the 945 would. That's the 3820 running at "166-watts" for 0.75 hour (124.5 watt/hours) plus idle which is hours at 64-watts @ 1.75 (80 watt/hours) for a total of 204.5 watt/hours.

Take the 945, loaded for twice as long (1.50 days) and idle for only (0.5 days). So 173-watts would be 259.5 watt/hours a day plus idle of 0.5 days @ 79-watts (39.5 watt/hours) which totals ~300 watt/hours.

So the actual power used difference is right there. If the 945 draws more power, the result of subtracting the value of the 945 from the 3820 should yield a negative number.

300 watt/hours - 204.5 watt/hours = 95.5 watt/hours difference for the same workload which would be recorded for twice the length of time of the longest running CPU.

Now that's the calculated difference from Cannucks. Hilbert Hagedoorn would disagree with the 3820 power consumption figures there. It doesn't help that their stressing the GPU which has nothing to do with CPU load if you're already maxxing it out.

What they leave out is what happens when you overclock the 945 and it ate power in a similar manner as Hilbert's graph does for the 8150 but not as bad, but it got up there.
Untitled-1.png


So not only am I talking about machines that are overclocked, my idea of saving power over it holds true even for stock speeds, the power issue will only become more apparent the more you overclock.

I thank you for the challenge, but a little more research might be in order before making such claims.
 
Last edited:
my i5-2500k with 40% oc "1.35v" idles at 15.xxW
i wish i had bothered to check what my Q6600 idled at (but it was always running at 3.7 and never clocked down with the way i had it)
the i5 at full load occt gets to 111.54w
( represents a decent load, which is a bit more than you get from gaming or cpu intensive pc tasks)

These measurements are taken from aida64 and only measure the cpu power draw not the system draw..

maybe we need to compile a list some where a "post your aida 64 cpu power consumption idle and load" Thread. and after a set amount of time make it in to a nice little graph..
This would help us know just how much you stand to save over a given period at your electricity tariff.

I am certain that i am making a substantial saving with the i5 vs the q6600 but only because my tarif is close to 50c P/kwh
however i cant find power consumption figures for the q6600 alone. only system power usage.. which makes it a bit more difficult to calculate properly.

-=edit=-
found some one elses results and they had these numbers
47.23w Idle
145.6w Load
so seems there is a conciderable difference. and i probably should have had my q6600 clock down from 3.7 when it wasnt under load.
(over 100% diference in idle consumption and about 25% difference under load.)
I will call my idle usage of the q6600 at 75w as a guess. so you guys are right even at my tarif the main bulk of my savings must have come from swapping the tv.
but at idle this i5 saves a considerable amount ~65w compared to the q6600. and 34w at load, so it does add up over the avarage upgrade course of 3 years at my tariff of 48c p/kwh.. and does mean that the cpu will pay for its self with ease.
 
Last edited:
Do you have home PCs in your houses or servers where all the neighbors are uploading tasks forcing your system to run constantly with at least 80-100% utilization?

There is difference between Intel and AMD, but things are not so melodramatic when talking about power bills. It depends on the person and how much it uses the system. If the system is constantly loaded, or idle. If it is constantly open, in stand by mode, or most of the time off. We can create scenarios where the PC is OCed and constantly runs all the time at high loads and come out with numbers that favor Intel greatly, or create other scenarios where the person has 5 PCs in his house usually only one or two of then ON not doing much, where the AMD is a better choice if we factor how much cheaper 5 AMD solutions are compared with 5 Intel. But these are extreme scenarios that only want to come out with an advantage for Intel or AMD and nothing else.
 
well 5 i7's of the same price and performance of the 9590 would be cheaper than 5 amd systems. (need less cooling and less expensive mother boards) and would also run using 1/2 the power...
but i was comparing intel to intel.
 
Last edited:
well 5 i7's of the same price and performance of the 9590 would be cheaper than 5 amd systems. also run using 1/2 the power...
but i was comparing intel to intel.

Really? Well next time we compare power efficiency and performance/watt between AMD and Nvidia cards let's start with Titan Z.
 
would kind of be pointless how about a 290x vs a 770
 
Okay, first of all you can't just pull screenshots from anywhere and expect that you're talking about the right thing. First of all, if you read what they were doing:


Also are those overclocked power draws? Yeah, I don't think so. Those are stock. So consider for a moment that SB-E idles like a champ even when overclocking.

So okay, assuming they did the same amount of work, they're the same. Oh wait, how much slower is the 945 against the 3820 again? Look at the numbers, most of them show the 3820 to be twice as fast as the 945, according to the review that you took that screenshot from. So lets assume you have both CPUs and the 3820 spends half as much time doing the same job because it does twice as much in the same amount of time (more or less, but on average I would say that is correct.)

So lets assume we record over a few days of load where the 3820 takes 0.75 days instead of 1.5 days like the 945 would. That's the 3820 running at "166-watts" for 0.75 hour (124.5 watt/hours) plus idle which is hours at 64-watts @ 1.75 (80 watt/hours) for a total of 204.5 watt/hours.

Take the 945, loaded for twice as long (1.50 days) and idle for only (0.5 days). So 173-watts would be 259.5 watt/hours a day plus idle of 0.5 days @ 79-watts (39.5 watt/hours) which totals ~300 watt/hours.

So the actual power used difference is right there. If the 945 draws more power, the result of subtracting the value of the 945 from the 3820 should yield a negative number.

300 watt/hours - 204.5 watt/hours = 95.5 watt/hours difference for the same workload which would be recorded for twice the length of time of the longest running CPU.

Now that's the calculated difference from Cannucks. Hilbert Hagedoorn would disagree with the 3820 power consumption figures there. It doesn't help that their stressing the GPU which has nothing to do with CPU load if you're already maxxing it out.

What they leave out is what happens when you overclock the 945 and it ate power in a similar manner as Hilbert's graph does for the 8150 but not as bad, but it got up there.
Untitled-1.png


So not only am I talking about machines that are overclocked, my idea of saving power over it holds true even for stock speeds, the power issue will only become more apparent the more you overclock.

I thank you for the challenge, but a little more research might be in order before making such claims.

Yup I'm just clarifying cause you never mentioned you're doing productivity tasks. Now my charts may be a bit off cause just as I said, I assumed all you did is just gaming. Just want to emphasize the difference in terms of total consumption minus the productivity factor. Thanks for clarifying.
 
AMD plans to deliver 25x APU energy efficiency gains but also plans to deliver 25x CPU energy consumption, heat gains. :roll:

Now compare:
Code:
         i7-4790K - FX-9590
5GHz     air           water
nm       22nm          32nm
price    340$          360$

So you are paying same price for old tech. :rockout:

AMD add's in GPU has part of the performance which I have seen a few artcles from "amd" fan review sites that wrote story from such view point, AMD put bench mark to say their apu is performance wise as fast an i7. But the benchmarks in question were basemarkCL, 3dmark, and forgot the other. all 3 were ones that could use both cpu and gpu at same time. Mostly everyone knows that only handful programs at best do that in real world use.

edit: found the article the title:
"AMD goes punch-for-punch with Intel's top-end i7 processors"

Yet they used an i7 4500u for comparison and claim that is "top-end i7" only dual core i7.

I can support AMD including water coolers for their top CPUs. It differentiates itself from Intel. Whether they are great CPUs, that's debatable.

they had to use water cooler, nothing else could cool the damn thing.

FX chips are severely overvolted. I've tuned core voltage down to >1.3v on my 8350s and 1.225v on my 8320s without crashes or any other problem (running Boinc 100% of the time they're on). Heck, at that VCORE, my 8320s run on 95w boards (Asrock 880GM-LE FX) without any problem.

Why do they ship with a 1.4v VCORE? My guess is that some of the worse chips do need the 1.4v and AMD chose to play sure.

Not like that would suddenly turn them into power sipping powerhouses but every bit helps :toast:

Its the silicon lottery, some cpu's can do same voltage at lower volts but a lot of them can't. So they gotta set voltage based on what works on everything. If you looked at intel cpu's, same thing happens with theirs. my 4770k can do 4.5ghz at 1.20volts but a lot of other chips are closer to 1.30-1.35volts or even higher.
 
Last edited:
C'mon AMD.....adding the AIO will not "wooo" anyone...not even die hard AMD fans (like me). As others have stated, lower the TDP on the 9000 series FX chips, and then we may bite. But AIO....thanks but no thanks, will be sticking to my 8350. o_O
 
C'mon AMD.....adding the AIO will not "wooo" anyone...not even die hard AMD fans (like me). As others have stated, lower the TDP on the 9000 series FX chips, and then we may bite. But AIO....thanks but no thanks, will be sticking to my 8350. o_O

wouldn't be possible without a whole more cpu arch to replace it. 9590 is nothing more then overclocked 8350.
 
wouldn't be possible without a whole more cpu arch to replace it. 9590 is nothing more then overclocked 8350.
Yeah, I know....still can wish this into reality right?;)
 
Back
Top