Monday, June 23rd 2014
AMD to Launch FX-9590 Refresh Package
AMD is preparing a new retail package of its feisty FX-9590 eight-core processor, in a bid to woo crowds away from Intel's Core i7-4790K "Devil's Canyon" processor. The package combines an FX-9590, which till now was sold chip-only (without a cooling solution), with an Asetek-made liquid CPU cooler, for US $359. Given that without the cooler, the FX-9590 costs $319, the extra $40 for a liquid cooler adds great value. Based on the 32 nm "Vishera" silicon, the FX-9590 features eight CPU cores based on the "Piledriver" micro-architecture, clocked at 4.70 GHz, with Turbo Core speeds of 5.00 GHz; a dual-channel DDR3 integrated memory controller that natively supports DDR3-1866 MHz, 8 MB of total L2 cache, 8 MB of L3 cache; and modern instruction sets such as AVX, AES, FMA3, etc.
Source:
HardwareCanucks
60 Comments on AMD to Launch FX-9590 Refresh Package
Now compare:
So you are paying same price for old tech. :rockout:
I'm not yelling.....not sure why my font got all whacky......prob that user error thing i keep hearing about......;)
Let me tell it a little differently. 9590 (this, or the original, it doesn't matter) is NOT a refresh of 8350.
let me go one step further. If 8320 was the top FX processor and one year later AMD was showing 8350 with better thermal paste, no one, but AMD's marketing department, would be calling it a refresh.
If you run cpu-z on 4770K and 4790K you will get identical info with the only difference being in watts and multiplier. I don't call this a refresh. I call it an overclock.
Tech Power Up misquoted HardwareCanucks.com.
Hardware Canucks said "It's interesting and "newish" but not a brand new processor or even a refreshed architecture."
Saying that TPU did say "refreshed packaging". Maybe to stop confusion they should have said "revised packaging" or "updated packaging".
Amd employee saying something new when its not.... either intentionally miss leading or clueless to his own products...
Also keep in mind if AMD goes under and Intel does become a monopoly in the market, antitrust laws may force them to make changes that would enable other companies to better compete, so I don't buy that argument. It's like saying banks that are doing badly shouldn't be able to fail to make room for ones that won't. It doesn't make any sense. I should reiterate that Intel is based out of the United States and is subject to our antitrust laws and in the past they've worked in AMD's favor.
OK you may save $30 per 5 year upgrade cycle going Intel in electricity. Which is negated by the fact that the Intel rig was likely more than $30 to purchase to begin with. I disagree, as a consumer I want the fastest for my budget. Whether Intel or AMD.
If Intel and AMD have similar performing CPU within my budget I may opt for with the smaller company, but I'm not obliged to.
Saying that most CPUs from Intel or AMD, whether mainstream, midrange or high end is enough to fulfil and exceed the average user's expectations. So buying the fastest CPU for the sake of it is becoming illogical. Its only us enthusiasts on forums and users with specific specialist needs that worry about the latest i7s and FX ranges
ok TO ME.. i would get an 8350 rather than this. the money saved could buy either a custom loop, or a good aio and decent board.
either way you could get that 8350 up to these specs and possibly further for the same money.
What benefits you would get running at 5ghz.(+) compared to 4.7 or so is pretty debatable though.
So really its not the aio that would annoy me. nor the fact that this is the same cpu re released. What would annoy me is they ever made the thing to start it..
As for the "give amd your money so they can make better stuff" quite a few people here do line amd's pockets although not via enthusiast cpu purchaces. Apu's, laptops, consoles. and A LOT of graphics cards all fund amd nicely.
It really isnt our fault that they took that money and said.
"lets make a 5ghz cpu that runs cool. has 8 cores and we will call it bulldozer. Then they said.. well lets not do this by hand, lets cut this corner.. Im sure it will be fine if we do this. No we dont need to just work on the phenom"
And in the end ended up with this..
the bulldozer did more damage to amd than any thing intel could have done. and then this is just a swan song for enthusiast cpu's.
Some one said it as AM3+ is dead.. Well i think and HOPE it is.
Amd should now go concentrate on gpu's and apu's rake some money in from the console monopoly. and then in 3-4 years time come back and do it properly.
p.s
on average we are spending £6 a week less on electricity now with the i5-2500k vs the q6600..
this is entierly down to the cost of electricity where i live and with my provider. but you cant just say you wont notice a difference unless your running 100+ Pc's. not every where has regulated prices with a max cap. i wish we had the same tariff as Ireland does. but we dont we just have them jacking up the price every year.
p.p.s
thats a measly $2649 saved in 5 years after conversion. provided the tariffs dont go up which they will.
Why do they ship with a 1.4v VCORE? My guess is that some of the worse chips do need the 1.4v and AMD chose to play sure.
Not like that would suddenly turn them into power sipping powerhouses but every bit helps :toast:
That's 80 watt all the time as my tower is typically on all the time.
24 hours in a day, 365 days in a year which is 8760 hours.
We want watt hours because that is how at least I get billed for power.
So 80 Watts * 8760 Hours is 700,800 watt/hours or 700kW/hours. That is almost a month of electricity for me of the course of one year, somewhere around probably 1/15th of my total power consumption and electricity costs me ~130 USD every month. That's me saving over 100 dollars every year, so I've had my tower for 2 years (assuming I lived in the same place, I used to pay almost 50% more for electricity where I used to live) that's around 200 dollars saved over the old tower in 2 years, but if you wanted to factor difference in cost for electricity where I used to live, it would be higher.
The simple point is, it doesn't have to be 100s of computers for it to add up. It depends on how much you already use and how often the tower is on and drawing power. You underestimate how much power computers use even more so if it's always on.
I know that crunchers can attest to this as having machines under load 24/7 makes a big impact on your power bill where 100 watts less can be significant on your power bill.
Let's use realistic numbers and say there's a 100W difference in platform power consumption (motherboard + CPU + memory) between an overclocked i5-2500 system and an overclocked Q6600 system. Even running your PC 24 hours per day, over 168 hours (one week) you will have only saved 16.8 kWh of electricity. That means you would have to pay at least £0.36/kWh ($0.61/kWh) to save £6 per week. If you even shut off your computer while you are asleep (say 6 hours per night) you need an electricity rate of at least £0.48/kWh ($0.82/kWh) for you to save £6 per week.
I think you changed more than just the processor and that's where you see the power savings. If you change from a CRT to an LCD, use a lower power GPU, use your computer less, or even change some other usage pattern in your household, then the £6/week is possible, but that savings from a CPU change alone seems exaggerated.
Note: I'm not disputing that buying a processor that consumes less electricity can make financial sense, just that saving $2649 over five years from a CPU change alone in unrealistic. Aquinus's scenario of $100/year is much more realistic.
with britain being the 4th heighest per unit for electicity in the EU.. whilst the USA is THE least expensive of any oecd nation..
prehaps that has more of a factor than you imagine? also with conversion rates you get almost $2 for £1
also the cheapest tarif i can get is 17.62p p/kwh but thats if i have a anual or quartely paid bill.. For reasons which are obvious to share holders. our provider wont just give you that unless you are A already using it or B have a medical reason for having it..
so we have to go out and buy pre paid cards. and for the benefit of that they also then add on a rental for the meeter..
so i actually have 18.56p p kw/h unit price + an extra payment taken from my electricity credit for meeter rental.
I will however conceed that i have changed from crt tv to a lcd in the living room which could well help impact on the bill..
but the wife told me we on avarage now spend £6 less per week on electricity.
and the pc is on 24/7
Also you exaggerate again. The exchange rate is 1.7 USD to 1£.
and my actuall tarif is...
25.060 p/pkwh with a 24.950 p Per day rental charge..
i think this all factors in a bit.
so i am certain that a lower power cpu will make a prety substantial annual difference to me lol.
that would work out at the equivalent of 43.48c p/kwh if the usa had to pay that..
(i would call $1.70 for £1 almost $2 lol. i wasnt going to go check the actual rates but knew it was close to it)
AMD was recording record revenue - i.e. plenty of people buying their CPUs and chipsets, and what did they do with those revenues?
Pay twice the book value for ATI when they were the only interested buyer.
Realize, after spending $2.7 billion too much, that they had better try to balance the books, by
Selling their mobile IP for a paltry $65m to Qualcomm. $65 million + some R&D = Adreno (shuffle the letters around and you get...Radeon).
Not content to toss money away like a drunk Dallas Cowboy at a strip club, they then get themselves into a position where they have to pay their former OWN foundry business to NOT make chips for them.
I'm all for supporting the underdog ( How about VIA?), but giving money to the financially irresponsible is called "enabling" not "supporting". I'd like to think that AMD have turned over a new leaf, but then I look at the BoD and note how many of that group still draw a salary despite presiding over the financial clusterf_ck that caused AMD's present problems.
it is understandable that 940 consumes around what 10-15w? more than the 945? But even if we add it up that would still fit in the not negligible bracket between the i7-3820 so I'm curious where you pulled that 80w difference.
So okay, assuming they did the same amount of work, they're the same. Oh wait, how much slower is the 945 against the 3820 again? Look at the numbers, most of them show the 3820 to be twice as fast as the 945, according to the review that you took that screenshot from. So lets assume you have both CPUs and the 3820 spends half as much time doing the same job because it does twice as much in the same amount of time (more or less, but on average I would say that is correct.)
So lets assume we record over a few days of load where the 3820 takes 0.75 days instead of 1.5 days like the 945 would. That's the 3820 running at "166-watts" for 0.75 hour (124.5 watt/hours) plus idle which is hours at 64-watts @ 1.75 (80 watt/hours) for a total of 204.5 watt/hours.
Take the 945, loaded for twice as long (1.50 days) and idle for only (0.5 days). So 173-watts would be 259.5 watt/hours a day plus idle of 0.5 days @ 79-watts (39.5 watt/hours) which totals ~300 watt/hours.
So the actual power used difference is right there. If the 945 draws more power, the result of subtracting the value of the 945 from the 3820 should yield a negative number.
300 watt/hours - 204.5 watt/hours = 95.5 watt/hours difference for the same workload which would be recorded for twice the length of time of the longest running CPU.
Now that's the calculated difference from Cannucks. Hilbert Hagedoorn would disagree with the 3820 power consumption figures there. It doesn't help that their stressing the GPU which has nothing to do with CPU load if you're already maxxing it out.
What they leave out is what happens when you overclock the 945 and it ate power in a similar manner as Hilbert's graph does for the 8150 but not as bad, but it got up there.
So not only am I talking about machines that are overclocked, my idea of saving power over it holds true even for stock speeds, the power issue will only become more apparent the more you overclock.
I thank you for the challenge, but a little more research might be in order before making such claims.