Wednesday, July 3rd 2013

AMD Readies a Pair of Energy-Efficient Socket FM2 APUs

AMD's A-series "Richland" APUs are great to have in HTPC builds, but their 65W to 100W TDPs can be a put-off for some. The company is working on a pair of energy efficient socket FM2 APUs based on the silicon, with TDP rated at 45W. The first of the two is the A10-6700T, which features a significantly lower CPU clock speed of 2.50 GHz, with an unknown TurboCore speed; and GPU clock speed of 720 MHz. It's not known if the chip is dual-core or quad-core, but given that quad-core non-K A10-6700 rates in at 65W, it's not improbable for A10-6700T to be one, as well. Also unknown is the stream processor count. Moving on, we have the A8-6500T. Its CPU clock speed is further lowered, to 2.10 GHz, and GPU to 720 MHz, like the A10-6700T, its CPU and GPU core/SP counts are under the wraps.

Source: Guru3D
Add your own comment

80 Comments on AMD Readies a Pair of Energy-Efficient Socket FM2 APUs

#1
newtekie1
Semi-Retired Folder
by: Ikaruga
I do not have any idea (none whatsoever tbh), how this comes to this thread (and from where?), but an i3 PC which eats 95W when it's idle? Seriously..... 95W when it's idle? Are you trolling?
It has a 7970 in the system, did you even bother to read?
Posted on Reply
#2
arbiter
by: Ikaruga
I do not have any idea (none whatsoever tbh), how this comes to this thread (and from where?), but an i3 PC which eats 95W when it's idle? Seriously..... 95W when it's idle? Are you trolling?
Well what i guess is since 7970 gpu was paired in, AMD does do things in their chipset to help performance and power usage with their gpu's and cpu's. This is probably reason why power usage idle is higher. The Reviewer should dropped in an Nvidia card to see the results are in same range or not.

With that being said, NO ONE would pair one the LOWEST end i3 cpu's with a top of the line gpu. its an Oxymoron
Posted on Reply
#3
drdeathx
by: Ikaruga
I do not have any idea (none whatsoever tbh), how this comes to this thread (and from where?), but an i3 PC which eats 95W when it's idle? Seriously..... 95W when it's idle? Are you trolling?
Ummm no, I did a head to head with the i3. Those are MY results...:nutkick:.


Granted, if you read my response, it was done with a 7970..... Just showing the difference between 13 3220 and 5800K. Now the APU's AMD is releasing is a lower core speed and lower voltage so the differences between i3 and the ones that are posted in the thread will be minimal thus again, the smoking comment.

by: arbiter
Well what i guess is since 7970 gpu was paired in, AMD does do things in their chipset to help performance and power usage with their gpu's and cpu's. This is probably reason why power usage idle is higher. The Reviewer should dropped in an Nvidia card to see the results are in same range or not.

With that being said, NO ONE would pair one the LOWEST end i3 cpu's with a top of the line gpu. its an Oxymoron
Agreed. I only had a 7970 at the time of the head to head and was looking only at CPU vs APU performance and not graphics in real world benchmarks... The 5800K performed pretty much head to head with the i3 3220 with the 5800K being slightly slower in single threaded benchmarks and ahead in multi threaded. Overclocked, the 5800K pretty much owned the 3220. Now that was Ivy, Haswell will surely be a bit more efficient and as we have recently saw, single threading is the same as Ivy but multi threading is bit better thn Ivy with Haswell.


by: newtekie1
It has a 7970 in the system, did you even bother to read?
Thanks Teckie.. obviously the answer is he didn't.....
Posted on Reply
#4
Ikaruga
by: drdeathx
Ummm no, I did a head to head with the i3. Those are MY results...:nutkick:.


Granted, if you read my response, it was done with a 7970..... Just showing the difference between 13 3220 and 5800K. Now the APU's AMD is releasing is a lower core speed and lower voltage so the differences between i3 and the ones that are posted in the thread will be minimal thus again, the smoking comment.
I obviously saw it that you used 7970, still don't get it why did you replied that to me and why would you post it here (or anywhere tbh).. and 95W is still too much.

by: drdeathx
Agreed. I only had a 7970 at the time of the head to head and was looking only at CPU vs APU performance and not graphics in real world benchmarks... The 5800K performed pretty much head to head with the i3 3220 with the 5800K being slightly slower in single threaded benchmarks and ahead in multi threaded. Overclocked, the 5800K pretty much owned the 3220. Now that was Ivy, Haswell will surely be a bit more efficient and as we have recently saw, single threading is the same as Ivy but multi threading is bit better thn Ivy with Haswell.
Again, as soon as you overclock that thing, you might as well go for an i5 since you will end up with the same cost on the long run. Hell, you actually made me Googling things I was sure about and knew already:/









etc.....
Posted on Reply
#5
cdawall
where the hell are my stars
by: Ikaruga
I love that everybody takes only the very last sentence of my comment(s) and go on with that. I was talking about the subject in my previous post, namely that AMD CPUs are simply leaking crap. They eat a ton of electricity and also heat up quite easy (but the IGP parts are decent ones tho). They are basically selling stuff in 2013 which is on par with what Intel had 5 years(!) ago, that a lot of time in computer technology terms tbh, and it should be half the price at least. The problem is that people keep buying from them only because they are cheaper, so they are not really forced to start producing better chips.

ps.: It's also must be noted that the i5+GTX650 combo is not really twice as expensive, because you have to OC the AMD chip while you can undervolt/clock the Intel+NV combo at the same time, and if you add up the difference in (let's say) three years of electricity costs, you might end up not that far perhaps.
First off the electricity cost will take well over a decade to even out with my current pricing. So that is mute.

Second off I don't think you actually know what high leakage chips are. It has nothing to do with wattage. There are high leakage chios from any and all manufacturers. You might want to sit down and do some more research about how a transistor woks. You also may want to look into what an AMD chip is better at. There are people that use them (like the entire server industry) because in a hughly multithreaded environment they are better performing.

Not aimed at you but I just want to kick it out there "real world benchmarks" is an oxymoron. Use both this and the i3 in the real world and try to spot the difference. There isn't an office program or internet browser that will be mystically faster on anything beyond a 5 year old core2duo. Remember that's what most pc owners use there pc for.
Posted on Reply
#6
Ikaruga
by: cdawall
First off the electricity cost will take well over a decade to even out with my current pricing. So that is mute.

Second off I don't think you actually know what high leakage chips are. It has nothing to do with wattage. There are high leakage chios from any and all manufacturers. You might want to sit down and do some more research about how a transistor woks. You also may want to look into what an AMD chip is better at. There are people that use them (like the entire server industry) because in a hughly multithreaded environment they are better performing.

Not aimed at you but I just want to kick it out there "real world benchmarks" is an oxymoron. Use both this and the i3 in the real world and try to spot the difference. There isn't an office program or internet browser that will be mystically faster on anything beyond a 5 year old core2duo. Remember that's what most pc owners use there pc for.
I know what "leakage chios" are and also well know how transistors work. I also know that where AMD chips better at, but we are talking about APUs here (but to stay on topic, I do think that AMD-APUs have a great future, as I stated before on this forum quite a long time ago). The thing is that these chips are just not good enough to replace the cpu+vidcard combo in desktops (well, almost there but not yet) and - in my opinion - AMD seriously needs to work on their CPU cores, both IPC and power consumption wise, while I also think that their GPUs and IGPs are already great.


ps.:I know what you mean, for examlpe I do type this from a 775 rig. It's not my main computer ofc, but this is what I have hooked up to the big screen, it's dead silent and looks awesome so I love it. It's already more than enough for browsing the net, so I refuse to replace it while it's works;)
Posted on Reply
#7
de.das.dude
Pro Indian Modder
by: cdawall
Not aimed at you but I just want to kick it out there "real world benchmarks" is an oxymoron. Use both this and the i3 in the real world and try to spot the difference. There isn't an office program or internet browser that will be mystically faster on anything beyond a 5 year old core2duo. Remember that's what most pc owners use there pc for.
amen
Posted on Reply
#8
cdawall
where the hell are my stars
by: Ikaruga
I know what "leakage chios" are and also well know how transistors work. I also know that where AMD chips better at, but we are talking about APUs here (but to stay on topic, I do think that AMD-APUs have a great future, as I stated before on this forum quite a long time ago). The thing is that these chips are just not good enough to replace the cpu+vidcard combo in desktops (well, almost there but not yet) and - in my opinion - AMD seriously needs to work on their CPU cores, both IPC and power consumption wise, while I also think that their GPUs and IGPs are already great.


ps.:I know what you mean, for examlpe I do type this from a 775 rig. It's not my main computer ofc, but this is what I have hooked up to the big screen, it's dead silent and looks awesome so I love it. It's already more than enough for browsing the net, so I refuse to replace it while it's works;)
Really you know what a high leakage chip is well explain away. You have not used it correctly in a single one of your posts. So what you know and what you think you know might need some realigning.

Also I think you fail to see the reason behind HSA and AMD's new cores. You might want to go look into them before calling them bad designs. Realize that AMD has made a package it can add and remove x86, ARM and GPU cores at willy nilly to suit whatever needs it has. Welcome to the bigger picture. :shadedshu
Posted on Reply
#9
Ikaruga
by: cdawall
Really you know what a high leakage chip is well explain away. You have not used it correctly in a single one of your posts. So what you know and what you think you know might need some realigning.

Also I think you fail to see the reason behind HSA and AMD's new cores. You might want to go look into them before calling them bad designs. Realize that AMD has made a package it can add and remove x86, ARM and GPU cores at willy nilly to suit whatever needs it has. Welcome to the bigger picture. :shadedshu
I think I only said ones that it's "leaking all over the place" or something similar, and I never called them bad desing, nor did I ever thought such thing. Thinking about and working with/on computers every day since the 80's, I do understand all the aspect you mentioned, and more to that I also think that it's scalability is well beyond it's competition. My point was that it's not good enough in it's current state to replace the gfx-card+intel combo in desktops (please not that this is a PC enthusiast site), and whoever says it otherwise just excited because of its powerful IGP part imo.
Instead of improving them, AMD keeps selling these APUs which are consume too much power and lack single threaded speed compared to the competition. Yes I understand they "found" some bins now which don't eat that much power if underclocked, but that's not something I would call great "progress". I do root for AMD that they make something which stomps Intel into the grounds, because fierce competition would be the best thing happen to us, but I just don't see it CPU wise in Trinity/Richland. It's still not enough to run AAA titles, so an i3-3225 is more than enough where you want an APU like this. Perhaps the next one will be much better finally, and believe me, I will be the first who will praise and acknowledge the job well done.
Posted on Reply
#10
drdeathx
by: Ikaruga
I know what "leakage chios" are and also well know how transistors work. I also know that where AMD chips better at, but we are talking about APUs here (but to stay on topic, I do think that AMD-APUs have a great future, as I stated before on this forum quite a long time ago). The thing is that these chips are just not good enough to replace the cpu+vidcard combo in desktops (well, almost there but not yet) and - in my opinion - AMD seriously needs to work on their CPU cores, both IPC and power consumption wise, while I also think that their GPUs and IGPs are already great.


ps.:I know what you mean, for examlpe I do type this from a 775 rig. It's not my main computer ofc, but this is what I have hooked up to the big screen, it's dead silent and looks awesome so I love it. It's already more than enough for browsing the net, so I refuse to replace it while it's works;)
APU's are replacing desktops with GPU's ATM. Heck, a huge part of the gaming community now uses laptops to game. It will not be long for APU's to catch up to graphics on high end laptops making the dedicated GPU market shrink more. The performance of these APU's are far good enough and their CPU cores are fine. This is criticized you earlier...
Posted on Reply
#11
Ikaruga
by: drdeathx
APU's are replacing desktops with GPU's ATM. Heck, a huge part of the gaming community now uses laptops to game. It will not be long for APU's to catch up to graphics on high end laptops making the dedicated GPU market shrink more. The performance of these APU's are far good enough and their CPU cores are fine. This is criticized you earlier...
Yes, it's always happening like this, and we talked about this many times, like how we did more than a year ago in this other thread:
by: Ikaruga
This is just progression and it's happening since I know computers.
As technology advances, they put more and more components into one chip to make it more efficient, but as those components get more and more complex (or when new components are introduced), they have to make them separate, because they get too big.
And the cycle continues... :)
by: Ikaruga
No, I meant something like the "Wheel of Reincarnation", the thing what happened with the Coprocessors, the CPU instruction extensions and with the various controllers, or perhaps what will happen (or already happening?) with the graphics(GPU) or audio chips in the future.
That's why I think that AMD is on the right track with the APUs which are really great, I just want them to make good cores finally, and I get a little sad when they come out with things like these here. And their CPU cores are not "fine" in my standards.
Posted on Reply
#12
drdeathx
by: Ikaruga
Yes, it's always happening like this, and we talked about this many times, like how we did more than a year ago in this other thread:



That's why I think that AMD is on the right track with the APUs which are really great, I just want them to make good cores finally, and I get a little sad when they come out with things like these here. And their CPU cores are not "fine" in my standards.
The PileDriver cores are pretty damn good. Heck Trinity with PileDriver cores are pretty much on par with Phenom II performance with an on die GPU for $140. That is a great milestome in my own opinion. If AMD keeps getting 10-15% better performance and continues to shrink the die with more transistors, APU's may kill the CPU market which I think they are. Intel does not call them APU's but the GPU in on the die through the 4770K's.
Posted on Reply
#13
Fourstaff
by: cdawall
First off the electricity cost will take well over a decade to even out with my current pricing. So that is mute.
Consider yourself lucky, if you are paying less than £1/watt/year (based on 12hr usage pattern per day) over this side of the pond I think you have signed up to the cheapest electricity provider ever.
Posted on Reply
#14
newtekie1
Semi-Retired Folder
by: Fourstaff
Consider yourself lucky, if you are paying less than £1/watt/year (based on 12hr usage pattern per day) over this side of the pond I think you have signed up to the cheapest electricity provider ever.
Currently I pay ~$0.12/kWh. So lets assume I'm using the computer for 12 hours a day and the power consumption difference is 50w. The money saved from using the lower powered computer is about ~$26 a year. If I only use the computer 8 hours a day the cost savings drops to ~$17.50 a year.
Posted on Reply
#15
Fourstaff
by: newtekie1
Currently I pay ~$0.12/kWh. So lets assume I'm using the computer for 12 hours a day and the power consumption difference is 50w. The money saved from using the lower powered computer is about ~$26 a year. If I only use the computer 8 hours a day the cost savings drops to ~$17.50 a year.
16p/kWh here, for 70p/w/yr, 12hr day. For 20w difference its £14/year, judging by how lasting i7 920 rigs are, we are looking at least 5 years of usage. So that gives you £70 in difference, or the price difference between getting an i7 3570K and 5800K. Now then I know that the £70 is on the high side (and quite unrealistic given that the difference is less than 20w), but even at half that value I think people would appreciate a much faster chip for a bit more money spent overall. That is before overclocking.
Posted on Reply
#16
cdawall
where the hell are my stars
by: Fourstaff
16p/kWh here, for 70p/w/yr, 12hr day. For 20w difference its £14/year, judging by how lasting i7 920 rigs are, we are looking at least 5 years of usage. So that gives you £70 in difference, or the price difference between getting an i7 3570K and 5800K. Now then I know that the £70 is on the high side (and quite unrealistic given that the difference is less than 20w), but even at half that value I think people would appreciate a much faster chip for a bit more money spent overall. That is before overclocking.
That doesn't include a GPU. The i5 3570K isn't even in the same league for onboard and that is the entire point of the A10-5800/6800K.
Posted on Reply
#17
Fourstaff
by: cdawall
That doesn't include a GPU.
If the IGP of 5800K is powerful enough, and the IGP of 3570K is not powerful enough, obviously the 5800K is the clear winner. However if we are going for a discrete solution, I think its only fair that both uses the same card (and consequently should have the same power consumption).
Posted on Reply
#18
cdawall
where the hell are my stars
by: Fourstaff
If the IGP of 5800K is powerful enough, and the IGP of 3570K is not powerful enough, obviously the 5800K is the clear winner. However if we are going for a discrete solution, I think its only fair that both uses the same card (and consequently should have the same power consumption).


Looking at this graph from above that 20w you are "saving" is only under load. So let me ask you how long is your CPU under 100% load, because at idle AMD is doing better and you should be comparing the FX6300 or FX8320 if you want to compare an actual CPU that competes with the 3570K why you you would compare it to an APU is beyond me. You do know the 5800K directly compare with an FX 4XX0 series right?

So comparing that side the 3570K is $220 and the 4300 is $100 and the 6300 is $120. Lets compare CPU's to CPU's and not APU's since you are running a discrete card in your scenario.
Posted on Reply
#19
Fourstaff
by: cdawall
http://images.anandtech.com/graphs/graph6347/50410.png

Looking at this graph from above that 20w you are "saving" is only under load. So let me ask you how long is your CPU under 100% load, because at idle AMD is doing better and you should be comparing the FX6300 or FX8320 if you want to compare an actual CPU that competes with the 3570K why you you would compare it to an APU is beyond me. You do know the 5800K directly compare with an FX 4XX0 series right?

So comparing that side the 3570K is $220 and the 4300 is $100 and the 6300 is $120. Lets compare CPU's to CPU's and not APU's since you are running a discrete card in your scenario.
Yes you are right. How I arrived at the conclusion that the 3570K idles less than 20w below the 5800K is beyond me, but its probably in some obscure test which involves undervolting and underclocking (and consequently not apples to apples comparison). Anyways I was trying to highlight the fact that power consumption can indeed make a difference in price (and, situationally, cooling costs), something which people should be aware of rather than brush it under. Works perfectly fine in the States where electricity is cheap and plentiful, but over this side of the pond electricity is quite a lot more expensive.
Posted on Reply
#20
cdawall
where the hell are my stars
by: Fourstaff
Yes you are right. How I arrived at the conclusion that the 3570K idles less than 20w below the 5800K is beyond me, but its probably in some obscure test which involves undervolting and underclocking (and consequently not apples to apples comparison). Anyways I was trying to highlight the fact that power consumption can indeed make a difference in price (and, situationally, cooling costs), something which people should be aware of rather than brush it under. Works perfectly fine in the States where electricity is cheap and plentiful, but over this side of the pond electricity is quite a lot more expensive.
Your CPU is still not under load that often unless you are gaming/encoding 8hrs a day I fail to see when that 3570K's power savings will pay for themselves. Even then if you are encoding there is a good chance the AMD chip will be faster at encoding so it may be a mute point.
Posted on Reply
#21
Fourstaff
by: cdawall
Your CPU is still not under load that often unless you are gaming/encoding 8hrs a day I fail to see when that 3570K's power savings will pay for themselves. Even then if you are encoding there is a good chance the AMD chip will be faster at encoding so it may be a mute point.
No the 3570K will not save enough power to justify the increase in cost against the 5800K, if anything it takes more power. I was thinking about the hypothetical (and off topic) 3770K vs 8350, both overclocked.
Posted on Reply
#22
cdawall
where the hell are my stars
by: Fourstaff
No the 3570K will not save enough power to justify the increase in cost against the 5800K, if anything it takes more power. I was thinking about the hypothetical (and off topic) 3770K vs 8350, both overclocked.
There was a review done on that. It would take several years running your PC at 100% load for 8-12hrs a day to make a difference.

These both have a good idle state in them which works for browsing the web and what most people do for the vast majority of the time.
Posted on Reply
#23
drdeathx
by: Fourstaff
Yes you are right. How I arrived at the conclusion that the 3570K idles less than 20w below the 5800K is beyond me, but its probably in some obscure test which involves undervolting and underclocking (and consequently not apples to apples comparison). Anyways I was trying to highlight the fact that power consumption can indeed make a difference in price (and, situationally, cooling costs), something which people should be aware of rather than brush it under. Works perfectly fine in the States where electricity is cheap and plentiful, but over this side of the pond electricity is quite a lot more expensive.
With all due repect, it doesn't idle less
Posted on Reply
#24
Fourstaff
by: drdeathx
With all due repect, it doesn't idle less
With all due respect, I agreed with that in the first line of the post.
Posted on Reply
#25
newtekie1
Semi-Retired Folder
I finally managed to find my Kill-a-Watt:


This is Rig2 in my sig so:
4.4GHz CPU Overclock
1013MHz GPU Overclock

Idle...50w! And this is driving 3 monitors.
Posted on Reply
Add your own comment