Friday, January 17th 2014

AMD Readies 16-core Processors with Full Uncore

AMD released developer documentation for a new processor it's working on, and the way it's worded describes a chip with 8 modules, working out to 16 cores, on a single piece of silicon, referred to as Family 15h Models 30h - 3fh. This is not to be confused with the company's Opteron 6300-series "Abu Dhabi" chips, which are multi-chip modules of two 8-core dies, in the G34 package.

What's more, unlike the current "Abu Dhabi" and "Seoul" chips, the new silicon features a full-fledged uncore, complete with a PCI-Express gen 3.0 root complex that's integrated into the processor die. In what's more proof that it's a single die with 8 modules and not an MCM of two dies with 4 modules each, the document describes the die as featuring four HyperTransport links; letting it pair with four other processors in 4P multi-socket configurations. Such systems would feature a total core count of 64. There's no clarity on which exact micro-architecture the CPU modules are based on. Without doubt, AMD is designing this chip for its Opteron enterprise product stack, but it should also give us a glimmer of hope that AMD could continue to serve up high-performance client CPU, only ones that can't be based on socket AM3+.

Source: Planet3DNow.de
Add your own comment

92 Comments on AMD Readies 16-core Processors with Full Uncore

#1
Thefumigator
by: arbiter
AMD single thread is good enough? Yea really? How many games at this point in time for example uses more then 4 threads?....cause don't think to many games will span much past 4 cores on best top side which puts AMD behind a bit, on top of that they use 50% more power then the competitors cpu
as PS4 and Xbox One are based on an 8 core CPUs, there's the possibility that in the near future games will be using more than 4 cores.

by: arbiter
Heck even a lot of programs don't really use much more then 2
If I open all the programs I use every day, I get 100% CPU on my FX8320, that includes a lot of excel, encoding audio and video, burning, browsing (like 50 tabs or so) and some calculus programs for other stuff. Sometimes I even do some gaming on this thing while doing all those kind of things on background.

by: arbiter
Yea AMD cpu looks good cause initial cheaper cost but over year or 2 that cost even's out when it add's up in an eletric bill.
Disagree. An electric bill is composed by several electrical appliances, an oven alone may meet 3000 watts unless you use propane. An Air Conditioner reaches at peek like 1200watts. Then you have a microwave oven, all the light bulbs, water heater tank, and 90% of the bill will be of appliances. I don't believe the choice of AMD CPU over Intel will really change the game on the bill, to make it a useless choice after just 2 years.... If what you need is to browse internet and visit facebook, get an AMD C60, its just 8 watts, then you'll be choosing correctly. If you want to save the planet, and our wallets, we should stop gaming then, but choosing an intel over an AMD will not "save our wallets" on the long run, the difference in power consumption and efficiency is not the catastrophe you described.
Posted on Reply
#2
NeoXF
by: Thefumigator
as PS4 and Xbox One are based on an 8 core CPUs, there's the possibility that in the near future games will be using more than 4 cores.
I don't remember exactly what they said about Mantle, but it seemed it can scale to beyond 8 cores as well, independently of how the game engine is coded. And it'd be about time, might be tricky but makes a lot more sense to make game engine/API future proof so as to take advantage of the tech of tomorrow as well, doesn't it? Anyway, here's bit of a mind-twister... imagine somewhere down the HSA line, having games and their APIs doing draw calls off of the APU itself, rather than just x86 cores... GPUception... O_O
Posted on Reply
#3
BiggieShady
by: lilhasselhoffer
1) AMD chips have a higher TDP than Intel, so they must be more efficient. No. AMD and Intel do measure chips differently. Between chips being measured differently, and completely differing architecture, efficiency cases can be made for both sides. There is no clear winner here.
Actually there is a clear winner here and it's Intel. AMD CPU efficiency comes close to Intel only for those kind of tasks that AMD architecture favors. Meaning, you would need a heavily multi threaded code with no floating point instructions at all (only integer SIMD and SISD instructions), to have somewhat comparable efficiencies.
Posted on Reply
#4
Thefumigator
by: BiggieShady
Actually there is a clear winner here and it's Intel. AMD CPU efficiency comes close to Intel only for those kind of tasks that AMD architecture favors. Meaning, you would need a heavily multi threaded code with no floating point instructions at all (only integer SIMD and SISD instructions), to have somewhat comparable efficiencies.
Still not clear enough to care. Everybody is talking like if there was a massacre of a difference, and its not that deep.
Posted on Reply
#5
Blue-Knight
by: Thefumigator
Everybody is talking like if there was a massacre of a difference, and its not that deep.
For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.
Posted on Reply
#6
Thefumigator
by: Blue-Knight
For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.
I don't find anything stupid in your opinion, its just your opinion. However take into consideration that if that is your point for choosing one over the other, then I assume you do the same for the rest of your home appliances. LED TV over LCD, as an example to begin with. If you only care about the CPU alone over the rest of your home then your opinion wouldn't be honest enough. But I don't think its stupid.
Posted on Reply
#7
eidairaman1
Bring on the hexadecimal core for desktop
Posted on Reply
#8
Melvis
by: Blue-Knight
For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.
You must live in Australia then?
Posted on Reply
#9
Frick
Fishfaced Nincompoop
by: Blue-Knight
For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.
I take it you remove all LED's from everything you have. You should get night vision goggles so you don't need lights at all. Imagine how much you would save!
Posted on Reply
#10
Aquinus
Resident Wat-man
Meanwhile, Intel is busy making things like low power SoCs with 8 cores. When it comes to servers, power efficiency can mean a lot if you're running a cluster, a lot of servers, or if your resources are limited. I guess the point I'm trying to make is that these are changes AMD should have started making a long time ago. What boggles me is why they didn't do it sooner. APUs have been designed this way since Llano.
Posted on Reply
#11
Blue-Knight
by: Thefumigator
LED TV over LCD, as an example to begin with. If you only care about the CPU alone over the rest of your home then your opinion wouldn't be honest enough.
Ah no. I can watch TV on computer and eliminate TV. But I am still using tube TV.

Why? LED or LCD TV costs a lot here, I can't invest in one at the moment. But I would buy if I could, no doubt. Sometimes, to save watts costs a lot too. It's not the same with computers because I can build a low power mini ITX for much less instead of more. But I prefer micro ATX as mini ITX has some disadvantages...

by: Melvis
You must live in Australia then?
No.

by: Frick
You should get night vision goggles so you don't need lights at all. Imagine how much you would save!
I do not live alone. Other people prefer light... What can I do?!
Posted on Reply
#12
Thefumigator
by: Aquinus
Meanwhile, Intel is busy making things like low power SoCs with 8 cores. When it comes to servers, power efficiency can mean a lot if you're running a cluster, a lot of servers, or if your resources are limited. I guess the point I'm trying to make is that these are changes AMD should have started making a long time ago. What boggles me is why they didn't do it sooner. APUs have been designed this way since Llano.
AMD did a great job with its kabini lineup, the A4-5000 is a good contender, 4 cores, Radeon GPU, 15watts
But that atom (8 cores!) looks nice on paper, I would like to try it out, seems a good one, but still at 20watts.

I found on amazon an ECS KBN motherboard with an A6-5200, quad core, Radeon GPU, 25watts.

by: Blue-Knight

I do not live alone. Other people prefer light... What can I do?!
get an intel...!! :lovetpu:
Posted on Reply
#13
Vinska
by: Blue-Knight
I do not live alone. Other people prefer light... What can I do?!
change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.

If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.

Back on topic:
8 core Atom? Please do enlighten me. Because I kinda remember that Atoms were friggin' slow compared to AMD's low power CPUs AKA equivalents.
Posted on Reply
#14
Aquinus
Resident Wat-man
by: Vinska
change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.

If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.

Back on topic:
8 core Atom? Please do enlighten me. Because I kinda remember that Atoms were friggin' slow compared to AMD's low power CPUs AKA equivalents.
I agree, but Atoms used to also be 32-bit dual cores with only a single memory channel. This thing has two memory channels, runs at DDR3-1600, has 8 cores that run at 2.4Ghz with a 2.6Ghz boost. All wrapped into a 20-watt TDP SoC CPU. It also has 16 PCI-E lanes and support for 6 SATA ports and 4 NICs off the CPU. Clearly it's aimed to be a server product.

Someone has to tell me why this doesn't look awesome as a cheap server.
ASRock C2750D4I Mini ITX Server Motherboard FCBGA1283 DDR3 1600/1333
Posted on Reply
#15
Blue-Knight
by: Thefumigator
AMD did a great job with its kabini lineup, the A4-5000 is a good contender, 4 cores, Radeon GPU, 15watts
I do not like processors with integrated graphics, so I will never use their integrated graphics. Only in "emergency" cases.

And I prefer low-end NVIDIA chips as I never had any problems with their drivers and I am sure they will work for my needs, so why to change?!

by: Vinska
"Intel because every single watt counts"
Not only every single watt but also every single instruction per second. Single core performance is crucial as I like to disable all the unnecessary cores to use even less power.

And maybe the opposite of overclocking in extreme cases. I did that already! :D

EDIT:
And to mention other DECISIVE factor for me to choose Intel instead of AMD: CPU pins. I let my E2200 to fall onto the ground in December 2013 and I was happy it was not an AMD. Ohterwise, it would bend many pins and give me a lot of headache and a non-working processor.
Posted on Reply
#16
Melvis
by: Blue-Knight
No.
Then its considered cheap. Australia is officially the hottest and the most expensive place to live on the planet at the moment.
Posted on Reply
#17
TRWOV
by: Vinska
change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.

If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.

Back on topic:
8 core Atom? Please do enlighten me. Because I kinda remember that Atoms were friggin' slow compared to AMD's low power CPUs AKA equivalents.
The newer Atoms are 64bit OoO with HT. I don't know how they stack up to Jaguar but they're not the same Atoms we knew (and loathed).
Posted on Reply
#18
Blue-Knight
by: Melvis
Australia is officially the hottest and the most expensive place to live on the planet at the moment.
Maybe because it is isolated from everything. It makes sense.
Posted on Reply
#19
Aquinus
Resident Wat-man
by: Blue-Knight
And to mention other DECISIVE factor for me to choose Intel instead of AMD: CPU pins. I let my E2200 to fall onto the ground in December 2013 and I was happy it was not an AMD. Otherwise, it would bend many pins and give me a lot of headache and a non-working processor.
That's nothing. Back when I had a Phenom II 940, the stock cooler was on so bad because the thermal paste got so dry, that I accidentally ripped the CPU out of the socket in the process while bending a good 100-ish pins.I straightened them out, and while getting it into the socket took a little work, it eventually went in just fine and ran without an issue.

Ever bent a pin on a motherboard with LGA? It's a bitch to fix and more often than not, you can't fix it. While I like LGA in general because of how the CPU is secured, I'm less worried about bending a pin on a CPU than on an LGA motherboard though.
Posted on Reply
#20
Blue-Knight
by: Aquinus
Ever bent a pin on a motherboard with LGA?
No, that would require no care at all. To bend CPU is infinitely easier.
Posted on Reply
#21
theoneandonlymrk
by: Blue-Knight
No, that would require no care at all. To bend CPU is infinitely easier.
For what purpose have you even replied in this thread.
Your a big telly owning amd hating fool who should stick to using a pda or pad imho, efficiency wtf and as for this last reply he was removing a heatsink harshly .

You dropped ya soddin chip er your worse.

And what the feck any of this has to do with the Op is beyond me.
Posted on Reply
#22
Blue-Knight
by: theoneandonlymrk
Your a big telly owning amd hating fool
I hate nothing. I am just telling the facts. And I didn't understand ~90% of what you said.

by: theoneandonlymrk
For what purpose have you even replied in this thread.
Because I am stupid and useless.
Posted on Reply
#23
Aquinus
Resident Wat-man
by: Blue-Knight
No, that would require no care at all. To bend CPU is infinitely easier.
No way. Over-tightening a cooler alone can cause LGA pins to bend. It's part of the reason why you hear people saying to re-seat their CPU when memory or PCI-E is acting funky and to not over tighten it.
Posted on Reply
#24
Frick
Fishfaced Nincompoop
by: Blue-Knight
I do not like processors with integrated graphics, so I will never use their integrated graphics. Only in "emergency" cases.

And I prefer low-end NVIDIA chips as I never had any problems with their drivers and I am sure they will work for my needs, so why to change?!
Well a high end APU GPU would be quicker than a low end Nvidia card. It would be quicker than a low end Intel chip and a low end Nvidia chip as well, with the added benefit of Hybrid Crossfire if you find a GPU that works with it. I mean if you're happy with your current setup you shouldn't change it just because (penis measuring is a terrible thing), but it is a viable alternative would you ever need to upgrade.

by: theoneandonlymrk
For what purpose have you even replied in this thread.
Your a big telly owning amd hating fool who should stick to using a pda or pad imho, efficiency wtf and as for this last reply he was removing a heatsink harshly .

You dropped ya soddin chip er your worse.

And what the feck any of this has to do with the Op is beyond me.
Drunken englishmen, they all sound so cute. :D
Posted on Reply
#25
Blue-Knight
by: Aquinus
No way. Over-tightening a cooler alone can cause LGA pins to bend.
My CPU cooler will not allow over tight (not that it could bend the pins). And it was quite cheap.

I think extreme violence would be required if that is possible, with my cooler. Just need to follow simple rules and it'll be fine.
Posted on Reply
Add your own comment