• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Readies 16-core Processors with Full Uncore

Allow me to make the performance arguments, so this thread doesn't become a pissing contest.

1) AMD chips have a higher TDP than Intel, so they must be more efficient.
No. AMD and Intel do measure chips differently. Between chips being measured differently, and completely differing architecture, efficiency cases can be made for both sides. There is no clear winner here.
2) Intel chips don't clock as high as AMD ones, so AMD makes better chips.
This one is generally true. If you're looking for bragging rights about the highest clock, then AMD wins. The reality is that both manufacturers' chips take huge amounts of power to do this. You don't run a CPU at peak frequencies constantly, unless you want a huge bill and rapidly deteriorating chip. For every day use, either manufacturer produces a relatively solidly performing chip.
3) Intel and AMD don't measure cores the same.
Absolutely. Intel has traditional cores, while AMD decided to share a component among the cores. A four core Intel chip doesn't match the 4 core AMD chip, a two core with hyper-threading chip doesn't match a 4 core AMD chip, and none of this matters. This is not a move for the consumer CPU market. In that market only a handful of program use more than a couple of cores. People using more cores are doing server related work, crunching, or running encoding software.


Now that the silly arguments have been made, can we get back on topic? AMD looks to be firing for the server market, without any bashfulness. Assuming this is the case, it seems like they are making a large step back into competing with Intel. This bodes well for more reasonably priced servers, but more importantly could be parlayed into something interesting on the desktop CPU front. Anyone care to comment on that, rather than on how much they think the current parts are either awesome or terrible?
 
AMD single thread is good enough? Yea really? How many games at this point in time for example uses more then 4 threads?....cause don't think to many games will span much past 4 cores on best top side which puts AMD behind a bit, on top of that they use 50% more power then the competitors cpu
as PS4 and Xbox One are based on an 8 core CPUs, there's the possibility that in the near future games will be using more than 4 cores.

Heck even a lot of programs don't really use much more then 2

If I open all the programs I use every day, I get 100% CPU on my FX8320, that includes a lot of excel, encoding audio and video, burning, browsing (like 50 tabs or so) and some calculus programs for other stuff. Sometimes I even do some gaming on this thing while doing all those kind of things on background.

Yea AMD cpu looks good cause initial cheaper cost but over year or 2 that cost even's out when it add's up in an eletric bill.

Disagree. An electric bill is composed by several electrical appliances, an oven alone may meet 3000 watts unless you use propane. An Air Conditioner reaches at peek like 1200watts. Then you have a microwave oven, all the light bulbs, water heater tank, and 90% of the bill will be of appliances. I don't believe the choice of AMD CPU over Intel will really change the game on the bill, to make it a useless choice after just 2 years.... If what you need is to browse internet and visit facebook, get an AMD C60, its just 8 watts, then you'll be choosing correctly. If you want to save the planet, and our wallets, we should stop gaming then, but choosing an intel over an AMD will not "save our wallets" on the long run, the difference in power consumption and efficiency is not the catastrophe you described.
 
as PS4 and Xbox One are based on an 8 core CPUs, there's the possibility that in the near future games will be using more than 4 cores.
I don't remember exactly what they said about Mantle, but it seemed it can scale to beyond 8 cores as well, independently of how the game engine is coded. And it'd be about time, might be tricky but makes a lot more sense to make game engine/API future proof so as to take advantage of the tech of tomorrow as well, doesn't it? Anyway, here's bit of a mind-twister... imagine somewhere down the HSA line, having games and their APIs doing draw calls off of the APU itself, rather than just x86 cores... GPUception... O_O
 
1) AMD chips have a higher TDP than Intel, so they must be more efficient. No. AMD and Intel do measure chips differently. Between chips being measured differently, and completely differing architecture, efficiency cases can be made for both sides. There is no clear winner here.

Actually there is a clear winner here and it's Intel. AMD CPU efficiency comes close to Intel only for those kind of tasks that AMD architecture favors. Meaning, you would need a heavily multi threaded code with no floating point instructions at all (only integer SIMD and SISD instructions), to have somewhat comparable efficiencies.
 
Actually there is a clear winner here and it's Intel. AMD CPU efficiency comes close to Intel only for those kind of tasks that AMD architecture favors. Meaning, you would need a heavily multi threaded code with no floating point instructions at all (only integer SIMD and SISD instructions), to have somewhat comparable efficiencies.
Still not clear enough to care. Everybody is talking like if there was a massacre of a difference, and its not that deep.
 
Everybody is talking like if there was a massacre of a difference, and its not that deep.
For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.
 
For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.

I don't find anything stupid in your opinion, its just your opinion. However take into consideration that if that is your point for choosing one over the other, then I assume you do the same for the rest of your home appliances. LED TV over LCD, as an example to begin with. If you only care about the CPU alone over the rest of your home then your opinion wouldn't be honest enough. But I don't think its stupid.
 
Bring on the hexadecimal core for desktop
 
For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.

You must live in Australia then?
 
For me, 1 watt is crucial for me to choose Intel instead of AMD. Why? Energy costs a lot where I live, a few watts less is better than a few watts more in my energy bill.

The less the better, no matter if 1 or 1/x. Just my stupid opinion.

I take it you remove all LED's from everything you have. You should get night vision goggles so you don't need lights at all. Imagine how much you would save!
 
Meanwhile, Intel is busy making things like low power SoCs with 8 cores. When it comes to servers, power efficiency can mean a lot if you're running a cluster, a lot of servers, or if your resources are limited. I guess the point I'm trying to make is that these are changes AMD should have started making a long time ago. What boggles me is why they didn't do it sooner. APUs have been designed this way since Llano.
 
LED TV over LCD, as an example to begin with. If you only care about the CPU alone over the rest of your home then your opinion wouldn't be honest enough.
Ah no. I can watch TV on computer and eliminate TV. But I am still using tube TV.

Why? LED or LCD TV costs a lot here, I can't invest in one at the moment. But I would buy if I could, no doubt. Sometimes, to save watts costs a lot too. It's not the same with computers because I can build a low power mini ITX for much less instead of more. But I prefer micro ATX as mini ITX has some disadvantages...

You must live in Australia then?
No.

You should get night vision goggles so you don't need lights at all. Imagine how much you would save!
I do not live alone. Other people prefer light... What can I do?!
 
Meanwhile, Intel is busy making things like low power SoCs with 8 cores. When it comes to servers, power efficiency can mean a lot if you're running a cluster, a lot of servers, or if your resources are limited. I guess the point I'm trying to make is that these are changes AMD should have started making a long time ago. What boggles me is why they didn't do it sooner. APUs have been designed this way since Llano.

AMD did a great job with its kabini lineup, the A4-5000 is a good contender, 4 cores, Radeon GPU, 15watts
But that atom (8 cores!) looks nice on paper, I would like to try it out, seems a good one, but still at 20watts.

I found on amazon an ECS KBN motherboard with an A6-5200, quad core, Radeon GPU, 25watts.

I do not live alone. Other people prefer light... What can I do?!

get an intel...!! :lovetpu:
 
I do not live alone. Other people prefer light... What can I do?!
change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.

If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.

Back on topic:
8 core Atom? Please do enlighten me. Because I kinda remember that Atoms were friggin' slow compared to AMD's low power CPUs AKA equivalents.
 
change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.

If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.

Back on topic:
8 core Atom? Please do enlighten me. Because I kinda remember that Atoms were friggin' slow compared to AMD's low power CPUs AKA equivalents.
I agree, but Atoms used to also be 32-bit dual cores with only a single memory channel. This thing has two memory channels, runs at DDR3-1600, has 8 cores that run at 2.4Ghz with a 2.6Ghz boost. All wrapped into a 20-watt TDP SoC CPU. It also has 16 PCI-E lanes and support for 6 SATA ports and 4 NICs off the CPU. Clearly it's aimed to be a server product.

Someone has to tell me why this doesn't look awesome as a cheap server.
ASRock C2750D4I Mini ITX Server Motherboard FCBGA1283 DDR3 1600/1333
 
AMD did a great job with its kabini lineup, the A4-5000 is a good contender, 4 cores, Radeon GPU, 15watts
I do not like processors with integrated graphics, so I will never use their integrated graphics. Only in "emergency" cases.

And I prefer low-end NVIDIA chips as I never had any problems with their drivers and I am sure they will work for my needs, so why to change?!

"Intel because every single watt counts"
Not only every single watt but also every single instruction per second. Single core performance is crucial as I like to disable all the unnecessary cores to use even less power.

And maybe the opposite of overclocking in extreme cases. I did that already! :D

EDIT:
And to mention other DECISIVE factor for me to choose Intel instead of AMD: CPU pins. I let my E2200 to fall onto the ground in December 2013 and I was happy it was not an AMD. Ohterwise, it would bend many pins and give me a lot of headache and a non-working processor.
 
Last edited:
change all Your lightbulbs 'n sh*t, including those "energy saving" ones into LED lights and enjoy your >10x energy consumption for lighting. Plus these even cost less than those "power saving" ones, too. Thus, no excuse not getting those and start enjoying comically low power consumption.

If You don't already do this, I'd say Your "Intel because every single watt counts" is hypocrisy.

Back on topic:
8 core Atom? Please do enlighten me. Because I kinda remember that Atoms were friggin' slow compared to AMD's low power CPUs AKA equivalents.

The newer Atoms are 64bit OoO with HT. I don't know how they stack up to Jaguar but they're not the same Atoms we knew (and loathed).
 
Australia is officially the hottest and the most expensive place to live on the planet at the moment.
Maybe because it is isolated from everything. It makes sense.
 
And to mention other DECISIVE factor for me to choose Intel instead of AMD: CPU pins. I let my E2200 to fall onto the ground in December 2013 and I was happy it was not an AMD. Otherwise, it would bend many pins and give me a lot of headache and a non-working processor.

That's nothing. Back when I had a Phenom II 940, the stock cooler was on so bad because the thermal paste got so dry, that I accidentally ripped the CPU out of the socket in the process while bending a good 100-ish pins.I straightened them out, and while getting it into the socket took a little work, it eventually went in just fine and ran without an issue.

Ever bent a pin on a motherboard with LGA? It's a bitch to fix and more often than not, you can't fix it. While I like LGA in general because of how the CPU is secured, I'm less worried about bending a pin on a CPU than on an LGA motherboard though.
 
Last edited:
No, that would require no care at all. To bend CPU is infinitely easier.
For what purpose have you even replied in this thread.
Your a big telly owning amd hating fool who should stick to using a pda or pad imho, efficiency wtf and as for this last reply he was removing a heatsink harshly .

You dropped ya soddin chip er your worse.

And what the feck any of this has to do with the Op is beyond me.
 
No, that would require no care at all. To bend CPU is infinitely easier.

No way. Over-tightening a cooler alone can cause LGA pins to bend. It's part of the reason why you hear people saying to re-seat their CPU when memory or PCI-E is acting funky and to not over tighten it.
 
I do not like processors with integrated graphics, so I will never use their integrated graphics. Only in "emergency" cases.

And I prefer low-end NVIDIA chips as I never had any problems with their drivers and I am sure they will work for my needs, so why to change?!

Well a high end APU GPU would be quicker than a low end Nvidia card. It would be quicker than a low end Intel chip and a low end Nvidia chip as well, with the added benefit of Hybrid Crossfire if you find a GPU that works with it. I mean if you're happy with your current setup you shouldn't change it just because (penis measuring is a terrible thing), but it is a viable alternative would you ever need to upgrade.

For what purpose have you even replied in this thread.
Your a big telly owning amd hating fool who should stick to using a pda or pad imho, efficiency wtf and as for this last reply he was removing a heatsink harshly .

You dropped ya soddin chip er your worse.

And what the feck any of this has to do with the Op is beyond me.

Drunken englishmen, they all sound so cute. :D
 
Back
Top