Friday, February 12th 2016

AMD "Zen" Processors to Feature SMT, Support up to 8 DDR4 Memory Channels

CERN engineer Liviu Valsan, in a recent presentation on datacenter hardware trends, presented a curious looking slide that highlights some of the key features of AMD's upcoming "Zen" CPU architecture. We know from a recent story that the architecture is scalable up to 32 cores per socket, and that AMD is building these chips on the 14 nanometer FinFET process.

Among the other key features detailed on the slide are symmetric multi-threading (SMT). Implemented for over a decade by Intel as HyperThreading Technology, SMT exposes a physical core as two logical CPUs to the software, letting it make better use of the hardware resources. Another feature is talk of up to eight DDR4 memory channels. This could mean that AMD is readying a product to compete with the Xeon E7 series. Lastly, the slide mentions that "Zen" could bring about IPC improvements that are 40 percent higher than the current architecture.
Source: HotHardware
Add your own comment

130 Comments on AMD "Zen" Processors to Feature SMT, Support up to 8 DDR4 Memory Channels

#51
cracklez
No need to clash fists here, we can all keep it civil can't we?

The fact of the matter is that the companies AMD competes with have a bigger budget and more resources at their disposal. This means that most of the time they will have the upper hand in price, performance, marketing, business deals, partnerships, etc.
This does not mean that AMD can't have their niche and survive if not flourish in their segments. It's just that they have been in the negative for so long, at this point they really are struggling to survive. If Jim Keller has done a miracle on the design of his ZEN, it will be a success on that front. If it fails, it will be 100% AMDs fault. Once the engine gets revved up to start getting these chips out on shelves, they need to do some HEAVY marketing to appeal to the masses otherwise I'm afraid it won't look good on that balance sheet of theirs.
Posted on Reply
#52
cdawall
where the hell are my stars
Kurt MaverickAnd here I thought I was talking about prices....of course, people take whatever context they damn want to suit their needs.

BTW, AMD doesn't offer full DX12 (12_1) support, Nvidia has offered 10-bit support since the Geforce 200 Series (nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus), and AFAIK Nvidia has beaten AMD in almost every DX11 game out there. I don't know (neither care) what DX12 will mean for the actual gen, but my argument about that is that transitional generations sucks, and that the real DX12 battle begins this year.

So that only leaves you with the HBM argument. Woohoo! Enjoy it for the few more months that remains, fanboy.
In your link it states that you need a quadro card to use 10 bit color. My exact wording was consumer card and I used it for a reason.

If you want to be really specific in performance the and dual gpu card from 2 generations ago is still the fastest single "card" solution. Of you look past 1080P the fury series beats everything but the titan and does beat the titan in some games. Again it's exactly what I already posted.

And again if you would look just a little closer into dx12. You will notice somethings.

I am no fanboy I have run both sets of cards and buy based off value/feature set. Last set of cards was three water cooled gtx470's. Those are to this day my favorite cards followed by my original ti4200, which currently holds some of the highest clock speed records ever recorded.
Posted on Reply
#53
Kurt Maverick
cdawallIn your link it states that you need a quadro card to use 10 bit color. My exact wording was consumer card and I used it for a reason.

If you want to be really specific in performance the and dual gpu card from 2 generations ago is still the fastest single "card" solution. Of you look past 1080P the fury series beats everything but the titan and does beat the titan in some games. Again it's exactly what I already posted.

And again if you would look just a little closer into dx12. You will notice somethings.

I am no fanboy I have run both sets of cards and buy based off value/feature set. Last set of cards was three water cooled gtx470's. Those are to this day my favorite cards followed by my original ti4200, which currently holds some of the highest clock speed records ever recorded.
And how many "non-professional" 10-bit monitors you currently see in the market, huh? And how much they'd be overshadowed by the future HDR monitors anyway?

Whatever. Like I said, it's not my point, and I'm absolutely tired of fanboyisms. Don't expect me to reply to any more of them. The same I say to anyone else.
Posted on Reply
#54
newtekie1
Semi-Retired Folder
cdawallEh if you look beyond the titan no they don't. AMD offers 10bit color in consumer cards nvidia offers 8, and offers full dx12 on gpu, the fury cards have hbm, nvidia has gddr5, the nano offers better performance per watt, gcn offers better performance with resolutions higher than 1080p etc. Actually once you get past the hype and cards should be wiping the floor with nvidia in sales.
10-Bit color has no affect on the consumer market. 10-Bit panels are expensive, though they are getting cheaper, but your still looking at $350 for a GW2765HT, and that is an exception to the rule, the next cheapest panel after that is the $500 ASUS PB287Q. And 10-bit isn't noticeable better than 8-Bit to the average consumer(that is why almost no 4K TVs support it). 10-Bit is a Professional feature, that only professionals will notice if lacking. And I know someone is going to post those BS pictures with the on on the left being 256 Color, and the one on the right being 8-bit, and claim "look at the difference 10-Bit makes!" Don't waste your time, those pictures all over exaggerate the actual difference. On top of that, AMD cards lack HDMI 2.0, a far more consumer friendly feature than 10-bit. There are a lot of people that like to connect their computer to their TV. I'm sure there are a lot of consumers, a lot more than those that care about 10-bit support, that only can afford to buy one 4K device, and that is their TV, but also want to connect their high end gaming computer to it, to play games in 4K. Not with AMD cards you aren't, because most 4K TVs don't have displayport. So it's an extra $30 for an adapter if you want to do that.

DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.

HBM shouldn't be a marketing bullet when HBM cards are still losing in performance to GDDR5 cards. So does it really make a difference? Are people with the Fury cards going "Hey look, my card is so much better because it has HBM...even though it still performs worse...but it is so much better because HBM!"? And what happens when the next generation of nVidia cards come out with HBM2 and AMD is still using HBM1? Are you going to say the nVidia card is now better because it is using the better form of HBM, the next generation of HBM? Somehow I doubt it. HBM isn't a value added feature.

The Nano doesn't offer better performance per watt. It is worse than the 980 and ties the 980Ti at 4K, it is worse than the 980Ti and 980 at 1440p, and is worse than the 970, 980, and 980Ti at 1080p.

GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.
Posted on Reply
#55
Kurt Maverick
newtekie1DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.
According to my data, the first full DX12 game (no betas or any other crap) will probably be Hitman (March 11 release date), with Quantum Break being the first DX12-only game (April 5). That is unless Tomb Raider's DX12 patch is released sooner than that...
Posted on Reply
#56
xfia
Kurt MaverickAnd here I thought I was talking about prices....of course, people take whatever context they damn want to suit their needs.

BTW, AMD doesn't offer full DX12 (12_1) support, Nvidia has offered 10-bit support since the Geforce 200 Series (nvidia.custhelp.com/app/answers/detail/a_id/3011/~/10-bit-per-color-support-on-nvidia-geforce-gpus), and AFAIK Nvidia has beaten AMD in almost every DX11 game out there. I don't know (neither care) what DX12 will mean for the actual gen, but my argument about that is that transitional generations sucks, and that the real DX12 battle begins this year.

So that only leaves you with the HBM argument. Woohoo! Enjoy it for the few more months that remains, fanboy.



Maybe because, like I said, there's a market for them? How dense can you be? The fact that I said that APUs have no use for gamers or other resource-hungry apps doesn't mean that they wouldn't be sold at all.

And the absurd argument of "OMG INTEL/AMD INVENTED IT FIRST, SO IT'S BETTER/THEY'RE MORALLY BETTER THAN THE OTHER!" is just the last resort of fanboys to justify their customer choice. Fortunately I'm smart enough as for making my choices based on raw performance and price/performance ratios, and not out of "loyalty" to a CORPORATION you don't even work in.

Seriously, fanboys of any kind are a nuisance, but I swear AMD fanboys are a pest. You can't say the slightest thing against 'their' brand, or in favor of the competition without them crying around in opposition.
Posted on Reply
#57
newtekie1
Semi-Retired Folder
Kurt MaverickAccording to my data, the first full DX12 game (no betas or any other crap) will probably be Hitman (March 11 release date), with Quantum Break being the first DX12-only game (April 5). That is unless Tomb Raider's DX12 patch is released sooner than that...
And it will really be interesting to see how much of a difference it will actually make. Obviously Microsoft is doing the same thing with Quantum Break as they did with Halo, making it DX12 only to try to force people to Windows 10(maybe I'll finally get around to formatting my main computer and installing it).

And when you look at the Steam survey, you've to almost 60% of the users running Windows 7 or 8 and only 35% running Windows 10. So is DX12 going to be a game changer this year?(pun not intended) No, I don't think so.
Posted on Reply
#58
ZeDestructor
xfia
Eventually, even the people who try and avoid the arguments snap from the constant shitty, uninformed circlejerk. I did so myself several threads ago when people were whining about how Intel wasn't innovating on the CPU side (here and later on in the same thread, here).. Here, it looks like @Kurt Maverick had his fill of the constant "AMD can do no bad and has never done any anti-consumer behaviour" circlejerk (spoiler: they're just as bad as Intel and nVidia when they're in the lead.. they're just not in the lead more often than not).
Posted on Reply
#59
rruff
newtekie1GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.
I suspect that will be as good as it gets for Zen also. Almost competitive, late to market, only makes sense if sold cheap. Hard to make money that way.

AMD has dug such a deep hole in the last 10 years, it would take a miracle for them to crawl out of it. Even if they manage to put out something that *beats* Intel and Nvidia on price/performance, Intel and Nvidia can just drop the price on competing products to retain market share, and keep AMD where they are. That would be nice for consumers while it lasts, but it probably isn't going to make AMD profitable. They'd need to keep that going for years.

The only way out that I see, is AMD hitting homeruns for the next few years to demonstrate that the company has potential, and then they merge with a company that has cash to invest. Or is that even possible with the licenses?
Posted on Reply
#60
rtwjunkie
PC Gaming Enthusiast
Kurt MaverickYeah, because APU's are soooo useful outside of budget and office builds....
It doesn't matter. That is 98% of the CPU/APU business...not enthusiasts like us. So yeah, AMD's APU's are extremely useful.
Posted on Reply
#61
Kurt Maverick
newtekie1And it will really be interesting to see how much of a difference it will actually make. Obviously Microsoft is doing the same thing with Quantum Break as they did with Halo, making it DX12 only to try to force people to Windows 10(maybe I'll finally get around to formatting my main computer and installing it).

And when you look at the Steam survey, you've to almost 60% of the users running Windows 7 or 8 and only 35% running Windows 10. So is DX12 going to be a game changer this year?(pun not intended) No, I don't think so.
It's still the future. I think that if there has even been a chance of a new API being a real game-changer, that's DX12 / Vulkan.
rtwjunkieIt doesn't matter. That is 98% of the CPU/APU business...not enthusiasts like us. So yeah, AMD's APU's are extremely useful.
And 99% of the people on the Internet thinks that a given percentage is false. Especially when it suits your argument so conveniently.

Anyway, I never denied that there's no market for them.
Posted on Reply
#62
rtwjunkie
PC Gaming Enthusiast
Kurt MaverickAnd 99% of the people on the Internet thinks that a given percentage is false. Especially when it suits your argument so conveniently.
I don't have an argument. I merely point out that the enthusists are an EXTREME minority. The majority of cpu's and apu's sold go to business, followed by regular users who don't do anything special.

I'm sorry if you're disappointed that yours and my desires don't count for anything with either AMD or Intel.
Posted on Reply
#63
xfia
rtwjunkieI don't have an argument. I merely point out that the enthusists are an EXTREME minority. The majority of cpu's and apu's sold go to business, followed by regular users who don't do anything special.

I'm sorry if you're disappointed that yours and my desires don't count for anything with either AMD or Intel.
gaming is a lot of money tho and gaming laptops sell well.. a lot of people actually know that you want to see a amd or nvidia logo somewhere. thats about it tho and well msi and asus have been making other laptop oem's put up a better fight to stay in business
Posted on Reply
#64
ZeDestructor
xfiagaming is a lot of money tho and gaming laptops sell well.. a lot of people actually know that you want to see a amd or nvidia logo somewhere. thats about it tho and well msi and asus have been making other laptop oem's put up a better fight to stay in business
Gaming is big money for GPU manufacturers, because on the consumer-facing side of things, that's the only thing still needing serious power. The rest of the time, an iGPU is just fine. For Intel on the other hand, gaming and enthusiast is but a tiny bit of marketshare. It's mostly the same story for AMD's CPU side, just slightly less skewed because AMD have no mobile CPUs worth talking about (when is the last time you saw a major laptop vendor ship AMD?).
Posted on Reply
#65
xfia
ZeDestructorGaming is big money for GPU manufacturers, because on the consumer-facing side of things, that's the only thing still needing serious power. The rest of the time, an iGPU is just fine. For Intel on the other hand, gaming and enthusiast is but a tiny bit of marketshare. It's mostly the same story for AMD's CPU side, just slightly less skewed because AMD have no mobile CPUs worth talking about (when is the last time you saw a major laptop vendor ship AMD?).
i think i just seen maybe dell or hp is shipping with a 380m for gaming but other than that. ya it wasnt a good situation before maxwell and then only got worse but i do recall some pretty nice gaming laptops being made with 290m and so is the 5k mac
Posted on Reply
#66
cdawall
where the hell are my stars
newtekie110-Bit color has no affect on the consumer market. 10-Bit panels are expensive, though they are getting cheaper, but your still looking at $350 for a GW2765HT, and that is an exception to the rule, the next cheapest panel after that is the $500 ASUS PB287Q. And 10-bit isn't noticeable better than 8-Bit to the average consumer(that is why almost no 4K TVs support it). 10-Bit is a Professional feature, that only professionals will notice if lacking. And I know someone is going to post those BS pictures with the on on the left being 256 Color, and the one on the right being 8-bit, and claim "look at the difference 10-Bit makes!" Don't waste your time, those pictures all over exaggerate the actual difference. On top of that, AMD cards lack HDMI 2.0, a far more consumer friendly feature than 10-bit. There are a lot of people that like to connect their computer to their TV. I'm sure there are a lot of consumers, a lot more than those that care about 10-bit support, that only can afford to buy one 4K device, and that is their TV, but also want to connect their high end gaming computer to it, to play games in 4K. Not with AMD cards you aren't, because most 4K TVs don't have displayport. So it's an extra $30 for an adapter if you want to do that.

DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.

HBM shouldn't be a marketing bullet when HBM cards are still losing in performance to GDDR5 cards. So does it really make a difference? Are people with the Fury cards going "Hey look, my card is so much better because it has HBM...even though it still performs worse...but it is so much better because HBM!"? And what happens when the next generation of nVidia cards come out with HBM2 and AMD is still using HBM1? Are you going to say the nVidia card is now better because it is using the better form of HBM, the next generation of HBM? Somehow I doubt it. HBM isn't a value added feature.

The Nano doesn't offer better performance per watt. It is worse than the 980 and ties the 980Ti at 4K, it is worse than the 980Ti and 980 at 1440p, and is worse than the 970, 980, and 980Ti at 1080p.

GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.
A lot of this is very true my only argument would be one those adapters aren't $30 for me and I wouldn't by any fury other than the nano. Overclock it and you have a cheaper fury x.

I also haven't really seen any benchmark showing gddr5 beating hbm?
Posted on Reply
#67
Pumper
If past AMDs PR and the real world performance figures are of any indication, the claimed 40% improvement will be just a best case scenario figure in one out of 50 difference benchmarks, while the average performance increase will be 20% tops.
Posted on Reply
#68
cdawall
where the hell are my stars
PumperIf past AMDs PR and the real world performance figures are of any indication, the claimed 40% improvement will be just a best case scenario figure in one out of 50 difference benchmarks, while the average performance increase will be 20% tops.
This is normally the case. I am hoping for once (and considering who designed the CPU) that this is not the case.
Posted on Reply
#69
newtekie1
Semi-Retired Folder
cdawallA lot of this is very true my only argument would be one those adapters aren't $30 for me and I wouldn't by any fury other than the nano. Overclock it and you have a cheaper fury x.

I also haven't really seen any benchmark showing gddr5 beating hbm?
I haven't seen a Displayport to HDMI2.0 adapter yet that was cheaper than $30.

Just look at the latest 980Ti Matrix benchmark here on TPU. It beats the Fury X by 17% at 4K. Sure, HBM provides more memory bandwidth than GDDR5, but when the overall card is still slower what's the point?
Posted on Reply
#70
xfia
newtekie1I haven't seen a Displayport to HDMI2.0 adapter yet that was cheaper than $30.

Just look at the latest 980Ti Matrix benchmark here on TPU. It beats the Fury X by 17% at 4K. Sure, HBM provides more memory bandwidth than GDDR5, but when the overall card is still slower what's the point?
its changing tho too and fiji gets better scaling with crossfire. its more than about the bandwidth that hbm provides.. its the low latency architecture that it comes with. it benefits vr among a long list of everything. duel fiji will be shown to blow away 980ti sli in the some situation.
about the article...
it looks like some serious architecture backing up those shiny new 14nm cores. not just low latency but high bandwidth.
along with dx12 its like turning a 4 lane highway into a 20 lane and making the speed limit 100% faster. the async highway
Posted on Reply
#71
newtekie1
Semi-Retired Folder
xfiaits changing tho too and fiji gets better scaling with crossfire. its more than about the bandwidth that hbm provides.. its the low latency architecture that it comes with. it benefits vr among a long list of everything. duel fiji will be shown to blow away 980ti sli in the some situation.
Whenever someone says "some situations" I immediately add "but those situations are the exception to the norm" in my head.
Posted on Reply
#72
xfia
newtekie1Whenever someone says "some situations" I immediately add "but those situations are the exception to the norm" in my head.
it does happen a lot when vram becomes the processing bottleneck and does happen at 4k and up.
Posted on Reply
#73
alucasa
Me is not gonna say anything until I see an ES chip review (or leak). Right now, we have nothing other than words.
Posted on Reply
#74
Jermelescu
For the sake of competition I hope Mr. Keller delivered magic.
Posted on Reply
#75
newtekie1
Semi-Retired Folder
Here is the thing with news about Zen. What we see isn't anything that is going to be available on the desktop market. "Up to" 32 Cores, we'll never see that on the desktop market. 8 DDR4 Channels, we'll never see that on the desktop market. There are already 16-core Bulldozer processors, but we don't see them on the desktop market.
JermelescuFor the sake of competition I hope Mr. Keller delivered magic.
I don't think he had to deliver magic, he just had to do what he was good at doing. AMD doesn't have to top Intel, and it probably won't. If they can get something out that is competitive with an 115X i7, then they will be in a good position, and I think(or hope) Jim Keller is capable of that.

What I think we will see on the desktop market is:

Up-To 8 Zen Cores with SMT for 16 threads
Up-To 4 DDR4 memory channels

But I don't think they are going to break the market up into the mainstream and HEDT like Intel does. Instead I think they will go with some kind of middle ground. So we'll likely see:

2-Core w/ SMT
4-Core w/out SMT
4-Core w/ SMT
8-Core w/out SMT
8-Core w/ SMT

I also think we'll see the motherboards that look more like the standard ATX boards we are used to with 115X, with only 4 RAM slots. Even if the boards support 4-Channel DDR4. You just have to populate all 4 slots if you want 4-Channel, if you only populate 2 slots, you get dual-channel(with a not so big performance hit, I'm guessing). Of course I'm sure we'll see the big players release HEDT motherboards with 8 RAM slots too, like the HEDT 2011, the difference will be they will still be using the same socket.

And I think that is the key for AMD, no matter what, they have to keep their desktop market all on the same socket. They can't try to break it up like Intel and AMD have been doing in the past. They tried to break it up with bulldozer, and have the HEDT market on AM3+ and the APU/Mainstream desktop market on FM2/+, and it didn't work. AMD has marketed on upgradability in the past. That is part of what made them a good choice. You would buy an AM2+ or even AM2 motherboard, and when AM3 processor came out you didn't have to replace your entire motherboard. When AM3+ came out, you could replace your motherboard and keep your AM3 processor. This allowed people to upgrade in steps instead of needing to replace the motherboard and processor all at once. You can buy a low end Zen computer, and stick one of the cheap processors in it to start, then when you save up a little more funds, you upgrade to the 8-Core monster.
Posted on Reply
Add your own comment
May 14th, 2024 17:22 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts