• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

AMD "Zen" Processors to Feature SMT, Support up to 8 DDR4 Memory Channels

Eh if you look beyond the titan no they don't. AMD offers 10bit color in consumer cards nvidia offers 8, and offers full dx12 on gpu, the fury cards have hbm, nvidia has gddr5, the nano offers better performance per watt, gcn offers better performance with resolutions higher than 1080p etc. Actually once you get past the hype and cards should be wiping the floor with nvidia in sales.

And here I thought I was talking about prices....of course, people take whatever context they damn want to suit their needs.

BTW, AMD doesn't offer full DX12 (12_1) support, Nvidia has offered 10-bit support since the Geforce 200 Series (http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus), and AFAIK Nvidia has beaten AMD in almost every DX11 game out there. I don't know (neither care) what DX12 will mean for the actual gen, but my argument about that is that transitional generations sucks, and that the real DX12 battle begins this year.

So that only leaves you with the HBM argument. Woohoo! Enjoy it for the few more months that remains, fanboy.

Nothing. Intel didn't invent SMT or use it first lol. Intel used it purely for marketing early on (it wasn't worth a crap until the i series). And if you knew anything about bulldozer, the FPU uses SMT (although this was basically a cost cutting measure).
It didn't make fiscal sense for AMD to use it before. They've never had the budget to make the chips even more complex (could barely get them out the door as it was). Zen has been in the works for a long time and with the node shrinks, they have more room to implement better features.

Keep on blabbing. If APUs are so worthless, then why did intel copy it? Is your foot tasty?

Maybe because, like I said, there's a market for them? How dense can you be? The fact that I said that APUs have no use for gamers or other resource-hungry apps doesn't mean that they wouldn't be sold at all.

And the absurd argument of "OMG INTEL/AMD INVENTED IT FIRST, SO IT'S BETTER/THEY'RE MORALLY BETTER THAN THE OTHER!" is just the last resort of fanboys to justify their customer choice. Fortunately I'm smart enough as for making my choices based on raw performance and price/performance ratios, and not out of "loyalty" to a CORPORATION you don't even work in.

Seriously, fanboys of any kind are a nuisance, but I swear AMD fanboys are a pest. You can't say the slightest thing against 'their' brand, or in favor of the competition without them crying around in opposition.
 
Last edited:
No need to clash fists here, we can all keep it civil can't we?

The fact of the matter is that the companies AMD competes with have a bigger budget and more resources at their disposal. This means that most of the time they will have the upper hand in price, performance, marketing, business deals, partnerships, etc.
This does not mean that AMD can't have their niche and survive if not flourish in their segments. It's just that they have been in the negative for so long, at this point they really are struggling to survive. If Jim Keller has done a miracle on the design of his ZEN, it will be a success on that front. If it fails, it will be 100% AMDs fault. Once the engine gets revved up to start getting these chips out on shelves, they need to do some HEAVY marketing to appeal to the masses otherwise I'm afraid it won't look good on that balance sheet of theirs.
 
And here I thought I was talking about prices....of course, people take whatever context they damn want to suit their needs.

BTW, AMD doesn't offer full DX12 (12_1) support, Nvidia has offered 10-bit support since the Geforce 200 Series (http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus), and AFAIK Nvidia has beaten AMD in almost every DX11 game out there. I don't know (neither care) what DX12 will mean for the actual gen, but my argument about that is that transitional generations sucks, and that the real DX12 battle begins this year.

So that only leaves you with the HBM argument. Woohoo! Enjoy it for the few more months that remains, fanboy.

In your link it states that you need a quadro card to use 10 bit color. My exact wording was consumer card and I used it for a reason.

If you want to be really specific in performance the and dual gpu card from 2 generations ago is still the fastest single "card" solution. Of you look past 1080P the fury series beats everything but the titan and does beat the titan in some games. Again it's exactly what I already posted.

And again if you would look just a little closer into dx12. You will notice somethings.

I am no fanboy I have run both sets of cards and buy based off value/feature set. Last set of cards was three water cooled gtx470's. Those are to this day my favorite cards followed by my original ti4200, which currently holds some of the highest clock speed records ever recorded.
 
In your link it states that you need a quadro card to use 10 bit color. My exact wording was consumer card and I used it for a reason.

If you want to be really specific in performance the and dual gpu card from 2 generations ago is still the fastest single "card" solution. Of you look past 1080P the fury series beats everything but the titan and does beat the titan in some games. Again it's exactly what I already posted.

And again if you would look just a little closer into dx12. You will notice somethings.

I am no fanboy I have run both sets of cards and buy based off value/feature set. Last set of cards was three water cooled gtx470's. Those are to this day my favorite cards followed by my original ti4200, which currently holds some of the highest clock speed records ever recorded.

And how many "non-professional" 10-bit monitors you currently see in the market, huh? And how much they'd be overshadowed by the future HDR monitors anyway?

Whatever. Like I said, it's not my point, and I'm absolutely tired of fanboyisms. Don't expect me to reply to any more of them. The same I say to anyone else.
 
Eh if you look beyond the titan no they don't. AMD offers 10bit color in consumer cards nvidia offers 8, and offers full dx12 on gpu, the fury cards have hbm, nvidia has gddr5, the nano offers better performance per watt, gcn offers better performance with resolutions higher than 1080p etc. Actually once you get past the hype and cards should be wiping the floor with nvidia in sales.

10-Bit color has no affect on the consumer market. 10-Bit panels are expensive, though they are getting cheaper, but your still looking at $350 for a GW2765HT, and that is an exception to the rule, the next cheapest panel after that is the $500 ASUS PB287Q. And 10-bit isn't noticeable better than 8-Bit to the average consumer(that is why almost no 4K TVs support it). 10-Bit is a Professional feature, that only professionals will notice if lacking. And I know someone is going to post those BS pictures with the on on the left being 256 Color, and the one on the right being 8-bit, and claim "look at the difference 10-Bit makes!" Don't waste your time, those pictures all over exaggerate the actual difference. On top of that, AMD cards lack HDMI 2.0, a far more consumer friendly feature than 10-bit. There are a lot of people that like to connect their computer to their TV. I'm sure there are a lot of consumers, a lot more than those that care about 10-bit support, that only can afford to buy one 4K device, and that is their TV, but also want to connect their high end gaming computer to it, to play games in 4K. Not with AMD cards you aren't, because most 4K TVs don't have displayport. So it's an extra $30 for an adapter if you want to do that.

DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.

HBM shouldn't be a marketing bullet when HBM cards are still losing in performance to GDDR5 cards. So does it really make a difference? Are people with the Fury cards going "Hey look, my card is so much better because it has HBM...even though it still performs worse...but it is so much better because HBM!"? And what happens when the next generation of nVidia cards come out with HBM2 and AMD is still using HBM1? Are you going to say the nVidia card is now better because it is using the better form of HBM, the next generation of HBM? Somehow I doubt it. HBM isn't a value added feature.

The Nano doesn't offer better performance per watt. It is worse than the 980 and ties the 980Ti at 4K, it is worse than the 980Ti and 980 at 1440p, and is worse than the 970, 980, and 980Ti at 1080p.

GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.
 
DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.

According to my data, the first full DX12 game (no betas or any other crap) will probably be Hitman (March 11 release date), with Quantum Break being the first DX12-only game (April 5). That is unless Tomb Raider's DX12 patch is released sooner than that...
 
And here I thought I was talking about prices....of course, people take whatever context they damn want to suit their needs.

BTW, AMD doesn't offer full DX12 (12_1) support, Nvidia has offered 10-bit support since the Geforce 200 Series (http://nvidia.custhelp.com/app/answ...-bit-per-color-support-on-nvidia-geforce-gpus), and AFAIK Nvidia has beaten AMD in almost every DX11 game out there. I don't know (neither care) what DX12 will mean for the actual gen, but my argument about that is that transitional generations sucks, and that the real DX12 battle begins this year.

So that only leaves you with the HBM argument. Woohoo! Enjoy it for the few more months that remains, fanboy.



Maybe because, like I said, there's a market for them? How dense can you be? The fact that I said that APUs have no use for gamers or other resource-hungry apps doesn't mean that they wouldn't be sold at all.

And the absurd argument of "OMG INTEL/AMD INVENTED IT FIRST, SO IT'S BETTER/THEY'RE MORALLY BETTER THAN THE OTHER!" is just the last resort of fanboys to justify their customer choice. Fortunately I'm smart enough as for making my choices based on raw performance and price/performance ratios, and not out of "loyalty" to a CORPORATION you don't even work in.

Seriously, fanboys of any kind are a nuisance, but I swear AMD fanboys are a pest. You can't say the slightest thing against 'their' brand, or in favor of the competition without them crying around in opposition.
l-114835.jpg
 
According to my data, the first full DX12 game (no betas or any other crap) will probably be Hitman (March 11 release date), with Quantum Break being the first DX12-only game (April 5). That is unless Tomb Raider's DX12 patch is released sooner than that...

And it will really be interesting to see how much of a difference it will actually make. Obviously Microsoft is doing the same thing with Quantum Break as they did with Halo, making it DX12 only to try to force people to Windows 10(maybe I'll finally get around to formatting my main computer and installing it).

And when you look at the Steam survey, you've to almost 60% of the users running Windows 7 or 8 and only 35% running Windows 10. So is DX12 going to be a game changer this year?(pun not intended) No, I don't think so.
 

Eventually, even the people who try and avoid the arguments snap from the constant shitty, uninformed circlejerk. I did so myself several threads ago when people were whining about how Intel wasn't innovating on the CPU side (here and later on in the same thread, here).. Here, it looks like @Kurt Maverick had his fill of the constant "AMD can do no bad and has never done any anti-consumer behaviour" circlejerk (spoiler: they're just as bad as Intel and nVidia when they're in the lead.. they're just not in the lead more often than not).
 
GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.

I suspect that will be as good as it gets for Zen also. Almost competitive, late to market, only makes sense if sold cheap. Hard to make money that way.

AMD has dug such a deep hole in the last 10 years, it would take a miracle for them to crawl out of it. Even if they manage to put out something that *beats* Intel and Nvidia on price/performance, Intel and Nvidia can just drop the price on competing products to retain market share, and keep AMD where they are. That would be nice for consumers while it lasts, but it probably isn't going to make AMD profitable. They'd need to keep that going for years.

The only way out that I see, is AMD hitting homeruns for the next few years to demonstrate that the company has potential, and then they merge with a company that has cash to invest. Or is that even possible with the licenses?
 
Yeah, because APU's are soooo useful outside of budget and office builds....

It doesn't matter. That is 98% of the CPU/APU business...not enthusiasts like us. So yeah, AMD's APU's are extremely useful.
 
And it will really be interesting to see how much of a difference it will actually make. Obviously Microsoft is doing the same thing with Quantum Break as they did with Halo, making it DX12 only to try to force people to Windows 10(maybe I'll finally get around to formatting my main computer and installing it).

And when you look at the Steam survey, you've to almost 60% of the users running Windows 7 or 8 and only 35% running Windows 10. So is DX12 going to be a game changer this year?(pun not intended) No, I don't think so.

It's still the future. I think that if there has even been a chance of a new API being a real game-changer, that's DX12 / Vulkan.

It doesn't matter. That is 98% of the CPU/APU business...not enthusiasts like us. So yeah, AMD's APU's are extremely useful.

And 99% of the people on the Internet thinks that a given percentage is false. Especially when it suits your argument so conveniently.

Anyway, I never denied that there's no market for them.
 
And 99% of the people on the Internet thinks that a given percentage is false. Especially when it suits your argument so conveniently.

I don't have an argument. I merely point out that the enthusists are an EXTREME minority. The majority of cpu's and apu's sold go to business, followed by regular users who don't do anything special.

I'm sorry if you're disappointed that yours and my desires don't count for anything with either AMD or Intel.
 
I don't have an argument. I merely point out that the enthusists are an EXTREME minority. The majority of cpu's and apu's sold go to business, followed by regular users who don't do anything special.

I'm sorry if you're disappointed that yours and my desires don't count for anything with either AMD or Intel.
gaming is a lot of money tho and gaming laptops sell well.. a lot of people actually know that you want to see a amd or nvidia logo somewhere. thats about it tho and well msi and asus have been making other laptop oem's put up a better fight to stay in business
 
gaming is a lot of money tho and gaming laptops sell well.. a lot of people actually know that you want to see a amd or nvidia logo somewhere. thats about it tho and well msi and asus have been making other laptop oem's put up a better fight to stay in business

Gaming is big money for GPU manufacturers, because on the consumer-facing side of things, that's the only thing still needing serious power. The rest of the time, an iGPU is just fine. For Intel on the other hand, gaming and enthusiast is but a tiny bit of marketshare. It's mostly the same story for AMD's CPU side, just slightly less skewed because AMD have no mobile CPUs worth talking about (when is the last time you saw a major laptop vendor ship AMD?).
 
Gaming is big money for GPU manufacturers, because on the consumer-facing side of things, that's the only thing still needing serious power. The rest of the time, an iGPU is just fine. For Intel on the other hand, gaming and enthusiast is but a tiny bit of marketshare. It's mostly the same story for AMD's CPU side, just slightly less skewed because AMD have no mobile CPUs worth talking about (when is the last time you saw a major laptop vendor ship AMD?).
i think i just seen maybe dell or hp is shipping with a 380m for gaming but other than that. ya it wasnt a good situation before maxwell and then only got worse but i do recall some pretty nice gaming laptops being made with 290m and so is the 5k mac
 
10-Bit color has no affect on the consumer market. 10-Bit panels are expensive, though they are getting cheaper, but your still looking at $350 for a GW2765HT, and that is an exception to the rule, the next cheapest panel after that is the $500 ASUS PB287Q. And 10-bit isn't noticeable better than 8-Bit to the average consumer(that is why almost no 4K TVs support it). 10-Bit is a Professional feature, that only professionals will notice if lacking. And I know someone is going to post those BS pictures with the on on the left being 256 Color, and the one on the right being 8-bit, and claim "look at the difference 10-Bit makes!" Don't waste your time, those pictures all over exaggerate the actual difference. On top of that, AMD cards lack HDMI 2.0, a far more consumer friendly feature than 10-bit. There are a lot of people that like to connect their computer to their TV. I'm sure there are a lot of consumers, a lot more than those that care about 10-bit support, that only can afford to buy one 4K device, and that is their TV, but also want to connect their high end gaming computer to it, to play games in 4K. Not with AMD cards you aren't, because most 4K TVs don't have displayport. So it's an extra $30 for an adapter if you want to do that.

DX12 right now doesn't matter, and probably isn't going to matter for this generation of cards. Tomb Raider is the first AAA game we are likely to see utilize it, maybe, and it is likely the nVidia GPUs will support every feature necessary.

HBM shouldn't be a marketing bullet when HBM cards are still losing in performance to GDDR5 cards. So does it really make a difference? Are people with the Fury cards going "Hey look, my card is so much better because it has HBM...even though it still performs worse...but it is so much better because HBM!"? And what happens when the next generation of nVidia cards come out with HBM2 and AMD is still using HBM1? Are you going to say the nVidia card is now better because it is using the better form of HBM, the next generation of HBM? Somehow I doubt it. HBM isn't a value added feature.

The Nano doesn't offer better performance per watt. It is worse than the 980 and ties the 980Ti at 4K, it is worse than the 980Ti and 980 at 1440p, and is worse than the 970, 980, and 980Ti at 1080p.

GCN only manages to close the gap at higher resolutions, it doesn't offer better performance. The Fury X is the best GCN card available, and it merely ties with the 980Ti at 4K, and the pre-overclocked 980Ti's are actually crushing the Fury X with 15%+ better performance at 4k. And since the Fury X's overclocking is lackluster, to say the least, there is no making that difference up by overclocking the Fury X.

A lot of this is very true my only argument would be one those adapters aren't $30 for me and I wouldn't by any fury other than the nano. Overclock it and you have a cheaper fury x.

I also haven't really seen any benchmark showing gddr5 beating hbm?
 
If past AMDs PR and the real world performance figures are of any indication, the claimed 40% improvement will be just a best case scenario figure in one out of 50 difference benchmarks, while the average performance increase will be 20% tops.
 
If past AMDs PR and the real world performance figures are of any indication, the claimed 40% improvement will be just a best case scenario figure in one out of 50 difference benchmarks, while the average performance increase will be 20% tops.

This is normally the case. I am hoping for once (and considering who designed the CPU) that this is not the case.
 
A lot of this is very true my only argument would be one those adapters aren't $30 for me and I wouldn't by any fury other than the nano. Overclock it and you have a cheaper fury x.

I also haven't really seen any benchmark showing gddr5 beating hbm?

I haven't seen a Displayport to HDMI2.0 adapter yet that was cheaper than $30.

Just look at the latest 980Ti Matrix benchmark here on TPU. It beats the Fury X by 17% at 4K. Sure, HBM provides more memory bandwidth than GDDR5, but when the overall card is still slower what's the point?
 
I haven't seen a Displayport to HDMI2.0 adapter yet that was cheaper than $30.

Just look at the latest 980Ti Matrix benchmark here on TPU. It beats the Fury X by 17% at 4K. Sure, HBM provides more memory bandwidth than GDDR5, but when the overall card is still slower what's the point?
its changing tho too and fiji gets better scaling with crossfire. its more than about the bandwidth that hbm provides.. its the low latency architecture that it comes with. it benefits vr among a long list of everything. duel fiji will be shown to blow away 980ti sli in the some situation.
about the article...
it looks like some serious architecture backing up those shiny new 14nm cores. not just low latency but high bandwidth.
along with dx12 its like turning a 4 lane highway into a 20 lane and making the speed limit 100% faster. the async highway
 
its changing tho too and fiji gets better scaling with crossfire. its more than about the bandwidth that hbm provides.. its the low latency architecture that it comes with. it benefits vr among a long list of everything. duel fiji will be shown to blow away 980ti sli in the some situation.

Whenever someone says "some situations" I immediately add "but those situations are the exception to the norm" in my head.
 
Whenever someone says "some situations" I immediately add "but those situations are the exception to the norm" in my head.
it does happen a lot when vram becomes the processing bottleneck and does happen at 4k and up.
 
Me is not gonna say anything until I see an ES chip review (or leak). Right now, we have nothing other than words.
 
Back
Top