Monday, January 4th 2016

AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture

AMD provided customers with a glimpse of its upcoming 2016 Polaris GPU architecture, highlighting a wide range of significant architectural improvements including HDR monitor support, and industry-leading performance-per-watt. AMD expects shipments of Polaris architecture-based GPUs to begin in mid-2016.

AMD's Polaris architecture-based 14nm FinFET GPUs deliver a remarkable generational jump in power efficiency. Polaris-based GPUs are designed for fluid frame rates in graphics, gaming, VR and multimedia applications running on compelling small form-factor thin and light computer designs.

"Our new Polaris architecture showcases significant advances in performance, power efficiency and features," said Lisa Su, president and CEO, AMD. "2016 will be a very exciting year for Radeon fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group."

The Polaris architecture features AMD's 4th generation Graphics Core Next (GCN) architecture, a next-generation display engine with support for HDMI 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding.


AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 2020.
Add your own comment

88 Comments on AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture

#1
Steevo
Compared to a 950, but with no mention of price, but at only .85 V according to the specs so that is good I suppose, unless its a very cherry picked sample with a very custom tuned BIOS for power consumption.

It shows 850E before the voltage, perhaps meaning 850Mhz core clock Energy saving made or efficiency mode?

Considering the 950 in a review with some overclock was pulling roughly 100W by itself, and they are saying the whole tower in the video was pulling 150-160W for the Nvidia 950 system, compared to their 86W total... that means it was pulling 30-40W only for the GPU.
Posted on Reply
#2
the54thvoid
Intoxicated Moderator
Too early for anything but PR speculation. HBM2 and the 1/2 node size will have massive efficiency gains regardless. But, happily it seems the head man is doing the talking, not some marketing type so this could be a very good portent of things to come.
Posted on Reply
#3
Casecutter
It sure seems RTG has started a constant drum beat... and it feels a little to early.

Can they keep this what seem like 3 week cadence (or quicker) going till there's products to show? I'm hoping RTG has a new PR initiative to build a slow constant beat... there was Crimson, 380X, the whole better pixels ~ HDR stuff, FreeSync monitors, now Polaris architecture. All of a sudden it just seems they're letting stuff go too rapidly and will there be enough to keep this beat for 4-6 months? I mean I'd hate to see a period of silence where rumor and forums start crafting information just to get hits, and either AMD is in a place plug holes (deny), or have to bring it forward to cover it.

It sounds like they wanted this released now and freely outing their intentions which isn't any big thing, but the whole leading "performance-per-watt" seems a bit to "before it’s time". They could've done Polaris and how they differentiate from GCN with display engine support HDMI 2.0a DP 1.3, and multimedia features for 4K h.265 and not came with an actual comparison to competition perf/W to the (system to system) seemed offer a little too much at this stage.
Posted on Reply
#4
the54thvoid
Intoxicated Moderator
CasecutterIt sure seems RTG has started a constant drum beat... and it feels a little to early.

Can they keep this what seem like 3 week cadence (or quicker) going till there's products to show? I'm hoping RTG has a new PR initiative to build a slow constant beat... there was Crimson, 380X, the whole better pixels ~ HDR stuff, FreeSync monitors, now Polaris architecture. All of a sudden it just seems they're letting stuff go too rapidly and will there be enough to keep this beat for 4-6 months? I mean I'd hate to see a period of silence where rumor and forums start crafting information just to get hits, and either AMD is in a place plug holes (deny), or have to bring it forward to cover it.

It sounds like they wanted this released now and freely outing their intentions which isn't any big thing, but the whole leading "performance-per-watt" seems a bit to "before it’s time". They could've done Polaris and how they differentiate from GCN with display engine support HDMI 2.0a DP 1.3, and multimedia features for 4K h.265 and not came with an actual comparison to competition perf/W to the (system to system) seemed offer a little too much at this stage.
Well, Nvidia have an event tomorrow so perhaps AMD jumped first for the PR? 2016 could be very dog eat dog.
Posted on Reply
#5
lilhasselhoffer
I'm too wary to jump at this for its face value.

If you'll note, the power consumption figures are for a single 1080p monitor and the entire system. That means that the theoretical savings from the GPU should be 154-86= 68 Watts. That seems a little bit high, especially considering that none of the new features of that GPU are being utilized. Given those numbers, the extra 6/8 pin power connector is dang near not needed. What I find even funnier is that instead of showing their progress (say a 3xx series card versus this new one) they compare to what will be outdated Nvidia tech before they come to market.

This is depressingly fluff. Instead of showing what they've got they're measuring against an old stick. It could well be the fear that Nvidia will release truly amazing cards with Pascal, but I'd hazard that this is more smoke screen than outright fear. Say that your cards are great before the competition can respond, and then when they bring out numbers you can target your lineup and its pricing to compete well. I'm saddened to think that AMD PR thinks this is necessary.


If AMD came forward with a demonstration where we could see 2 or 3 monitors, running games, I might be happier. I would even accept 4k video as a reasonable demonstration (especially over a single cable, rather than the painful setups we have to make now). What they've shown us right now is that an unknown card, in an unknown segment, can compete well against a relatively low end card from the competitor that has been available for some time.

Sorry, but you don't buy a 950 for gaming unless you've got a very tight budget. I'd like to see two cards that we can say are direct price competitors in the $200-300 range square off. That's where the computer enthusiast looks to spend their money, not on something just adequate for current gaming.
Posted on Reply
#6
Xzibit
CasecutterIt sure seems RTG has started a constant drum beat... and it feels a little to early.

Can they keep this what seem like 3 week cadence (or quicker) going till there's products to show? I'm hoping RTG has a new PR initiative to build a slow constant beat... there was Crimson, 380X, the whole better pixels ~ HDR stuff, FreeSync monitors, now Polaris architecture. All of a sudden it just seems they're letting stuff go too rapidly and will there be enough to keep this beat for 4-6 months? I mean I'd hate to see a period of silence where rumor and forums start crafting information just to get hits, and either AMD is in a place plug holes (deny), or have to bring it forward to cover it.

It sounds like they wanted this released now and freely outing their intentions which isn't any big thing, but the whole leading "performance-per-watt" seems a bit to "before it’s time". They could've done Polaris and how they differentiate from GCN with display engine support HDMI 2.0a DP 1.3, and multimedia features for 4K h.265 and not came with an actual comparison to competition perf/W to the (system to system) seemed offer a little too much at this stage.
CES is going to be HDR heavy HDMI 2.0a with all the TV manufacturers pushing it this year. PC folks will have DP 1.3 finally. All there is left is product support from TV & Monitors once the GPUs are made available. HDMI 2.0 was short-lived and horribly supported by only a limited expensive TVs.

Expect most news to be beneficial for TV and Mobile. Both will save their in-depth reveals for their own shows later on.
Posted on Reply
#7
Assimilator
From AnandTech:
As for RTG’s FinFET manufacturing plans, the fact that RTG only mentions “FinFET” and not a specific FinFET process (e.g. TSMC 16nm) is intentional. The group has confirmed that they will be utilizing both traditional partner TSMC’s 16nm process and AMD fab spin-off (and Samsung licensee) GlobalFoundries’ 14nm process, making this the first time that AMD’s graphics group has used more than a single fab. To be clear here there’s no expectation that RTG will be dual-sourcing – having both fabs produce the same GPU – but rather the implication is that designs will be split between the two fabs. To that end we know that the small Polaris GPU that RTG previewed will be produced by GlobalFoundries on their 14nm process, meanwhile it remains to be seen how the rest of RTG’s Polaris GPUs will be split between the fabs.

Unfortunately what’s not clear at this time is why RTG is splitting designs like this. Even without dual sourcing any specific GPU, RTG will still incur some extra costs to develop common logic blocks for both fabs. Meanwhile it's also not clear right now whether any single process/fab is better or worse for GPUs, and what die sizes are viable, so until RTG discloses more information about the split order, it's open to speculation what the technical reasons may be. However it should be noted that on the financial side of matters, as AMD continues to execute a wafer share agreement with GlobalFoundries, it’s likely that this split helps AMD to fulfill their wafer obligations by giving GlobalFoundries more of AMD's chip orders.
Seems that TSMC still has some secret GPU sauce.
Posted on Reply
#8
the54thvoid
Intoxicated Moderator
XzibitCES is going to be HDR heavy HDMI 2.0a with all the TV manufacturers pushing it this year. PC folks will have DP 1.3 finally. All there is left is product support from TV & Monitors once the GPUs are made available. HDMI 2.0 was short-lived and horribly supported by only a limited expensive TVs.

Expect most news to be beneficial for TV and Mobile. Both will save their in-depth reveals for their own shows later on.
HDR a go-go. Was looking at 4k OLED TV's but read the current crop aren't HDR compatible which is what the next Blu-ray (4K) will use. So yeah, I think that will be a huge focus this year as you say.
Posted on Reply
#9
Xzibit
the54thvoidHDR a go-go. Was looking at 4k OLED TV's but read the current crop aren't HDR compatible which is what the next Blu-ray (4K) will use. So yeah, I think that will be a huge focus this year as you say.
Another thing to look out for is HDR requires higher bandwidth HDMI 2.0a doesn't look to be capable of handling 4k rec.2020 10-bit HDR @ 60hz. I suspect all this years model (with the exception of some $12,000+ models) to suffer the same draw backs almost all 4k TVs still have of just being a resolution increase with no proper support.

4k HDR TVs will be limited to 30hz on HDMI 2.0a for full support but likely manufacturers will dumb it down to 8-bit like they currently are with current 4k support until HDMI gets better. DisplayPort 1.3 should have no problem with full 4k HDR support at 60hz.
Posted on Reply
#10
arbiter
SteevoIt shows 850E before the voltage, perhaps meaning 850Mhz core clock Energy saving made or efficiency mode?

Considering the 950 in a review with some overclock was pulling roughly 100W by itself, and they are saying the whole tower in the video was pulling 150-160W for the Nvidia 950 system, compared to their 86W total... that means it was pulling 30-40W only for the GPU.
Its AMD so can't take anything they say at 100% face value, til its proven by independent reviews
CasecutterIt sure seems RTG has started a constant drum beat... and it feels a little to early.
Yea typical AMD starting up the hype train which could come to a screeching halt if they poke the green monster to much.
lilhasselhofferWhat I find even funnier is that instead of showing their progress (say a 3xx series card versus this new one) they compare to what will be outdated Nvidia tech before they come to market.
This is depressingly fluff. Instead of showing what they've got they're measuring against an old stick. It could well be the fear that Nvidia will release truly amazing cards with Pascal,
I think AMD should stick to comparing to their own cards instead of nvidia. if you look on nvidia's site they just compare their cards to their own cards not amd.
Posted on Reply
#11
PP Mguire
Blah blah blah, beat Nvidia so I can play with team Red again. Please.
Posted on Reply
#12
FordGT90Concept
"I go fast!1!11!1!"
0:39

HDMI 2.0a
DisplayPort 1.3
4K h.265 encode/decode

YAY! :D I want it NOW! :D
Posted on Reply
#13
deemon
FordGT90Concept0:39

HDMI 2.0a
DisplayPort 1.3
4K h.265 encode/decode

YAY! :D I want it NOW! :D
Now if they can only give us a card that can pull off 4k HDR @ 60+fps at ultra settings in any game in a single card solution and in Fury Nano size card. That would be something! :)
Posted on Reply
#14
FordGT90Concept
"I go fast!1!11!1!"
I think 4K HDR @ 60+ FPS is entirely plausible. I think the PCB will be nano-sized (because HBM) but the air coolers on it would have to be larger to prevent thermal throttling. A little more than Fury X performance from a 380-sized card is likely.
Posted on Reply
#15
R-T-B
So much for "Arctic Islands" naming scheme?

I was looking forward to owning "Greenland," awesome place. Not that it's really relevant, but oh well.

EDIT: NVM, it seems according to another newslink, this is simply the name for the 4th gen GCN architecture. They still have arctic islands.
Posted on Reply
#16
a_ump
i'm not super tv/monitor savy besides the basics. I keep reading HDR.....only HDR i know of is the light setting in games.
Posted on Reply
#17
deemon
a_umpi'm not super tv/monitor savy besides the basics. I keep reading HDR.....only HDR i know of is the light setting in games.
en.wikipedia.org/wiki/High-dynamic-range_imaging

What it technically means, is 10/12bits per color information, instead of currently widely used 8bits. So instead of 256*256*256 = 16.7M different colors you will get instead on case of 10 bits 1024*1024*1024 = 1073M colors and on case of 12 bits 4096*4096*4096 = 281474976M colors.

Maybe easier to understand would be that right now you have from black to white 256 different gradient steps (256 shades of grey :D ) and if you made such gradient over your 1080p screen from left to right, each different color will have 7.5 pixels wide stripe, then on case of 10bit colors you will have 1024 different colors and each color line would be 1.875pixels wide => much much smoother gradient. On case of 12 bits, you could make such gradient over 4k screen and each line would have it's own color.

However, I must say, that those "HDR photos" that I have seen in interwebs and in journals, for example
stuckincustoms.smugmug.com/Portfolio/i-8mFWsjn/1/900x591/953669278_349a6a9897_o-900x591.jpg
www.imgbase.info/images/safe-wallpapers/photography/hdr/41108_hdr_hdr_landscape.jpg

although look beautiful, don't exactly look natural ... so I am a bit puzzled, what this AMD HDR stuff would mean for picture itself.

If anyone has better explanation please correct me :)
Posted on Reply
#18
a_ump
deemon~
What it technically means, is 10/12bits per color information, instead of currently widely used 8bits. So instead of 256*256*256 = 16.7M different colors you will get instead on case of 10 bits 1024*1024*1024 = 1073M colors and on case of 12 bits 4096*4096*4096 = 281474976M colors.

Maybe easier to understand would be that right now you have from black to white 256 different gradient steps (256 shades of grey :D ) and if you made such gradient over your 1080p screen from left to right, each different color will have 7.5 pixels wide stripe, then on case of 10bit colors you will have 1024 different colors and each color line would be 1.875pixels wide => much much smoother gradient. On case of 12 bits, you could make such gradient over 4k screen and each line would have it's own color.
~
If anyone has better explanation please correct me :)
ahh thanks, so that would explain why in dark dark scenes, my tv tends to have somewhat noticeable black "shade" lines as i call them, where its distinguishable from one area of black on my screen to another area instead of a smooth transition. Course my TV is from 2012. Here i thought they called 16.7mill colors "true color" because that was all the recognizable colors to the human eye.
Posted on Reply
#19
Xzibit
deemonen.wikipedia.org/wiki/High-dynamic-range_imaging

What it technically means, is 10/12bits per color information, instead of currently widely used 8bits. So instead of 256*256*256 = 16.7M different colors you will get instead on case of 10 bits 1024*1024*1024 = 1073M colors and on case of 12 bits 4096*4096*4096 = 281474976M colors.

Maybe easier to understand would be that right now you have from black to white 256 different gradient steps (256 shades of grey :D ) and if you made such gradient over your 1080p screen from left to right, each different color will have 7.5 pixels wide stripe, then on case of 10bit colors you will have 1024 different colors and each color line would be 1.875pixels wide => much much smoother gradient. On case of 12 bits, you could make such gradient over 4k screen and each line would have it's own color.

However, I must say, that those "HDR photos" that I have seen in interwebs and in journals, for example
stuckincustoms.smugmug.com/Portfolio/i-8mFWsjn/1/900x591/953669278_349a6a9897_o-900x591.jpg
www.imgbase.info/images/safe-wallpapers/photography/hdr/41108_hdr_hdr_landscape.jpg

although look beautiful, don't exactly look natural ... so I am a bit puzzled, what this AMD HDR stuff would mean for picture itself.

If anyone has better explanation please correct me :)
HDR on cameras is multi exposures filtered to make 1.
HDR on TVs seem to be a reference for 4k standards + improved CR on panels.
Posted on Reply
#20
geon2k2
deemonalthough look beautiful, don't exactly look natural ... so I am a bit puzzled, what this AMD HDR stuff would mean for picture itself.
Another question is, how would those HDR 10 bit/color pictures look on a "gaming" TN panel which in most of the cases has 6 bit/color.

I think these TN monitors should just disappear or they should only be used in entry level products, more like it happen already for the phones. Most decent phones have ips/amoled or different similar tech.

On the performance side I can't wait to see what the FinFET brings to the table. There should be amazing improvement over the last generation. And I really like that they continue with the GCN, which means most of the GCN cards will still get support.
Posted on Reply
#21
RejZoR
Yet another one bashing TN panels because reasons. Good luck finding a 144Hz IPS screens that don't cost fucking 1k €. But you can get TN ones with same other specs for 1/4th of the price. And color/angle wise they aren't that worse. Stop thinking of TN panels from year 2005 and comparing them to those released in 2015...

Source: I own a 144Hz TN gaming monitor...
Posted on Reply
#22
deemon
geon2k2Another question is, how would those HDR 10 bit/color pictures on a "gaming" TN panel which in most of the cases it has 6 bit/color.
99% of the currently available monitors don't benefit from this, be they TN, IPS or MVA. There are few uber-expensive "pro" displays... that you can use with nvidia tegra/firepro GPU-s I believe.

Hopefully with this new generation of AMD GPU-s will bring new displays to market (AMD hinted with cooperation with different display manufacturers for "HDR displays") with 10/12 bit support, that doesn't cost an arm and a leg. And have Free-Sync support. And are IPS ... and are OLED ... and are 21:9 ... and are 144+Hz... and are curved. Too many things that you have to look for when shopping for displays.
Posted on Reply
#23
geon2k2
RejZoRYet another one bashing TN panels because reasons. Good luck finding a 144Hz IPS screens that don't cost fucking 1k €. But you can get TN ones with same other specs for 1/4th of the price. And color/angle wise they aren't that worse. Stop thinking of TN panels from year 2005 and comparing them to those released in 2015...

Source: I own a 144Hz TN gaming monitor...
:)

I obviously have the IPS with 60hz, and a pretty old one as well, but i still like it so much, despite its pitfalls, which is mostly lack of 120Hz and Freesync/Gsync.

I don't know what to say, I occasionally visit electronics stores and every time I pass by the shelf with monitors I can instantly tell which is TN and which is IPS, just by seeing the viewing angles, and my guess is that they sell quite new monitors in there, in fact some of the ips on display are those crazy wide monitors, which are a very new gimmick.

I would also like more hertz and tearing free experience, but not at the expense of color fidelity and viewing angles. If I cannot have both then I prefer IPS.
I think companies should stop investing in TN and focus more on making better technology affordable and available to everybody.

BTW prices are not as you say from what I see if you compare the same brand, a high hertz TN monitor is like 70% of a good 144hz IPS (MG279Q), and there are some TN gaming monitors which are even more expensive than the IPS like the PG278Q.
Posted on Reply
#24
RejZoR
Viewing angles are meaningless for gaming imo. For comfortable gaming you're facing monitor dead on anyway. Besides, even if you lean a bit to any side, trust me, in the heat of the battle, you'll NEVER notice tiny gradients of colors that are a bit off. And with pixel response times, 1ms (TN) compared to 5ms (IPS), zero shadowing. When I first brought it home, image was so sharp in motion it was weird to look at the image because it was so sharp even during insane motion (Natural Selection 2). Or the road in NFS Hot Pursuit 2010. I could actually see road texture sharply where on old monitor it was just a blurry mess and I had a 2ms 75Hz gaming screen. But it was an older TN panel and it showed its age a bit.
Posted on Reply
#25
geon2k2
RejZoRViewing angles are meaningless for gaming imo. For comfortable gaming you're facing monitor dead on anyway. Besides, even if you lean a bit to any side, trust me, in the heat of the battle, you'll NEVER notice tiny gradients of colors that are a bit off. And with pixel response times, 1ms (TN) compared to 5ms (IPS), zero shadowing. When I first brought it home, image was so sharp in motion it was weird to look at the image because it was so sharp even during insane motion (Natural Selection 2). Or the road in NFS Hot Pursuit 2010. I could actually see road texture sharply where on old monitor it was just a blurry mess and I had a 2ms 75Hz gaming screen. But it was an older TN panel and it showed its age a bit.
True and true.
For sure you will not notice the viewing angles during gaming. For me, I don't care about anti-aliasing for example, and I game with it disabled most of the time, although in some games I could easily enable it, without dropping under 60 frames. If you look carefully on a static image you will notice it, but during movement and action ... not really.

I'm also pretty sure that the low response time does make a difference and I do plan to get to 120hz myself, but somehow I still find it hard to let go on my old monitor, which still works perfectly and served me well for so many years.
Posted on Reply
Add your own comment
Apr 23rd, 2024 22:35 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts