Monday, January 4th 2016

AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture

AMD provided customers with a glimpse of its upcoming 2016 Polaris GPU architecture, highlighting a wide range of significant architectural improvements including HDR monitor support, and industry-leading performance-per-watt. AMD expects shipments of Polaris architecture-based GPUs to begin in mid-2016.

AMD's Polaris architecture-based 14nm FinFET GPUs deliver a remarkable generational jump in power efficiency. Polaris-based GPUs are designed for fluid frame rates in graphics, gaming, VR and multimedia applications running on compelling small form-factor thin and light computer designs.

"Our new Polaris architecture showcases significant advances in performance, power efficiency and features," said Lisa Su, president and CEO, AMD. "2016 will be a very exciting year for Radeon fans driven by our Polaris architecture, Radeon Software Crimson Edition and a host of other innovations in the pipeline from our Radeon Technologies Group."

The Polaris architecture features AMD's 4th generation Graphics Core Next (GCN) architecture, a next-generation display engine with support for HDMI 2.0a and DisplayPort 1.3, and next-generation multimedia features including 4K h.265 encoding and decoding.


AMD has an established track record for dramatically increasing the energy efficiency of its mobile processors, targeting a 25x improvement by the year 2020.
Add your own comment

88 Comments on AMD Demonstrates Revolutionary 14 nm FinFET Polaris GPU Architecture

#51
FordGT90Concept
"I go fast!1!11!1!"
Wishing I didn't look at that. Man my monitor is crap! :laugh:

Viewing angle and gamma it's red at the bottom, gets choppy about 2/3 of the way up, and turns cyan above that. Viewing angle and brightness I see shades of violet instead of a solid color.
arbiterYea i don't know if they are just keeping it a secret or if they got issues. But keeping it a secret is more likely as if there was issues would probably heard.
I doubt there are "issues" with the TMSC process; they are just way behind Samsung which is why AMD can demo (and likely deliver) the next generation of GPUs first.
Posted on Reply
#52
arbiter
FordGT90ConceptI doubt there are "issues" with the TMSC process; they are just way behind Samsung which is why AMD can demo (and likely deliver) the next generation of GPUs first.
that or AMD is feeling the pressure of rumored april release of pascal? if it does release in under 3 months AMD likely won't be for another 3 after that so they are tring to create some buzz and people to hold out
Posted on Reply
#53
PP Mguire
deemonwww.lagom.nl/lcd-test/viewing_angle.php
Put your browser to fullscreen (F11) scroll a bit downward so your entire display is covered with the "lagom" text on grey background.
I see a huge color distortion on my TN panel even when viewing "dead on anyway" so much that on top of the screen the red text is actually already cyan. Whereas on my IPS and MVA panel I do not see this color distortion on his picture.
That's a neat test.
Posted on Reply
#54
newtekie1
Semi-Retired Folder
I'm just going to say this, the 14nm FinFet is already sort of a let down, and has already shown to consume more power than TSMC's 16nm process, in literally the most Apples to Apples comparison possible...see what I did there?

So I'm not holding my breath about power consumption, AMD has already kind of shot themselves in the foot by picking a less power efficient process.
Posted on Reply
#55
RejZoR
deemonwww.lagom.nl/lcd-test/viewing_angle.php
Put your browser to fullscreen (F11) scroll a bit downward so your entire display is covered with the "lagom" text on grey background.
I see a huge color distortion on my TN panel even when viewing "dead on anyway" so much that on top of the screen the red text is actually already cyan. Whereas on my IPS and MVA panel I do not see this color distortion on his picture.
Text looks exactly the same even from very extreme left and right angles. It also looks the same from very extreme top down angle. Only time text becomes cyan is by looking it from extreme bottom angle. Meaning I have to make my beard touching the desk. Let me just say I NEVER look at the display from that angle. Only slight brightness change of the entire image from all other 3 extreme angles.

In normal position, like 1-2cm of top left corner is kinda turning to cyan. When you factor in the fact that you never have a monochrome image like this in games or in Windows, it becomes entirely irrelevant "problem".

The model of my monitor is ASUS VG248QE with TN panel, 144Hz screen and 1ms pixel response. Yeah, that's how far TN's have come. Stop bloody comparing them to TN's from 2005 already.
Posted on Reply
#56
Xzibit
RejZoRText looks exactly the same even from very extreme left and right angles. It also looks the same from very extreme top down angle. Only time text becomes cyan is by looking it from extreme bottom angle. Meaning I have to make my beard touching the desk. Let me just say I NEVER look at the display from that angle. Only slight brightness change of the entire image from all other 3 extreme angles.

In normal position, like 1-2cm of top left corner is kinda turning to cyan. When you factor in the fact that you never have a monochrome image like this in games or in Windows, it becomes entirely irrelevant "problem".

The model of my monitor is ASUS VG248QE with TN panel, 144Hz screen and 1ms pixel response. Yeah, that's how far TN's have come. Stop bloody comparing them to TN's from 2005 already.
Actually in-game the color-shifting is worse due to games Ignoring any such color calibration profiles and settings. People just seem to tolerate it because they buy them for gaming or a budget monitor. VA panels have slight color shifts too but no where near the degree you'll find on TNs.
Posted on Reply
#57
FordGT90Concept
"I go fast!1!11!1!"
newtekie1I'm just going to say this, the 14nm FinFet is already sort of a let down, and has already shown to consume more power than TSMC's 16nm process, in literally the most Apples to Apples comparison possible...see what I did there?

So I'm not holding my breath about power consumption, AMD has already kind of shot themselves in the foot by picking a less power efficient process.
Got linky?
Posted on Reply
#58
newtekie1
Semi-Retired Folder
FordGT90ConceptGot linky?
All over the net. Apple used TMSC's 16nm FinFet and Samsung's 14nm FinFet to make the A9 chips. Then put those chips into identical phones. The phones with Samsung chips get 20-30% less battery life.

arstechnica.com/apple/2015/10/samsung-vs-tsmc-comparing-the-battery-life-of-two-apple-a9s/

The interesting thing is that the real difference seems to come out in very heavy CPU intensive tasks. So, if you make a GPU, there is a good chance the load power consumption will be higher using Samsung's 14nm than TMSC's 16nm.
Posted on Reply
#60
FordGT90Concept
"I go fast!1!11!1!"
Samsung never made a Cortex-A9. Well they did, but the lowest process it was on was 32nm. :confused:

Is there a specific model I should be looking at? The newer Samsungs (e.g. Galaxy S6) run on Cortex-A53 or Cortex-A57.

Considering their operating systems are completely different and Apple does its own thing for GPUs (for Metal), I don't think it's really a 1:1 comparison.


Edit: Samsung Galaxy S6 has 8 cores as a Big-Little architecture--nothing like Apple. Actually, it appears Appl A9 is only a dual core. Samsung's processor should have substantially more performance when power saving and when under heavy load.

Edit: It would be best to compare APL0898 to APL1022. Here we go: bgr.com/2015/10/08/iphone-6s-a9-processor-samsung-tsmc-batterygate/
Posted on Reply
#61
newtekie1
Semi-Retired Folder
FordGT90ConceptEdit: It would be best to compare APL0898 to APL1022.
The Apple A9, made by Samsung, to the Apple A9, made by TMSC...

The identical chip, in identical phones, the only difference is 14nm FinFET vs. 16nm FinFET. The Samsung chip clearly uses more power. It doesn't get more identical than that.
XzibitIts a 3 watt difference

It could be the reviewer they keep quoting got a high leak one. Bad luck
It isn't just one reviewer, the results have been confirmed all over the net. Even Apple confirmed there is a 2-3% difference in normal use.

It also isn't a 3 Watt difference, not sure where that came from... I don't even think these chips use 3 Watts a full load.
Posted on Reply
#62
Xzibit
newtekie1It isn't just one reviewer, the results have been confirmed all over the net. Even Apple confirmed there is a 2-3% difference in normal use.

It also isn't a 3 Watt difference, not sure where that came from... I don't even think these chips use 3 Watts a full load.
I was wrong, I was looking at something else. Serves me right for playing around with 4k scaling on a 24' monitor, It was tolerable on a 27' tho.
Posted on Reply
#63
FordGT90Concept
"I go fast!1!11!1!"
If the chips are truly identical in design, you'd think the only way they'd get ~2% different performance is perhaps the Samsung is running at 2% lower clock (e.g. 1.813 GHz). Even so, that's not good on the power consumption side of things.
Posted on Reply
#64
medi01
So AMD going Samsung not really confirmed yet.

Remember that in case of Apple's chips 16nm TSMC is significantly better (I recall 20%-ish less power consumption, correct me if I'm wrong) than 14nm Samsung.
arbiterI think AMD should stick to comparing to their own cards instead of nvidia. if you look on nvidia's site they just compare their cards to their own cards not amd.
Logical reasoning is strong within you.

AMD's power consumption was badmouthed into oblivion even though mainstream cards were nowhere as bad, e.g. 380x about 20%-ish more total power consumed, while also being faster.
If they'd compare to own cards, uneducated public wouldn't figure that it is much better than nVidia's.
lilhasselhofferI've got to ask a fundamental question here. Do more colors really matter?
It's also brigthness/contrast, so, yep, it does.
Posted on Reply
#65
deemon
lilhasselhofferI've got to ask a fundamental question here. Do more colors really matter?






Posted on Reply
#66
newtekie1
Semi-Retired Folder
deemon





Except it isn't actually that bad, those are extreme exaggeration, not what it is really like. The funny thing is, every single one of those images is an 8-bit image...
Posted on Reply
#67
Xzibit
newtekie1Except it isn't actually that bad, those are extreme exaggeration, not what it is really like. The funny thing is, every single one of those images is an 8-bit image...
Those are simulated images to illustrate the difference

A 10-bit image will have banding on a 8-bit monitor. Most people if they haven't recently bought a monitor are likely on a 6-bit+FRC monitor and don't even know it.
Posted on Reply
#68
lilhasselhoffer
deemon





Help me understand here.

First, I'm going to give you the facts that the monitor I'm looking at right now isn't 10-bit, so the premise on its face is silly. If it can render the "better" image then 8 bit is capable of it now.

Now, what you've shown is banding. The color bands you've shown represent only 26 distinct colors (and that's really pushing it). 26 distinct color values would be 3 bit color (approximately) if my math isn't off. 26/3 = 8.67, two cubed is 8. The difference between 8 bit and 10 bit above is then akin to the difference between 3 bit and 8 bit. I'm having an immensely difficult time buying that little bit of math.

Moreover, you've started equating grayscale with color scale. Sorry, but that's silly too. If you're including grayscale as a measure then your 8 bit colors are now 3(2^8)*2^8 = 4(2^8). Can you seriously tell that many distinct colors apart? I'll give you a hint, biology says you can't.


So what I'm seeing in the "10 bit" color spectrum is the 8 bit spectrum I'd already be able to see. There is some minor banding, but not really enough to notice unless intensely focused upon. If I already can't tell the difference between color values, what exactly is the difference between having more of them to utilize?


What you've demonstrated is marketing BS, that people would use to sell me a TV. This is like arguing that a TV is better because it supports a refresh rate of 61 Hz, above a TV with 60 Hz. While technically correct, the difference is entirely unappreciable with the optical hardware we were born with. I can see the difference between 24 and 48 Hz resolutions (thank you Peter Jackson). I can't tell the difference between the RGB color of 1.240.220 and 1.239.220. I don't see how adding an extra couple of values between colors I already can't differentiate is particularly useful.


This is why I'm asking why 10 bit color is useful. It's another specification that makes precious little sense when you understand the math, and even less when you understand the biology. Tell me, did you buy it when people said adding a yellow LED to the TV screen produced "better" yellows (Sharp's Aquos)? Did you suddenly rush out and get a new data standard that included an extra value to accommodate that new LED? I'd say no. I'd say that it was a cheap tactic to sell a TV on a feature that was impossible to discern. That's largely what 10 bit is to me, a feature that can't reasonably improve my experience, but I will be told is why I should by this new thing.

Please sell me a GPU that can run two 1080p monitors with all of the eye candy on high, not a chunk of silicon which requires a couple of thousand dollars of monitor replacement to potentially be slightly better than what I've got now. Especially not when the potential improvement is functionally impossible for my eyeballs to actually see.
Posted on Reply
#69
deemon
lilhasselhofferHelp me understand here.

First, I'm going to give you the facts that the monitor I'm looking at right now isn't 10-bit, so the premise on its face is silly. If it can render the "better" image then 8 bit is capable of it now.

Now, what you've shown is banding. The color bands you've shown represent only 26 distinct colors (and that's really pushing it). 26 distinct color values would be 3 bit color (approximately) if my math isn't off. 26/3 = 8.67, two cubed is 8. The difference between 8 bit and 10 bit above is then akin to the difference between 3 bit and 8 bit. I'm having an immensely difficult time buying that little bit of math.

Moreover, you've started equating grayscale with color scale. Sorry, but that's silly too. If you're including grayscale as a measure then your 8 bit colors are now 3(2^8)*2^8 = 4(2^8). Can you seriously tell that many distinct colors apart? I'll give you a hint, biology says you can't.


So what I'm seeing in the "10 bit" color spectrum is the 8 bit spectrum I'd already be able to see. There is some minor banding, but not really enough to notice unless intensely focused upon. If I already can't tell the difference between color values, what exactly is the difference between having more of them to utilize?


What you've demonstrated is marketing BS, that people would use to sell me a TV. This is like arguing that a TV is better because it supports a refresh rate of 61 Hz, above a TV with 60 Hz. While technically correct, the difference is entirely unappreciable with the optical hardware we were born with. I can see the difference between 24 and 48 Hz resolutions (thank you Peter Jackson). I can't tell the difference between the RGB color of 1.240.220 and 1.239.220. I don't see how adding an extra couple of values between colors I already can't differentiate is particularly useful.


This is why I'm asking why 10 bit color is useful. It's another specification that makes precious little sense when you understand the math, and even less when you understand the biology. Tell me, did you buy it when people said adding a yellow LED to the TV screen produced "better" yellows (Sharp's Aquos)? Did you suddenly rush out and get a new data standard that included an extra value to accommodate that new LED? I'd say no. I'd say that it was a cheap tactic to sell a TV on a feature that was impossible to discern. That's largely what 10 bit is to me, a feature that can't reasonably improve my experience, but I will be told is why I should by this new thing.

Please sell me a GPU that can run two 1080p monitors with all of the eye candy on high, not a chunk of silicon which requires a couple of thousand dollars of monitor replacement to potentially be slightly better than what I've got now. Especially not when the potential improvement is functionally impossible for my eyeballs to actually see.
Are you really so blunt, that you can not understand, that those pictures, although OBVIOUSLY being 8bit in nature, have only the meaning to illustrate what kind of a difference the 8bit and 10bit picture would have, if you once put a 10bit monitor next to your 8bit one and looked 10bit picture on it (vs 8bit picture on your 8bit monitor). Really?

(24Hz and 48Hz are not resolutions, but refresh rates; Yes, I can not see a difference between 1,240,220 and 1,239,220 either, but I can see a clear difference between 0,128,0 and 0,129,0 for example - also when I make a gradient from 0,0,0 to 255,255,255 on my monitor, I do see distinct lines between colors - true, some go over smoother that others, but still there too many visible lines in the picture and the gradient is not smooth; I don't buy a TV because it has better stats on paper - I have to see the difference myself and it has to be big enough for me to be convinced - quite often it has been clearly visible. Same way I want to see 10bit picture with my own eyes in shop compared to 8bit before I buy anything (unless ofc the price difference is so small that it doesn't matter anyway, but I doubt that this will be the case when consumer 10bit displays launch); You mistake me for hardware vendor - I am not (read: I don't sell you anything, no new displays, no new GPU-s nor new eyeballs that can actually see different colors);
Posted on Reply
#70
Frick
Fishfaced Nincompoop
deemonAre you really so dent, that you can not understand, that those pictures, although OBVIOUSLY being 8bit in nature, have only the meaning to illustrate what kind of a difference the 8bit and 10bit picture would have, if you once put a 10bit monitor next to your 8bit one and looked 10bit picture on it (vs 8bit picture on your 8bit monitor). Really?

(24Hz and 48Hz are not resolutions, but refresh rates; Yes, I can not see a difference between 1,240,220 and 1,239,220 either, but I can see a clear difference between 0,128,0 and 0,129,0 for example - also when I make a gradient from 0,0,0 to 255,255,255 on my monitor, I do see distinct lines between colors - true, some go over smoother that others, but still there too many visible lines in the picture and the gradient is not smooth; I don't buy a TV because it has better stats on paper - I have to see the difference myself and it has to be big enough for me to be convinced - quite often it has been clearly visible. Same way I want to see 10bit picture with my own eyes in shop compared to 8bit before I buy anything (unless ofc the price difference is so small that it doesn't matter anyway, but I doubt that this will be the case when consumer 10bit displays launch); You mistake me for hardware vendor - I am not (read: I don't sell you anything, no new displays, no new GPU-s nor new eyeballs that can actually see different colors);
Obviously he isn't, he questions if there is such a huge difference between 8 and 10 bits, and the answer as you hint at will depend on the person.
Posted on Reply
#71
deemon
FrickObviously he isn't, he questions if there is such a huge difference between 8 and 10 bits, and the answer as you hint at will depend on the person.
Time will tell. I did see clear difference when going from monochrome (1bit) to CGA(2bit), then again when moved to EGA (4bit)... then again when moved to VGA (8bit)... then again when moved to "TrueColor" (8bit *3)... I don't see why this would be any different this time.
Posted on Reply
#72
Xzibit
FrickObviously he isn't, he questions if there is such a huge difference between 8 and 10 bits, and the answer as you hint at will depend on the person.
There more variables to it. Web images are in 8-bit. Monitor coverage can be big ranging from 65/72/75/92/98/99/100+ pct. Average monitor is between 65-75%. As an example Newtekie is 72%.
Posted on Reply
#73
medi01
As I said earlier, it's more about contrast than color banding, and, yep, it does make one hell of a difference.

[INDENT]High-dynamic-range imaging (HDRI or HDR) is a technique used in imaging and photography to reproduce a greater dynamic range of luminosity than is possible with standard digital imaging or photographic techniques.

The aim is to present the human eye with a similar range of luminance as that which, through the visual system, is familiar in everyday life. The human eye, through adaptation of the iris (and other methods) adjusts constantly to the broad dynamic changes ubiquitous in our environment. The brain continuously interprets this information so that most of us can see in a wide range of light conditions. Most cameras, on the other hand, cannot.

HDR images can represent a greater range of luminance levels than can be achieved using more 'traditional' methods, such as many real-world scenes containing very bright, direct sunlight to extreme shade, or very faint nebulae.
[/INDENT]
In other words:
  • Expanded color gamut ("no banding" is only about color accuracy!!!)
  • Higher contrast
Posted on Reply
#74
lilhasselhoffer
deemonAre you really so blunt, that you can not understand, that those pictures, although OBVIOUSLY being 8bit in nature, have only the meaning to illustrate what kind of a difference the 8bit and 10bit picture would have, if you once put a 10bit monitor next to your 8bit one and looked 10bit picture on it (vs 8bit picture on your 8bit monitor). Really?

(24Hz and 48Hz are not resolutions, but refresh rates; Yes, I can not see a difference between 1,240,220 and 1,239,220 either, but I can see a clear difference between 0,128,0 and 0,129,0 for example - also when I make a gradient from 0,0,0 to 255,255,255 on my monitor, I do see distinct lines between colors - true, some go over smoother that others, but still there too many visible lines in the picture and the gradient is not smooth; I don't buy a TV because it has better stats on paper - I have to see the difference myself and it has to be big enough for me to be convinced - quite often it has been clearly visible. Same way I want to see 10bit picture with my own eyes in shop compared to 8bit before I buy anything (unless ofc the price difference is so small that it doesn't matter anyway, but I doubt that this will be the case when consumer 10bit displays launch); You mistake me for hardware vendor - I am not (read: I don't sell you anything, no new displays, no new GPU-s nor new eyeballs that can actually see different colors);
Allow me to quote what I said to you.
"Now, what you've shown is banding. The color bands you've shown represent only 26 distinct colors (and that's really pushing it). 26 distinct color values would be 3 bit color (approximately) if my math isn't off. 26/3 = 8.67, two cubed is 8. The difference between 8 bit and 10 bit above is then akin to the difference between 3 bit and 8 bit. I'm having an immensely difficult time buying that little bit of math."

Let's do the math here. The spectrum is functionally continuous and adding extra bits to color information doesn't produce stronger base colors, so we're going to say what each bit represents of the spectrum:
2^1 = 1 bit, 50%
2^2 = 2 bit, 25%
2^3 = 3 bit, 12.5%
2^4 = 4 bit, 6.25%
2^5 = 5 bit, 3.125%
2^6 = 6 bit, 1.5625%
2^7 = 7 bit, 0.78125%
2^8 = 8 bit, 0.390625%
2^9 = 9 bit, 0.1953125%
2^10 = 10 bit, 0.0976562%

That means, according to your cited differences, that a step down from 12.5% to 0.390625% (delta = 12.109) is the same as a step down from 0.39% to 0.09% ( delta = 0.2929). Those numbers don't lie, and I call bullshit on them. Your infographic is attempting to convey differences, that aren't in any way representative of reality. If you'd like to contest that, please point out exactly where the error in my mathematics lies. I'll gladly change my opinion if I've somehow missed the point somewhere.


In short, what I'm telling you is that your example is crap, which is what I thought I said clearly above. Everything I've observed from 10 bit is functionally useless, though you're free to claim otherwise. What I have observed is variations in intensity and increases in frame rates making a picture objectively better. HDR (intensity of colors being pushed out), pixel count (standard more monitor is good argument), and frame rate (smoother motion) are therefore demonstrably what cards should be selling themselves on (if they want my money), not making a rainbow slightly more continuous in its color spectrum. Unless you missed the title of the thread, this is still about Polaris.


So we're clear, I have no personal malice here. I hate it when companies market new technology on BS promises, like the yellow LED and 10 bit being an enormous improvement over 8 bit. In the case of yellow LEDs, I've literally never been able to determine a difference. In the case of 10 bit color, I've yet to see a single instance of where slightly "off" colors due to not having an extra couple of shades of one color is going to influence me more than dropping from 30 Hz refresh to 25 Hz refresh. If you are of the opposite opinion, I'm glad to allow you to entertain it. I just hope you understand how much of an investment you're going to have to make for that slight improvement, while the other side of it requires significantly less cost to see appreciable improvement.
Posted on Reply
#75
FordGT90Concept
"I go fast!1!11!1!"
Here's a demo:

At 9-bit, there would be a 254.5 step between 254 and 255. At 10-bit, there would be 254.25, 254.5. and 254.75 steps between 254 and 255. Of course it doesn't work like that in binary but in practice that's the difference.

In terms of the eyes being able to perceive the difference at 8-bit is highly subjective. I can see the transition line between 254 and 253 but I can't really make out the other two. I think 9-bit would be good but I'm leaning towards the idea that 10-bit is excessive.
Posted on Reply
Add your own comment
May 5th, 2024 19:06 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts