Monday, January 18th 2016

AMD Working on Two "Polaris" GPUs: Raja Koduri

AMD Radeon Technologies Group (RTG) head Raja Koduri, in an interview with Venture Beat, confirmed that the company is currently working on two 14 nm FinFET GPUs based on the "Polaris" (4th generation Graphics CoreNext) architecture. He was quoted as referring to the two chips as "Polaris 10" and "Polaris 11." He remarked that the two chips are "extremely power efficient."

Koduri ran Venture Beat through what's new with these chips, besides being built on the 14 nm process and GCN 4.0 stream processors - a redesigned front-end, new geometry processors, a new multimedia engine, and new display controllers. GCN 4.0 lends the chip an up-to-date API support besides significantly higher performance, the new multimedia engine features native h.265 hardware acceleration, and the display controllers support the latest DisplayPort 1.3 and HDMI 2.0a connectors.
Source: Venture Beat
Add your own comment

37 Comments on AMD Working on Two "Polaris" GPUs: Raja Koduri

#1
john_
Two GPUs can't cover the whole market, so for the beginning AMD is giving a small GPU for those who want something power efficient and enough for medium quality 1080p gaming and the successor to Fury X to regain the top spot. Probably the old GPUs will remain as rebrands for the sub $100 market and the prices between $200 and $600.
Posted on Reply
#2
HumanSmoke
john_Two GPUs can't cover the whole market, so for the beginning AMD is giving a small GPU for those who want something power efficient and enough for medium quality 1080p gaming and the successor to Fury X to regain the top spot. Probably the old GPUs will remain as rebrands for the sub $100 market and the prices between $200 and $600.
Probably. You would think a high-end GPU would be a priority if just to maintain marketing momentum although I suspect it is most definitely needed for the VR push. The other GPU would almost certainly IMO be a Pitcairn/Curacao/Trinidad replacement since it is getting pretty long in the tooth and doesn't have great support for many of the newer GCN features
Posted on Reply
#3
Orijin16
2 gpu's covers the high end and upper mid range if they just disable a few cores or use gddr5 on one and hbm on another, leaving rebrands for the low end / lower mid range.

When I first heard about Polaris going all out on efficiency I was a bit concerned because efficiency and processing grunt don't often go hand in hand, but they did a pretty good job coming close to Maxwell's efficiency with Fiji, despite being on a much larger process node, so shrinking down to 14nm should give them some pretty good performance per watt even without all the other design tweaks.

I'm pretty excited to see what they release, although saying that I was about Fiji which turned out to be slightly disappointing and could / should of been much better than it is.
Posted on Reply
#4
julizs
Two new Gpu's will be enough.

1. R9 490X
R9 490 (disable some cores)

2. R9 480X
R9 480 (disable some cores)

Rebrand the rest if you need to, not interested in those anyway...
Posted on Reply
#5
NC37
julizsTwo new Gpu's will be enough.

1. R9 490X
R9 490 (disable some cores)

2. R9 480X
R9 480 (disable some cores)

Rebrand the rest if you need to, not interested in those anyway...
Sadly this will likely be so.

Course they only really need 2 if they get it right. Heck in the old days ATI got by on about that much at times.
Posted on Reply
#6
julizs
NC37Sadly this will likely be so.

Course they only really need 2 if they get it right. Heck in the old days ATI got by on about that much at times.
Well, it can only get better.


R9 300 series initially was a big disappointment, too many rebrands and the new cards came way too late and were too expensive.
Imo the 900 series was also a disappointment, 960 is a turd, 970 was good performance per dollar but turned out to be 3.5gb ram and also had a buzzing problem. 980 was 200€ more then 970 and almost identical performance. 980ti later on was cool.


My expectations are at an all time low but I think both Amd and Nvidia are gonna deliver this time.
Posted on Reply
#7
micropage7
personally i want lower power consumption card with middle power
just wait....
Posted on Reply
#8
RejZoR
I think I'll be switching to AMD again when this comes out. These Polaris GPU's sound amazing! A proper evolution of what R9 Fury X should be when it was released.
Posted on Reply
#9
Aquinus
Resident Wat-man
GCN 4.0? Wasn't the last version 1.3 not 3.0?
Posted on Reply
#10
medi01
Did they ever have more than 2?

Wasn't it even one back in 4xxx times? (with x2 for a flagship)
Posted on Reply
#11
okidna
AquinusGCN 4.0? Wasn't the last version 1.3 not 3.0?
There's difference beween AMD internal naming scheme and press/tech-sites naming scheme regarding GCN.
AMD use "generation" naming scheme and press/tech-sites prefer to use "x.x" naming scheme.

And each have a reason for their preference naming scheme, press/tech-sites consider that GCN progression is an evolution or incremental update hence the small increment in the version (1.0, 1.1, 1.2) meanwhile AMD think every update is a milestone so it's worthy for a "new generation" name.

GCN 1.0 = GCN 1st generation (Southern Islands)
GCN 1.1 = GCN 2nd generation (Sea Islands)
GCN 1.2 and 1.3 (if you consider Fiji) = GCN 3rd generation (Volcanic Islands)
GCN ?.? (probably 1.3/1.4 or 2.0) = GCN 4th generation (Arctic Islands)
Posted on Reply
#12
Deeveo
Don't they usually start with smaller chips with a new process node, and especially combined with new (improved?) architecture? But I really hope both succeed and we see some fierce competition on performance to keep those prices in check.
Posted on Reply
#13
jabbadap
HumanSmokeProbably. You would think a high-end GPU would be a priority if just to maintain marketing momentum although I suspect it is most definitely needed for the VR push. The other GPU would almost certainly IMO be a Pitcairn/Curacao/Trinidad replacement since it is getting pretty long in the tooth and doesn't have great support for many of the newer GCN features
Yep, AMD needs new low/mid range gpu. Think about notebooks, most of notebook discrete gpus right now is maxwell based. Pitcairn, Bonaire and Tonga(and their respective aka names) can't really compete on perf/W metrics with maxwell.
Posted on Reply
#14
Moofachuka
That's the same as Fury... There was a normal Fury and a Fury X chip. They don't need to work on low end since the current gen will move down to lower range... (which in other words, rebranding)
Posted on Reply
#15
deemon
notebooks will use apu-s, not gpu-s
Posted on Reply
#16
xfia
deemonnotebooks will use apu-s, not gpu-s
The high end of mid range is def Carrizo. The top tier R7 model is just as good as the game systems at 1080p.
Posted on Reply
#17
jabbadap
deemonnotebooks will use apu-s, not gpu-s
I said discrete, apus have igpu but even many amd apu netbooks have discrete radeon with them(hybrid crossfire or not). And do you suggest it's vice to amd to leave every intel inside laptop to nvidia.

And the myth that igpus will rule the world is just a myth. It's moving target, game graphics will evolve to require better and better gpus, which tiny core igpu can't handle.
Posted on Reply
#18
GhostRyder
I guess by that standard we will get 4 total cards on the new architecture. Well that is all fine and dandy as long as we get some serious updates this round with the new lineup. We need some cards that will have 8gb HBM, new supported techs like DP 1.3 and such, along with (hopefully) a little better overclocking. I will be purchasing this year based on what's released so ill pick my cards on who has the top dog.
Posted on Reply
#19
xfia
jabbadapI said discrete, apus have igpu but even many amd apu netbooks have discrete radeon with them(hybrid crossfire or not). And do you suggest it's vice to amd to leave every intel inside laptop to nvidia.

And the myth that igpus will rule the world is just a myth. It's moving target, game graphics will evolve to require better and better gpus, which tiny core igpu can't handle.
AMD throws down 3x the graphics performance as Intel when it comes to apu's... all that Intel inside stuff with no NV to back it up is kinda garbage. On top of that HSA is practically made to be best friends with DX12. If they make both sides of the next apu's with 14nm finfet the performance increase will certainly bring them up another tier.. if not two.
Posted on Reply
#20
jabbadap
xfiaAMD throws down 3x the graphics performance as Intel when it comes to apu's... all that Intel inside stuff with no NV to back it up is kinda garbage.
Iris pro in notebooks are quite similar performance vice as most powerful carrizo igpu so I don't think your graphics performance is up to date. AMD apus needs more memory bandwidth, while intel uses dedicated edram from micron to address memory bandwidth problems. Next gen APU's will hopefully address this with hbm.
Posted on Reply
#21
xfia
jabbadapIris pro in notebooks are quite similar performance vice as most powerful carrizo igpu so I don't think your graphics performance is up to date. AMD apus needs more memory bandwidth, while intel uses dedicated edram from micron to address memory bandwidth problems. Next gen APU's will hopefully address this with hbm.
yeah i read a leak but with the 35w power target considered eh.. and best in class online gaming
www.notebookcheck.net/AMD-Radeon-R7-Carrizo-Benchmarks.144288.0.html
still tho i expect finfet to super charge AMD apu's
www.forbes.com/sites/patrickmoorhead/2015/06/11/for-advanced-micro-devices-it-was-all-about-carrizo-at-computex-2015/#2715e4857a0b16c05f4b3d7a
Posted on Reply
#22
jabbadap
xfiayeah i read a leak but with the 35w power target considered eh..
www.notebookcheck.net/AMD-Radeon-R7-Carrizo-Benchmarks.144288.0.html
still tho i expect finfet to super charge AMD apu's
Well yeah, 14nm apus will be the thing. They can put more shaders to igpu, use hbm memory and possibly Zen -cpu cores(hopefully they don't just wait to get zen ready if it's take too much time, die shrinked excavator could be a good filler product).

Heh, i know intel and amd Watts are different, but 15W cpu(Intel Core i7-6650U from ms surface pro 4):
www.notebookcheck.net/Intel-Iris-Graphics-540.149939.0.html
Posted on Reply
#23
xfia
I keep seeing the commercials for the surface pro 4. I love the new pen and how it folds up. I would get one for some video editing..netflix on the go and stuff with a Lumia to match. Off topic but tired of Android..
Posted on Reply
#24
Deeveo
APU with 8GB HBM could be a thing of wonder, you could make solder version with a mb with no memory slots at all. El cheapo system eh?
Posted on Reply
#25
xfia
DeeveoAPU with 8GB HBM could be a thing of wonder, you could make solder version with a mb with no memory slots at all. El cheapo system eh?
I think thats exactly what the next XB and PS will have.. they need HBM to drive 4k the way they want to. I also think that the XB1 and PS4 will be thought of as the 1080p system for the same games for awhile until devs start taking advantage of the new features they will be packed with.
Posted on Reply
Add your own comment
Apr 27th, 2024 19:43 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts