Thursday, April 21st 2016

AMD's GPU Roadmap for 2016-18 Detailed

AMD finalized the GPU architecture roadmap running between 2016 and 2018. The company first detailed this at its Capsaicin Event in mid-March 2016. It sees the company's upcoming "Polaris" architecture, while making major architectural leaps over the current-generation, such as a 2.5-times performance/Watt uplift and driving the company's first 14 nanometer GPUs; being limited in its high-end graphics space presence. Polaris is rumored to drive graphics for Sony's upcoming 4K Ultra HD PlayStation, and as discrete GPUs, it will feature in only two chips - Polaris 10 "Ellesmere" and Polaris 11 "Baffin."

"Polaris" introduces several new features, such as HVEC (h.265) decode and encode hardware-acceleration, new display output standards such as DisplayPort 1.3 and HDMI 2.0; however, since neither Polaris 10 nor Polaris 11 are really "big" enthusiast chips that succeed the current "Fiji" silicon, will likely make do with current GDDR5/GDDR5X memory standards. That's not to say that Polaris 10 won't disrupt current performance-thru-enthusiast lineups, or even have the chops to take on NVIDIA's GP104. First-generation HBM limits the total memory amount to 4 GB over a 4096-bit path. Enthusiasts will have to wait until early-2017 for the introduction of the big-chip that succeeds "Fiji," which will not only leverage HBM2 to serve up vast amounts of super-fast memory; but also feature a slight architectural uplift. 2018 will see the introduction of its successor, codenamed "Navi," which features an even faster memory interface.
Source: VideoCardz
Add your own comment

43 Comments on AMD's GPU Roadmap for 2016-18 Detailed

#1
RejZoR
Next gen memory? I know memory bandwidth plays a role in all this, but I don't think it's that big compared to GPU's actually being able to process all the data.
Posted on Reply
#2
FordGT90Concept
"I go fast!1!11!1!"
Yeah, HBM2 already being replaced by 2018 seems ridiculous.

I'm surprised they aren't calling it HDMI 2.0a (supports FreeSync) too. I guess they want to get the message out that Polaris will have HDMI 2 because Fiji not having it was pretty disappointing.
Posted on Reply
#3
Xzibit
FordGT90Concept said:
Yeah, HBM2 already being replaced by 2018 seems ridiculous.

I'm surprised they aren't calling it HDMI 2.0a (supports FreeSync) too. I guess they want to get the message out that Polaris will have HDMI 2 because Fiji not having it was pretty disappointing.
HDMI 1.4a+ can use FreeSync although its not official.

HDMI plans on officially supporting Dynamic-Sync in HDMI 2.0b.
Posted on Reply
#5
Ferrum Master
Well... Year 2016 is still a miss for enthusiast segment...

I feel sorry for those peps driving 4K screens and wanting to game something on them with some reasonable FPS...
Posted on Reply
#6
RCoon
Gaming Moderator
Neither AMD nor NVidia are providing me options to replace my 295x2 for at least 8 months :(
Posted on Reply
#7
RejZoR
R9 295X2 is a beast. It pisses on GTX 980Ti in pretty much all cases. By a lot.
Posted on Reply
#8
RCoon
Gaming Moderator
RejZoR said:
R9 295X2 is a beast. It pisses on GTX 980Ti in pretty much all cases. By a lot.
*If you have crossfire profiles
Posted on Reply
#9
Folterknecht
RejZoR said:
R9 295X2 is a beast. It pisses on GTX 980Ti in pretty much all cases. By a lot.
Except the little inconvenience called CF-profiles or their absence.
Posted on Reply
#10
AsRock
TPU addict
Ferrum Master said:
Well... Year 2016 is still a miss for enthusiast segment...

I feel sorry for those peps driving 4K screens and wanting to game something on them with some reasonable FPS...
But thats what happens when some one says jump and they jump, and not ask how high.
Posted on Reply
#11
Sir Alex Ice
RejZoR said:
R9 295X2 is a beast. It pisses on GTX 980Ti in pretty much all cases. By a lot.
... and how much more watts it burns through?
Posted on Reply
#12
RejZoR
Sir Alex Ice said:
... and how much more watts it burns through?
Who cares? Are you one of those who would buy a Ferrari and then constantly worry and bitch about the mileage? It's a top of the line liquid cooled card. Only thing people who buy such cards care about what kind of framerate they get out of it. Nothing else.
Posted on Reply
#13
jigar2speed
Sir Alex Ice said:
... and how much more watts it burns through?
Ultra high-end solution and Watts ?? Do you know the mileage of Bugatti Veyron?
Posted on Reply
#14
Caring1
jigar2speed said:
Ultra high-end solution and Watts ?? Do you know the mileage of Bugatti Veyron?
About 5 minutes on a full tank.
Posted on Reply
#15
Octavean
FordGT90Concept said:


I'm surprised they aren't calling it HDMI 2.0a (supports FreeSync) too. I guess they want to get the message out that Polaris will have HDMI 2 because Fiji not having it was pretty disappointing.
I agree, it was disappointing and so too IMO was the lack of HEVC decode. It just made the so called new silicon feel like recycled hardware that was recycled one too many times without sufficient updates.

I'm glad AMD is going to address this. When I was buying my last video card I was looking for one to drive a 4K UHD Smart TV which only had HDMI 2.0 inputs. Therefore AMD was taken out of the running automatically and I would have preferred to have had a viable choice between AMD and nVidia rather then just nVidia.
Posted on Reply
#16
the54thvoid
RejZoR said:
R9 295X2 is a beast. It pisses on GTX 980Ti in pretty much all cases. By a lot.
Performance summary shows the terrible weakness of dual card support.




and for the terrible car analogy bollocks.




Essentially, the 295X2 is the fastest card when all conditions are met.

But those conditions aren't met enough of the time (same for sli).

Single card FTW.
Posted on Reply
#17
PP Mguire
Ferrum Master said:
Well... Year 2016 is still a miss for enthusiast segment...

I feel sorry for those peps driving 4K screens and wanting to game something on them with some reasonable FPS...
It's not that bad. When SLI works, which it does on most AAA titles, 4k is achievable. 3 Titan X's piss all over The Division because it has excellent support even at 4k. Scaling is superb with no microstutter. Other games not so much, but really it all depends. For games that don't support SLI properly we need Pascal and above.
Posted on Reply
#18
Ferrum Master
PP Mguire said:
It's not that bad. When SLI works, which it does on most AAA titles, 4k is achievable. 3 Titan X's piss all over The Division because it has excellent support even at 4k. Scaling is superb with no microstutter. Other games not so much, but really it all depends. For games that don't support SLI properly we need Pascal and above.
Agree... it is hit or miss as always. What a wonderful world we live in... leave few K's dough... and still have to struggle :D
Posted on Reply
#19
GhostRyder
Well, least we know some cards are coming out this year. Its just a matter of when because I want to see the next R9 490X or if they stick with the naming scheme R9 Fury X to compare to the next gen Nvidia GTX 1080ti (Or whatever its to be called). Either way, I will be replacing my current cards with two of whatever the top end card from either side that peaks my interest is.

the54thvoid said:
Performance summary shows the terrible weakness of dual card support.




and for the terrible car analogy bollocks.




Essentially, the 295X2 is the fastest card when all conditions are met.

But those conditions aren't met enough of the time (same for sli).

Single card FTW.
Yea, that's a downside of SLI/CFX which is why when I do it these days I always grab the top end cards so at least I have some performance to fall back on when there is not one available. Though to be fair with only a few exceptions the profiles for the games that really need it come out in a reasonable amount of time.

Sir Alex Ice said:
... and how much more watts it burns through?
You don't buy a high end card and worry about power consumption. Like the example used above, you by high performance, your going to use some watts. Besides dual GPU though, you can run any modern GPU off a $50 PSU in this day and age so it really does not matter unless you need to run (Or want to run) multiple cards.
Posted on Reply
#20
Ferrum Master
the54thvoid said:
Performance summary shows the terrible weakness of dual card support.
I have to agree too... I often chuckle in those indie game reviews... made by RCoon itself :D. Almost always 50% load :D

Truly more power isn't the issue... the space, and cold air is... the hotter it gets the more throttle it can create again, thus getting out of sycn and stutter... yet another struggle.

I really hope that DX12 will do something about that... ditch the old AFR as such... the next hope is nvlink... if nvidia leaves one link for adding SLI using that connection, thus making possible to access neighboring RAM pool without latency tax, it would be also something.

All this needs just good coders... the hardware is there as usualy... we lack time.
Posted on Reply
#21
PP Mguire
Some people like me don't mind power consumption from the wall, but more power consumption equates to more heat output. That's bad for me as I hate a hot room.
Posted on Reply
#22
Ferrum Master
PP Mguire said:
I hate a hot room.
Yet you live in Texas? :eek:
Posted on Reply
#23
RejZoR
Those graphs are meaningless when you bring in the price. R9 295X2 was like 3 times cheaper than GTX 980Ti. If I didn't have miniATX case back then, I'd most likely have the R9 295X2. And the fact I generally have very little trust in multi-GPU design, because it has never really worked well. Maybe DX12 will change that, but I doubt it.
Posted on Reply
#24
PP Mguire
Ferrum Master said:
Yet you live in Texas? :eek:
Trust me, I'd rather not live here. Taking a trip to Colorado to seek out potential areas to move to. In the meantime, I have a 12,000 BTU window unit coming from Amazon to help cope with the heat this summer. It's gonna be toasty.

RejZoR said:
Those graphs are meaningless when you bring in the price. R9 295X2 was like 3 times cheaper than GTX 980Ti. If I didn't have miniATX case back then, I'd most likely have the R9 295X2. And the fact I generally have very little trust in multi-GPU design, because it has never really worked well. Maybe DX12 will change that, but I doubt it.
You wot m8? Cheapest I've ever seen a 295x2 was 400 bucks used. A 980ti can be had new for 559 all USD. Wouldn't say that's a 3rd the cost of a 980ti.
Posted on Reply
#25
RejZoR
If GTX 980Ti was 559 I'd be having that and not a vanilla GTX 980 for 650€...
Posted on Reply
Add your own comment