Tuesday, October 30th 2018

AMD Radeon RX 590 Built on 12nm FinFET Process, Benchmarked in Final Fantasy XV

Thanks to some photographs by Andreas Schilling, of HardwareLuxx, it is now confirmed that AMD's Radeon RX 590 will make use of the 12 nm FinFET process. The change from 14 nm to 12 nm FinFET for the RX 590 brings with it the possibility of both higher clock speeds and better power efficiency. That said, considering it is based on the same Polaris architecture used in the Radeon RX 580 and 570, it remains to be seen how it will impact AMDs pricing in regards to the product stack. Will there be a price drop to compensate, or will the RX 590 be more expensive? Since AMD has already made things confusing enough with its cut down 2048SP version of RX 580 in China, anything goes at this point.

Thanks to the Final Fantasy XV Benchmark being updated with results for the RX 590, we at least have an idea as to the performance on offer. In the 2560x1440 resolution tests, performance has improved by roughly 10% compared to the RX 580. Looking at the 3840x2160 resolution results shows a performance improvement of roughly 10 to 15%. This uplift in performance allows the RX 590 to put some serious pressure on NVIDIA's GeForce GTX 1060 6 GB. Keep in mind that while the RX 580 was always close, AMD's latest Polaris offering eliminates the gap and is far more competitive in a title that has always favored its competition, at least when it comes to comparing reference cards.
When manual overclocking is taken into account, AMD's RX 580 can typically see a performance increase of around 5 to 10%. While this is speculation, in theory, it would put it on par or very near the RX 590 in terms of performance. However, no overclock is guaranteed so keeping that in mind, the move to the 12 nm FinFET process delivers a guaranteed performance boost with the possibility of further overclocking headroom. This should allow the RX 590 to further increase its performance advantage over the RX 580 and 570. Only time will tell how things shake out, but at the very least users can expect the AMD Radeon RX 590 to deliver on average 10% more performance over the current crop of RX 580 graphics cards.
Sources: Twitter, FFXV Benchmark, via Videocardz
Add your own comment

70 Comments on AMD Radeon RX 590 Built on 12nm FinFET Process, Benchmarked in Final Fantasy XV

#51
Casecutter
FordGT90ConceptThe question is: what does AMD do with cutdown chips? Is there going to be a 575 coming?
Yea, that is the other that were not considering and there has to be a place to use those.

A RX 570 4Gb is already the 1080p "darling", as finally now seeing good deals (seen them as low as $140) . A RX 575 4Gb will great option, especially if they give the memory a bump to 2000Mhz (8000MHz effective) and keep the MSRP at $170.

I think GloFo had to have this happen to streamline on one process (I might say they helped AMD not have a reason not to), while AMD can differentiate the product against the magnitude of mining cards being dumped now in the used market.
Posted on Reply
#52
efikkan
FordGT90ConceptI don't think performance is really the point here. AMD converting Polaris to 12 nm means they intend to produce it in volume for quite some time yet. My guess is that the price isn't going to be much different between 580 and 590 so in time, 590 will replace 580 in the market. The question is: what does AMD do with cutdown chips? Is there going to be a 575 coming?
Excellent observations!
If this truly is a refined Polaris 30 on "12 nm", and not just cherry-picked golden samples of Polaris 20, then it means AMD have decided to extend the lifespan of Polaris very late in the 500-series lifecycle. This could be a strong indicator that AMD is preparing for either delays or supply issues on Navi. I would not be surprised if we hear something about a delay in ~5 months.

And yes, if this is Polaris 30 on "12 nm", there should be potential for a refresh of RX 570 too, but who knows if it will have a distinct name?
FordGT90ConceptIt also means we're probably not going to be seeing a Vega 48 or Vega 32 as discreet cards. They're choosing to upgrade Polaris over cutting down Vega.
Yes. I believe there was plans for this in the roadmaps, but they never materialized. I don't know if they were using HBM or GDDR, but there have obviously been some problems with cost or production.
Posted on Reply
#53
Valantar
efikkanExcellent observations!
If this truly is a refined Polaris 30 on "12 nm", and not just cherry-picked golden samples of Polaris 20, then it means AMD have decided to extend the lifespan of Polaris very late in the 500-series lifecycle. This could be a strong indicator that AMD is preparing for either delays or supply issues on Navi. I would not be surprised if we hear something about a delay in ~5 months.

And yes, if this is Polaris 30 on "12 nm", there should be potential for a refresh of RX 570 too, but who knows if it will have a distinct name?


Yes. I believe there was plans for this in the roadmaps, but they never materialized. I don't know if they were using HBM or GDDR, but there have obviously been some problems with cost or production.
I agree with this partially, but at the same time think you're a bit pessimistic.

First off, there's no Navi delay necessary to provide a market opportunity for a refreshed Polaris - Navi isn't due out until H2 2019 (or is that Q3/Q4? Can't remember.), which means it's still likely a year away - and that's the initial launch, which might be a couple of models for all we know. It makes sense to have a somewhat updated portfolio to begin with. While Nvidia hasn't launched anything remotely affordable yet from Turing, it's coming at some point, and definitely within a year. Waiting it out would be a bit silly, seeing how a refresh on a slightly improved node like this is likely rather cheap (at least compared to making a brand-new cut-down Vega) and would still give some benefits. If they're on par with the 1060 now, why not make a minor investment and make sure they're in a slightly better position when the 2060 comes along (or, with Nvidia pricing, the 2050 Ti, more likely)? Sounds like a no-brainer. What doesn't make sense is the naming, not the refresh itself.

As for this confirming there's no mid-range Vega coming, it might only be my opinion, but that's been pretty clear ever since its launch. AMD has never spoken about any such product (except for the Vega Mobile that Apple just implemented), and looking at the architecture (which is amazing for compute, but struggles in gaming performance (particularly compared to die size and power draw - and die size becomes increasingly important as margins shrink outside of high-end products)), making a cut-down gaming Vega doesn't make much sense. HBM2 is too expensive for the mid-range, and while it'd be nice to implement some of Vega's architectural improvements there too, they're not that dramatic. Polaris does essentially the same job for cheaper. I'm pretty certain AMD saw that Vega had the most potential as a compute card early on, and from then on deliberately focused on that while maintaining Polaris until they could bring out something more suited to gaming - which is Navi.
Posted on Reply
#54
king of swag187
So it's barely faster than even the 580/1060? This seems to be more of a stopgap/testing out of the 12NM process.
I'd still rather buy a R9 Fury over all of these, its cheaper and faster
Posted on Reply
#55
Casecutter
king of swag187So it's barely faster than even the 580/1060? This seems to be more of a stopgap/testing out of the 12NM process.
I'd still rather buy a R9 Fury over all of these, its cheaper and faster
Well if they can get it to be a decent/acceptable 1440p, which cards like the 980Ti/FuryX are still able to provide today and give it a MSRP more aligned to what the RX 580 8Gb is at, I don't see the a problem. Market to 1440p FreeSync and conjunction with CHILL permitting smooth frame rates with exceptional immersive gameplay and both GPU/Monitor cost around $500. Heck, they need to release it in conjunction with monitor partners who provide a discount code for 1440p monitors of like $20 and a free good new game.
Posted on Reply
#56
rtwjunkie
PC Gaming Enthusiast
CasecutterWell if they can get it to be a decent/acceptable 1440p, which cards like the 980Ti/FuryX are still able to provide today
If you are counting on 1440p performance of this to match those two cited cards, then it is doomed.

Just check W1zzard’s latest review, the MSI RTX 2070 and look at those two cards currently. They have become firmly 1080p cards only in just 3 short years.
Posted on Reply
#57
king of swag187
CasecutterWell if they can get it to be a decent/acceptable 1440p, which cards like the 980Ti/FuryX are still able to provide today and give it a MSRP more aligned to what the RX 580 8Gb is at, I don't see the a problem. Market to 1440p FreeSync and conjunction with CHILL permitting smooth frame rates with exceptional immersive gameplay and both GPU/Monitor cost around $500. Heck, they need to release it in conjunction with monitor partners who provide a discount code for 1440p monitors of like $20 and a free good new game.
Get a R9 Fury then, all the same features, more performance, less price, but increased TDP. RX 580 and 1060 6GB are not 1440P cards in the slightest, they can run it but are better for 1080p
Posted on Reply
#58
Casecutter
rtwjunkieJust check W1zzard’s latest review, the MSI RTX 2070 and look at those two cards currently. They have become firmly 1080p cards only in just 3 short years.
I think judging what's today the pinnacle for high setting of what 1440p Fps graphs show is not what should be consider as the baseline/staring point, especially given its' healthily price of $500 to play what today is what fast becoming "mainstream" gaming. Today 1080p is "entry"... kind of where 1600x900 was in 2012, and the HD 7770 (Cape Verde) came along and provided descent 1080p for $150 and it brought 1080p into the mainstream.
king of swag187Get a R9 Fury then, all the same features, more performance, less price, but increased TDP. RX 580 and 1060 6GB are not 1440P cards in the slightest, they can run it but are better for 1080p
Where could one find a R9 Fury except on Ebay that was some miner's well used shovel? Correct a RX580/1060 are only able to provide "entry" 1440p at low/medium settings. That said if you where 15% off what a GTX 1070 provides and juggle some settings I think you can have decent/acceptable and immersive gameplay. The best part would be if it took 30% less funds from your wallet to provide/achieve it.
Posted on Reply
#59
king of swag187
CasecutterWhere could one find a R9 Fury except on Ebay that was some miner's well used shovel? Correct a RX580/1060 are only able to provide "entry" 1440p at low/medium settings. That said if you where 15% off what a GTX 1070 provides and juggle some settings I think you can have decent/acceptable and immersive gameplay. The best part would be if it took 30% less funds from your wallet to provide/achieve it.
No one would use a R9 Fury for mining, it would be terrible at it.
Even if the RX 590 costs $250, thats still a little bit more than a 1070/980 ti off eBay, so your statement would be incorrect.
Coming from a Fury X, around 15-10% slower than a 1070, I can easily get 1080p ultra-high at around 80-100fps for 90% of my games, I'd rather have quality/FPS over increased resolution, as would a majority of people here
Posted on Reply
#60
Casecutter
king of swag187No one would use a R9 Fury for mining, it would be terrible at it.
What are you talking about... people used whatever they had or could get! With the Fury they'd rollback power on Wattman 20-30% and see something about 25MH/s. That not that bad although yes return on that after electricity not good but those folks are the ones not paying for power.
www.tweaktown.com/articles/8331/definitive-ethereum-mining-performance-article/index7.html
king of swag187as would a majority of people here
I don't think the majority here don't want to gamble on Ebay, either.
Posted on Reply
#61
king of swag187
CasecutterWhat are you talking about... people used whatever they had or could get! With the Fury they'd rollback power on Wattman 20-30% and see something about 25MH/s. That not that bad although yes return on that after electricity not good but those folks are the ones not paying for power.
www.tweaktown.com/articles/8331/definitive-ethereum-mining-performance-article/index7.html


I don't think the majority here don't want to gamble on Ebay, either.
Gamble? What are you talking about? If the item doesn't work, you go to ebay, then paypal. And thats an "If"
Posted on Reply
#62
Valantar
king of swag187Gamble? What are you talking about? If the item doesn't work, you go to ebay, then paypal. And thats an "If"
And if it fails after a year? Buying hardware that's been (ab)used for mining is a gamble. Period. Used hardware otherwise is generally not, as stock-clocked electronics don't generally wear much over time. Mining is an exception, though.

Im still very happy with my Fury X at 1440p though, for those of you claiming that it's a 1080p card now. Can't always max out the details, but who cares?
Posted on Reply
#63
king of swag187
ValantarAnd if it fails after a year? Buying hardware that's been (ab)used for mining is a gamble. Period. Used hardware otherwise is generally not, as stock-clocked electronics don't generally wear much over time. Mining is an exception, though.

Im still very happy with my Fury X at 1440p though, for those of you claiming that it's a 1080p card now. Can't always max out the details, but who cares?
You, or any sane human being, shouldn't buy mining hardware, I would never advocate for it. After a year, you'll likely have saved enough to buy something more expensive than a $150-160 GPU. If not, you shouldn't be buying it in the first place. I do agree that everything fails eventually, no matter the care or usage, because it is a literal fact.
I too am very happy with my Fury X, it's still a wonderful GPU, and I'm glad I paid the $240 for it back in the mining craze.
Posted on Reply
#64
rtwjunkie
PC Gaming Enthusiast
CasecutterI think judging what's today the pinnacle for high setting of what 1440p Fps graphs show is not what should be consider as the baseline/staring point, especially given its' healthily price of $500 to play what today is what fast becoming "mainstream" gaming. Today 1080p is "entry"... kind of where 1600x900 was in 2012, and the HD 7770 (Cape Verde) came along and provided descent 1080p for $150 and it brought 1080p into the mainstream.
Irregardless if you think 1080p is entry level (its still the most popular monitor resolution), that doesn’t make a 980Ti able to play anything worthwhile at 1440p. If you have to turn down all the settings just to play it, then you might as well just throw a toaster it n there, it would be just as effective.

When I got my 1080Ti and a 2560x1440 monitor 2 months ago, just for giggles I tried it out for one last run on the 980Ti. It wasn’t pretty. I had been lulled to sleep by the fact that the 980Ti handled almost everything maxed out at 60fps on the 1920x1080 monitor, although there were already games at the end it had to work really hard to not even make it to 60fps.

1440 was a joke. It made me glad I got the 1080Ti.
Posted on Reply
#65
CandymanGR
10% over a GPU that is 2 years old. I am in awe.
Posted on Reply
#66
Valantar
rtwjunkieIrregardless if you think 1080p is entry level (its still the most popular monitor resolution), that doesn’t make a 980Ti able to play anything worthwhile at 1440p. If you have to turn down all the settings just to play it, then you might as well just throw a toaster it n there, it would be just as effective.

When I got my 1080Ti and a 2560x1440 monitor 2 months ago, just for giggles I tried it out for one last run on the 980Ti. It wasn’t pretty. I had been lulled to sleep by the fact that the 980Ti handled almost everything maxed out at 60fps on the 1920x1080 monitor, although there were already games at the end it had to work really hard to not even make it to 60fps.

1440 was a joke. It made me glad I got the 1080Ti.
I wonder what games you're playing. From looking at recent GPU reviews here on TPU, the 980Ti and the Fury X have aged very similarly, with each beating the other in about 50% of titles. And mostly, they do ~40fps @1440p Ultra in brand-new AAA titles, while doing a lot more in older titles. I don't mind stepping down to High or even lowering a couple of settings beyond that, which is what you'll need to reach 60fps if Ultra runs around 40. It's not a problem, and while the difference in detail level is visible when you look for it, it's not really noticeable in gameplay IMO. Your standards are likely higher than mine, but on the other hand, it's a good idea to train oneself to look through the placebo-inducing talk of graphical fidelity. Modern AAA titles look great at "high".

Still, I don't count either the RX 580 or any 590 that aligns with these rumors as a 1440p card. They're not that far off, but not quite there.
Posted on Reply
#67
Casecutter
CandymanGR10% over a GPU that is 2 years old. I am in awe.
ValantarThey're not that far off, but not quite there.
I suppose either of those scenarios could be the outcome, but I think we wait to see how whatever this is, and ends up looking against say the review W1zzard had on the MSI Radeon RX 580 Mech 2 8Gb, Aug 7th, 2018. I would consider that as the baseline to discern advancement.
www.techpowerup.com/reviews/MSI/RX_580_Mech_2/31.html
Posted on Reply
#68
Adam Krazispeed
cucker tarlsonI told you it's gonna be 590. Misleading naming again. But it's good they're releasing it, though just 7.5% increase is pretty measly,it technically makes it faster than 1060, but it's just a placeholder for Navi really and still very much nothing special as far as competing in the mid-range goes. Tell me when a $300 card can outperform the 1070.
It's techically not a rebrand, but it very much feels that way, this is 480/580 with just a very minor die shrink. As for power savings, do you really think they can bump clocks and save a noticeable amounts of power at the same time going from 14 to 12 ?
no of course not... but going from a R9 280 to an R9 290 was decent at least, but RX 480/RX 580 to RX 590 wont be the same...
as an R9 290 was a missive die compared to other cards then and had 64 ROPs...512bit bus...
now if the RX 590 had 64 ROPS instead of the RX 480 & RX 580's 32 rops and also.... doubled the...
rx 580's Unified Shaders from 2304 shader units, to 4608 Shader units and on texture units from, 144 TMU's to 288+ TMUs on a RX 590/or/590X to..something like.... 2-4 x this 288 TMUs or higher!

so RX 590X Should be...on the Render config....

Render Config

590 - 590X*
-
Shading Units
4608
- 6912*

TMUs
288
- 432*

ROPs
64
- 96*

Compute Units
72
- 108*


(* would indicate an AMD RADEON RX 590X Model if ever existed....)

as my spec list the Rx 590X (Resembling the R9 290X several years ago)

R9 290X ( 512 Bit Ram Bus Width)

RX 590 ( 384 Bit Ram Bus Width)

RX 590X (512 Bit Ram Bus Width)

Rx 590X if wit specs i stated @ 12nm , it would be a decent card , maybe even a monster if done right!!!
Posted on Reply
#69
RealNeil
EntropyZI got burned with the R9 300 series before
I feel your pain. I had a pair of R9-290X cards in Crossfire and when the R9-390Xs came out, I sold the 290X cards and bought a pair of 390Xs.
The improvement was dismal and they were louder than the 290Xs were.
I felt ripped-off.
Posted on Reply
#70
Adam Krazispeed
RealNeilI feel your pain. I had a pair of R9-290X cards in Crossfire and when the R9-390Xs came out, I sold the 290X cards and bought a pair of 390Xs.
The improvement was dismal and they were louder than the 290Xs were.
I felt ripped-off.
thats because all a R9 390x was.. was a 290/290x gpu relabeled to 300 series, only thing extra u got (depending which model) might have been 4GB more GDDR5, if u got the 8GB R9 390X, it still was a ripp off, i went for the Fury X which was better.. but not as much as all the Fury X had was 4GB HBM, (LOW RAM) and the Pixel rate was identical to an R9 290 @ 1050mhz core..as below..

Fury X

64 ROPS ( was rumored to be and should have been 128 ROPS,, basically 2x R9 290 Dies AS one Large monolithic die! they never listen to me @amd

4096 Shaders (Should have been 5120, as it would as i stated above as 2 290 Dies as one...)

256 TMU's (was decent but needed more Should have been @ minimum 320 TMUs (as 2x 290 dies as two above...or....(384 to 512)

4096 Bit Ram Interface (needed More than 4GB Vram)

R9 290 (NON X MODEL)

64 ROPS Provided exactly the same Pixel Fillrates as a Fury X @ same 1050MHZ CORE CLOCK/

2560 Shaders

160 TMU's

512 Bit Ram Interface

as this the fury x only had higher Texture fill rates which WAS NOT ENOUGH to be a high end card as i was portrayed it to be :( so yeah i went to GTX 1080 after that..
Posted on Reply
Add your own comment
Apr 23rd, 2024 08:04 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts