• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

AMD Radeon RX 480 Clock Speeds Revealed, Clocked Above 1.2 GHz

The question is, which 480 it is.

My guess. The $229 8GB 480 is C4. It will just barely beat the 970 and 390.

Why? Because AMD ain't getting in a major price war with Nvidia. They'd be shooting themselves in the foot. So it will be a damn good FPS/$ increase over the last generation, but not insane.
 
I was thinking about Crossfiring my R9 290.
Going to wait and see how these cards are first..
 
Ashes uses procedual generation for some texture like snow and units, thats why it looks different.
Both used the same settings:
http://www.ashesofthesingularity.co...-details/a957db0f-59b3-4394-84cc-2ba0170ab699
http://www.ashesofthesingularity.co...-details/ac88258f-4541-408e-8234-f9e96febe303
Thing is the numbers in those pages don't match up what AMD claimed Live for either card so. Also the game version is different. As much as you say its the same, there is some things that are not same as AMD claimed they were which then there is enough to doubt them.
IMO, people have been looking for a reason to bash AMD here, just because they are doing well. I'd wait for third party benches anyway, but to each their own.
Don't need to find a reason, AMD has been very nice to provide them last few years.
 
This seems really low compared to Pascal, unless there is a really big boost clock, or lots of overclocking potential.

Still, this might be a good card at 200-250 $, beating the 970/980 and 390(X).

It's called having a more efficient architecture which doesn't need to be clocked high for good performance. Clock-per-clock, AMD's been ahead for a while. With this, they widened the gap.
 
g-sync is done in the driver so dev's don't have to do anything for implementing them.

I think he may have been referring to SLI and not G-Sync being that is the competing technology from NVIDIA to AMD's Crossfire
 
Last edited by a moderator:
Clock-per-clock, AMD's been ahead for a while. With this, they widened the gap.
There is no "gap".
It's different architecture, and that's it.
You can go high frequency or lower frequency, only resulting performance and power consumption is what matters.
 
People seem to be forgetting crossfire and g-sync are dying

Well Dx12 Split Frame Rendering is suppose to be a boon for Dual Card Performance, at least that was how it was being touted a year ago. Though we won't get that figured out until we start see a good amount of Dx12 titles, so it's not something that weighs (at least right now) on a purchase decision. The feasibility of it having a boon in 6 months when games hit and a second is <$200 it might.
http://www.overclock3d.net/articles/gpu_displays/amd_explains_dx12_multi_gpu_benefits/1
http://www.extremetech.com/extreme/...able-amd-and-nvidia-gpus-to-work-side-by-side
http://www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview/2


What this card could do "If" panel manufactures figure out what this provides... they could incite a huge migration from 1080p over to 1440p. Companies would be smart to start offer some a standard spec 27" monitor that provide the VERSA Adaptive Sync and say stick to an all-encompassing 60Hz. Gamers that have been at 1080p and a looking to get into a the immersive smooth game play, and can do it for something between $400, though well south of $500 it would drive a exodus from 1080p. A WIN for panel manufactures, WIN for RTG, and a WIN for Gamers. But something tells me that's not going to happen, it feels like the panel manufactures are more wanting to be moving to 4K as mainstream.
 
Last edited:
The question is, which 480 it is.
They mentioned price range of 100$-300$.
If it is 300$ 480x then, meh.
If it is 229$ 480 8Gb, then shutupandtakemymoney.

It's the C7 version of the Polaris 10 chip. This version has 36 CUs enabled and a clock speed of 1266MHz. The RX480 is the only card that they announced at Computex. I do think that the Polaris 10 chip has 40 CUs, which rings the possibility of a future 480X $299 card.
 
It's in the photo.

36 CU? And how do you know it is the $229 GPU announced and not some other variant? The 470 is a much lower spec, so it can't be the C4, though the C4 could be a further reduce version of the 480.

If the $229 card performs better than the GTX 980, that would be shockingly good news. But I doubt it.
 
I really hope there is a 480x out there, a full 2560SP chip. AMD may be waiting for GDDR5X production to catch up before releasing. Although, if they do call it the RX 480X, then I cant wait for the XFX RX X480XTX DD edition.
I'm waiting for the xXx version.
 
A careful guess would look like this based on the info from RX480, but there will not be much room for A RX480X then.
RX490.JPG
 
The 470 is a much lower spec, so it can't be the C4
Why not?

A careful guess would look like this based on the info from RX480, but there will not be much room for A RX480X then.
Could you elaborate? Because I don't see why 40 cu 480x with higher clock and 8 pin power connector is unlikely
.
Oh, and we have that nice C0 thing in linux drivers:

https://lists.freedesktop.org/archives/dri-devel/2016-May/107756.html
https://lists.freedesktop.org/archives/dri-devel/2016-May/107758.html
 
Could you elaborate? Because I don't see why 40 cu 480x with higher clock and 8 pin power connector is unlikely
.
Oh, and we have that nice C0 thing in linux drivers:

https://lists.freedesktop.org/archives/dri-devel/2016-May/107756.html
https://lists.freedesktop.org/archives/dri-devel/2016-May/107758.html


Yes that is passable, but it would be preforming like the "490" so no idea to make 2 cards that have the same performance. with the info out on the RX480 it could be a redesigned Hawaii/Grenada with the same MAX 2816 SP and 44 CU.
But yes there is a passability that the 490X on my picture will be the RX480X and they will use a new chip for the 490 series.
 
but it would be preforming like the "490"
Would it?
I mean, 36 Cu > 40 Cu = +11%.
1266Mhz =>... dunno, let's be generous, +100Mhz, so 7.8% increase here.

In case it scales perfectly (there is mem and other things) +11% * +7.8% = ~20% increase.

So if base is C7 with 18090 score, we get to 21687, which is between 980Ti and TitaniumX, still rather short of 1070.

However, 380 and 380 difference in 3DMark was about 9%.
So if we drop clock boost, it's 18090 => 20099, roughly at Ti levels (and likely doable with the same 6 pin power connector).


I'd expect 490 to take on 1070 and 490x to take on 1080, the way 390 and 390x did. (Vega 10)
 
I'd expect 490 to take on 1070 and 490x to take on 1080, the way 390 and 390x did. (Vega 10)

Yes, and 480 and 480x will be well below.

According to AMD, Polaris 11 will be ~1/2 the capability of the Polaris 10 (shaders and memory bandwidth) and will consume ~50W. It has been assumed that 470 will be Polaris 11, but it could be a reduced Polaris 10 chip, with Polaris 11 being designated 460.

Either way, I'm doubtful that AMD would price a $199 card that beats a GTX 980. That would be *extremely* aggressive pricing, resulting in lower than necessary margins. Obviously Nvidia would need to match them (nearly) with the 1060, meaning that neither company would make good profits. AMD may calculate however that the gain in market share would be worth it, and they might be right. If so, this is very good news for us.
 
Yes, and 480 and 480x will be well below.

According to AMD, Polaris 11 will be ~1/2 the capability of the Polaris 10 (shaders and memory bandwidth) and will consume ~50W. It has been assumed that 470 will be Polaris 11, but it could be a reduced Polaris 10 chip, with Polaris 11 being designated 460.

Either way, I'm doubtful that AMD would price a $199 card that beats a GTX 980. That would be *extremely* aggressive pricing, resulting in lower than necessary margins. Obviously Nvidia would need to match them (nearly) with the 1060, meaning that neither company would make good profits. AMD may calculate however that the gain in market share would be worth it, and they might be right. If so, this is very good news for us.

That's what normally happened before they got greedy --------> 199 card >= 500 previous gen card
 
I'd expect 490 to take on 1070 and 490x to take on 1080, the way 390 and 390x did. (Vega 10)
If I had to guess there a full Polaris 10 still out their as a RX490 8Gb GDDR5X that's as good as a 1070 ~$330. Vega again 3 cards, the middle ground (RZ Fury) well above the GTX1080 4K/VR for ~$500. A (RX Fury) Vega "LE" that's 4G of HBM2 ~$430 and is a true 4K/VR that a 1070 is short on. While still a RZ FuryX for the GP102

It has been assumed that 470 will be Polaris 11, but it could be a reduced Polaris 10 chip, with Polaris 11 being designated 460.
As again I think all these designs are to produce at least 3 SKU's, so yes a Polaris 10 "LE" would be a RX470 4GB $150 and be the "end all" for 1080p, figure a 380X or slight bit better. While Polaris 11 is the RE 460 will be the lowest discrete and/or C-F APU, while all the rest is in Apple pads, Laptops, and consoles (IDK).
 
Last edited:
What this card could do "If" panel manufactures figure out what this provides... they could incite a huge migration from 1080p over to 1440p. Companies would be smart to start offer some a standard spec 27" monitor that provide the VERSA Adaptive Sync and say stick to an all-encompassing 60Hz. Gamers that have been at 1080p and a looking to get into a the immersive smooth game play, and can do it for something between $400, though well south of $500 it would drive a exodus from 1080p. A WIN for panel manufactures, WIN for RTG, and a WIN for Gamers. But something tells me that's not going to happen, it feels like the panel manufactures are more wanting to be moving to 4K as mainstream.
I don't think 1440p will be mainstream, simply because it's irrelevant for the mass (I could even say it's generally irrelevant, because it simply is). The mass players don't care about it, they are even pleased with 720p and less (see PS4/Xbone/PS3/Xbox360), that's why 1080p is easily enough for them, more so in high settings. 4K is a dead end too. The next big thing will most likely be VR and HDR 720/1080p (on monitors), that's a lot more benefit than simply increasing the resolution of the monitor.

On the speculations here: Polaris10 is 480 and 480X, Polaris 11 is most likely 470 and 470X. 480X will most likely be released later, with 4 more CUs activated (2304 + 4 x 64 = 40 CUs = 2560 shaders on the full chip). Vega will be a bigger chip, with a lot more of shaders, so the 490(X) is not comparable to 480(X), there's no conflict. A new Fury then would essentially be a maximum size chip like Fiji, but I don't see them making another one, because it's simply way too expensive on 14nm Finfet - too many transistors, as to why the big chip of Nvidia will "only" be about 450mm² in size, this time (talking GP102, GP100 is over 600mm² big and extremely expensive as already known). This means, Vega10/Vega11 (490 and 490X) will most likely be the maximum of GCN 4.
 
I don't think 1440p will be mainstream, simply because it's irrelevant for the mass (I could even say it's generally irrelevant, because it simply is). The mass players don't care about it, they are even pleased with 720p and less (see PS4/Xbone/PS3/Xbox360), that's why 1080p is easily enough for them, more so in high settings. 4K is a dead end too.

It is now for obvious reasons. The hardware requirements are too high. But when 1440p monitors and cards capable of high settings are <$200 then it will become mainstream.
 
I was thinking about Crossfiring my R9 290.
Going to wait and see how these cards are first..
Maybe I'm thinking about this a little differently because, I see an opportunity to get a second 390 for maybe 100 USD less than the first for a serious jump in performance if you can deal with waiting for driver updates which I have, can, and would. If a 200-220 USD part runs as fast as or faster than a 390, it will drag the 390's price down. This is the same way I ended up going with 2x6870s. There is nothing like getting a nice solid boost in performance for a fraction of the price.
 
Maybe I'm thinking about this a little differently because, I see an opportunity to get a second 390 for maybe 100 USD less than the first for a serious jump in performance if you can deal with waiting for driver updates which I have, can, and would. If a 200-220 USD part runs as fast as or faster than a 390, it will drag the 390's price down. This is the same way I ended up going with 2x6870s. There is nothing like getting a nice solid boost in performance for a fraction of the price.
Just that the 390 in CF is like double the power consumption. I'd advise you downclocking - the sweetspot of GCN is under 1000 MHz, as seen on Nano GPUs - and the performance benefit would still be good. I'm not sure if you want another heater in your room.
 
Just that the 390 in CF is like double the power consumption. I'd advise you downclocking - the sweetspot of GCN is under 1000 MHz, as seen on Nano GPUs - and the performance benefit would still be good. I'm not sure if you want another heater in your room.

If he has set refresh monitors he could just use FRTC instead of down clocking to save power.
 
If he has set refresh monitors he could just use FRTC instead of down clocking to save power.
...and that's what I typically do with just the one 390 if vsync isn't smooth enough.
Just that the 390 in CF is like double the power consumption. I'd advise you downclocking - the sweetspot of GCN is under 1000 MHz, as seen on Nano GPUs - and the performance benefit would still be good. I'm not sure if you want another heater in your room.
That's what the air conditioner in the summer is for. I don't game as often as I used to and the 390 idles like a champ. I'm not too worried about consumption under load because it doesn't matter as much to me and because there are utilities that enable me to not have my GPU(s) to run at full tilt.

Another note, if I cared about power consumption, I wouldn't have invested in SB-E. :p
 
Back
Top