Wednesday, June 1st 2016

AMD Radeon RX 480 Clock Speeds Revealed, Clocked Above 1.2 GHz

Here are the clock speeds of the Radeon RX 480. Like the GeForce "Pascal," AMD "Polaris" GPUs love to run at speeds way above the 1 GHz mark. The Radeon RX 480 features an engine clock of 1266 MHz, while its memory is clocked at 2000 MHz (actual), or 8 GHz (GDDR5-effective).
Add your own comment

88 Comments on AMD Radeon RX 480 Clock Speeds Revealed, Clocked Above 1.2 GHz

#51
_larry
I was thinking about Crossfiring my R9 290.
Going to wait and see how these cards are first..
Posted on Reply
#52
arbiter
iOAshes uses procedual generation for some texture like snow and units, thats why it looks different.
Both used the same settings:
www.ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/a957db0f-59b3-4394-84cc-2ba0170ab699
www.ashesofthesingularity.com/metaverse#/personas/b0db0294-8cab-4399-8815-f956a670b68f/match-details/ac88258f-4541-408e-8234-f9e96febe303
Thing is the numbers in those pages don't match up what AMD claimed Live for either card so. Also the game version is different. As much as you say its the same, there is some things that are not same as AMD claimed they were which then there is enough to doubt them.
TheinsanegamerNIMO, people have been looking for a reason to bash AMD here, just because they are doing well. I'd wait for third party benches anyway, but to each their own.
Don't need to find a reason, AMD has been very nice to provide them last few years.
Posted on Reply
#53
Smorgesborg
Harry LloydThis seems really low compared to Pascal, unless there is a really big boost clock, or lots of overclocking potential.

Still, this might be a good card at 200-250 $, beating the 970/980 and 390(X).
It's called having a more efficient architecture which doesn't need to be clocked high for good performance. Clock-per-clock, AMD's been ahead for a while. With this, they widened the gap.
Posted on Reply
#54
GrootSquad
arbiterg-sync is done in the driver so dev's don't have to do anything for implementing them.
I think he may have been referring to SLI and not G-Sync being that is the competing technology from NVIDIA to AMD's Crossfire
Posted on Reply
#55
medi01
SmorgesborgClock-per-clock, AMD's been ahead for a while. With this, they widened the gap.
There is no "gap".
It's different architecture, and that's it.
You can go high frequency or lower frequency, only resulting performance and power consumption is what matters.
Posted on Reply
#56
Casecutter
Primey_People seem to be forgetting crossfire and g-sync are dying
Well Dx12 Split Frame Rendering is suppose to be a boon for Dual Card Performance, at least that was how it was being touted a year ago. Though we won't get that figured out until we start see a good amount of Dx12 titles, so it's not something that weighs (at least right now) on a purchase decision. The feasibility of it having a boon in 6 months when games hit and a second is <$200 it might.
www.overclock3d.net/articles/gpu_displays/amd_explains_dx12_multi_gpu_benefits/1
www.extremetech.com/extreme/199905-report-claims-directx-12-will-enable-amd-and-nvidia-gpus-to-work-side-by-side
www.anandtech.com/show/9740/directx-12-geforce-plus-radeon-mgpu-preview/2


What this card could do "If" panel manufactures figure out what this provides... they could incite a huge migration from 1080p over to 1440p. Companies would be smart to start offer some a standard spec 27" monitor that provide the VERSA Adaptive Sync and say stick to an all-encompassing 60Hz. Gamers that have been at 1080p and a looking to get into a the immersive smooth game play, and can do it for something between $400, though well south of $500 it would drive a exodus from 1080p. A WIN for panel manufactures, WIN for RTG, and a WIN for Gamers. But something tells me that's not going to happen, it feels like the panel manufactures are more wanting to be moving to 4K as mainstream.
Posted on Reply
#57
G33k2Fr34k
medi01The question is, which 480 it is.
They mentioned price range of 100$-300$.
If it is 300$ 480x then, meh.
If it is 229$ 480 8Gb, then shutupandtakemymoney.
It's the C7 version of the Polaris 10 chip. This version has 36 CUs enabled and a clock speed of 1266MHz. The RX480 is the only card that they announced at Computex. I do think that the Polaris 10 chip has 40 CUs, which rings the possibility of a future 480X $299 card.
Posted on Reply
#58
rruff
G33k2Fr34kIt's the C7 version of the Polaris 10 chip. This version has 36 CUs enabled and a clock speed of 1266MHz.
How do you know this?
Posted on Reply
#59
G33k2Fr34k
rruffHow do you know this?
It's in the photo.
Posted on Reply
#60
rruff
G33k2Fr34kIt's in the photo.
36 CU? And how do you know it is the $229 GPU announced and not some other variant? The 470 is a much lower spec, so it can't be the C4, though the C4 could be a further reduce version of the 480.

If the $229 card performs better than the GTX 980, that would be shockingly good news. But I doubt it.
Posted on Reply
#61
Caring1
TheinsanegamerNI really hope there is a 480x out there, a full 2560SP chip. AMD may be waiting for GDDR5X production to catch up before releasing. Although, if they do call it the RX 480X, then I cant wait for the XFX RX X480XTX DD edition.
I'm waiting for the xXx version.
Posted on Reply
#62
JalleR
A careful guess would look like this based on the info from RX480, but there will not be much room for A RX480X then.
Posted on Reply
#63
medi01
rruffThe 470 is a much lower spec, so it can't be the C4
Why not?
JalleRA careful guess would look like this based on the info from RX480, but there will not be much room for A RX480X then.
Could you elaborate? Because I don't see why 40 cu 480x with higher clock and 8 pin power connector is unlikely
.
Oh, and we have that nice C0 thing in linux drivers:

lists.freedesktop.org/archives/dri-devel/2016-May/107756.html
lists.freedesktop.org/archives/dri-devel/2016-May/107758.html
Posted on Reply
#64
JalleR
medi01Could you elaborate? Because I don't see why 40 cu 480x with higher clock and 8 pin power connector is unlikely
.
Oh, and we have that nice C0 thing in linux drivers:

lists.freedesktop.org/archives/dri-devel/2016-May/107756.html
lists.freedesktop.org/archives/dri-devel/2016-May/107758.html
Yes that is passable, but it would be preforming like the "490" so no idea to make 2 cards that have the same performance. with the info out on the RX480 it could be a redesigned Hawaii/Grenada with the same MAX 2816 SP and 44 CU.
But yes there is a passability that the 490X on my picture will be the RX480X and they will use a new chip for the 490 series.
Posted on Reply
#65
medi01
JalleRbut it would be preforming like the "490"
Would it?
I mean, 36 Cu > 40 Cu = +11%.
1266Mhz =>... dunno, let's be generous, +100Mhz, so 7.8% increase here.

In case it scales perfectly (there is mem and other things) +11% * +7.8% = ~20% increase.

So if base is C7 with 18090 score, we get to 21687, which is between 980Ti and TitaniumX, still rather short of 1070.

However, 380 and 380 difference in 3DMark was about 9%.
So if we drop clock boost, it's 18090 => 20099, roughly at Ti levels (and likely doable with the same 6 pin power connector).


I'd expect 490 to take on 1070 and 490x to take on 1080, the way 390 and 390x did. (Vega 10)
Posted on Reply
#66
rruff
medi01I'd expect 490 to take on 1070 and 490x to take on 1080, the way 390 and 390x did. (Vega 10)
Yes, and 480 and 480x will be well below.

According to AMD, Polaris 11 will be ~1/2 the capability of the Polaris 10 (shaders and memory bandwidth) and will consume ~50W. It has been assumed that 470 will be Polaris 11, but it could be a reduced Polaris 10 chip, with Polaris 11 being designated 460.

Either way, I'm doubtful that AMD would price a $199 card that beats a GTX 980. That would be *extremely* aggressive pricing, resulting in lower than necessary margins. Obviously Nvidia would need to match them (nearly) with the 1060, meaning that neither company would make good profits. AMD may calculate however that the gain in market share would be worth it, and they might be right. If so, this is very good news for us.
Posted on Reply
#67
N3M3515
rruffYes, and 480 and 480x will be well below.

According to AMD, Polaris 11 will be ~1/2 the capability of the Polaris 10 (shaders and memory bandwidth) and will consume ~50W. It has been assumed that 470 will be Polaris 11, but it could be a reduced Polaris 10 chip, with Polaris 11 being designated 460.

Either way, I'm doubtful that AMD would price a $199 card that beats a GTX 980. That would be *extremely* aggressive pricing, resulting in lower than necessary margins. Obviously Nvidia would need to match them (nearly) with the 1060, meaning that neither company would make good profits. AMD may calculate however that the gain in market share would be worth it, and they might be right. If so, this is very good news for us.
That's what normally happened before they got greedy --------> 199 card >= 500 previous gen card
Posted on Reply
#68
Casecutter
medi01I'd expect 490 to take on 1070 and 490x to take on 1080, the way 390 and 390x did. (Vega 10)
If I had to guess there a full Polaris 10 still out their as a RX490 8Gb GDDR5X that's as good as a 1070 ~$330. Vega again 3 cards, the middle ground (RZ Fury) well above the GTX1080 4K/VR for ~$500. A (RX Fury) Vega "LE" that's 4G of HBM2 ~$430 and is a true 4K/VR that a 1070 is short on. While still a RZ FuryX for the GP102
rruffIt has been assumed that 470 will be Polaris 11, but it could be a reduced Polaris 10 chip, with Polaris 11 being designated 460.
As again I think all these designs are to produce at least 3 SKU's, so yes a Polaris 10 "LE" would be a RX470 4GB $150 and be the "end all" for 1080p, figure a 380X or slight bit better. While Polaris 11 is the RE 460 will be the lowest discrete and/or C-F APU, while all the rest is in Apple pads, Laptops, and consoles (IDK).
Posted on Reply
#69
Kanan
Tech Enthusiast & Gamer
CasecutterWhat this card could do "If" panel manufactures figure out what this provides... they could incite a huge migration from 1080p over to 1440p. Companies would be smart to start offer some a standard spec 27" monitor that provide the VERSA Adaptive Sync and say stick to an all-encompassing 60Hz. Gamers that have been at 1080p and a looking to get into a the immersive smooth game play, and can do it for something between $400, though well south of $500 it would drive a exodus from 1080p. A WIN for panel manufactures, WIN for RTG, and a WIN for Gamers. But something tells me that's not going to happen, it feels like the panel manufactures are more wanting to be moving to 4K as mainstream.
I don't think 1440p will be mainstream, simply because it's irrelevant for the mass (I could even say it's generally irrelevant, because it simply is). The mass players don't care about it, they are even pleased with 720p and less (see PS4/Xbone/PS3/Xbox360), that's why 1080p is easily enough for them, more so in high settings. 4K is a dead end too. The next big thing will most likely be VR and HDR 720/1080p (on monitors), that's a lot more benefit than simply increasing the resolution of the monitor.

On the speculations here: Polaris10 is 480 and 480X, Polaris 11 is most likely 470 and 470X. 480X will most likely be released later, with 4 more CUs activated (2304 + 4 x 64 = 40 CUs = 2560 shaders on the full chip). Vega will be a bigger chip, with a lot more of shaders, so the 490(X) is not comparable to 480(X), there's no conflict. A new Fury then would essentially be a maximum size chip like Fiji, but I don't see them making another one, because it's simply way too expensive on 14nm Finfet - too many transistors, as to why the big chip of Nvidia will "only" be about 450mm² in size, this time (talking GP102, GP100 is over 600mm² big and extremely expensive as already known). This means, Vega10/Vega11 (490 and 490X) will most likely be the maximum of GCN 4.
Posted on Reply
#70
rruff
KananI don't think 1440p will be mainstream, simply because it's irrelevant for the mass (I could even say it's generally irrelevant, because it simply is). The mass players don't care about it, they are even pleased with 720p and less (see PS4/Xbone/PS3/Xbox360), that's why 1080p is easily enough for them, more so in high settings. 4K is a dead end too.
It is now for obvious reasons. The hardware requirements are too high. But when 1440p monitors and cards capable of high settings are <$200 then it will become mainstream.
Posted on Reply
#71
Aquinus
Resident Wat-man
_larryI was thinking about Crossfiring my R9 290.
Going to wait and see how these cards are first..
Maybe I'm thinking about this a little differently because, I see an opportunity to get a second 390 for maybe 100 USD less than the first for a serious jump in performance if you can deal with waiting for driver updates which I have, can, and would. If a 200-220 USD part runs as fast as or faster than a 390, it will drag the 390's price down. This is the same way I ended up going with 2x6870s. There is nothing like getting a nice solid boost in performance for a fraction of the price.
Posted on Reply
#72
Kanan
Tech Enthusiast & Gamer
AquinusMaybe I'm thinking about this a little differently because, I see an opportunity to get a second 390 for maybe 100 USD less than the first for a serious jump in performance if you can deal with waiting for driver updates which I have, can, and would. If a 200-220 USD part runs as fast as or faster than a 390, it will drag the 390's price down. This is the same way I ended up going with 2x6870s. There is nothing like getting a nice solid boost in performance for a fraction of the price.
Just that the 390 in CF is like double the power consumption. I'd advise you downclocking - the sweetspot of GCN is under 1000 MHz, as seen on Nano GPUs - and the performance benefit would still be good. I'm not sure if you want another heater in your room.
Posted on Reply
#73
Xzibit
KananJust that the 390 in CF is like double the power consumption. I'd advise you downclocking - the sweetspot of GCN is under 1000 MHz, as seen on Nano GPUs - and the performance benefit would still be good. I'm not sure if you want another heater in your room.
If he has set refresh monitors he could just use FRTC instead of down clocking to save power.
Posted on Reply
#74
Aquinus
Resident Wat-man
XzibitIf he has set refresh monitors he could just use FRTC instead of down clocking to save power.
...and that's what I typically do with just the one 390 if vsync isn't smooth enough.
KananJust that the 390 in CF is like double the power consumption. I'd advise you downclocking - the sweetspot of GCN is under 1000 MHz, as seen on Nano GPUs - and the performance benefit would still be good. I'm not sure if you want another heater in your room.
That's what the air conditioner in the summer is for. I don't game as often as I used to and the 390 idles like a champ. I'm not too worried about consumption under load because it doesn't matter as much to me and because there are utilities that enable me to not have my GPU(s) to run at full tilt.

Another note, if I cared about power consumption, I wouldn't have invested in SB-E. :p
Posted on Reply
#75
AsRock
TPU addict
XzibitIf he has set refresh monitors he could just use FRTC instead of down clocking to save power.
It helps but it's not a true solution, if you going limit the frame rate you better of limiting the power output too though some thing like Afterburner.

Just down clocking can leave it using about the same power usage, this does depend on the game too.
Posted on Reply
Add your own comment
May 8th, 2024 22:30 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts