Monday, July 18th 2016

XFX Radeon RX 480 Double Dissipation Pictured

Here are some of the first detailed pictures of XFX Radeon RX 480 Double Dissipation graphics card. The card combines a custom-design PCB with a meaty custom cooling solution by the company, to support factory-overclocked speeds and overclocking headroom further still. The cooling solution, from which the card derives its name, features two aluminium fin-stacks, to which heat drawn from a copper base is conveyed by four 6 mm-thick copper heat pipes, and ventilated by a pair of 90 mm spinners.

These fans can be detached from the cooler without needing any tools, and without even having to detach the cooler shroud. A back-plate finishes off the cooling solution. The PCB draws power from a single 8-pin PCIe power connector, and uses a 6-phase VRM with high-end International Rectifier DirectFETs to condition it for the GPU. The card features factory-overclocked speeds in excess of 1300 MHz. The card features 8 GB of memory. Display outputs include a DVI connector, besides three DisplayPort 1.4 and one HDMI 2.0b. XFX could launch the card later this week.
Sources: VideoCardz, QuasarZone
Add your own comment

69 Comments on XFX Radeon RX 480 Double Dissipation Pictured

#1
P4-630
Well that looks somewhat better-ish than the HIS :p

Posted on Reply
#2
ZeppMan217
Is the board longer than reference? It looks kinda longer.
Posted on Reply
#3
PowerPC
ZeppMan217Is the board longer than reference? It looks kinda longer.
Yes, I read it's longer than the reference card, which itself was much longer than the reference PCB.
Posted on Reply
#4
arbiter
PowerPCYes, I read it's longer than the reference card, which itself was much longer than the reference PCB.
you can see in the picture it looks like its around 10.5inch which is standard for a lot of cards. Not saying its 10.5inches long but its in that ball park of it, its also about half inch wide as well.
Posted on Reply
#5
rtwjunkie
PC Gaming Enthusiast
P4-630Well that looks somewhat better-ish than the HIS :p

LOL! It looks like HIS is staying very conservative in their design. It looks, other than the big cat on there, to be the same heatsink and shroud they have had for most of the last few years.
Posted on Reply
#6
ZoneDymo
That fan removal feature is pretty cool, now lets hope it performs well
Posted on Reply
#7
woopadidoo
Does the DVI carry analog signal? Would be pretty dumb not to (some other AIB's don't) as some people still use DVI to to D-SUB/VGA adapters.
Posted on Reply
#8
ZeppMan217
rtwjunkieLOL! It looks like HIS is staying very conservative in their design. It looks, other than the big cat on there, to be the same heatsink and shroud they have had for most of the last few years.
No need to fix it if it ain't broken.
Posted on Reply
#9
Cybrnook2002
I'm curious, why would one want to remove the fans from your GPU's heat-sink. Unless XFX is going to start offering fan upgrades? Or easier for fan replacement,to cut down on RMA costs? I am curious to know the reason they did this.
Posted on Reply
#10
Shou Miko
Can't wait to see how the custom pcb's for the RX480's does also with oc and performance to compete with GTX1060/1070 @ 1080p (Mby 1440p).
Posted on Reply
#11
P4-630
puma99dk|Can't wait to see how the custom pcb's for the RX480's does also with oc and performance to compete with GTX1060/1070 @ 1080p (Mby 1440p).
I don't think it will "compete" with the GTX1070....
Posted on Reply
#12
ZoneDymo
Cybrnook2002I'm curious, why would one want to remove the fans from your GPU's heat-sink. Unless XFX is going to start offering fan upgrades? Or easier for fan replacement,to cut down on RMA costs? I am curious to know the reason they did this.
I see it purely as a way for easier cleaning, remove the fans, blow dust out of the fins, put fans back.
Posted on Reply
#13
okidna
Cybrnook2002I'm curious, why would one want to remove the fans from your GPU's heat-sink. Unless XFX is going to start offering fan upgrades? Or easier for fan replacement,to cut down on RMA costs? I am curious to know the reason they did this.
Easier fans and heatsink cleaning?
Also Sapphire mentioned that it would be easier to deal with RMA if your fan(s) is broken (they just need to send the replacement fan(s), no need to send the card back).
Posted on Reply
#14
Shou Miko
P4-630I don't think it will "compete" with the GTX1070....
Let's see in 1080p the ref RX480 was a "good" card lets see if the custom can do like 1400mhz gpu without much problems and higher they will be really good value
Posted on Reply
#15
PowerPC
Cybrnook2002I'm curious, why would one want to remove the fans from your GPU's heat-sink. Unless XFX is going to start offering fan upgrades? Or easier for fan replacement,to cut down on RMA costs? I am curious to know the reason they did this.
It's a very good idea, only if it doesn't introduce any extra wobble or noise. If a fan fails right now and you're over RMA or even in it, it's a huge pain to swap them right now and sometimes you even have to take off the whole heat sink!
Posted on Reply
#16
fullinfusion
Vanguard Beta Tester
Man the thickness of the coolers base should really draw a hell of alot of heat from the chip. It's going to be real interesting to see how well it clocks.
Posted on Reply
#17
droopyRO
That blue strip on the VRM thermal pad, shouldn`t it be removed before installing it ?
Posted on Reply
#18
ZeppMan217
puma99dk|Let's see in 1080p the ref RX480 was a "good" card lets see if the custom can do like 1400mhz gpu without much problems and higher they will be really good value
RX480 eats more juice than GTX1070 while delivering 30% less performance. I don't see how it's gonna compete with GTX1060 in any way other than the price.
Posted on Reply
#19
rtwjunkie
PC Gaming Enthusiast
PowerPCIt's a very good idea, only if it doesn't introduce any extra wobble or noise. If a fan fails right now and you're over RMA or even in it, it's a huge pain to swap them right now and sometimes you even have to take off the whole heat sink!
If they are quality fans, it should be fine. Enermax Magma fans can be popped off repeatedly for super easy cleaning of the blades, without it affecting their operation at all when reattached. I'm sure XFX will have similar results, especially since they are going on a GPU.
Posted on Reply
#20
newtekie1
Semi-Retired Folder
woopadidooDoes the DVI carry analog signal? Would be pretty dumb not to (some other AIB's don't) as some people still use DVI to to D-SUB/VGA adapters.
AMD and nVidia have both eliminated the analog signal output from the GPU itself. So none of the modern cards will output analog over the DVI connector unless the AIB builds in a digital to analog converter on the PCB of the card itself, which isn't likely considering the extra cost involved.
Posted on Reply
#21
GhostRyder
Well, it will be interesting to see how this performs in comparison and if it overclocks well to the reference designs.

The fan design though, is pretty awesome. Being able to remove them for easy cleaning (Or heck RMA) if a pretty useful feature. Well at least useful in my book.
Posted on Reply
#22
rtwjunkie
PC Gaming Enthusiast
This does look interesting. I've preliminarily narrowed my interest down to the XFX and the Spahire models. Now, we just need them to release and be reviewed.
Posted on Reply
#23
ShurikN
Has anything else been announced and given date aside from.. like what, Nitro and this?
Posted on Reply
#24
AsRock
TPU addict
Cybrnook2002I'm curious, why would one want to remove the fans from your GPU's heat-sink. Unless XFX is going to start offering fan upgrades? Or easier for fan replacement,to cut down on RMA costs? I am curious to know the reason they did this.
Easier to clean the cooler without requiring a screw driver ?. I have my personal like about it were when the fan is fixed to the cooler all that metal that allows them to be screwed to it is not there.

I am not as fond of the shroud as much but being able to take that off and use my MK-13 fan bracket would be pretty cool as looks like there be less crap stuck tot he heatsink due to the fans being fixed to shroud.

Other from that i do hope they put a lifetime warranty on it, not in the market for a 480 due to obvious reasons, i am looking forward to next year were i be more in the market for some thing.
Posted on Reply
#25
Assimilator
woopadidooDoes the DVI carry analog signal? Would be pretty dumb not to (some other AIB's don't) as some people still use DVI to to D-SUB/VGA adapters.
None of the current-gen AMD or NVIDIA cards support DVI-A, and in fact AMD card's haven't for a few generations. If you're still using VGA, f'in upgrade; it's 2016 already and VGA is long past its death date. Hell, even DVI-D is dead, but that at least provides a signal that doesn't make your eyes bleed.
Posted on Reply
Add your own comment
Apr 28th, 2024 22:40 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts