AMD has been working on a dual-GPU HD 7990 card for a long time, but we are still waiting for a release. PowerColor has taken on the challenge and engineered a working HD 7990 by themselves, without any help from AMD.
The PowerColor Devil 13 is the company's flagship limited edition hardcore enthusiast card. It uses two Tahiti XT GPUs running at HD 7970 GPU speeds of 925 MHz GPU and 1375 MHz memory. To enable an extra clock boost, a turbo button has been added which takes the GPU clock to 1000 MHz.
Cooling for this immensely powerful card is provided by a custom triple slot, triple fan cooler which covers the whole card. Pricing of the PowerColor Devil 13 is $999, with limited volume available.
All data in this review was obtained after I repaired a major design error of the card. When I received it, the card ran extremely high temperatures, reaching beyond 100°C, causing instability and black screen hangs. The fan would also run at 100% almost instantly. This is caused by screws with integrated stop that resulted in too little mounting pressure between GPU and heatsink, clearly an engineering oversight. I added three metal washers on each screw by removing the screws, adding the washers and putting the screws back. The cooler was removed for this procedure, and thermal paste was replaced with a thin layer. Only this procedure enabled the card to run stable. The card was unusable out of the box.
HD 7990 Market Segment Analysis
| HD 7970
||2x 2048 MB
||2x 1536 MB
||2x 3072 MB
||2x 2048 MB
|Memory Bus Width
||2x 256 bit
||2x 384 bit
||2x 384 bit
||2x 256 bit
||925 / 1000 MHz
You will receive:
- Graphics card
- Driver CD + documentation
- Active mini-DP to DVI adapter
- Passive mini-DP to DVI adapter
- Wiha screwdriver kit
- 3x PCI-Express power cable
- CrossFire bridge
- DVI adapter
- PowerColor powerjack
PowerColor's card looks mighty even at a quick glance. The large triple-fan cooler speaks for itself.
The card requires three slots in your system.
Display connectivity options include one dual-link DVI port, one single-link DVI port, one full-size HDMI port, and two mini-DisplayPorts. You may use all the outputs at the same time.
An HDMI sound device is included in the GPU as well. It is HDMI 1.4a compatible, and includes HD audio and Blu-ray 3D movies support. The DisplayPort outputs are version 1.2 which enables the use of hubs and Multi-Stream Transport.
The card has a single CrossFire connector, which allows a quad-CrossFire configuration with another Devil 13.
Pictured above are the front and back, showing the disassembled board. High-res versions are also available (front
). If you choose to use these images for voltmods, etc., please include a link back to this site or let us post your article.
A Closer Look
PowerColor's thermal solution uses a large copper base and five heatpipes for each GPU to keep the card cool.
Once we removed the main heatsink, we can see many smaller heatsinks on the card that cool voltage regulation circuitry and the PCI-Express bridge chip.
The backplate helps stabilize the card with the heavy cooler. It also cools the memory chips on that side of the card.
The card requires three 8-pin PCI-Express connectors. This power configuration is good for up to 525 W of power draw.
The dual BIOS feature of the HD 7900 series is also present, but instead of placing the switch near the CrossFire connector, PowerColor has placed it in the monitor output area. The second BIOS is a "Turbo" BIOS that uses clock frequencies of 1000 MHz GPU and 1375 MHz memory.
The GDDR5 memory chips are made by Hynix and carry the model number H5GQ2H24AFR-R0C. They are specified to run at 1500 MHz (6000 MHz GDDR5 effective).
We find two CHiL 8228 voltage controllers on the card, one for each GPU. The chip supports voltage control via I2C, provides comprehensible monitoring features, and is also well supported in most overclocking software today.
The PCI-Express bridge chip is sitting under a glued on heatsink which I couldn't safely remove. I am quite certain that a PLX PEX8747 PCI-Express Gen 3 bridge chip is underneath. The second image is from our Computex coverage, showing that chip on the Devil 13.
AMD's Tahiti graphics processor introduced the GCN shader architecture. It is also the first GPU to be produced on a 28 nm process at TSMC. The transistor count is 4.31 billion.
Benchmark scores in other reviews are only comparable when this exact same configuration is used.
|Test System - VGA Rev. 21
||Intel Core i7-3770K @ 4.7 GHz
(Ivy Bridge, 8192 KB Cache)
||ASUS Maximus V Gene
||2x 4096 MB Corsair Vengeance PC3-12800 DDR3
@ 1600 MHz 9-9-9-24
||WD Caviar Blue WD5000AAKS 500 GB
||Antec HCP-1200 1200W
||Windows 7 64-bit Service Pack 1
||NVIDIA: 304.79 Beta
GTX 660 Ti: 305.37 Beta
ATI: Catalyst 12.7 Beta
HD 7990 Devil 13: 12.8
LG Flatron W3000H 30" 2560x1600
3x Hanns.G HL225DBB 21.5" 1920x1080
- All video card results were obtained on this exact system with exactly the same configuration.
- All games were set to their highest quality setting unless indicated otherwise.
- AA and AF are applied via in-game settings - not via the driver's control panel.
Each benchmark was tested at the following settings and resolution:
- 1280 x 800, 2x Anti-aliasing. Common resolution for most small flatscreens today (17" - 19"). A bit of eye candy turned on in the drivers.
- 1680 x 1050, 4x Anti-aliasing. Most common widescreen resolution on larger displays (19" - 22"). Very good looking driver graphics settings.
- 1920 x 1200, 4x Anti-aliasing. Typical widescreen resolution for large displays (22" - 26"). Very good looking driver graphics settings.
- 2560 x 1600, 4x Anti-aliasing. Highest possible resolution for commonly available displays (30"). Very good looking driver graphics settings.
- 5760 x 1080, 4x Anti-aliasing. Typical high-end gaming multi-monitor resolution. Very good looking driver graphics settings.
, released in 2012 for the PC, is a highly successful third-person horror shooter that revolves around the adventures of novelist Alan Wake who has to battle the "darkness" which takes over living and dead things. Alan's signature flashlight is used to strip the forces of darkness of their protection, making them vulnerable to conventional weapons.
The engine of Alan Wake
uses DirectX 9, but features complex lighting effects, making it a quite demanding title. We benchmarked with the highest settings possible.
Batman: Arkham City
is back on the LCD screen with Batman: Arkham City
, a sequel to Batman: Arkham Asylum
, by Rocksteady Games and WB. It was released on the PC platform in November. Batman is imprisoned in Arkham City, an infamous district of the DC Universe that contains the scum of Gotham, most of whom Batman helped put in there. In order to get out, he must go through scores of baddies and encounters many of the iconic supervillains along the way - he's not entirely alone.
Batman: Arkham City
uses the same Unreal Engine by Epic as Batman: Arkham Asylum
does, but, it has, thanks to the engine's modularity, been overhauled and outfitted with the latest technologies, including a graphics engine that takes advantage of DirectX 11.
Arguably one of the most anticipated online shooters in recent times, Battlefield 3
is the latest addition to some of the most engaging online multiplayer-shooter franchises. It combines infantry combat with mechanized warfare including transport vehicles, armored personnel carriers, main battle tanks, attack helicopters, and combat aircraft - pretty much everything that goes into today's battlefields. Infantry combat is coupled with role-playing elements which make the experience all the more engaging. It also has a single-player campaign that adds a few gigabytes to its installer.
Behind all this is a spanking new game engine by EA-DICE, Frostbite 2. It makes use of every possible feature DirectX 11 has to offer, including hardware tessellation and new lighting effects, to deliver some of the most captivating visuals gamers have ever had access to. Not playing this game on the PC is a grave injustice to what's in store. Faster PCs are rewarded with better visuals.
, a card-based RTS, is developed by the German EA Phenomic Studio. A few months after the launch, the game was transformed into a Play 4 Free branded game. That move, and the fact that it was included as a game bundle with a large number of ATI cards, made it one of the more well-known RTS games of 2009. You, as a player, assemble your deck before the game to select the units that will be available. Elemental force choices can come from the forces of Fire, Frost, Nature, and Shadow to complement each other.
The BattleForge engine has full support for DX9, DX10, and DX10.1. We used the internal benchmark tool in DirectX 11 mode with the highest settings possible to obtain our results.
Sid Meier's Civilization V
(or Civ 5
in common jargon) is the latest addition to the franchise of masterfully crafted real-time strategy games that let you play God to a nascent civilization of your choice all the way up to the space age. Civilization V
uses large 3D worlds that are procedurally generated and takes advantage of the hardware tessellation features offered by DirectX 11 to exponentially step up the complexity of cities, models, terrains, and objects. This generation of GPUs can be expected to handle large texture loads that come with such eye candy.
After the tremendous success of Far Cry
, the German game studio Crytek released their shooter Crysis
in 2007. The game was by far the most hyped and anticipated game in 2007, and forums were full of "Can my system run Crysis?" threads because of the high hardware requirements of this game. Just like in Far Cry
, the plot evolves on a small island with a thick and richly detailed jungle world. A lot of attention has been given to small details like accurate physics. When you, for example, fire on a tree trunk, it will shatter and the tree will fall over and leave a stump behind. Enemies in a car can be stopped by shooting the tire of the car. The game graphics are, even today, top notch, yet the game still runs well on most computers.
takes the player into an alien-infested New York City. The game adds a tactical options mode that allows several ways to attack a heavily infested enemy location. The new Nanosuit 2.0, that the player uses, offers more freedom in ability use; multiple abilities can, for example, be used at the same time. To better accommodate a given play style, weapons can be customized with silencers, laser sights, or even a sniping scope.
For rendering, Crytek's CryEngine 3, which comes with reduced system requirements compared to the first Crysis
game, is used. Since Crysis 2
is a multi-platform game, with major development focus on the console, the graphics on launch day were only DirectX 9. DirectX 11 functionality was added later in a patch. We used the DX11 version and the high-res texture pack for our benchmarking.
Blizzard's Diablo 3
is the latest release in one of the most popular action RPG series of all time. You, the hero, will experience epic adventures on your journey to defeat Diablo, the master of Hell. Diablo 3
set the record for the fastest-selling PC game - selling over 3.5 million copies on the first day of its release. It was also the most pre-ordered game on Amazon.
Blizzard's DirectX 9 engine provides the player with an isometric view on the action. The game has been tuned to run well on most computer systems to let as many players as possible experience the game. We tested Diablo 3
running at the highest image-quality settings.
Dragon Age II
Dragon Age II
is the second game in BioWare's Dragon Age
franchise and was released in March 2011. You will be able to pick your hero, named Hawke, from several classes and grow him over the course of the adventure. Gameplay takes you through a linearly narrated story of Hawke's rise to become the legendary "Champion of Kirkwall".
BioWare's Lycium Engine has support for DirectX 11, using tessellation, advanced dynamic lighting, and camera effects like depth of field. We benchmarked the DX11 version with details set to the highest possible.
Developed by Flying Wild Hog, a studio that prides itself with the fact that its creation is PC exclusive (bless them), Hard Reset
is a first-person shooter that's set in a future cyberpunk setting of a dystopian world. It reintroduces many of the gameplay mechanics that have made classics such as Quake
wickedly fun to play and that are sorely lacking in today's tactical military shooter, thus creating a 'void' for Flying Wild Hog to fill.
The game uses the studio's in-house Road Hog Engine, which isn't particularly heavy on new-generation DirectX features, but can still be taxing for some GPUs.
Max Payne 3
Max is back! The long anticipated third release in the Max Payne
series is the first game developed by Rockstar, which took over the title from Remedy Entertainment. In this first-person shooter, using an over-the-shoulder camera view, you battle the bad guys using game-changing features like Bullet Time or Last Stand. The maps have scenic locations taking the player to places like New York, Sao Paulo, and Panama.
The Max Payne 3
game engine uses DirectX 11 with tessellation and very detailed textures. We tested the game with details set to the maximum possible.
is a first-person shooter game that is set in a post-apocalyptic Moscow - inside the metro system as the name suggests. You will fight mutants or other humans who want to take away your shelter. The game has many gameplay elements similar to STALKER
; the engine also has similar features. This is because two STALKER
engine programmers left GSC Game World and started their own company which made Metro 2033
The engine has support for all the latest eye candy like DirectX 11 and tessellation. Unfortunately, it leaves a less than satisfactory impression, making it a candidate to surpass Crysis
for the highest hardware requirements. We tested the game in DirectX 11 mode with details set to "Very High".
Sniper Elite V2
Sniper Elite V2
is a tactical shooter letting you play the Battle of Berlin during early May 1945. You are an American elite sharpshooter who is located behind enemy lines to stop the German V-2 rocket program. Gameplay does not only focus on full frontal assault, but also requires elements of stealth and patience to gain the upper hand. Sniper Elite V2
features a complex ballistics simulation, forcing players to account for factors including gravity, wind, velocity, bullet penetration, and aim stability.
Sniper Elite V2
uses DirectX 11, including tessellation, contact hardening shadows, and DirectCompute-based effects, including anti-aliasing.
For our testing, we used the Sniper Elite V2 benchmark tool, in DX11 mode, with highest settings and super sampling disabled.
STALKER: Call of Pripyat
STALKER: Call of Pripyat
continues shortly after the events of the prequel STALKER: Shadow of Chernobyl
. The player is one of many stalkers who are attracted by the Zone in hopes of finding fame, wealth, and artifacts. Over the course of the game, you meet Strelok, the protagonist of the first STALKER
game, and team up with him to progress through the Zone.
An updated X-Ray Engine 1.6 powers the game with support for DirectX 11 using DirectCompute Shaders to improve shadow rendering and tessellation to improve model quality.
, released in July 2010, is a sequel to Blizzard's award-winning strategy game StarCraft
. In the 26th century, three species, namely, Terrans, Protoss, and Zerg, are at war. The campaign takes you through many missions on different planets where you have to face various enemy factions or, sometimes, several of them at once. StarCraft II
features a similar number of units - some of them new - as the original game. Due to the massive success of the first game, Blizzard chose to focus a large aspect of the game on multiplayer combat through Battle.net. The campaign serves as a good introduction to units and concepts – the real action is in competitive multiplayer combat.
The StarCraft II
engine supports only DirectX 9, but several patches have improved rendering quality and available options considerably. We tested the game using a recorded 1 vs. 1 multiplayer replay in the late-game phase. Please note that StarCraft II
is very CPU limited on high-end cards, especially on lower resolutions, so you may not see much scaling between some cards. StarCraft II
does not support multi-monitor gaming, because it would provide an unfair advantage in competitive multiplayer as a larger portion of the map would be visible.
Total War: Shogun 2
Set in 16th century feudal Japan, Total War: Shogun 2
takes the player on a quest for domination to conquer and unite the warlords of Japan. Moving away from the European setting of previous Total War
games, the game is now designed around the principles of the brilliant Chinese general Sun Tzu and his book The Art of War
. Gameplay switches between real-time battles, during which units on the battlefield are controlled, and turn-based strategy, which focuses on diplomacy, economy, and production management. Taking control of a castle involves several different stages which adds more complexity to the warfare in Shogun 2.
We benchmarked using the highest settings possible in DirectX 11 mode, which was added via a patch after release.
The Elder Scrolls: Skyrim
This isn't just a game - it's a masterpiece; a very large sandbox game that rejects the quality-quantity inverse proportionality. By genre, TES: Skyrim
is a role-playing game - it combines some of the best elements of older titles in the franchise with some new sandbox elements to churn out an extremely engaging and addictive game. It makes use of Bethesda's Creation Engine which isn't visually intensive in that it doesn't use taxing graphics features. Instead, the game's presentation itself, with large open worlds, ends up taxing your hardware. Faster GPUs result in smoother gameplay with most eye candy turned on.
3DMark 11 is the very latest benchmark test from the house of Futuremark, which has given out some of the most comprehensive benchmark applications for PC enthusiasts and gamers. 3DMark 11, as the name might probably suggest, makes use of the Microsoft DirectX 11 API and puts every feature of it at its disposal to use, creating astonishingly realistic visuals. In the process, it evaluates DirectX 11-compliant GPUs and lets gamers know what to expect from upcoming games that make use of the API in terms of visual realism. The tessellation and depth-of-field tests are particularly of interest here. 3DMark 11 has no proper support for multi-monitor configurations.
Unigine Heaven 2.0
Unigine Heaven was one of the first demos that supported DirectX 11. Heaven is a technology demonstration for the Unigine engine which supports DirectX 9 through 11 and OpenGL. Version 2.0 adds more scenes and, optionally, more complex tessellation features. Although there is some controversy surrounding the benchmark and as to whether it is an accurate representation of what to expect from future games in regard to DirectX 11, we still decided to use this test to get an insight into the potential of future gaming.
Cooling modern video cards is becoming more and more difficult, especially with users asking for quiet cooling solutions. That's why engineers are now paying much more attention to the power consumption of new video card designs. An optimized fan profile is also one of the few things that board vendors can do to impress with reference designs where they are prohibited to make changes to the thermal solution or components on the card.
For this test, we measured the power consumption of the graphics card only, via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution was used for all measurements. Again, the values here reflect only the power consumption of the card measured at DC VGA card inputs, not of the whole system.
We chose Crysis 2
as a standard test representing typical 3D gaming usage because it offers the following: very high power draw; high repeatability; is a current game that is supported on all cards because of its DirectX 9 roots; drivers are actively tested and optimized for it; supports all multi-GPU configurations; test runs in a relatively short time and renders a non-static scene with variable complexity.
Our results were based on the following tests:
- Idle: Windows 7 Aero sitting at the desktop (1280x1024, 32-bit) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable.
- Multi-monitor: Two monitors connected to the tested card, both using different display timings. Windows 7 Aero sitting at the desktop (1280x1024 32-bit) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable.
- Average: Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Average of all readings (12 per second) while the benchmark was rendering (no title/loading screen).
- Peak: Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Highest single reading during the test.
- Maximum: Furmark Stability Test at 1280x1024, 0xAA. This results in a very high non-game power consumption that can typically be reached only with stress-testing applications. The Card was left running the stress test until power draw converged to a stable value. On cards with power-limiting systems, we disabled the power-limiting system or configured it, if possible, to the highest available setting. We also used the highest single reading from a Furmark run which was obtained by measurements taken faster than the power limit could kick in.
- Blu-ray Playback: Power DVD 9 Ultra was used at a resolution of 1920x1200 to play back the Batman: The Dark Knight disc with GPU acceleration turned on. Playback started around timecode 1:19 which has the highest data rates on the BD with up to 40 Mb/s. Playback was left running until power draw converged to a stable value.
We knew that building a dual Tahiti board would be difficult, but how difficult it is becomes clear on this page. In Furmark PowerColor's Devil 13 consumes 551 Watts of power! A new record. During typical gaming, power consumption is very high as well, hovering around the 300 W mark. NVIDIA's GTX 690 does much better here.
We see some higher power consumption in non-gaming states as well, which is surprising because ULPS should put the second GPU to sleep outside of gaming states.
In the past years, users would accept everything just to get more performance. Nowadays, this has changed and users have become more aware of the fan noise and the power consumption of their graphic cards.
In order to properly test the fan noise that a card emits, we use the Bruel & Kjaer 2236 sound level meter (~$4,000) which has the measurement range and accuracy we are looking for.
The tested graphics card was installed in a system that was completely passively cooled. That is, passive PSU, passive CPU cooler, and passive cooling on the motherboard and on a solid state drive.
This setup allows us to eliminate secondary noise sources and test only the video card. To be more compliant with standards like DIN 45635 (we are not claiming to be fully DIN 45635 certified), the measurement was conducted at 100 cm of distance and at 160 cm over the floor. The ambient background noise level in the room was well below 20 dBA for all measurements. Please note that the dBA scale is not linear but logarithmic. 40 dBA is not twice as loud as 20 dBA. A 3 dBA increase results in double the sound pressure. The human hearing perception is a bit different and it is generally accepted that a 10 dBA increase doubles the perceived sound level. The 3D load noise levels were tested with a stressful game, not with Furmark.
All data on this page was collected after we fixed the fan's mounting pressure by adding extra washers as detailed on page 1. Without that change the card would run at 100% fan speed immediately after starting a game and crashing. The numbers on this page reflect what would have been possible if PowerColor's engineers designed a proper mounting system.
In idle, fan noise is clearly too high. It shouldn't be as noisy since the card emits very little heat in idle; better fan settings would have been no problem.
Under load, the card has to deal with the huge heat output of the two Tahiti GPUs; 49 dbA is quite a decent result compared to the HD 7970 and the HD 7970 GHz Edition. NVIDIA's GTX 690 is much quieter though.
The graphs on this page show a combined performance summary of all tests and resolutions from previous pages. Each graph shows the tested card as 100% and all other cards' performance as relative to it. A sixth graph summarizes all tests in all resolutions to calculate the total relative performance of the review sample.
Performance per Watt
Using the relative performance scores from the previous page, and the typical gaming power consumption result, the following graphs show the efficiency of the cards in our test group.
Performance per Dollar
You will love this graph if you are looking for the best bank for the buck. We looked up the current USD price of each card on the popular online shop Newegg and used that value, and relative performance numbers to calculate the Performance per Dollar Index.
The overclocking results listed in this section were achieved with the default fan and voltage settings as defined in the VGA BIOS. Please note that every single sample overclocks differently which is why our results here can only serve as a guideline for what you can expect from your card.
Overclocking with the normal 925 MHz BIOS did not work. The second GPU was stuck at 925 MHz all the time, even with ULPS disabled. Switching to the 1000 MHz turbo BIOS fixed that, and all our OC results are with that BIOS.
The maximum stable clocks of our card are 1085 MHz core (9% overclock) and 1740 MHz memory (27% overclock).
Its Overclocking potential is quite nice, but wouldn't be possible if we hadn't fixed its too low cooler-mounting pressure.
Important: Each GPU (including each GPU of the same make and model) will
|Maximum Overclock Comparison
||Max. GPU Clock
||Max. Memory Clock
||Max. OC Perf.
|PowerColor HD 7990 Devil 13
|HD 7970 GHz
overclock slightly differently based on random production variances. This table
just serves to provide a list of typical overclocks for similar cards, determined
during TPU review.
Using these clock frequencies, we ran a quick test of Battlefield 3
to evaluate the gains from overclocking.
Actual 3D performance gained from overclocking is 16.7%.
After we fixed the mounting pressure of the cooler, the temperatures are quite decent. We heard from PowerColor that the typical load temperatures they've seen are 80°C, so our little modification improves temperatures by over 10°C!
Important: GPU temperature will vary depending on clock speed, voltage settings,
|GPU Temperature Comparison
|PowerColor HD 7990 Devil 13
||38°C / 37°C
||71°C / 65°C
||34°C / 35°C
||80°C / 81°C
|HD 7970 GHz
cooler design, and production variances. This table just serves to provide
a list of typical temperatures for similar cards determined during TPU review.
Modern graphics cards have several clock profiles that are selected to balance power draw and performance requirements.
The following table lists the clock settings for important performance scenarios and the GPU voltage that we measured. We performed the measurement on the pins of a coil or a capacitor near the GPU voltage regulator.
|CCC Overdrive Limits
Value and Conclusion
- According to PowerColor, the HD 7990 Devil 13 will retail for $999.
- High performance
- Awesome screwdriver kit included
- Dual BIOS
- Good accessory kit
- Good OC potential
- Low temperatures (after fixing cooler mounting)
- Native full-size HDMI output with included audio
- Support for PCI-Express 3.0 and DirectX 11.1
- AMD ZeroCore power for reduced power consumption
- Major engineering fail with cooler mounting system
- High price
- High power draw
- Based on CrossFire technology, needs driver support to perform best
- Noisy in idle
- Triple slot design
||This whole review was conducted under the assumption that PowerColor would fix the bad contact between cooler and GPU that was present on my sample (check page 1). Let's hope that the changes by PowerColor will provide similar cooling performance, otherwise large portions of this review will be obsolete.|
PowerColor's HD 7990 Devil 13 is the first dual-GPU HD 7990 card that we review. I have to praise PowerColor for making the bold move and engineering their card without any help from AMD. AMD's HD 7990 is still MIA and might never be released at all. The HD 7990 Devil 13 provides awesome performance in games that properly support CrossFire. But many games don't show ideal scaling, or no scaling at all. AMD's CrossFire technology requires profiles for each game to provide maximum performance, which is often a problem with newly released games as AMD is slower than NVIDIA in updating their profiles. Compared to NVIDIA's dual-GPU GeForce GTX 690, the PowerColor Devil 13 is 11% behind in performance when averaged over all our testing.
PowerColor's card also needs a ton of power. The card alone needs up to 550W (in Furmark). During typical gaming, we see power draw in the 260-300 W range, which is similar to what you can expect from two separate HD 7970s in CrossFire. NVIDIA's GTX 690 does better here as well, offering much better performance per Watt.
All this power is converted into heat, which means that the cooler has to work extra hard. PowerColor does use a powerful triple-slot, triple-fan cooler with ten heatpipes total, and it does, after fixing the mounting pressure, a good job keeping the card cool. Temperatures under load are 70°C; PowerColor mentions 80°C from their own testing. Fan noise in idle is pretty high, but load noise is pretty good, given the card's performance class, when compared to AMD's cards. NVIDIA's GTX 690 is much quieter though, despite using a dual-slot cooler.
What really stands out is the epic package that comes with tons of goodies. The included DisplayPort to DVI adapters will make setting up a triple-monitor EyeFinity configuration a breeze. My personal favorite is the included Wiha screwdriver kit. I've bought many of these for all review and computer work. They are indestructible and will get any screw out easily.
Taming the card for overclocking was not easy as you have to disable AMD's Ultra Low Power State (ULPS) for it to work properly. We also had to switch to PowerColor's Turbo BIOS as the normal BIOS would not let us change the clocks of the second GPU. In the end we reached a pretty impressive 1085 MHz GPU clock, which is higher than the original HD 7970 reference design. After manual overclocking the Devil 13 gained almost 17% in performance compared to the normal BIOS. Pretty impressive!
PowerColor is asking $999 for their card, which is the same as NVIDIA's GTX 690. I find it hard to recommend the PowerColor card at that price-point. NVIDIA's card provides the better overall experience, less power, noise, heat, better max OC performance, in a dual slot form factor, and better drivers. On the other hand, PowerColor's card will let you tweak voltages and overclock as much as you want, unlike NVIDIA's latest generation. This will make the Devil 13 very interesting for hardcore overclockers on their quest to break records.
Update: PowerColor just told us that they are working on the mounting issue and halted all shipments of production boards to investigate. No retail cards should be affected.