NVIDIA's GeForce GTX 680 was introduced earlier this year and has since taken the enthusiast market by storm. It offers great performance at significantly reduced power consumption. NVIDIA's dynamic overclocking algorithm works well and provides a nice speed boost for the card.
The KFA² GTX 680 LTD OC is a unique GTX 680 design. It uses a sexy white PCB and a large triple-fan thermal solution. Its output configuration is unique too, using three mini-HDMI outputs and one full-size DisplayPort connector. GPU clock has also been increased, while memory remains at the NVIDIA reference clock frequency.
GeForce GTX 680 Market Segment Analysis
| HD 7970
||2x 2048 MB
||2x 1536 MB
||2x 2048 MB
|Memory Bus Width
||2x 256 bit
||2x 384 bit
||2x 256 bit
You will receive:
- Graphics card
- Driver CD + Documentation
- 2x PCI-Express power cable
- HDMI to DVI adapter (not pictured)
- Mini-HDMI to HDMI adapter (not pictured)
The GTX 680 LTD OC uses a white PCB, which looks kinda sexy. Unfortunately, the black cooler, which covers the whole front, doesn't let you see much of the PCB. On the other hand, the black cooler does provide a color contrast; it is up to you to decide whether you like it.
The card requires two slots in your system.
Display connectivity options include one full-size DisplayPort and three mini-HDMI ports. You may use all the outputs at the same time. Please note that using a dual-link DVI display (1920x1200 and up) requires that you buy an active dual-link DisplayPort-to-DVI adapter. A passive adapter or any HDMI adapter will only work up to 1920x1080.
An HDMI sound device is included in the GPU, too. It is HDMI 1.4a compatible which includes HD audio and support for Blu-ray 3D movies. The DisplayPort outputs are version 1.2 which enables the use of hubs and Multi-Stream Transport.
You may combine up to four GTX 680 cards from any vendor in a multi-GPU SLI configuration for higher frame rates or better image quality settings.
Pictured above are the front and back, showing the disassembled board. High-res versions are also available (front
). If you choose to use these images for voltmods, etc., please include a link back to this site or let us post your article.
A Closer Look
KFA²'s cooler uses a large copper base with five heatpipes to transport heat away from the GPU surface.
Once we remove the main heatsink, you can see two metal pieces on the card, one covers the memory chips and the other the voltage regulation circuitry.
The card requires two 8-pin PCI-Express power cables for operation. This power configuration is good for up to 375 W of power draw.
For voltage control, the card uses a CHiL CHL8318 which is a brand-new model, providing software voltage control and extensive monitoring features. It is also used on cards like the MSI GTX 680 Lightning and ZOTAC GTX 680 Extreme Edition. Unfortunately, NVIDIA does not allow board vendors to expose software voltage control on Kepler graphics cards.
The GDDR5 memory chips are made by Hynix and carry the model number H5GQ2H24MFR-R0C. They are specified to run at 1500 MHz (6000 MHz GDDR5 effective).
NVIDIA's GK104 graphics processor introduced the company's brand-new Kepler architecture. It is NVIDIA's first chip to be produced on a 28 nm process, at TSMC Taiwan. The transistor count is 3.54 billion.
Benchmark scores in other reviews are only comparable when this exact same configuration is used.
|Test System - VGA Rev. 17
||Intel Core i7-3770K @ 4.7 GHz
(Ivy Bridge, 8192 KB Cache)
||ASUS Maximus V Gene
||2x 4096 MB Corsair Vengeance PC3-12800 DDR3
@ 1600 MHz 9-9-9-24
||WD Caviar Blue WD5000AAKS 500 GB
||Antec HCP-1200 1200W
||Windows 7 64-bit Service Pack 1
||NVIDIA: 304.79 Beta
ATI: Catalyst 12.7 Beta
LG Flatron W3000H 30" 2560x1600
3x Hanns.G HL225DBB 21.5" 1920x1080
- All video card results were obtained on this exact system with exactly the same configuration.
- All games were set to their highest quality setting unless indicated otherwise.
- AA and AF are applied via in-game settings, not via the driver's control panel.
Each benchmark was tested at the following settings and resolution:
- 1280 x 800, 2x Anti-aliasing. Common resolution for most smaller flatscreens today (17" - 19"). A bit of eye candy turned on in the drivers.
- 1680 x 1050, 4x Anti-aliasing. Most common widescreen resolution on larger displays (19" - 22"). Very good looking driver graphics settings.
- 1920 x 1200, 4x Anti-aliasing. Typical widescreen resolution for large displays (22" - 26"). Very good looking driver graphics settings.
- 2560 x 1600, 4x Anti-aliasing. Highest possible resolution for commonly available displays (30"). Very good looking driver graphics settings.
- 5760 x 1080, 4x Anti-aliasing. Typical high-end gaming multi-monitor resolution. Very good looking driver graphics settings.
, released in 2012 for the PC, is a highly successful third-person horror shooter that revolves around the adventures of novelist Alan Wake who has to battle the "darkness" which takes over living and dead things. Alan's signature flashlight is used to strip the forces of darkness of their protection, to make them vulnerable to conventional weapons.
The engine of Alan Wake
uses DirectX 9, but features complex lighting effects which makes it quite a demanding title. We benchmarked with the highest settings possible.
Batman: Arkham City
is back on the LCD screen with Batman: Arkham City
, a sequel to Batman: Arkham Asylum
, by Rocksteady Games and WB. It was released on the PC platform in November. Batman is imprisoned in Arkham City, an infamous district of the DC Universe that contains the scum of Gotham, most of whom Batman helped put in there. In order to get out, he must go through scores of baddies and encounters many of the iconic supervillains along the way. He's not entirely alone.
Batman: Arkham City
uses the same Unreal Engine by Epic as Batman: Arkham Asylum
does, but thanks to the engine's modularity, it has been overhauled, being outfitted with the latest technologies, including a graphics engine that takes advantage of DirectX 11.
Arguably one of the most anticipated online shooters in recent times, Battlefield 3
is the latest addition to some of the most engaging online multiplayer shooter franchises. It combines infantry combat with mechanized warfare including transport vehicles, armored personnel carriers, main battle tanks, attack helicopters, combat aircraft, pretty much everything that goes into today's battlefields. The infantry combat is coupled with role-playing elements, which makes the experience all the more engaging. It also has a single-player campaign which added a few gigabytes to its installer.
Behind all this is a spanking new game engine by EA-DICE, Frostbite 2. It makes use of every possible feature DirectX 11 has to offer, including hardware tessellation, and new lighting effects, to deliver some of the most captivating visuals gamers have ever had access to. Not playing this game on the PC is a grave injustice to what's in store. Faster PCs are rewarded with better visuals.
, a card-based RTS, is developed by the German EA Phenomic Studio. A few months after the launch, the game was transformed into a Play 4 Free branded game. That move and the fact that it was included as a game bundle with a large number of ATI cards made it one of the more well-known RTS games of 2009. You as a player assemble your deck before the game to select the units that will be available. Elemental force choices can be from the forces of Fire, Frost, Nature, and Shadow to complement each other.
The BattleForge engine has full support for DX9, DX10, and DX10.1. We used the internal benchmark tool in DirectX 11 mode with the highest settings possible to obtain our results.
Sid Meier's Civilization V
(or Civ 5
in common jargon) is the latest addition to the franchise of masterfully crafted real-time strategy games that let you play God to a nascent civilization of your choice all the way up to the space age. Civilization V
uses large 3D worlds that are procedurally generated and takes advantage of the hardware tessellation features offered by DirectX 11 to exponentially step up the complexity of cities, models, terrains, and objects. It is also expected from this generation of GPUs to handle the larger texture loads that come with the eye candy.
After the tremendous success of Far Cry
, the German game studio Crytek released their latest shooter Crysis
in 2007. The game was by far the most hyped and anticipated game in 2007, and forums were full of "Can my system run Crysis?" threads because of the high hardware requirements of this game. Just like in Far Cry
, the plot evolves on a small island with a thick and richly detailed jungle world. A lot of attention has been given to small details like accurate physics. For example, when you fire on a tree trunk, it will shatter and the tree will fall over leaving a stump behind. Enemies in a car can be stopped by shooting the tire of the car. The game graphics are top notch, even today, yet the game still runs well on most computers.
takes the player into an alien-infested New York City. The game adds a tactical options mode that allows several ways to attack a heavily infested enemy location. The new Nanosuit 2.0 that the player uses offers more freedom in ability use; for example, multiple abilities can be used at the same time. To better accommodate a given play style, weapons can be customized with silencers, laser sights, or even a sniping scope.
For rendering, Crytek's CryEngine 3 is used, which comes with reduced system requirements compared to the first Crysis
game. Since Crysis 2
is a multi-platform game, with major development focus on the console, the graphics on launch day were only DirectX 9. DirectX 11 functionality was added later in a patch. We used the DX11 version and the high-res texture pack for our benchmarking.
Blizzard's Diablo 3
is the latest release in one of the most popular action RPG series of all time. You, the hero, will experience epic adventures on your journey to defeat Diablo, the master of Hell. Diablo 3
set the record for the fastest-selling PC game, selling over 3.5 million copies on the first day of its release; it was also the most pre-ordered game on Amazon.
Blizzard's DirectX 9 engine provides the player with an isometric view on the action. The game has been tuned to run well on most computer systems, to let as many players as possible experience the game. We tested Diablo 3
running at the highest image quality settings.
Dragon Age II
Dragon Age II
is the second game in BioWare's Dragon Age
franchise and was released in March 2011. As player, named Hawke, you will be able to pick your hero from several classes and grow him over the course of the adventure. Gameplay takes you through a linearly narrated story of Hawke's rise to become the legendary "Champion of Kirkwall".
BioWare's Lycium Engine has support for DirectX 11, using tessellation, advanced dynamic lighting, and camera effects like depth of field. We benchmarked the DX11 version with details set to the highest possible.
Developed by Flying Wild Hog, a studio that prides itself with the fact that its creation is PC exclusive (bless them), Hard Reset
is a first-person shooter that's set in a future cyberpunk setting of a dystopian world. It reintroduces many of the gameplay mechanics that have made classics such as Quake
wickedly fun to play and that are sorely lacking in today's tactical military shooters, thus creating a 'void' for Flying Wild Hog to fill.
The game uses the studio's in-house Road Hog Engine, which isn't particularly heavy on new-generation DirectX features, but can still be taxing for some GPUs.
Max Payne 3
Max is back! The long anticipated third release in the Max Payne
series is the first game developed by Rockstar, which took over the title from Remedy Entertainment. In this first-person shooter, using an over-the-shoulder camera view, you battle the bad guys using game-changing features like Bullet Time or Last Stand. The maps have scenic locations taking the player to places like New York, Sao Paulo, and Panama.
The Max Payne 3
game engine uses DirectX 11 with tessellation and very detailed textures. We tested the game with details set to the maximum possible.
is a first-person shooter game that is set in a post-apocalyptic Moscow - inside the metro system as the name suggests. You will fight mutants or other humans who want to take away your shelter. The game has many gameplay elements similar to STALKER
; the engine also has similar features. This is because two STALKER
engine programmers left GSC Game World and started their own company which is now making Metro 2033
The engine has support for all the latest eye candy like DirectX 11 and tessellation. Unfortunately, it leaves a less than satisfactory impression, making it a candidate to surpass Crysis
for the highest hardware requirements. We tested the game in DirectX 11 mode with details set to "Very High".
Sniper Elite V2
Sniper Elite V2
is a tactical shooter letting you play the Battle of Berlin during early May 1945. You are an American elite sharpshooter who is located behind enemy lines to stop the German V-2 rocket program. Gameplay does not only focus on full frontal assault, but also requires elements of stealth and patience, to gain the upper hand. Sniper Elite V2
features a complex ballistics simulation, forcing players to account for factors including gravity, wind, velocity, bullet penetration, and aim stability.
Sniper Elite V2
uses DirectX 11, including tessellation, contact hardening shadows, and DirectCompute-based effects, including anti-aliasing.
For our testing, we used the Sniper Elite V2 benchmark tool, in DX11 mode, with highest settings and super sampling disabled.
STALKER: Call of Pripyat
STALKER: Call of Pripyat
continues shortly after the events of the previous game STALKER: Shadow of Chernobyl
. The player is one of many stalkers who are attracted by the Zone in the hope of finding fame, wealth, and artifacts. Over the course of the game, you meet Strelok, the protagonist of the first STALKER
game, and team up with him to progress through the Zone.
An updated X-Ray Engine 1.6 powers the game with support for DirectX 11 using compute shaders to improve shadow rendering and tessellation to improve model quality.
, released in July 2010, is a sequel to Blizzard's award-winning strategy game StarCraft
. In the 26th century, three species, namely, Terrans, Protoss, and Zerg, are at war. The campaign takes you through many missions on different planets where you have to face the various enemy factions, sometimes several of them. StarCraft II
features a similar number of units as the original game, some of them new. Due to the massive success of the first game, Blizzard chose to focus a large aspect of the game on multiplayer combat through Battle.net. The campaign serves as a good introduction to units and concepts, and competitive multiplayer is where the action is.
The StarCraft II
engine supports only DirectX 9, but several patches have improved rendering quality and available options considerably. We tested the game using a recorded 1 vs. 1 multiplayer replay in the late game phase. Please note that StarCraft II
is very CPU limited on high-end cards, especially on lower resolutions, so you may not see much scaling between some cards. StarCraft II
does not support multi-monitor gaming, because it would provide an unfair advantage in competitive multiplaying, as a larger portion of the map would be visible.
Total War: Shogun 2
Set in 16th century feudal Japan, Total War: Shogun 2
takes the player on a quest for domination to conquer and unite the warlords of Japan. Moving away from the European setting of previous Total War
games, the game is now designed around the principles of the brilliant Chinese general Sun Tzu and his book The Art of War
. Gameplay is switched between real-time battles, during which units on the battlefield are controlled, and turn-based strategy, which focuses on diplomacy, economy, and production management. Taking control of a castle involves several different stages, which adds more complexity to warfare.
We benchmarked using the highest settings possible in DirectX 11 mode, which was added via patch after release.
The Elder Scrolls: Skyrim
This isn't just a game - it's a masterpiece. A very large sandbox game that rejects the quality-quantity inverse proportionality. By genre, TES: Skyrim
is a role-playing game; it combines some of the best elements of older titles in the franchise, with some new sandbox elements to churn out an extremely engaging and addictive game. It makes use of Bethesda's Creation Engine, which isn't visually intensive in that it doesn't use taxing graphics features, but the game's presentation itself, with large open worlds, ends up taxing your hardware. Faster GPUs result in smoother gameplay with most eye candy turned on.
World of Warcraft: Cataclysm
World of Warcraft
is the most successful massively multiplayer online game in the world with far over 12 million monthly subscribers. The game is centered around the epic battle between the Horde and Alliance factions with many other races getting involved in a long and complex story line. Although World of Warcraft
was released in 2004, Blizzard has always added incremental improvements to the graphics, especially with new expansions. One key success factor of World of Warcraft
is that it will run on a large number of slower systems, but, at the same time, also delivers a decent graphics experience on high-end systems. We tested the game in DirectX 11 mode with details set to "Ultra".
3DMark 11 is the very latest benchmark test from the house of Futuremark, which has given out some of the most comprehensive benchmark applications for PC enthusiasts and gamers. 3DMark 11, as the name might probably suggest, makes use of the Microsoft DirectX 11 API and puts every feature of it at its disposal to use, creating astonishingly realistic visuals. In the process, it evaluates DirectX 11-compliant GPUs and lets gamers know what to expect from upcoming games that make use of the API, in terms of visual realism. The tessellation and depth-of-field tests are particularly of interest here. 3DMark 11 has no proper support for multi-monitor configurations.
Unigine Heaven 2.0
Unigine Heaven was one of the first demos that supported DirectX 11. Heaven is a technology demonstration for the Unigine engine which supports DirectX 9 through 11 and OpenGL, too. Version 2.0 adds more scenes and, optionally, more complex tessellation features. Although there is some controversy surrounding the benchmark as to whether it is an accurate representation of what to expect from future games in regard to DirectX 11, we still decided to use this test to get an insight into the potential of future gaming.
Cooling modern video cards is becoming more and more difficult, especially when users are asking for quiet cooling solutions. That's why engineers are now paying much more attention to the power consumption of new video card designs. An optimized fan profile is also one of the few things that board vendors can do to impress with reference designs where they are prohibited to make changes to the thermal solution or components on the card.
For this test, we measured the power consumption of the graphics card only, via the PCI-Express power connector(s) and PCI-Express bus slot. A Keithley Integra 2700 digital multimeter with 6.5-digit resolution was used for all measurements. Again, the values here reflect only the power consumption of the card measured at DC VGA card inputs, not of the whole system.
We chose Crysis 2
as a standard test representing typical 3D gaming usage because it offers the following: very high power draw; high repeatability; is a current game that is supported on all cards because of its DirectX 9 roots; drivers are actively tested and optimized for it; supports all multi-GPU configurations; test runs at a relatively short time and renders a non-static scene with variable complexity.
Our results were based on the following tests:
- Idle: Windows 7 Aero sitting at the desktop (1280x1024, 32-bit) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable.
- Multi-monitor: Two monitors connected to the tested card, both using different display timings. Windows 7 Aero sitting at the desktop (1280x1024 32-bit) with all windows closed and drivers installed. Card left to warm up in idle mode until power draw was stable.
- Average: Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Average of all readings (12 per second) while the benchmark was rendering (no title/loading screen).
- Peak: Crysis 2 at 1920x1200, Extreme profile, representing a typical gaming power draw. Highest single reading during the test.
- Maximum: Furmark Stability Test at 1280x1024, 0xAA. This results in a very high non-game power consumption that can typically be reached only with stress-testing applications. Card left running the stress test until power draw converged to a stable value. On cards with power-limiting systems, we disabled the power-limiting system or configured it to the highest available setting - if possible. We also used the highest single reading from a Furmark run which was obtained by measuring faster than when the power limit could kick in.
- Blu-ray Playback: Power DVD 9 Ultra was used at a resolution of 1920x1200 to play back the Batman: The Dark Knight disc with GPU acceleration turned on. Playback started around timecode 1:19 which has the highest data rates on the BD with up to 40 Mb/s. Playback left running until power draw converged to a stable value.
Power consumption of the KFA² GTX 680 LTD OC is quite a bit higher than the NVIDIA reference design, in all power states. While it wouldn't be surprising to see increased power draw for gaming (where higher clock speeds are active), non-gaming states are increased, too. Both the GTX 680 reference design and the GTX 680 LTD OC run at 324/162 MHz, 0.99 V, in these power states, at similar temperatures, so it's most probably the radically differerent voltage regulation circuitry that makes the difference.
In the past years, users would accept everything just to get more performance. Nowadays, this has changed and users have become more aware of the fan noise and power consumption of their graphic cards.
In order to properly test the fan noise that a card emits, we use the Bruel & Kjaer 2236 sound level meter (~$4,000) which has the measurement range and accuracy we are looking for.
The tested graphics card was installed in a system that was completely passively cooled. That is, passive PSU, passive CPU cooler, and passive cooling on the motherboard and on a solid state drive.
This setup allows us to eliminate secondary noise sources and test only the video card. To be more compliant with standards like DIN 45635 (we are not claiming to be fully DIN 45635 certified), the measurement was conducted at 100 cm of distance and at 160 cm over the floor. The ambient background noise level in the room was well below 20 dBA for all measurements. Please note that the dBA scale is not linear but logarithmic. 40 dBA is not twice as loud as 20 dBA. A 3 dBA increase results in double the sound pressure. The human hearing perception is a bit different and it is generally accepted that a 10 dBA increase doubles the perceived sound level. The 3D load noise levels were tested with a stressful game, not with Furmark.
KFA²'s cooler works well in terms of noise. In both idle mode and noise level, the card is quieter than the GTX 680 reference design. While it is certainly not the quietest GTX 680, most other custom-designed GTX 680s from other vendors emit a similar noise.
The graphs on this page show a combined performance summary of all tests and resolutions from previous pages. Each graph shows the tested card as 100% and all other cards' performance is relative to it. A sixth graph summarizes all tests in all resolutions to calculate the total relative performance of the review sample.
Performance per Watt
Using the relative performance scores from the previous page and the typical gaming power consumption result, the following graphs show the efficiency of the cards in our test group.
Performance per Dollar
If you are looking for the best bang for the buck, then you will love this graph. We looked up the current USD price of each card on the popular online shop Newegg and used that value and the relative performance numbers to calculate the Performance per Dollar Index.
The overclocking results listed in this section were achieved with the default fan and voltage settings as defined in the VGA BIOS. Please note that every single sample overclocks differently, that's why our results here can only serve as a guideline for what you can expect from your card.
The maximum stable clocks of our card are 1240 MHz core (3% overclock) and 1740 MHz memory (16% overclock).
GPU overclocking potential is outstanding and better than that of any other GTX 680 cards we have tested so far. Memory overclocks well, too, but other cards easily reach higher clocks.
Important: Each GPU (including each GPU of the same make and model)
|Maximum Overclock Comparison
||Max. GPU Clock
||Max. Memory Clock
|KFA² GTX 680 OC
|MSI GTX 680 Lightning
|ASUS GTX 680 DC II
|ZOTAC GTX 680 AMP
|Palit GTX 680 JetStream
|NVIDIA GTX 680
will overclock slightly differently based on random production variances.
This table just serves to provide a list of typical overclocks for similar cards,
determined during TPU review.
Using these clock frequencies, we ran a quick test of Battlefield 3
to evaluate the gains from overclocking.
Actual 3D performance gained from overclocking is 10.2%.
Temperatures are good, in the typical range of other GTX 680 cards we have tested.
Important: GPU temperature will vary depending on clock speed, voltage settings,
|GPU Temperature Comparison
|KFA² GTX 680 OC
|MSI GTX 680 Lightning
|ASUS GTX 680 DC II
|ZOTAC GTX 680 AMP
|Palit GTX 680 JetStream
|NVIDIA GTX 680
cooler design, and production variances. This table just serves to provide
a list of typical temperatures for similar cards, determined during TPU review.
Modern graphics cards have several clock profiles that are selected to balance power draw and performance requirements.
The following table lists the clock settings for important performance scenarios and the GPU voltage that we measured. We performed the measurement on the pins of a coil or a capacitor near the GPU voltage regulator.
||1111 - 1293 MHz
||1.012 - 1.175 V
The card uses NVIDIA's dynamic overclocking mechanism, which means it will dynamically adjust clock and voltage based on render load, temperature, and other factors.
For the graph below, we recorded all GPU clock and voltage combinations of our benchmarking suite for the 1920x1200 resolution. The plotted points have transparency, so they can add up to indicate more often used values. A light color means the clock/voltage combination is rarely used; a dark color means it's active a lot.
Unlike other GTX 680 cards, where we see a staircase pattern covering a wide range of clock and voltage combinations, the KFA² GTX 680 OC uses only a few clock states.
Value and Conclusion
- The KFA² GTX 680 LTD OC is only available in Europe at this time, at a price of €570, which we converted to $560 without taxes.
- Overclocked out of the box
- Excellent GPU overclocking potential
- Voltage measuring points
- Up to four active displays now, makes Surround gaming possible with one card
- Native full-size DisplayPort output
- Support for PCI-Express 3.0 and DirectX 11.1
- Support for CUDA and PhysX
- High price
- No dual-link DVI - expensive adapter needed
- Memory not overclocked
- High power consumption in non-gaming states
- Dynamic OC can't be turned off
- Manual overclocking more complicated than before
- Voltage measuring points hard to reach
- No technology similar to AMD's ZeroCore power
||The KFA² Geforce GTX 680 LTD OC is one of the fastest GTX 680 cards available on the market. Thanks to its large GPU overclock out of the box, the card is 8% faster than the GTX 680 reference design and 5% faster than the recently released AMD HD 7970 GHz Edition. Compared to the last-generation dual-GPU GTX 590, the card provides similar performance - what an improvement with just a single graphics processor! It would have been nice to see an overclock on the memory too; the card can certainly take it as shown by our manual overclocking tests.|
Manual overclocking yielded an excellent 1240 MHz maximum GPU clock, which is higher than that of any other GTX 680 cards we have tested so far, almost 100 MHz higher than the GTX 680 reference design. Memory performance, on the other hand, did not do so well and reached a bit below what we have seen on other GTX 680 cards. Overall, the real-world performance improvement from manual overclocking is 10%, which is pretty good.
Power consumption of the card is increased in all power states. For gaming, the increase is relatively small and can be explained by the higher clock speeds. I have no explanation for the extra power draw in idle mode and multi-monitor use; my best guess is that the revamped voltage regulation circuitry plays a role in it. The overall increase is not big enough though to make it a dealbreaker.
KFA² has opted for a large triple-fan cooler which covers the whole front of the card and blocks the view of the pretty white PCB design; white plastic or frosted acrylic would have worked wonders here. When considering the back side of the card, the white PCB adds a unique, stylish touch to the GTX 680 OC. Cooling performance is decent, the card has no problems staying cool, and it is also significantly quieter than the NVIDIA reference design. Overall noise levels are similar to those of other custom-designed GTX 680 cards, none of which, in my opinion, is as quiet as they should be.
KFA² has changed the monitor output configuration of its card. Instead of two DVI, one HDMI and one DP port, the card comes with no DVI port, three mini-HDMI ports, and one full-size DisplayPort. So if you want to use a DVI monitor at any resolution higher than 1920x1080, like 2560x1600 or 2560x1440, for example, then you will have to buy an active dual-link DisplayPort-to-DVI adapter, which costs $100-$150. Also, if you plan to build a multi-monitor gaming rig, don't forget to buy some extra mini-HDMI adapters as only one is provided with the card - I had to wait 3 days for delivery before I could do our 5760x1080 testing. I'm not sure why KFA² went this route with the display outputs, but, in my opinion, every card on the market should have a native dual-link DVI output, period.
KFA²'s GTX 680 LTD OC is only available in Europe at this time, which is why we converted the €570 retail price to USD without taxes, which comes out at around $560. This means that the card sits in the category where we usually see cards with 4 GB of memory - the KFA² card has only 2 GB. Nevertheless, the card is still an excellent choice for an ultra-high end gamer, if you are willing to pay a premium and can work with the lack of a dual-DVI output.