ATI's new Radeon HD 2900 or "R600" is probably the most hyped (and delayed) product of 2006 and 2007. The card uses the brand-new R600 GPU which is the first ATI product that has been specifically designed for DirectX10 and Windows Vista.
Here are all the specs at a glance, for further reference see our upcoming Radeon HD 2000 Series article.
||HD 2400 XT
||HD 2600 XT
||HD 2900 XT
|Memory Bus Width
- 700 million transistors on 80nm HS fabrication process
- 512-bit 8-channel GDDR3/4 memory interface
Ring Bus Memory Controller
- Fully distributed design with 1024-bit internal ring bus for memory reads and writes
- Optimized for high performance HDR (High Dynamic Range) rendering at high display resolutions
Unified Superscalar Shader Architecture
- 320 stream processing units
- Dynamic load balancing and resource allocation for vertex, geometry, and pixel shaders
- Common instruction set and texture unit access supported for all types of shaders
- Dedicated branch execution units and texture address processors
- 128-bit floating point precision for all operations
- Command processor for reduced CPU overhead
- Shader instruction and constant caches
- Up to 80 texture fetches per clock cycle
- Up to 128 textures per pixel
- Fully associative multi-level texture cache design
- DXTC and 3Dc+ texture compression
- High resolution texture support (up to 8192 x 8192)
- Fully associative texture Z/stencil cache designs
- Double-sided hierarchical Z/stencil buffer
- Early Z test, Re-Z, Z Range optimization, and Fast Z Clear
- Lossless Z & stencil compression (up to 128:1)
- Lossless color compression (up to 8:1)
- 8 render targets (MRTs) with anti-aliasing support
- Physics processing support
Full support for Microsoft DirectX 10.0
- Shader Model 4.0
- Geometry Shaders
- Stream Output
- Integer and Bitwise Operations
- Alpha to Coverage
- Constant Buffers
- State Objects
- Texture Arrays
Dynamic Geometry Acceleration
- High performance vertex cache
- Programmable tessellation unit
- Accelerated geometry shader path for geometry amplification
- Memory read/write cache for improved stream output performance
- Multi-sample anti-aliasing (up to 8 samples per pixel)
- Up to 24x Custom Filter Anti-Aliasing (CFAA) for improved quality
- Adaptive super-sampling and multi-sampling
- Temporal anti-aliasing
- Gamma correct
- Super AA (CrossFire configurations only)
- All anti-aliasing features compatible with HDR rendering
Texture filtering features
- 2x/4x/8x/16x high quality adaptive anisotropic filtering modes (up to 128taps per pixel)
- 128-bit floating point HDR texture filtering
- Bicubic filtering
- sRGB filtering (gamma/degamma)
- Percentage Closer Filtering (PCF)
- Depth & stencil texture (DST) format support
- Shared exponent HDR (RGBE 9:9:9:5) texture format support
CrossFire™ Multi-GPU Technology
- Scale up rendering performance and image quality with 2 or more GPUs
- Integrated compositing engine
- High performance dual channel interconnect
ATI Avivo™ HD Video and Display Platform
- Two independent display controllers
- Drive two displays simultaneously with independent resolutions, refresh rates, color controls and video overlays for each display
- Full 30-bit display processing
- Programmable piecewise linear gamma correction, color correction, and color space conversion
- Spatial/temporal dithering provides 30-bit color quality on 24-bit and 18-bit displays
- High quality pre- and post-scaling engines, with underscan support for all display outputs
- Content-adaptive de-flicker filtering for interlaced displays
- Fast, glitch-free mode switching
- Hardware cursor
- Two integrated dual-link DVI display outputs
- Each supports 18-, 24-, and 30-bit digital displays at all resolutions up to 1920x1200 (single-link DVI) or 2560x1600 (dual-link DVI)
- Each includes a dual-link HDCP encoder with on-chip key storage for high resolution playback of protected content
- Two integrated 400 MHz 30-bit RAMDACs
- Each supports analog displays connected by VGA at all resolutions up to 2048x1536
- HDMI output support
- Supports all display resolutions up to 1920x1080
- Integrated HD audio controller with multi-channel (5.1) AC3 support, enabling a plug-and-play cable-less audio solution
- Integrated Xilleon™ HDTV encoder
- Provides high quality analog TV output (component/Svideo/ composite)
- Supports SDTV and HDTV resolutions
- Underscan and overscan compensation
- HD decode acceleration for H.264/AVC, VC-1, DivX and MPEG-2 video formats
- Flawless DVD, HD DVD, and Blu-Ray playback
- Motion compensation and IDCT (Inverse Discrete Cosine Transformation)
- HD video processing
- Advanced vector adaptive per-pixel de-interlacing
- De-blocking and noise reduction filtering
- Edge enhancement
- Inverse telecine (2:2 and 3:2 pull-down correction)
- Bad edit correction
- High fidelity gamma correction, color correction, color space conversion, and scaling
- MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264/AVC encoding and transcoding
- Seamless integration of pixel shaders with video in real time
- VGA mode support on all display outputs
- PCI Express x16 bus interface
- OpenGL 2.0 support
When we received our sample a retail package was not available yet.
Our shipment had only the Radeon HD 2900 XT card and a DVI to HDMI (+audio) adapter.
The HDMI adapter is an extremely convenient solution for connecting your HDTV to the Radeon HD 2900 XT, unlike most previous solution the HDMI cable carries now audio and video. The audio is played back directly on the Radeon HD video card in a fully digital process, similar to SPDIF.
If we look at the size of the Radeon HD 2900 XT we see that it fits nicely in the middle of the GeForce 8800 GTX (left) and the Radeon X1900 XTX (right). First leaks have shown a behemoth PCB which obviously was too big to be used in the retail market, the big board may appear in OEM designs though.
The card's theme is still completely red, even though AMD's corporate colors are green, yet ATI remains red. On top of the cooler you can see a simple but pretty silvery flame paint job. The red transparent cooler reminds of the Radeon X1950 Series, the cooler inside has been changed of course.
On the back we see a big black metal heatsink which cools the memory chips on the back of the card. Yes you read right, the card goes back to the "memory chips on two sides" approach. This is because the memory bus bandwidth has been bumped to 512 bit which requires that you use 16 memory chips. Fitting all these on one side isn't possible, so each side has eight of them.
As you would expect, the card has two DVI outputs which are both dual-link capable, so highres displays up to 2560x1600 can be used. You can use both DVI outputs with HDMI adapters at the same time while still retaining HDCP encryption on both outputs. Also you can use the good old DVI to VGA adapters in case you still have a CRT or TFT with analog input.
The internal CrossFire connector is the same as on the Radeon X1950 Pro Series. All CrossFire handling is now done inside the GPU, an external CrossFire chip is no longer required, also all boards can run as CrossFire Master or Slave. Every retail board should include one CrossFire cable, so if you purchase two cards you will have the two bridges you need.
A Closer Look
In case you want to take a closer look at a small detail on the card, we took some ultra high-res photos for you. Each image has a resolution 3264x2448 of and weighs in at about 3 MB.
Several parts make up the cooler, the baseplate that makes contact with the core is made from Copper, so are the heatpipes and the heatsink fins. The fan is radial which is better for airflow than the good old axial design, unfortunately the card is a bit noisy when running at high load. Thermal tape on both the front and the back plate ensure that your memory is kept at an acceptable temperature. As mentioned before, the black metal piece sits on the back of the card and cools the eight memory chips there.
The mounting hole distance is exactly the same as on the X1950. The distance across from hole to hole ( like an X ) is 3" or 7.62 cm.
The board layout is quite complex, all available space is used. You can clearly see how the memory chips are centered around the GPU.
Two fan connectors are placed on the board, which allows Add-In-Board partners to make special designs that use two bigger slower fans.
ATI's Rage Theater chip has also received an overhaul and is now smaller, yet offers better image quality and more video processing features.
The Radeon HD 2900 XT is the first video card to use an 8-pin power supply connector in addition to the established 6-pin one. This allows for better power delivery, which will be handy when overclocking. To use the card you need to connect at least two 6-pin power connectors, the 6-pin will fit into the 8-pin plug leaving two pins open. However, if you plan on doing serious overclocking you may be better off using a power supply that has the new 8-pin connector. PSU manufacturers are already updating their products with companies like Corsair or be Quiet! offering free upgrades for power supplies that were recently purchased.
The voltage regulator area is very crowded, it uses Volterra's VT1165MF voltage controller, which allows software changes of the board voltage up to 2.0V - the default is 1.0V. Actually there are two of them, a second VT1165MF is used to generate the memory voltage MVDDC.
The board uses sixteen GDDR3 memory chips from Hynix (HY5RS573225A FP-1) which are rated at 1.0 ns, so should be good for 1000 MHz.
The GPU is without any markings, its color is actually silverish, but with the reflection of the lens it looks like black. The 45° rotated die approach will make it hard to use existing coolers, especially optimized watercoolers may not fit. Also it seems that on most cards the shim around the core is higher than the core in the middle, R350 anyone? This will add extra cost to any aftermarket coolers and waterblocks as well because a more complex design with a raised baseplate is required. Of course you can file down the shim to be lower than the core, so your old coolers may fit.
||AMD Athlon64 FX-60 @ 2900 MHz
(Toledo, 2x 1024 KB Cache)
ATI Radeon XPRESS 3200
||2x 1024MB G.Skill F1-4000BIU2-2GBHV CL3
||WD Raptor 360GD 36 GB
||OCZ GameXStream 700W
||Windows XP SP2
||NVIDIA: 91.47 (8500/8600/8800: 158.22)
ATI: Catalyst 7.1 (HD 2900 XT: 8.37)
- All video card results were obtained on this exact system with the exact same configuration.
- All games were set to their highest quality setting
The following resolutions were tested per benchmark:
- 1024 x 768, No Anti-aliasing, No anisotropic filtering. This is a standard resolution without demanding display settings.
- 1280 x 1024, 2x Anti-aliasing, 8x anisotropic filtering. Common resolution for most gamer flatscreens today. A bit of eye candy turned on in the drivers.
- 1600 x 1200, 4x Anti-aliasing, 16x anisotropic filter. Highest non-widescreen resolution available to a wide range of users. Very good looking driver graphics settings.
- 2048 x 1536, 4x Anti-aliasing, 16x anisotropic filter. Highest non-widescreen resolution available to any consumer video card. Very good looking driver graphics settings.
We tested the Radeon HD 2900 XT with ATI's 8.37 driver which is the official benchmarking driver for all R600 reviews. We tried 8.38 and 8.39 too, but they offer only +1% to - 1% performance difference which is within the margin of error of the benchmarks. The only gains you will see in extreme HD resolutions with highest AA and AF settings.
We did not test any DX10 titles because there are none available than can be used for competitive benchmarking. The Call Of Juarez DX10 demo works on ATI, but is having issues on NVIDIA hardware. Other than that there is not a single shipping title with DX10, so coming to a DX10 performance conclusion is not possible.
was released in early 2004 by the new development studio Crytek. It quickly became a massive success because it was one of the first titles to take you in a beautiful 3D outdoor world. Far Cry was one of the most demanding games at its time. Even with today's video cards you can still see big differences in frame rates, especially at the higher resolutions.
The first person shooter F.E.A.R
, developed by Monolith Game Studios, was released in Fall 2005 and has a great 3D engine that uses a large number of shading and shadow effects to accurately model the game world. In addition to that it features a realistic physics engine that lets you interact with many objects in the game world. The game was voted game of the year by several publications.
is based on a highly modified 3D engine made by id Software. This first person shooter brought a completely new way of gaming to the genre. In many levels you find yourself walking upside down or on the walls. This adds a completely new aspect to the gaming experience in this genre.
The Quake titles are among the most successful first person games. Developed by id Software, the famous game studio that brought you DOOM, you find yourself in a scifi world that is full of aliens and shocking effects. The main focus of the game is the single player story line. Quake 4
puts you on the home planet of the Strogg. In a number of missions you and your fellow marines will encounter all sorts of enemies, including some really huge aliens.
is a space combat/trading simulation game with beautiful graphics. The game world is gigantic and there is always something new to see. Even though the user interface is not that great, the title has found many fans that love to explore the rich content. When you are flying in your spaceship you are sometimes tempted to just stop the action to take a look at the highly detailed ships and planets.
is the number one player in the world of synthetic benchmarking. The 3DMark series is the most popular test suite for video card testing and is used by gamers, overclockers and manufacturers alike to determine how fast their hardware is. Even though it is a few years old, 3DMark03 can easily stress today's video cards.
Another benchmark from Futuremark
is 3DMark05 which comes with four completely new game tests that make massive use of shaders and lighting effects. 3DMark05 is a great test for modern video card architectures - in some tests you are often close to the 30 fps mark, below which your games will feel sluggish.
Cooling modern video cards is becoming more and more difficult, especially when users are asking for quiet cooling solutions. That's why the engineers are now paying much more attention to power consumption of new video card designs.
||AMD Athlon64 FX-60 @ 2900 MHz
(Toledo, 2x 1024 KB Cache)
ATI Radeon XPRESS 3200
||2x 1024MB G.Skill F1-4000BIU2-2GBHV CL3
||WD Raptor 360GD 36 GB
||OCZ GameXStream 700W
||Windows XP SP2
In order to characterize a video card's power consumption, the whole system's mains power draw was measured. This means that these numbers include CPU, Memory, HDD, Video card and PSU inefficiency.
The three result values are as following:
- Idle: Windows sitting at the desktop (1024x768 32-bit) all windows closed, drivers installed.
- Average: 3DMark03 Nature at 1280x1024, 6xAA, 16xAF. This results in the highest power consumption. Average of all readings (two per second) while the test was rendering (no title screen).
- Peak: 3DMark03 Nature at 1280x1024, 6xAA, 16xAF. This results in the highest power consumption. Highest single reading
As we can see the card needs quite some power, especially in idle where it consumes more power than an X1900 XTX CrossFire setup with two cards! Under load the power consumption is much better which can be attributed to ATI's new power saving technologies inside the GPU. However, compared against the GeForce 8800 GTX, ATI still has some way to go to reach performance/watt leadership.
The Radeon HD 2900 XT still uses the dreaded 2D/3D clocks to save power when the GPU is not needed for 3D operations. In 2D mode the core runs at 509 MHz, the memory at 511 MHz. The GPU voltage is at 1.00V. Once you run a full-screen application, the driver will change the clocks to 739 MHz core and 819 MHz memory, the GPU voltage is also increased to 1.15V.
Since ATITool does not work to change the clocks we had to use several different applications (which unfortunately we can't show you) to do some overclocking. Overclocking should be possible using the CCC Overdrive module, but the maximum clocks will be limited. ATITool Scan for Artifacts works, but you can't change clocks, we have an early version now that supports monitoring and voltage changes.
The overclocks are quite ok for a brand new product. In the end the card runs totally stable at 850 MHz core (14.5%) and 970 MHz memory (17.5%).
As I am told the BIOS will check the presence of an 8 pin power connector on bootup. If it is not connected, Catalyst Control Center will not allow any clock changes. This is done to ensure that enough power is delivered to the GPU. Personally I find this will drive many users away from CCC to other tools that can do it without such a limitation. Actually, I wasn't able to verify that CCC does indeed appear because I don't have any 8 pin PSUs yet.
The stock cooler tends to be a bit noisy when running 3D apps at full load, especially at higher clocks. Even with the cooler running fairly fast we saw temperatures of 58°C in idle, 72°C under load and 80°C under load with overclocking. With ATITool you can raise the fan speed, but the fan running at 100% sounds like a leaf blower.
Software can change the voltage, and all the extreme overclockers will be happy that you can go up to 2.035 V on VDDC (1.00 V default), MVDDC max. 3.400 V from 2.20 V default and MVDDQ up to 2.482 V from 1.897 V default.
While playing with voltage changes and trying to find maximum overclocks I got to a point where even the fan running at 100% could barely keep the card cool, the system was pulling measured 470 W from the main AC line. This was at 1.36V VDDC (GPU Core Voltage), in my opinion anything above 1.30V is way too much for the stock cooler to handle. Even at such an insane voltage I could "only" get 911 MHz on the GPU. There definitely seems to be a use for more extreme cooling solutions. Our friend Kinc has his card running at above 1100 MHz using extreme cooling and crazy GPU voltage (set with ATITool of course).
Value and Conclusion
- ATI's Radeon HD 2900 XT is priced at a mere $399 which is an extremely competitive price for being a highest-class performance product.
- Breathtaking $399 price point
- Support for DirectX 10, Shader Model 4.0
- HDMI + Audio output included
- New video acceleration features
- New AA Modes
- Not fastest GPU in the world as speculated by many
- Fan can be noisy
- High power consumption
I have to admit I was a bit disappointed after seeing the benchmark results of the Radeon HD 2900 XT. Even though the card is fast it can not beat the NVIDIA GeForce 8800 GTX, it should have no problems beating the GeForce 8800 GTS 320 MB though. Having a product like this AMD did the only right thing - make it affordable. With a price point of $399 USD this card is taking the price/performance crown without breaking a sweat.
Even though ATI has employed a number of power-saving techniques and heat reducing features, the heat output is still fairly high, especially when you are overclocking at a higher voltage. This also means that the fan has to run faster and louder than competing products. On the other hand watercooling will provide seriously improved overclocking potential, companies like Danger Den and Thermaltake already have the first coolers ready.
The new features like integrated Tesselator and CFAA offer new ways to improve image quality while not hurting performance too much. Full HDCP with audio support and H.264 decode acceleration make this card an excellent choice for media PC systems where image quality matters. I really hope that AMD can improve their drivers to offer increased performance, because at this price point the card has the potential to be a card that everybody can afford.