Together with the Radeon X1950 XTX, ATI introduced additional new cards at the same time: the X1300 XT and the X1650 XT. Today we will take a look at the X1300 XT.
This refresh introduces a new GPU into the X1300 product lineup. In the past only the RV515 GPU has been used. On the X1300 XT the RV530 is in use which was limited to the X1600 series until now. This means that the performance of the X1300 XT should be a lot closer to X1600 levels. This performance upgrade makes sense because NVIDIA has several products directly competing with the X1300 and the RV515 is slower than many of these cards.
<table border="1" class="resulttable" cellspacing="0" cellpadding="3">
<td>X1300 Pro </td>
<td>X1300 XT </td>
<td><strong>X1300 XT OC </strong></td>
<td>X1600 Pro </td>
<th>Memory Type </th>
<td align="right">450 MHz</td>
<td align="right">600 MHz </td>
<td align="right">500 MHz </td>
<td align="right"><strong>575 MHz </strong></td>
<td align="right">500 MHz </td>
<td align="right">250 MHz</td>
<td align="right">400 MHz </td>
<td align="right">400 MHz </td>
<td align="right"><strong>700 MHz </strong></td>
<td align="right">400 MHz </td>
The card tested here is Sapphire's Overclocked Edition X1300 XT. This means that the clock speeds have been increased on both the core and the memory to bring them closer to the X1600. Also the memory type has been bumped to GDDR3.
- 105 million transistors on 90nm fabrication process
- Four pixel shader processors
- Two vertex shader processors
- 128-bit 4-channel DDR/DDR2/GDDR3 memory interface
- 32-bit/1-channel, 64-bit/2-channel, and 128-bit/4-channel configurations
- Native PCI Express x16 bus interface
- AGP 8x configurations also supported with external bridge chip
- Dynamic Voltage Control
High Performance Memory Controller
- Fully associative texture, color, and Z/stencil cache designs
- Hierarchical Z-buffer with Early Z test
- Lossless Z Compression (up to 48:1)
- Fast Z-Buffer Clear
- Z/stencil xache optimized for real-time shadow rendering
Ultra-Threaded Shader Engine
- Support for Microsoft® DirectX® 9.0 Shader Model 3.0 programmable vertex and pixel shaders in hardware
- Full speed 128-bit floating point processing for all shader operations
- Up to 128 simultaneous pixel threads
- Dedicated branch execution units for high performance dynamic branching and flow control
- Dedicated texture address units for improved efficiency
- 3Dc+ texture compression
- High quality 4:1 compression for normal maps and two-channel data formats
- High quality 2:1 compression for luminance maps and single-channel data formats
- Multiple Render Target (MRT) support
- Render to vertex buffer support
- Complete feature set also supported in OpenGL® 2.0
Advanced Image Quality Features
- 64-bit floating point HDR rendering supported throughout the pipeline
- Includes support for blending and multi-sample anti-aliasing
- 32-bit integer HDR (10:10:10:2) format supported throughout the pipeline
- Includes support for blending and multi-sample anti-aliasing
- 2x/4x/6x Anti-Aliasing modes
- Multi-sample algorithm with gamma correction, programmable sparse sample patterns, and centroid sampling
- New Adaptive Anti-Aliasing feature with Performance and Quality modes
- Temporal Anti-Aliasing mode
- Lossless Color Compression (up to 6:1) at all resolutions, including widescreen HDTV resolutions
- 2x/4x/8x/16x Anisotropic Filtering modes
- Up to 128-tap texture filtering
- Adaptive algorithm with Performance and Quality options
- High resolution texture support (up to 4k x 4k)
Avivo™ Video and Display Platform
- High performance programmable video processor
- Accelerated MPEG-2, MPEG-4, DivX, WMV9, VC-1, and H.264 decoding and transcoding
- DXVA support
- De-blocking and noise reduction filtering
- Motion compensation, IDCT, DCT and color space conversion
- Vector adaptive per-pixel de-interlacing
- 3:2 pulldown (frame rate conversion)
- Seamless integration of pixel shaders with video in real time
- HDR tone mapping acceleration
- Maps any input format to 10 bit per channel output
- Flexible display support
- Dual integrated DVI transmitters (one dual-link + one single-link)
- DVI 1.0 compliant / HDMI interoperable and HDCP ready*
- Dual integrated 10 bit per channel 400 MHz DACs
- 16 bit per channel floating point HDR and 10 bit per channel DVI output
- Programmable piecewise linear gamma correction, color correction, and color space conversion (10 bits per color)
- Complete, independent color controls and video overlays for each display
- High quality pre- and post-scaling engines, with underscan support for all outputs
- Content-adaptive de-flicker filtering for interlaced displays
- Xilleon™ TV encoder for high quality analog output
- YPrPb component output for direct drive of HDTV displays**
- Spatial/temporal dithering enables 10-bit color quality on 8-bit and 6-bit displays
- Fast, glitch-free mode switching
- VGA mode support on all outputs
- Drive two displays simultaneously with independent resolutions and refresh rates
- Compatible with ATI TV/Video encoder products, including Theater 550
- Multi-GPU technology
- Inter-GPU communication over PCI Express (no interlink hardware required)
- Four modes of operation:
- Alternate Frame Rendering (maximum performance)
- Supertiling (optimal load-balancing)
- Scissor (compatibility)
- Super AA 8x/10x/12x/14x (maximum image quality)
On the front you see one of Sapphire's lovely aliens trying to get you interested in the packaging. The most important features are noted on the front of the box, so that potential shoppers can find out what they want to know.
On the backside are more specs and an alien in a rather interesting pose I must say.
Even though the box is big, there is a lot of unused space - small boxes do not sell as good as big ones.
Inside the box you will find:
- X1300 XT Video Card
- Instruction Manual
- Power DVD 6
- Component TV adapter
- S-Video cable
- DVI Adapter
We tested the Lite Retail version which is cheaper than full retail and does not include any games that nobody plays anyways.
The card comes with Sapphire's blue PCB. Some users may like it, some may not. At least it shows that Sapphire tries to have something innnovative over all the other board partners who have the red ATI PCBs. I wonder, will the new AMD-ATI PCBs be green now?
The backside is pretty clean. The memory chips have been moved to the other side for a while now. While faster cards need an extra power connector to get the juice they need, this card will get all its power via the PCI-Express bus.
This card has one analog and one DVI connector. This makes sense because most budget card users do not have a TFT or a TFT with DVI input. If you need to connect two analog displays you can use the included DVI adapter.
Sapphire's video card cooler is an all aluminum construction which is cheaper and lighter. Even though the fan design is remotely similar to the high-end video cards it is definitely less powerful. For example hot air does not get exhausted like on the X1900 series, but stays inside the case. The heat output of the card is probably so small that there is no point to add the extra engineering and cost required.
The fan sucks in air from above the fan and exhausts the warm air at the backside. Sapphire did not put a temperature sensor on the card, so temperature based fan control is not possible. This means that the fan will always run at the same speed, no matter if in idle or under load.
A Closer Look
Sapphire's cooler does not cool the memory. There is no contact between the memory chips and the cooler base.
After removing the cooler I saw this mess of thermal paste on the GPU. I find it surprising that the card even worked at all, considering several capacitors were fully covered in thermal paste.
This is the X1300 XT and the X1650 Pro side by side. As you can see it's the same PCB. Both cards share the same GPU and memory type so the performance differences should be almost non-existent.
The PCB number is exactly the same like on the X1650 Pro, 109-A67131-00A.
Sapphire uses 1.3 ns GDDR3 memory from Infineon with the model number HYB18H512321AF.
After a lot of cleaning up, this is the GPU core. Unfortunately it does not have a product name marking. But the GPU is either RV530 or RV535 which is an 80nm die shrink version.
<table border="1" cellpadding="3" cellspacing="0" class="ramtable" width="450">
<th colspan="2" scope="row" style="font-size:larger;text-align:center">Test System</th>
<th width="150" scope="row">CPU:</th>
<td scope="row">AMD Athlon64 3000+ @ 2225 MHz<br />(Venice, 512 KB Cache)</td>
<td scope="row">ABIT AT8, BIOS 1.1<br />ATI Radeon XPRESS 200</td>
<td scope="row">2x 1024MB G.Skill F1-4000BIU2-2GBHV CL3</td>
<th valign="top" scope="row">Harddisk:</th>
<td valign="top" scope="row">WD Raptor 360GD 36 GB</td>
<th valign="top" scope="row">Power Supply:</th>
<td valign="top" scope="row">OCZ GameXStream 700W</td>
<th valign="top" scope="row">Software:</th>
<td valign="top" scope="row">Windows XP SP2</td>
<th valign="top" scope="row">Drivers:</th>
<td valign="top" scope="row">NVIDIA: 91.47<br />ATI: Catalyst 6.9</td>
- All video card results were obtained on this exact system with the exact same configuration.
- All games were set to their highest quality setting
Three resolutions were tested per benchmark:
- 1024 x 768, No Anti-aliasing, No anisotropic filtering. This is a standard resolution without demanding display settings.
- 1280 x 1024, 2x Anti-aliasing, 8x anisotropic filtering. Common resolution for most gamer flatscreens today. A bit of eye candy turned on in the drivers.
- 1600 x 1200, 4x Anti-aliasing, 16x anisotropic filter. Highest non-widescreen resolution available to a wide range of users. Very good looking driver graphics settings.
was released in early 2004 by the new development studio Crytek. It quickly became a massive success because it was one of the first titles to take you in a beautiful 3D outdoor world. Far Cry was one of the most demanding games at its time. Even with today's video cards you can still see big differences in frame rates, especially at the higher resolutions.
The first person shooter F.E.A.R
, developed by Monolith Game Studios, was released in Fall 2005 and has a great 3D engine that uses a large number of shading and shadow effects to accurately model the game world. In addition to that it features a realistic physics engine that lets you interact with many objects in the game world. The game was voted game of the year by several publications.
is based on a highly modified 3D engine made by id Software. This first person shooter brought a completely new way of gaming to the genre. In many levels you find yourself walking upside down or on the walls. This adds a completely new aspect to the gaming experience in this genre.
The Quake titles are among the most successful first person games. Developed by id Software, the famous game studio that brought you DOOM, you find yourself in a scifi world that is full of aliens and shocking effects. The main focus of the game is the single player story line. Quake 4
puts you on the home planet of the Strogg. In a number of missions you and your fellow marines will encounter all sorts of enemies, including some really huge aliens.
is a space combat/trading simulation game with beautiful graphics. The game world is gigantic and there is always something new to see. Even though the user interface is not that great, the title has found many fans that love to explore the rich content. When you are flying in your spaceship you are sometimes tempted to just stop the action to take a look at the highly detailed ships and planets.
is the number one player in the world of synthetic benchmarking. The 3DMark series is the most popular test suite for video card testing and is used by gamers, overclockers and manufacturers alike to determine how fast their hardware is. Even though it is a few years old, 3DMark03 can easily stress today's video cards.
Another benchmark from Futuremark
is 3DMark05 which comes with four completely new game tests that make massive use of shaders and lighting effects. 3DMark05 is a great test for modern video card architectures - in some tests you are often close to the 30 fps mark, below which your games will feel sluggish.
Cooling modern video cards is becoming more and more difficult, especially when users are asking for quiet cooling solutions. That's why the engineers are now paying much more attention to power consumption of new video card designs.
To measure power consumption the whole system's mains power draw was measured. This means that these numbers include CPU, Memory, HDD, Video card and PSU inefficiency.
The load value was obtained by running 3DMark03 Nature at 1280x1024, 6xAA, 16xAF. This results in the highest power consumption. While the test was running, power consumption was recorded. The highest reading is listed in the following graph.
Given the small power consumption you do not need to worry about having a powerful PSU when using this card. Even the cheapest low quality unit should be able to deliver the required power. However, compared to the power consumption of NVIDIA, ATI still needs more power. For example, the 7600 GT is considerably faster than the X1300 XT, yet it needs even less power.
There are no different 2D/3D voltages implemented in this card. It will always run at 575 MHz core and 700 MHz memory. You don't need to install any Sapphire software to run at their increased clocks.
Even though ATI's spec list states "Dynamic Voltage Control", there is no way to change the voltages via software. This means that if you need more juice you have to do a soldering voltmod.
We used ATITool to automatically find the maximum core and memory clocks (yes it works!) of our card.
In the end the card runs completely stable at 621 MHz Core (8 % overclock) and 806 MHz Memory (15 % overclock). Compared to ATI's default specifications this is an overclock of 24 % / 101 %.
For an already overclocked card this overclock is pretty amazing I must say. When I think of "pre-overclocked" I think of "card running at its maximum". Sapphire did a great job here, having some clock headroom means that the card will work stable even in hot cases or hot environments.
Value and Conclusion
<table width="100%" cellpadding="5" cellspacing="0" id="result">
- Expect the price of this card a bit above $100. For this price it's a great deal I must say.
- Good performance
- Good price
- Stock overclocked
- GDDR3 Memory
- Crossfire capable
- No temperature monitoring
- No automatic/manual fan control
- Memory is not cooled
<td>When you think "X1300 XT" you would expect just a small performance increase compared to the X1300 Pro since the marketing names are so close. What ATI did however, is give you X1600 performance for X1300 price. The performance difference to the X1300 is pretty amazing, and the performance difference to the X1600 is so small that there is not really a reason to spend more money for the X1600.<br />
Compared to NVIDIA's a bit less expensive 7300 GT, the card is definitely faster, the difference is big enough to be noticed not only in benchmark number but actual gameplay.<br />
Sapphire made this product even more attractive by increasing the clocks a good amount, bringing the card even closer to X1600 performance. The card is Crossfire capable without any connector cable or dongle which may be a good upgrade path. Even with the factory overclock there is still some potential left for your own overclocking adventures. Overall the Sapphire X1300 XT Overclock Edition is a very solid product that I would buy any day when looking for a budget video card. Having Shader Model 3.0 and AVIVO support does make a difference compared to the X800 series, especially if you are looking into running Windows Vista [maybe not so] soon.<br />
The only weak point I see is that the card's cooler does not perform so great. I really miss a temperature controlled fan there. This can be easily overcome by getting a cheap aftermarket cooler. Remember, the heat output is nowhere near the high-end cards' so almost any cooler should work great.