NVIDIA GeForce GTX 560 Ti 1 GB Review 60

NVIDIA GeForce GTX 560 Ti 1 GB Review

(60 Comments) »

GTX 560 Ti Review Introduction

NVIDIA Logo


Here you are, the NVIDIA GeForce GTX 560 Ti launch pilot review. Making use of NVIDIA's proven second generation Fermi architecture, this GPU targets a key price point which has been NVIDIA and ATI's hunting ground since 2006. The sub-$300 segment is where customers have learned to expect high-end like performance and features at compelling prices. Price-performance ratio is the king here. The performance-price sweet spot is a virtual G-spot for GPU manufacturers. Whoever hits it right, gets loads of...sales. Veterans in this segment include the GeForce 7900 GT, Radeon X1950 Pro, GeForce 8800 GT, Radeon HD 4850, GeForce GTX 260-216, GeForce GTX 460, Radeon HD 6870, and in comes the latest contender, the GeForce GTX 560 Ti. The model name invokes some nostalgia as SKUs carrying the "Ti" marker were some of NVIDIA's first with programmable shaders. While the GeForce GTX 560 Ti isn't a "first" in anything as far as feature-set goes, I think "Ti" has more to do with shaping up the brand, telling buyers that the product has a little more to offer for its price, and that it's a step above the price point GTX 460 set for itself, while remaining to be a performance segment model.

Getting into the fine print of NVIDIA's offer, the GeForce GTX 560 Ti is based on NVIDIA's new GF114 chip. As far as its specifications and transistor-count go, the GeForce GTX 560 Ti is identical to the GF104 on which GTX 460 was based, except that it has all 384 of the CUDA cores physically present enabled, and that it uses the same secret-sauce (read: electrical enhancements) that made GF110, an evolved clone of the GF100, totally rock with power consumption figures. 384 CUDA cores apart, there's a 256-bit wide GDDR5 memory interface, 32 ROPs, branched geometry processing, and the immediate fruition of the electrical enhancements, clock speeds: 822 MHz core, 1640 MHz CUDA cores, and 1000 MHz (4.00 GHz effective) memory. As far as features go, the GeForce GTX 560 Ti doesn't come with anything we haven't seen already with the GTX 460, it's all about performance per watt/dollar in this round.

GeForce
GTX 460
GeForce
GTX 460
Radeon
HD 6850
Radeon
HD 5850
GeForce
GTX 470
Radeon
HD 6870
GeForce
GTX 560 Ti
Radeon
HD 5870
GeForce
GTX 570
GeForce
GTX 480
GeForce
GTX 580
Radeon
HD 5970
Shader units 3363369601440448112038416004804805122x 1600
ROPs24323232403232324048482x 32
GPUGF104GF104BartsCypressGF100BartsGF114CypressGF110GF100GF1102x Cypress
Transistors1950M1950M1700M2154M3200M1700M1950M2154M3000M3200M3000M2x 2154M
Memory Size768 MB1024 MB1024 MB1024 MB1280 MB1024 MB1024 MB1024 MB1280 MB1536 MB1536 MB2x 1024 MB
Memory Bus Width 192 bit 256 bit 256 bit 256 bit 320 bit 256 bit 256 bit 256 bit 320 bit 384 bit 384 bit 2x 256 bit
Core Clock675 MHz 675 MHz 775 MHz 725 MHz 607 MHz 900 MHz 823 MHz 850 MHz 732 MHz 700 MHz 772 MHz 725 MHz
Memory Clock900 MHz 900 MHz 1000 MHz 1000 MHz 837 MHz 1050 MHz 1002 MHz 1200 MHz 950 MHz 924 MHz 1002 MHz 1000 MHz
Price$160$200$180$260$260$240$250$360$330$450$500$580

The Card

Graphics Card Front
Graphics Card Back

NVIDIA's GTX 560 reference design comes with a black cooler and black PCB, but board partners are free to customize all aspects of the card to their positioning and product strategy.

Graphics Card Height

GeForce GTX 560 requires two slots in your system.

Monitor Outputs, Display Connectors

The card has two DVI ports and one mini-HDMI port. Unlike AMD's latest GPUs, the output logic design is not as flexible. On AMD cards vendors are free to combine six TMDS links into any output configuration they want (dual-link DVI consuming two links), on NVIDIA, you are fixed to two DVI outputs and one HDMI/DP in addition to that. NVIDIA confirmed that you can use only two displays at the same time, so for a three monitor setup you would need two cards.

NVIDIA has included an HDMI sound device inside their GPU which does away with the requirement of connecting an external audio source to the card for HDMI audio. The HDMI interface is HDMI 1.3a compatible which includes Dolby TrueHD, DTS-HD, AC-3, DTS and up to 7.1 channel audio with 192 kHz / 24-bit. NVIDIA also claims full support for the 3D portion of the HDMI 1.4 specification which will become important later this year when we will see first Blu-Ray titles shipping with support for 3D output.


You may combine up to two GeForce GTX 560 cards in SLI for increased performance or improved image quality settings.

Graphics Card Teardown PCB Front
Graphics Card Teardown PCB Back

Here are the front and the back of the card, high-res versions are also available (front, back). If you choose to use these images for voltmods etc, please include a link back to this site or let us post your article.

A Closer Look

Graphics Card Cooler Front
Graphics Card Cooler Back

NVIDIA's cooler uses a vapor-chamber technology heatplate to maximize heat transfer between the GPU and the rest of the heatsink. You can also see above that the heatsink cools secondary components like voltage regulation circuitry and memory chips.

Graphics Card Power Plugs

The card has two 6-pin PCI-Express power connectors.

Graphics Card Memory Chips

The GDDR5 memory chips are made by Samsung, and carry the model number K4G10325FE-HC04. They are specified to run at 1250 MHz (5000 MHz GDDR5 effective).


NVIDIA has chosen an ONSemi NCP5388 for their board. While it does not support voltage control via I2C, NVIDIA's driver exposes a voltage control interface that works via VID.

Graphics Chip GPU

NVIDIA's GeForce 114 graphics processor is made on a 40 nm process at TSMC Taiwan. It uses the same architecture as NVIDIA's GF 104 but with improvements on the transistor level to reduce power consumption.

Test System

Test System - VGA Rev. 12
CPU:Intel Core i7 920 @ 3.8 GHz
(Bloomfield, 8192 KB Cache)
Motherboard:Gigabyte X58 Extreme
Intel X58 & ICH10R
Memory:3x 2048 MB Mushkin Redline XP3-12800 DDR3
@ 1520 MHz 8-7-7-16
Harddisk:WD Caviar Black 6401AALS 640 GB
Power Supply:akasa 1200W
Software:Windows 7 64-bit
Drivers:GTX 560: 266.56
GTX 570 & 580: 263.09
NVIDIA: 260.99
ATI: Catalyst 10.11
Display: LG Flatron W3000H 30" 2560x1600
Benchmark scores in other reviews are only comparable when this exact same configuration is used.
  • All video card results were obtained on this exact system with the exact same configuration.
  • All games were set to their highest quality setting
Each benchmark was tested at the following settings and resolution:
  • 1024 x 768, No Anti-aliasing. This is a standard resolution without demanding display settings.
  • 1280 x 1024, 2x Anti-aliasing. Common resolution for most smaller flatscreens today (17" - 19"). A bit of eye candy turned on in the drivers.
  • 1680 x 1050, 4x Anti-aliasing. Most common widescreen resolution on larger displays (19" - 22"). Very good looking driver graphics settings.
  • 1920 x 1200, 4x Anti-aliasing. Typical widescreen resolution for large displays (22" - 26"). Very good looking driver graphics settings.
  • 2560 x 1600, 4x Anti-aliasing. Highest possible resolution for commonly available displays (30"). Very good looking driver graphics settings.
Our Patreon Silver Supporters can read articles in single-page format.
Discuss(60 Comments)
May 1st, 2024 13:57 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts