NVIDIA GeForce GTX TITAN 6 GB Review 190


Architecture »



Had someone told us NVIDIA was going to launch the GeForce GTX Titan around this time a month ago, we'd have politely asked them to go jump off a skyscraper. Why? Because NVIDIA simply doesn't need it in its product stack at this time. The performance lead AMD's HD 7970 GHz Edition has over the GeForce GTX 680 is disputed at best, and the latter ships with better energy-efficiency and acoustics. The GeForce GTX 670 continues to beat the HD 7950 Boost Edition and the GTX 660 Ti trades blows with the HD 7950, with much better power/noise numbers. It's only with the "Pitcairn" based HD 7800 series that AMD appears to have a solid footing. Even at the ultra high-end segment, the dual-GPU GeForce GTX 690 scales near perfectly over the GTX 680. So what prompted NVIDIA to rush out the GTX Titan? Is it even being rushed out to begin with? We'll have to look back at 2012 for some answers.

When AMD launched its Radeon HD 7970 in December 2011, it appeared for a brief moment as though AMD was set for 2012. Brief, because there was more than just arrogance in NVIDIA's dismissal of AMD's new flagship GPU and the architecture that drives it. NVIDIA's "Kepler" GPU architecture was designed under the assumption that the HD 7970 would be much faster than it ended up being, so the company realized its second best chip, the GK104, had a fair shot against the HD 7900 series.

The GK104 really was just a successor of the GF114 that drives the performance-segment GeForce GTX 560 Ti. What followed was a frantic attempt by NVIDIA to re-package the GK104 into a high-end product, the GeForce GTX 680, while shelving its best but expensive chip, the GK110 (which drives the GTX Titan we're reviewing today). The gambit paid off when the GTX 680 snatched the performance crown from the HD 7970 in March. AMD may have responded with the faster HD 7970 GHz Edition in June, but it flunked energy-efficiency and fan-noise tests big time. The GK110 wore a business suit before a t-shirt, since NVIDIA built the Tesla K20 GPU compute accelerator that powers the Titan supercomputing array out of it. Normally, you'd want your ASIC to pass consumer applications before enterprise ones, and Intel usually sells Core processors on new silicon before Xeon. The Titan supercomputer, by the way, is where the GTX Titan got its name from.

2013 is more than just another year for PC GPU makers. It's when game console giants Microsoft, Sony and Nintendo each plan to ship their next-generation game consoles. The WiiU is already out, the PlayStation 4 was unveiled yesterday, and the Xbox "Durango" could follow closely. A rare commonality between the three is that they're all rumored to be driven by AMD graphics. Although we don't expect either of those consoles to match the graphics processing prowess of even PCs from 2010, their mere introduction could draw an entire generation of gamers to adopt them, which could be bad for the PC platform. It's now more than ever that we need some action in the PC hardware scene, even if it means launching hardware such as the GTX Titan, which the product stack doesn't really need.

There's yet another factor at play. This one is more local to the PC platform. AMD recently announced the Never Settle Reloaded offer in which most of its performance-thru-extreme graphics cards across AIB partners ship with games that are extremely relevant to the season. These include Crysis 3, Tomb Raider (the new one), Bioshock Infinite, and DmC: Devil May Cry; CrossFire HD 7900 series buyers are rewarded with up to six games (add Far Cry 3, Hitman Absolution, and Sleeping Dogs to that mix). In comparison, NVIDIA's "Free to Play bundle" which gives you about $25~$50 worth of in-game currency (per game) with free-to-play games such as Hawken, World of Tanks, and Planetside 2, only makes NVIDIA look bad. AMD, in a recent teleconference, called the GeForce GTX Titan a reaction by NVIDIA to "Never Settle Reloaded" and AMD getting cozy with leading game studios in general.

Introduction of the GeForce GTX Titan at this time could, hence, be a product of unorthodox but effective market foresight on NVIDIA's part.

In this review, we will take a single GeForce GTX Titan for a spin by taking it apart and exploring some of its new features. We have also posted a GeForce GTX Titan SLI and Tri-SLI review.

GeForce GTX Titan Market Segment Analysis
GTX 580
HD 7970
HD 7970
GHz Ed.
GTX 680
GTX 590
GTX Titan
GTX 690
Shader Units5122048204815362x 51226882x 1536
ROPs483232322x 48482x 32
Graphics ProcessorGF110TahitiTahitiGK1042x GF110 GK1102x GK104
Transistors3000M4310M4310M3500M2x 3000M7100M2x 3500M
Memory Size1536 MB3072 MB3072 MB2048 MB2x 1536 MB6144 MB2x 2048 MB
Memory Bus Width384 bit384 bit384 bit256 bit2x 384 bit384 bit2x 256 bit
Core Clock772 MHz925 MHz1050 MHz1006 MHz+607 MHz837 MHz+915 MHz+
Memory Clock1002 MHz1375 MHz1500 MHz1502 MHz855 MHz1502 MHz1502 MHz
Next Page »Architecture
View as single page
Jul 3rd, 2022 04:59 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts