ASUS GTX 760 DirectCU II OC 2 GB Review 15

ASUS GTX 760 DirectCU II OC 2 GB Review

(15 Comments) »

Introduction

ASUS Logo


Summer is upon us, and for a lot of people, particularly those taking a break from studies, it is time to head back home, dust out the old gaming PC, and check out some of the new games on offer. Given that game studios finally put their money on DirectX 11 and cutting-edge visual technologies, it could prompt graphics upgrades. $200 to $300 is what an ideal graphics upgrade should cost. It's either that or wait until the winter Holiday for next-gen consoles. NVIDIA sensed this potential rush for summertime graphics upgrades and pulled an ace from up its sleeves, the GeForce GTX 760. We take a look at its hand.



The GeForce GTX 760 is a bit of a strangelet when it comes to market positioning. It's designed to succeed the GeForce GTX 660, but in reality, displaces the GeForce GTX 660 Ti from the product stack. With $249.99, it's priced bang in the middle of the price / performance sweet-spot segment. The product-stack roadmap given to us by NVIDIA confirms just that.



Unlike the GeForce GTX 770 that has a curious lot in common with the GeForce GTX 680, except for higher clock speeds and GPU Boost 2.0, the GeForce GTX 760 features a core configuration never implemented on a retail SKU. It's based on the same 28 nm GK104 silicon but features just six of the chip's eight streaming multiprocessors, which translates into a configuration with 1,152 CUDA cores and 96 texture memory units (TMUs). Unlike with the GTX 660 Ti, NVIDIA left the memory and raster operations subsystems untouched, giving the chip a 256-bit wide GDDR5 memory interface and 32 ROPs.

The chip features new GPU Boost 2.0 technology, which takes temperature into account, alongside power and load. If the GPU is cool enough (under the 80°C mark), there's greater opportunity for the GPU to run at boost frequencies at load and therein lies the incentive to opt for custom-design graphics cards with competent cooling solutions. The memory is clocked at 6 Gbps, yielding a decent 192 GB/s of memory bandwidth, a 33 percent increase over the GTX 660 Ti and GTX 660.



In this review, we take a look at ASUS' GTX 760 DirectCU II OC graphics card. Built on a custom-designed PCB and a new-generation DirectCU II cooler, the card offers overclocked speeds of 1006 MHz core, 1072 MHz GPU Boost, and 1502 MHz memory. The new DirectCU series cooler makes use of four heatipes and two fans to cool the chip.

GTX 760 Market Segment Analysis
 GeForce
GTX 660
Radeon
HD 7870
GeForce
GTX 580
GeForce
GTX 660 Ti
GeForce
GTX 760
ASUS GTX
760 DC II
Radeon
HD 7950
GeForce
GTX 670
Radeon
HD 7970
GeForce
GTX 770
HD 7970
GHz Ed.
Shader Units960128051213441152115217921344204815362048
ROPs2432482432323232323232
Graphics ProcessorGK106PitcairnGF110GK104GK104GK104TahitiGK104TahitiGK104Tahiti
Transistors2540M2800M3000M3500M3500M3500M4310M3500M4310M3500M4310M
Memory Size2048 MB2048 MB1536 MB2048 MB2048 MB2048 MB3072 MB2048 MB3072 MB2048 MB3072 MB
Memory Bus Width192 bit256 bit384 bit192 bit256 bit256 bit384 bit256 bit384 bit256 bit384 bit
Core Clock980 MHz+1000 MHz772 MHz915 MHz+980 MHz+1006 MHz+800 MHz915 MHz+925 MHz1046 MHz+1050 MHz
Memory Clock1502 MHz1200 MHz1002 MHz1502 MHz1502 MHz1502 MHz1250 MHz1502 MHz1375 MHz1753 MHz1500 MHz
Price$195$215$310$280$250$260$270$345$370$400$410

Architecture

The GeForce GTX 760 is based on NVIDIA's winning GK104 silicon first used in previous generation, with the newer variant driving a few SKUs too. The component hierarchy inside a GeForce GTX 760 is identical to every other chip based on the "Kepler" architecture. A memory interface, 768 KB cache, and display I/O are shared by four graphics processing clusters (GPCs) that in turn share a raster engine (combination of edge setup, rasterizer, and Z-cull) with two streaming multiprocessors. These are the building blocks of Kepler GPUs, and each hold 192 CUDA cores and specialized components.



The GTX 760 downscale is of a different kind than GeForce GTX 660 Ti. While the GTX 660 Ti had seven out of eight streaming multiprocessors (SMX, the building blocks of "Kepler" family of GPUs) enabled, it also parted with a quarter of its raster operations circuitry and memory bus width. The GeForce GTX 760, on the other hand, has just six out of eight SMXs enabled but the full complement of ROPs at 32 and the full 256-bit wide memory bus.

GeForce Experience

With GeForce 320.18 WHQL drivers, NVIDIA released the first stable version of GeForce Experience. The application simplifies game configuration for PC gamers who aren't well-versed in all the necessary technobabble required to get that game to run at the best possible settings with the hardware available to them. GeForce Experience is aptly named as it completes the experience of owning a GeForce graphics card; PCs, being the best possible way to play video games, should not be any harder to use than gaming consoles.



With your permission, the software scans your system for installed games, recommending optimal settings that give you the highest visual details at consistent, playable frame rates. The software is also optimized to reduce settings that have a big performance impact at low visual cost. You could easily perform these changes yourself in-game, probably through trial and error, but you can trust GeForce Experience to pick reasonably good settings if you are too lazy to do so yourself. I imagine the software to be particularly useful for gamers who aren't familiar with the intricacies of game configurations yet want the best possible levels of detail.

The simplicity of inserting a disc or cartridge and turning on the device is what attracts gamers to consoles. Gamers who pick the PC platform should hence never be faulted for their lack of knowledge with graphics settings, and that's what GeForce Experience addresses. Price is a non-argument. $300 gets you a console, but the same $300 can also get you a graphics card that lets you turn your parents' Dell desktop into a gaming machine that eats consoles for breakfast. GeForce Experience keeps itself up to date by fetching settings data from NVIDIA each time you run it, which will also keep your GeForce drivers up to date.

I gave GeForce Experience a quick try for Battlefield 3 and it picked a higher AA mode that was still playable in BF3, so it does value image-quality. It also takes into account the rest of the system and not just the GPU.

Packaging

Package Front
Package Back


Our Patreon Silver Supporters can read articles in single-page format.
Discuss(15 Comments)
Apr 25th, 2024 08:14 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts