NVIDIA GeForce GTX 770 2 GB Review 83

NVIDIA GeForce GTX 770 2 GB Review

(83 Comments) »

Introduction

NVIDIA Logo


After NVIDIA scored a big technological victory over AMD in peformance-per-watt and pure performance with its "Kepler" GPU architecture, and AMD's rather lukewarm response to the GeForce GTX 600 series coupled with the company's intention not to launch its next GPU generation until much later this year (think X'mas), it was only natural of NVIDIA to milk its existing GK104 silicon for another generation of GeForce GTX products, with a few superficial additions. The GeForce GTX 770 we have with us today is the first of many such products in NVIDIA's pipeline over the next few months.

The 2880-core GK110 was always going to be NVIDIA's most flexible chip. With GK104 beating AMD's "Tahiti" in single-GPU performance and efficiency, the GK110 never had to feature in the GeForce GTX 600 series. It made its consumer debut with only 2688 enabled cores on the GeForce GTX TITAN, and warranted a $1000 price point, which went on to make the $650 the 2304-core GeForce GTX 780 commanded look good. The rest of NVIDIA's GeForce GTX 700 series product stack is in for a pseudo-upgrade. The GeForce GTX 770 looks a lot like the GeForce GTX 680 on paper, and it is rumored that the GeForce GTX 760 Ti could bear a similar resemblance to the GeForce GTX 670, the GTX 760 to the GTX 660 Ti, and so on. I call this card a pseudo-upgrade because its specification increases don't come at the same price. The GTX 770 is priced roughly on par with the GTX 680, and other models in the series could feature similar pricing trends.

To be fair to NVIDIA, the GeForce GTX 770 isn't a complete and utter rebranding of the GeForce GTX 680 (à la GeForce 8800 GT to 9800 GT). Sure, it is driven by the same GK104 silicon with the same exact core configuration of 1536 cores, 128 TMUs, 32 ROPs, and a 256-bit wide memory interface; but it features a different reference-design PCB that comes with a stronger VRM to support higher clock speeds, and the new GPU Boost 2.0 technology. The similarities the GTX 770 bears to the GTX 680 are in that sense more along the lines of those between the GeForce 8800 GTS-512 and GeForce 9800 GTX.



The GeForce GTX 770 ships with the highest reference clock speeds of any NVIDIA GPU to date. Its core is clocked at 1046 MHz, with a GPU Boost frequency of 1085 and a blisteringly fast 7.00 GHz memory that churns out 224 GB/s of memory bandwidth. The card features 2 GB of memory, but 4 GB variants could come out pretty soon.

The best part about the GeForce GTX 770, which could go a long way in making people forget its commonalities with the GTX 680, is its product design. The GTX 770 features the same sexy cooling solution as the GeForce GTX 780 and GTX TITAN, combining a space-age design with exotic materials, such as magnesium alloys. Installed into a system, the card would be tough to distinguish from a GTX 780 or GTX TITAN.

GTX 770 Market Segment Analysis
 GeForce
GTX 570
GeForce
GTX 580
GeForce
GTX 660 Ti
GeForce
GTX 670
Radeon
HD 7970
GeForce
GTX 770
HD 7970
GHz Ed.
GeForce
GTX 680
GeForce
GTX 780
GeForce
GTX Titan
Shader Units48051213441344204815362048153623042688
ROPs40482432323232324848
Graphics ProcessorGF110GF110GK104GK104TahitiGK104TahitiGK104GK110 GK110
Transistors3000M3000M3500M3500M4310M3500M4310M3500M7100M7100M
Memory Size1280 MB1536 MB2048 MB2048 MB3072 MB2048 MB3072 MB2048 MB3072 MB6144 MB
Memory Bus Width320 bit384 bit192 bit256 bit384 bit256 bit384 bit256 bit384 bit384 bit
Core Clock732 MHz772 MHz915 MHz+915 MHz+925 MHz1046 MHz+1050 MHz1006 MHz+863 MHz+837 MHz+
Memory Clock950 MHz1002 MHz1502 MHz1502 MHz1375 MHz1753 MHz1500 MHz1502 MHz1502 MHz1502 MHz
Price$250$310$280$370$380$400$450$430$650$1020

Architecture

As mentioned in the introduction, the GeForce GTX 770 has a lot in common with the GeForce GTX 680. They're both based on the same 28 nm GK104 silicon, circa March 2012. The chip is based on the "Kepler" microarchitecture, with four independent graphics processing clusters (GPCs) that each has two streaming multiprocessors (SMXs) with 192 CUDA cores a piece. Each SMX holds 16 texture memory units (TMUs) that make up the chip's 128 TMUs. The GK104 features a 256-bit wide GDDR5 memory interface, which raised the bar for memory clock speed with the GTX 680 as the first card to feature memory clocked at 6 GHz. A slightly better VRM allowed the GTX 770 to raise that bar yet again, with a 7 GHz out-of-the-box memory clock offering 224 GB/s bandwidth.



The new reference-design PCB allowed NVIDIA to, aside from giving the GK104 a strong enough VRM to maintain those high clock speeds, deploy its brand new GPU Boost 2.0 technology, which makes higher GPU core clock speeds available to demanding applications by taking into account not only power draw but also GPU temperatures. Lower operating temperatures are rewarded with better boosting opportunities for the GPU, which creates a real incentive to buy cards with better-performing cooling solutions than NVIDIA's reference-design.

GeForce Experience

With last week's GeForce 320.18 WHQL drivers, NVIDIA released the first stable version of GeForce Experience. The application simplifies game configuration for PC gamers who aren't well-versed in all the necessary technobabble required to get that game to run at the best possible settings with the hardware available to them. GeForce Experience is aptly named as it completes the experience of owning a GeForce graphics card; PCs, being the best possible way to play video games, should not be any harder to use than gaming consoles.



With your permission, the software scans your system for installed games, recommending optimal settings that give you the highest visual details at consistent, playable frame-rates. The software is also optimized to reduce settings that have a big performance impact at low visual cost. You could easily perform these changes yourself in-game, probably through trial and error, but you can trust GeForce Experience to pick reasonably good settings if you are too lazy to do so yourself. I imagine the software to be particularly useful for gamers who aren't familiar with the intricacies of game configurations yet want the best possible levels of detail.

The simplicity of inserting a disc or cartridge and turning the device on is what attracts gamers to consoles. Gamers who pick the PC platform should hence never be faulted for their lack of knowledgeable with graphics settings, and that's what GeForce Experience addresses. Price is a non-argument. $300 gets you a console, but the same $300 can also get you a graphics card that lets you turn your parents' Dell desktop into a gaming machine that eats consoles for breakfast. GeForce Experience keeps itself current by fetching settings data from NVIDIA each time you run it, which will also keep your GeForce drivers up to date.

I gave GeForce Experience a quick try for Battlefield 3 and it picked a higher AA mode that was still playable in BF3, so it does value image-quality. It also takes into account the rest of the system, and not just the GPU.

Packaging

Package Front


We received our card in a black, shiny NVIDIA package that seems to act as a placeholder for the final packaging design. I have to admit that this looks much better than a card wrapped in bubble wrap.

Final retail cards will include the usual goodies, like power cables, DVI adapter, and game coupons.
Our Patreon Silver Supporters can read articles in single-page format.
Discuss(83 Comments)
Apr 24th, 2024 16:25 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts