MSI GTX 960 Gaming OC 2 GB Review 17

MSI GTX 960 Gaming OC 2 GB Review

(17 Comments) »

Introduction

MSI Logo


NVIDIA recently launched the GTX 970 and GTX 980 based on its new "Maxwell" architecture, torching the market with an unreal combination of performance, power-draw, and fan-noise, which made them overkill for Full HD. The two can handle any game at QHD (1440p) and can provide playable frame-rates at 4K, with some eye-candy lowered.

That creates the need for NVIDIA to come up with a GPU that's just right for Full HD, but with low power-draw and pricing to benefit from the new architecture. "Maxwell" also presents NVIDIA with an opportunity to cut costs because its current sweet-spot graphics card, the GeForce GTX 760, is based on a 3.5 billion-transistor GPU with a surface area of 294 mm² and just 24 percent less power draw than GTX 970 for 51 percent lower performance. Since NVIDIA is still on the 28 nm process, it might as well build a smaller GPU based on "Maxwell" to cut on costs, while hopefully transferring those savings to the consumer.

Full HD is still the most popular gaming-PC resolution, and the advents of 4K and affordable 1440p haven't managed to shake its dominance yet. People still seem to be buying monitors based on panel-size rather than resolution and are also happy to hold onto a monitor for several years. There are, hence, three kinds of consumers: First are those on a tight budget and the ability to buy a reasonably big (24-inch) Full HD monitor for cheap. Second are those who want to convert their vanilla desktops into gaming PCs to game on their Full HD TVs. The third kind is holding onto an old Full HD monitor and drifts between a mid-range graphics card every other year. NVIDIA and AMD can hence ill-afford to leave this resolution unaddressed with each new architecture.



That brings us to NVIDIA's most important launch ahead of Spring-Summer 2015, the GeForce GTX 960. This card is based on a brand-new silicon codenamed GM206. NVIDIA's third chip based on "Maxwell", the GM206 is supposed to be a successor to the GK106 on which NVIDIA built the GeForce GTX 660. The new GTX 960, however, is meant to replace the GTX 660 and GTX 760 in the product stack, offering slightly higher performance at much lower power draw and noise, with greater room for price-cuts. It also brings some of the new features introduced with "Maxwell" to the masses, such as real-time voxel illumination, MFAA (multi-frame sampled anti-aliasing), Dynamic Super-Resolution, VR Direct, Turf Effects, and PhysX Flex, along with community favorites like G-Sync and ShadowPlay.

NVIDIA's GeForce GTX 960 is priced at just $200, which is lower than what the GTX 760 was priced at on launch ($250). NVIDIA has clearly transferred some of the cost-savings due to a smaller chip with lower power-draw, which translates into a cheaper VRM, fewer memory chips, and a lighter cooler, to the consumers. We are also convinced that there is room for more cost-cutting.



In this review, we have with us the MSI GTX 960 Gaming OC featuring the company's signature TwinFrozr V cooling solution, MSI Gaming Series branding, and an interesting factory-overclock. Just as with the company's GTX 970 and 980, the cooler will completely turn off in idle or during light gaming, which results in the perfect noise-free experience.

In terms of pricing, MSI is sticking with NVIDIA's reference design pricing for their GTX 960 Gaming, which will have the card retail at $200.

GTX 960 Market Segment Analysis
 GeForce
GTX 660
GeForce
GTX 660 Ti
GeForce
GTX 760
GeForce
GTX 670
GeForce
GTX 960
MSI GTX
960 Gaming
Radeon
HD 7970
Radeon
R9 285
GeForce
GTX 770
GeForce
GTX 680
Radeon
R9 280X
GeForce
GTX 780
Radeon
R9 290
GeForce
GTX 970
Shader Units9601344115213441024102420481792153615362048230425601664
ROPs2424323232323232323232486464
Graphics ProcessorGK106GK104GK104GK104GM206GM206TahitiTongaGK104GK104TahitiGK110HawaiiGM204
Transistors2540M3500M3500M3500M2940M2940M4310Munknown3500M3500M4310M7100M6200M5200M
Memory Size2048 MB2048 MB2048 MB2048 MB2048 MB2048 MB3072 MB2048 MB2048 MB2048 MB3072 MB3072 MB4096 MB4096 MB
Memory Bus Width192 bit192 bit256 bit256 bit128 bit128 bit384 bit256 bit256 bit256 bit384 bit384 bit512 bit256 bit
Core Clock980 MHz+915 MHz+980 MHz+915 MHz+1127 MHz+1190 MHz+925 MHz918 MHz1046 MHz+1006 MHz+1000 MHz863 MHz+947 MHz1051 MHz+
Memory Clock1502 MHz1502 MHz1502 MHz1502 MHz1753 MHz1753 MHz1375 MHz1375 MHz1753 MHz1502 MHz1500 MHz1502 MHz1250 MHz1750 MHz
Price$140$260$210$270$200$200$350$210$300$340$220$300$270$330

Architecture

The GM206 silicon on which the GeForce GTX 960 is based looks like a GM204 that's been sawed off sideways. At 1,024, it has exactly half the CUDA cores, which it spread across two graphics processing clusters, half the memory bus width, half the TMUs at 64, and half the ROPs at 32. At 2 GB, the GTX 960 even has half the memory amount of the GTX 980. The transistor count of the GM206 is roughly 2.94 billion, which is about 17 percent less than the GK104 silicon on which the GTX 760 was based.



Don't be quick to write off the 128-bit memory bus width just yet as NVIDIA is backing it with a new lossless texture compression technology that reduces memory bandwidth usage, effectively making it a wider memory bus than it physically is. The performance figures put out by the GTX 970, with its 256-bit wide memory bus outperforming the Radeon R9 290X with its 512-bit bus, give us no reason to doubt the memory chops made to the GTX 960. It allows NVIDIA to lower costs by using just four memory chips on the PCB. Since the GTX 960 is for Full HD gaming, 2 GB shouldn't let you down with any of today's games. NVIDIA tells us that the GTX 960 has just enough muscle for DOTA 2 at 4K. That should draw scores of budding gamers who can spend just about $200 on their graphics card and $500 on a monitor to the scene, instead of messing things up with a $300 graphics card and a $400 QHD monitor.

The GM206 is structured just like a GM204 with half the number of components. At the heart of the Maxwell architecture is a redesigned streaming multiprocessor (SMM), the tertiary subunit of the GPU. Variants of NVIDIA's GeForce GTX products are carved out by setting the number of SMM units at the chip's disposal. The chip begins with a PCI-Express 3.0 x16 bus interface, a 128-bit wide GDDR5 memory interface, and a display controller that supports as many as three Ultra HD displays or five physical displays in total. This display controller introduces support for HDMI 2.0, which has enough bandwidth to drive Ultra HD displays at 60 Hz refresh rates. The controller is ready for 5K (5120x2880, four times the pixels of QuadHD). The 128-bit wide memory interface holds a standardized 2 GB of memory with a bus that has been clocked at 7.00 GHz on both the GTX 980 and GTX 970, which works out to a memory bandwidth of 112 GB/s. With NVIDIA's memory bandwidth management sauce added to the mix, the company is talking about an unscientific "effective bandwidth" figure of around 144 GB/s. The core of the GTX 960 is clocked at 1126 MHz, with a maximum GPU Boost frequency of 1178 MHz. The memory ticks in at 7.00 GHz (GDDR5-effective).



GeForce Features

With each new architecture, NVIDIA introduces innovations in the consumer graphics space that go beyond simple feature-level compatibility with new DirectX versions. NVIDIA says GeForce GTX 980 and GTX 970 cards to be DirectX 12 cards, but exact feature levels and requirements have not been finalized by Microsoft, although support for OpenGL 4.4 has also been added. OpenGL 4.4 adds a few new features through its GameWorks SDK that give game developers easy-to-implement visual features through existing APIs.



According to NVIDIA, the first and most important is VXGI, or real-time voxel global illumination. VGXI adds realism to the way light behaves with different surfaces in a 3D scene. VXGI introduces volume pixels, or voxels, a new 3D graphics component. These are pixels with built-in 3-dimensional data, so their interactions in 3D objects with light look more photo-realistic.



No new NVIDIA GPU architecture launch is complete without advancements in post-processing, particularly anti-aliasing. NVIDIA introduced an interesting feature called Dynamic Super Resolution (DSR), which it claims offers "4K-like clarity on a 1080p display". To us, it comes across as a really nice super-sampling AA algorithm with a filter.



Using GeForce Experience, you can enable DSR arbitrarily for 3D apps. The other new algorithm is MFAA (multi-frame sampled AA), which offers MSAA-like image quality at a deficit of 30 percent in performance. Using GeForce Experience, MFAA can hence be substituted for MSAA, perhaps even arbitrarily.



Moving on, NVIDIA introduced VR Direct, a technology designed for the reemerging VR headset market, due to the growing interest in Facebook's Occulus Rift VR headset. VR Direct is an API designed to reduce latency between the headset's input and the change on the display, governed by the principle that head movements are more rapid and unpredictable than pointing and clicking with a mouse.



To meet the need of a low-cost (performance cost), realistic hair- or grass-rendering technology, NVIDIA came up with Turf Effects. NVIDIA PhysX also got a much needed feature-set update that introduces new gas dynamics and fluid adhesion effects. Epic's Unreal Engine 4 will implement the technology.

GeForce Experience

With its GeForce 320.18 WHQL drivers, NVIDIA released the first stable version of GeForce Experience. The application simplifies the process of configuring a game and is meant for PC gamers who aren't well-versed in all the necessary technobabble required to get a game to run at the best-possible settings with the hardware available to them. GeForce Experience is aptly named as it completes the experience of owning a GeForce graphics card; PCs, being the best-possible way to play video games, should not be any harder to use than gaming consoles.



NVIDIA Shadow Play

GeForce Experience Shadow Play is another feature NVIDIA recently debuted. Shadow Play lets you record gaming footage or stream content in real time, with a minimal performance drop to the game you're playing. The feature is handled by GeForce Experience, which lets you set hot-keys to toggle recording on the fly; or set output, format, quality, etc.



Our Patreon Silver Supporters can read articles in single-page format.
Discuss(17 Comments)
Jul 4th, 2022 23:36 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts