GPU Boost 2.0 continued »

GPU Boost 2.0

NVIDIA updated its GPU Boost technology with the GTX Titan. First introduced with the GeForce GTX 680 (detailed here, read it if you are not familiar with GPU Boost), GPU Boost replaces orthodox methods of clock speed control with speed-ranges between a nominal clock speed and a boost frequency. If the GPU senses an app could do with more performance, power-draw permitting, it automatically overclocks the graphics card. With GPU Boost 2.0, temperature is also taken into account to give the GPU dynamic overclocking headroom.

The temperature-based boost limits could help enthusiasts who use extreme cooling methods, such as liquid nitrogen, by reducing the control of power-based boost, which would allow them to achieve higher clock speeds as long as they keep the GPU within a temperature limit. Also introduced is over-voltage: it allows you to manually adjust the GPU core voltage. On-the-fly adjustments are also possible, to stabilize your overclock offsets.

The following graph shows how changes in GPU temperature drive the selected clock. We tested this with a static scene that renders the same scene each frame, which results in a constant GPU and memory load that would otherwise not be possible.

GPU clock is plotted on the vertical axis using the blue MHz scale on the left. Temperature is plotted on the vertical axis using the red °C scale on the right. Time is run on the horizontal axis.

As you can see, clock behavior is fundamentally different to how Boost 1.0 behaved. The card immediately goes to its maximum boost clock (993 MHz) and stays there as long as temperature allows. Once the card reaches the temperature target of 80°C, Boost 2.0 will quickly dial down frequencies to slow down the temperature increase after which there is a brief period where Boost 2.0 will try to increase clocks again in hopes of a less demanding game scene, which could allow for higher clocks again. Once that proves futile (we used a static scene), clocks are dropped down to the base clock levels of 836 MHz to keep temperature at around 80°C. As the temperature rises, clocks stay at base clock. Only once the GPU reaches 95°C do clocks go down to 418 MHz to avoid damaging the card. Once the card reaches 100°C, it is shut off to prevent any damage (we had to block the fan intake for the card to actually run that hot).

Voltage increase

With Titan, NVIDIA introduces the option of voltage control called "overvoltaging". This lets enthusiasts unlock extra voltage in software to facilitate additional overclocking.

Using EVGA Precision, we increased the GPU voltage by the maximum level available (+0.038 V up to 1.20 V). We did not increase clock speeds, the power target, temperature target, or any other setting.

In all the following graphs, the blue line shows the performance improvement (or reduction) of the GTX Titan in comparison to its baseline performance at 80°C (black line). We used our test suite at 1920x1200 for all these tests. The dotted green line shows the average of the blue line.

As you can see from the benchmark results, we enabled a very small performance gain just by making the new increased voltages available to the boost clock algorithm. Normally, overvoltage is used to stabilize manual overclocking, but it looks like NVIDIA's boost 2.0 is smart enough to exploit that potential on its own.

Temperature Target

Using software tools provided by the board partners, users can adjust the GPU temperature target to their liking. If you want the card to boost to higher clocks, for example, you can adjust the temperature target up (for example, from the default of 80°C to 85°C). The GPU will then boost to higher clock speeds until it reaches the new temperature target.

With GPU Boost 2.0 being temperature-dependent, NVIDIA suggests that adding a waterblock onto the Titan could result in additional performance benefits because the card can boost higher for longer, since it does not have to worry about GPU temperature getting too high.

We set the fan speed of our card to maximum (which is limited to 85% by the vBIOS), adjusted the temperature target to 94°C (highest) and ran our test suite. The results show that real-life performance increases by an average of 2.5%. This is representative of what to expect of the card without any additional overclocking and with watercooling. In our test, the GPU temperature never exceeded 65°C, so any temperature limitations were effectively removed and the card could boost as high as the power limits would allow. It also shows that the card already comes with very decent fan settings out of the box, since increasing fan speed just to gain a little bit of performance is certainly not worth it.
Next Page »GPU Boost 2.0 continued