NVIDIA's shady trick to boost the GeForce 9600GT Review 174

NVIDIA's shady trick to boost the GeForce 9600GT Review

Page 2 »

How NVIDIA made the 9600 GT gain extra performance .. secretly

When we first reviewed NVIDIA's new GeForce 9600 GT (review here), we noticed a discrepancy between the core clock speed reported by the driver and the core clock speed read from the clock generator directly.



RivaTuner Overclocking and GPU-Z read the clocks from the NVIDIA driver, displaying whatever the driver returns as core clock. Rivatuner Monitoring however accesses the clock generator inside the GPU directly and gets its information from there. A PLL to generate clocks works as follows. It is fed a base frequency from a crystal oscillator, typically in the 13..27 MHz range. It then multiplies and divides this frequency by an integer value to reach the final clock speed. For example 630 MHz = 27 MHz * 70 / 3.

The information which crystal is used is stored inside the GPU's strap registers which are initialized from a resistor configuration on the PCB and the VGA BIOS. In case of the GeForce 9600 GT the strap says "27 MHz" crystal frequency and Rivatuner Monitoring applies that to its clock reading code, resulting frequency: 783 MHz = 27 MHz * 29 / 1. The NVIDIA driver however uses 25 MHz for its calculation: 725 MHz = 25 * 29 / 1.

This explains the clock difference and can only be seen on the core frequency (the memory PLL is always running at 27 MHz).

We verified this personally on three 9600 GT cards from various manufacturers, other users confirm this too.

Now the big question is, what is going on here? When I asked NVIDIA about this phenomenon they replied:
The crystal frequency is 25MHz on 9600GT. Clock is 650MHz.
No further details were given. I also sent another email later asking for an update on this - no reply.
Next Page »Page 2
View as single page
Apr 25th, 2024 03:34 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts