I have just noticed, while OCing my graphics card and using GPU-Z to monitor all my GTX 260's temps, that on the main tab and the sensors tab, GPU-Z does not actually show the true clock speeds. It shows whatever the overclock has been set to. We all know graphics card have clock steps. For example, I have set my graphics to run at 730 Mhz core, 1156 Mhz memory and 1517 Mhz on the shaders. But as per the clock steps and correctly reoprted by EVGA precision, the True clocks are in fact 729 Mhz on the core, 1152 Mhz memory and 1512 Mhz on the shaders. I had never noticed before, so wondered if it is suppost to be like that, and if not maybe it would be good for at least the sensors page to report the true clocks like EVGA precision does.