• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's shady trick to boost the GeForce 9600GT

the crystal affects only memory clock .. better to leave it alone and do normal overclocking
 
very nice investigation man!
 
Very interesting - thanks W1z. I too was puzzled by your earlier findings so it's pleasing to see that you found the reason behind it. I certainly am impressed by your deduction skills!
 
can someone test if linkboost is enabled when using 9600gt? just plug it in a 680i chipset and check if pci-e is above 100MHz. If it is above 100MHz it would mean that pretty much every 9600gt sli benchmark on the net is actually with the cards 0-25% overclocked.
 
Last edited:
So that's how they did it!
I was like crazy on another thread why is the 9600GT so close to the 8800GT even though it has half the shaders. It just didn't add up.
Well, now it does!

Thanks W1zzard for clearing that out :)

Yeah I read your posts and wondered it my self too, wasn't expecting it to be so close to 8800GT. It being OC card overclocked adds up nicely to the performance :)

btw. has to 9600GT more core voltage than 8800GT, if it can run so high on stock voltage?
 
I wonder why NVIDIA hid the 27MHz crystal instead of marketing it as a.. "feature" like some other companies would do? Seems a bit shady.
 
Hello,

Nvidia's Ntune and every other software program I've seen shows the correct value. Only Riva Tuner, which doesn't support the 9600 gt even though it works shows two different values for the core clock speed. Isn't this actually a bug in a sense if only one program is giving false readings?

I would think that Ntune would show the same thing that Riva tuner does. But no other software does. Just rivatuner so I don't think it's accurate otherwise, the core clock changes would be listed on my 3dmark05/06 scores, gpu-z ect.

I just think riva tuner is wrong. Maybe RT 2.07 with support for the 9600 with give correct readings.

I could be wrong but I don't think RT is accurate.
Chris
 
Nice read Wiz

Can the crystal be changed on the G92 (8800GTS) and would it give a big boost in performance.

i think its not about 9600 performing well without chrystal, its about how u OC the card. with 9600gt and nvidia chipset u just enable linkboost and ur card is overclocked. on the G92 and other cards u have to digg up OC progs to OC the card. just 2 different methods.
 
Hello,

Nvidia's Ntune and every other software program I've seen shows the correct value.

you are telling me that if you raise pcie clock the new core clock is reflected by other programs?
 
Hello,

Well, if the crystal thing is really legit. I'd like to know where the memory core crystal is as well.

http://i29.tinypic.com/1zzjl3a.jpg

I get an inaccurate reading on both the core clock AND memory clock. Look at my oc settings in the pic. Both are not correct. The drivers have changed for the 9600 so I still don't see how the plugins are reading the drivers correctly since Riva isn't updated.

I still could be wrong but the memory clock isn't right either.
Chris
 
you are telling me that if you raise pcie clock the new core clock is reflected by other programs?

No, I'm was saying that all other software shows my factory default clocks. Only the non updated Riva Tuner is the only software that shows different clock speeds than what I have set it in.

If I set my card to 650 core clock. Ntune, Expertool, 3dmark06 and everest all show 650mhz core clock. Riva tuner is the only one showing a different value. And it doesn't support my drivers so I just don't see how it's an accurate reading is all.

Chris
 
No, I'm was saying that all other software shows my factory default clocks. Only the non updated Riva Tuner is the only software that shows different clock speeds than what I have set it in.

If I set my card to 650 core clock. Ntune, Expertool, 3dmark06 and everest all show 650mhz core clock. Riva tuner is the only one showing a different value. And it doesn't support my drivers so I just don't see how it's an accurate reading is all.

Chris

yes you are correct. rivatuner sensor readings only pointed out that something strange is going on here that required more investigation. and as i mentioned in the article rt's sensor readings are wrong. its just something that makes you ask "whats going on here?" and then you dig and find your answer.
 
Hello,

I'm digging but so far. I see that almost all reviewers use Riva Tuner. RT does not have support for G94 cards. It also doesn't support the 174.20 drivers either. So I think the reading are wrong because riva doesn't know how to read the drivers.

The wierd Riva Tuner output;
http://i30.tinypic.com/2dqjdd4.jpg

GPU-Z output for Stock Factory OC;
http://i26.tinypic.com/2e4btbo.jpg

Re-applied OC Settings with Expertool showing with GPU-Z;
http://i30.tinypic.com/33capgz.jpg

This is why I honestly think Riva Tuner is inaccurate moreso than anything.
Chris
 
Hello,

I'm digging but so far. I see that almost all reviewers use Riva Tuner. RT does not have support for G94 cards. It also doesn't support the 174.20 drivers either. So I think the reading are wrong because riva doesn't know how to read the drivers.

The wierd Riva Tuner output;
http://i30.tinypic.com/2dqjdd4.jpg

GPU-Z output for Stock Factory OC;
http://i26.tinypic.com/2e4btbo.jpg

Re-applied OC Settings with Expertool showing with GPU-Z;
http://i30.tinypic.com/33capgz.jpg

This is why I honestly think Riva Tuner is inaccurate moreso than anything.
Chris
He's not disputing that RT is inaccurate, only that it's different readings led him to investigate further.
 
I just think in the end it's odd because I found this post by searching for "9600 rivatuner" in google.com to see what the line was that needed to be added to the rivatuner config. but searching google for 9600 rivatuner lists where this topic is now being posted everywhere. But a simple comparision with expertool, ntune or simular along side of gpu-z would yeild the correct results. But now, we have some magical 9600 gt cards all due to a version of riva tuner that doesn't even support the g94 chipset or it's drivers.

It's ashame Riva Tuner 2.07 isn't out with updated plugins I guess.
Chris
 
I just think in the end it's odd because I found this post by searching for "9600 rivatuner" in google.com to see what the line was that needed to be added to the rivatuner config. but searching google for 9600 rivatuner lists where this topic is now being posted everywhere. But a simple comparision with expertool, ntune or simular along side of gpu-z would yeild the correct results. But now, we have some magical 9600 gt cards all due to a version of riva tuner that doesn't even support the g94 chipset or it's drivers.

It's ashame Riva Tuner 2.07 isn't out with updated plugins I guess.
Chris
Whether it's reading or not isn't the issue. W1zzard proved that upping the PCIe bus overclocked the card by using benchmarks, not by using anything that reads the card's clocks.
 
Dugg! :D

Ah, NVidia can give Desperate Housewives a run for their money.....in the shady business that is.
 
Whether it's reading or not isn't the issue. W1zzard proved that upping the PCIe bus overclocked the card by using benchmarks, not by using anything that reads the card's clocks.

How do you get benchmarks "Without" reading the cards clocks?

I'm aware of this article, http://www.nbsgaming.com/PCIEBus.html

If overclocking the PCIe bus has really increased the core with the stats not coming from Riva Tuner, then the 9600 cards are one of the first to be able to show this from what I've seen.

Chris
 
How do you get benchmarks "Without" reading the cards clocks?

I'm aware of this article, http://www.nbsgaming.com/PCIEBus.html

If overclocking the PCIe bus has really increased the core with the stats not coming from Riva Tuner, then the 9600 cards are one of the first to be able to show this from what I've seen.

Chris
That's his point exactly. Do you have a 9600? If so, run 3dmark 06 at 100MHz PCIe, then set your PCIe to something like 105MHz, and run it again.
 
Hello,

I still don't see where the info is accurate. Looking back at some of the older cards which already had the 27mhz chips on it..

Almost all GT/GTX cards used to set core frequencies at 27 MHz steps, frequency of the geometry unit was always higher than those of the other units by 40 MHz. But this 27 MHz step was removed in the G71. In the 7900 series, all the three frequencies can be changed at 1-2 MHz steps. A difference between the geometry unit frequency and frequencies of the other units in the 7900 GTX has grown to 50 MHz. This difference in the 7900 GT is 20 MHz, there is no 27 MHz step either.

So it would seem by opening up my 9600, that Nvidia has brought something back for some reason. But I don't see how that affects a change in core settings through the PCIe bus unless something is different where these 2.0 cards are backwards compatible with x16. Since I am using x16, a oc of my pcie bus may show some differences but I would assume it would change the data rate, not overclock the card. I just can't see how it would be possible although it very well could be.

Chris
 
Hello,

I still don't see where the info is accurate. Looking back at some of the older cards which already had the 27mhz chips on it..



So it would seem by opening up my 9600, that Nvidia has brought something back for some reason. But I don't see how that affects a change in core settings through the PCIe bus unless something is different where these 2.0 cards are backwards compatible with x16. Since I am using x16, a oc of my pcie bus may show some differences but I would assume it would change the data rate, not overclock the card. I just can't see how it would be possible although it very well could be.

Chris
Just run 3Dmark06 at the settings I said above. You should see a roughly 5% increase in performance. 5MHz on the PCIe bus on any other card, makes no such difference.
 
Hello,

I think I've found the details.

http://www.digit-life.com/articles2/video/g70-2.html

Look half way down that review until you see the Nvidia control panel with the oc configuration. The text below that starts talking about how the reviewer and the author of Riva Tuner figured out exactly why this was happening.

Chris

Edit:

Just search that page at the above link for A story about the triple core frequencies
and you'll find the start of the info.
 
You're missing the point entirely. The readings from programs don't matter here. The fact of the matter is, the card gets faster when you increase the PCIe Frequency. Instead of trying to explain the readings, just do as I suggested, and run benchmarks at various PCIe frequencies, and see for yourself.
 
The wierd Riva Tuner output;
http://i30.tinypic.com/2dqjdd4.jpg

GPU-Z output for Stock Factory OC;
http://i26.tinypic.com/2e4btbo.jpg

Re-applied OC Settings with Expertool showing with GPU-Z;
http://i30.tinypic.com/33capgz.jpg

This is why I honestly think Riva Tuner is inaccurate moreso than anything.

You're mistaking. All the tools you've mentioned including nTune, GPU-Z and ExpertTool show just the target clocks, which you "ask" to set. So you'll always see "correct" clocks there regardless of the reall PLL state, thermal throttling conditions etc.
The real clocks generated by hardware must be and normally are different comparing to target ones. And there are only two tools, allowing to monitor real PLL clocks: RivaTuner and Everest. The rest will give your target clocks only.
 
Back
Top