• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's shady trick to boost the GeForce 9600GT

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
the crystal affects only memory clock .. better to leave it alone and do normal overclocking
 

panchoman

Sold my stars!
Joined
Jul 16, 2007
Messages
9,595 (1.57/day)
Processor Amd Athlon X2 4600+ Windsor(90nm) EE(65W) @2.9-3.0 @1.45
Motherboard Biostar Tforce [Nvidia] 550
Cooling Thermaltake Blue Orb-- bunch of other fans here and there....
Memory 2 gigs (2x1gb) of patriot ddr2 800 @ 4-4-4-12-2t
Video Card(s) Sapphire X1950pro Pci-E x16 @stock@stock on stock
Storage Seagate 7200.11 250gb Drive, WD raptors (30/40) in Raid 0
Display(s) ANCIENT 15" sony lcd, bought it when it was like 500 bucks
Case Apevia X-plorer blue/black
Audio Device(s) Onboard- Why get an sound card when you can hum??
Power Supply Antec NeoHe 550-manufactured by seasonic -replacement to the discontinued smart power series
Software Windows XP pro SP2 -- vista is still crap
very nice investigation man!
 
Joined
Nov 12, 2006
Messages
2,996 (0.47/day)
System Name COLOSSUS-MK4
Processor E8400 @4.4 GHz - FSB @550 MHZ
Motherboard Asus P5K Premium (Black Pearl)
Cooling Xigmatek HDT-S1283
Memory 2x1GB Geil BlckDrgn 800 @1158 5-5-5-18
Video Card(s) 8800GT 512MB @740/1782/2080
Storage Hitachi T7K250 250GB & 7200.10 Seagate 250GB
Display(s) Gateway FPD1975W 19" Widescreen
Case Antec 1200
Audio Device(s) Xi-FI Xtreme Audio
Power Supply CoolerMaster IGreen 500W
Software XP Home SP3
Benchmark Scores SuperPi: 10.563 Sciencemark: 2563.14
Very interesting - thanks W1z. I too was puzzled by your earlier findings so it's pleasing to see that you found the reason behind it. I certainly am impressed by your deduction skills!
 

iop3u2

New Member
Joined
Feb 29, 2008
Messages
1 (0.00/day)
can someone test if linkboost is enabled when using 9600gt? just plug it in a 680i chipset and check if pci-e is above 100MHz. If it is above 100MHz it would mean that pretty much every 9600gt sli benchmark on the net is actually with the cards 0-25% overclocked.
 
Last edited:

OnBoard

New Member
Joined
Sep 16, 2006
Messages
3,033 (0.47/day)
Location
Finland
Processor Core i5-750 @ 3.6GHz 1.136V 24/7
Motherboard Gigabyte P55A-UD3, SATA 6Gbit/s & USB3.0 baby!
Cooling Alpenföhn Brocken HeatpipeDirectTouch
Memory Geil Ultra Series 4GB 2133MHz DDR3 @ 1440MHz 7-7-7-24
Video Card(s) Gigabyte GTX 460 1GB OC (mostly stock speeds)
Storage OS: Samsung F3 500GB Games: Samsung F1 640GB
Display(s) new! Samsung P2350 23" FullHD 2ms / Mirai DTL-632E500 32" LCD
Case new! Xigmatek Midgard/Utgard side window with red cathodes, 1x140mm & 3x120mm fans
Audio Device(s) new! ASUS Xonar DG & JVC HA-RX700 headphones
Power Supply Cougar CM 700W Modular
Software Windows 7 Home Premium x64
Benchmark Scores Logitech UltraX Premium & G5 laser v2 + Ulti-mat Breathe X2 for fragging
So that's how they did it!
I was like crazy on another thread why is the 9600GT so close to the 8800GT even though it has half the shaders. It just didn't add up.
Well, now it does!

Thanks W1zzard for clearing that out :)

Yeah I read your posts and wondered it my self too, wasn't expecting it to be so close to 8800GT. It being OC card overclocked adds up nicely to the performance :)

btw. has to 9600GT more core voltage than 8800GT, if it can run so high on stock voltage?
 

JacKz5o

New Member
Joined
Jan 22, 2007
Messages
477 (0.08/day)
Location
New Jersey
Processor Intel Q6600 @ 3.2GHz
Motherboard XFX nForce 780i
Cooling Ultra 120 eXtreme (lapped)
Memory 4GB G.Skill PC2-8000 PQ
Video Card(s) eVGA GeForce 9800GTX
Storage 640GB + 250GB + 2x500GB (external)
Display(s) Samsung 216BW 21.6"
Case CoolerMaster Cosmos S
Audio Device(s) Creative X-Fi XtremeMusic
Power Supply PC Power & Cooling 750W Silencer
Software Windows Vista Ultimate 64-bit
I wonder why NVIDIA hid the 27MHz crystal instead of marketing it as a.. "feature" like some other companies would do? Seems a bit shady.
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
Hello,

Nvidia's Ntune and every other software program I've seen shows the correct value. Only Riva Tuner, which doesn't support the 9600 gt even though it works shows two different values for the core clock speed. Isn't this actually a bug in a sense if only one program is giving false readings?

I would think that Ntune would show the same thing that Riva tuner does. But no other software does. Just rivatuner so I don't think it's accurate otherwise, the core clock changes would be listed on my 3dmark05/06 scores, gpu-z ect.

I just think riva tuner is wrong. Maybe RT 2.07 with support for the 9600 with give correct readings.

I could be wrong but I don't think RT is accurate.
Chris
 

PuMA

New Member
Joined
Jul 15, 2006
Messages
724 (0.11/day)
Location
Finland
Processor lapped C2D e6850, 3600mhz
Motherboard ASUS P5N32-E SLI (680i SLI)
Cooling NOCTUA NH-U9, 3 2x120mm 1x180mm case fans
Memory 6GB A-DATA vitesta DDR2 900mhz (2x1gb + 2x2gb)
Video Card(s) LEASTEK GTX 260 (701/1408/1105)
Storage SAMSUNG F1 640GB SATAII/SEAGATE 250gb SATAII
Display(s) SAMSUNG 226BW 1680x1050 2ms
Case ANTEC three hundred
Audio Device(s) ASUS SupremeFX, SONY surround AMP
Power Supply XION 630W
Software VISTA ULTIMATE x64
Benchmark Scores 3dmark06:15126 (701/1408/1105) vantage: 11698
Nice read Wiz

Can the crystal be changed on the G92 (8800GTS) and would it give a big boost in performance.

i think its not about 9600 performing well without chrystal, its about how u OC the card. with 9600gt and nvidia chipset u just enable linkboost and ur card is overclocked. on the G92 and other cards u have to digg up OC progs to OC the card. just 2 different methods.
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
Hello,

Nvidia's Ntune and every other software program I've seen shows the correct value.

you are telling me that if you raise pcie clock the new core clock is reflected by other programs?
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
Hello,

Well, if the crystal thing is really legit. I'd like to know where the memory core crystal is as well.

http://i29.tinypic.com/1zzjl3a.jpg

I get an inaccurate reading on both the core clock AND memory clock. Look at my oc settings in the pic. Both are not correct. The drivers have changed for the 9600 so I still don't see how the plugins are reading the drivers correctly since Riva isn't updated.

I still could be wrong but the memory clock isn't right either.
Chris
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
you are telling me that if you raise pcie clock the new core clock is reflected by other programs?

No, I'm was saying that all other software shows my factory default clocks. Only the non updated Riva Tuner is the only software that shows different clock speeds than what I have set it in.

If I set my card to 650 core clock. Ntune, Expertool, 3dmark06 and everest all show 650mhz core clock. Riva tuner is the only one showing a different value. And it doesn't support my drivers so I just don't see how it's an accurate reading is all.

Chris
 

W1zzard

Administrator
Staff member
Joined
May 14, 2004
Messages
27,049 (3.71/day)
Processor Ryzen 7 5700X
Memory 48 GB
Video Card(s) RTX 4080
Storage 2x HDD RAID 1, 3x M.2 NVMe
Display(s) 30" 2560x1600 + 19" 1280x1024
Software Windows 10 64-bit
No, I'm was saying that all other software shows my factory default clocks. Only the non updated Riva Tuner is the only software that shows different clock speeds than what I have set it in.

If I set my card to 650 core clock. Ntune, Expertool, 3dmark06 and everest all show 650mhz core clock. Riva tuner is the only one showing a different value. And it doesn't support my drivers so I just don't see how it's an accurate reading is all.

Chris

yes you are correct. rivatuner sensor readings only pointed out that something strange is going on here that required more investigation. and as i mentioned in the article rt's sensor readings are wrong. its just something that makes you ask "whats going on here?" and then you dig and find your answer.
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
Hello,

I'm digging but so far. I see that almost all reviewers use Riva Tuner. RT does not have support for G94 cards. It also doesn't support the 174.20 drivers either. So I think the reading are wrong because riva doesn't know how to read the drivers.

The wierd Riva Tuner output;
http://i30.tinypic.com/2dqjdd4.jpg

GPU-Z output for Stock Factory OC;
http://i26.tinypic.com/2e4btbo.jpg

Re-applied OC Settings with Expertool showing with GPU-Z;
http://i30.tinypic.com/33capgz.jpg

This is why I honestly think Riva Tuner is inaccurate moreso than anything.
Chris
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Hello,

I'm digging but so far. I see that almost all reviewers use Riva Tuner. RT does not have support for G94 cards. It also doesn't support the 174.20 drivers either. So I think the reading are wrong because riva doesn't know how to read the drivers.

The wierd Riva Tuner output;
http://i30.tinypic.com/2dqjdd4.jpg

GPU-Z output for Stock Factory OC;
http://i26.tinypic.com/2e4btbo.jpg

Re-applied OC Settings with Expertool showing with GPU-Z;
http://i30.tinypic.com/33capgz.jpg

This is why I honestly think Riva Tuner is inaccurate moreso than anything.
Chris
He's not disputing that RT is inaccurate, only that it's different readings led him to investigate further.
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
I just think in the end it's odd because I found this post by searching for "9600 rivatuner" in google.com to see what the line was that needed to be added to the rivatuner config. but searching google for 9600 rivatuner lists where this topic is now being posted everywhere. But a simple comparision with expertool, ntune or simular along side of gpu-z would yeild the correct results. But now, we have some magical 9600 gt cards all due to a version of riva tuner that doesn't even support the g94 chipset or it's drivers.

It's ashame Riva Tuner 2.07 isn't out with updated plugins I guess.
Chris
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
I just think in the end it's odd because I found this post by searching for "9600 rivatuner" in google.com to see what the line was that needed to be added to the rivatuner config. but searching google for 9600 rivatuner lists where this topic is now being posted everywhere. But a simple comparision with expertool, ntune or simular along side of gpu-z would yeild the correct results. But now, we have some magical 9600 gt cards all due to a version of riva tuner that doesn't even support the g94 chipset or it's drivers.

It's ashame Riva Tuner 2.07 isn't out with updated plugins I guess.
Chris
Whether it's reading or not isn't the issue. W1zzard proved that upping the PCIe bus overclocked the card by using benchmarks, not by using anything that reads the card's clocks.
 

btarunr

Editor & Senior Moderator
Staff member
Joined
Oct 9, 2007
Messages
46,390 (7.68/day)
Location
Hyderabad, India
System Name RBMK-1000
Processor AMD Ryzen 7 5700G
Motherboard ASUS ROG Strix B450-E Gaming
Cooling DeepCool Gammax L240 V2
Memory 2x 8GB G.Skill Sniper X
Video Card(s) Palit GeForce RTX 2080 SUPER GameRock
Storage Western Digital Black NVMe 512GB
Display(s) BenQ 1440p 60 Hz 27-inch
Case Corsair Carbide 100R
Audio Device(s) ASUS SupremeFX S1220A
Power Supply Cooler Master MWE Gold 650W
Mouse ASUS ROG Strix Impact
Keyboard Gamdias Hermes E2
Software Windows 11 Pro
Dugg! :D

Ah, NVidia can give Desperate Housewives a run for their money.....in the shady business that is.
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
Whether it's reading or not isn't the issue. W1zzard proved that upping the PCIe bus overclocked the card by using benchmarks, not by using anything that reads the card's clocks.

How do you get benchmarks "Without" reading the cards clocks?

I'm aware of this article, http://www.nbsgaming.com/PCIEBus.html

If overclocking the PCIe bus has really increased the core with the stats not coming from Riva Tuner, then the 9600 cards are one of the first to be able to show this from what I've seen.

Chris
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
How do you get benchmarks "Without" reading the cards clocks?

I'm aware of this article, http://www.nbsgaming.com/PCIEBus.html

If overclocking the PCIe bus has really increased the core with the stats not coming from Riva Tuner, then the 9600 cards are one of the first to be able to show this from what I've seen.

Chris
That's his point exactly. Do you have a 9600? If so, run 3dmark 06 at 100MHz PCIe, then set your PCIe to something like 105MHz, and run it again.
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
Hello,

I still don't see where the info is accurate. Looking back at some of the older cards which already had the 27mhz chips on it..

Almost all GT/GTX cards used to set core frequencies at 27 MHz steps, frequency of the geometry unit was always higher than those of the other units by 40 MHz. But this 27 MHz step was removed in the G71. In the 7900 series, all the three frequencies can be changed at 1-2 MHz steps. A difference between the geometry unit frequency and frequencies of the other units in the 7900 GTX has grown to 50 MHz. This difference in the 7900 GT is 20 MHz, there is no 27 MHz step either.

So it would seem by opening up my 9600, that Nvidia has brought something back for some reason. But I don't see how that affects a change in core settings through the PCIe bus unless something is different where these 2.0 cards are backwards compatible with x16. Since I am using x16, a oc of my pcie bus may show some differences but I would assume it would change the data rate, not overclock the card. I just can't see how it would be possible although it very well could be.

Chris
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
Hello,

I still don't see where the info is accurate. Looking back at some of the older cards which already had the 27mhz chips on it..



So it would seem by opening up my 9600, that Nvidia has brought something back for some reason. But I don't see how that affects a change in core settings through the PCIe bus unless something is different where these 2.0 cards are backwards compatible with x16. Since I am using x16, a oc of my pcie bus may show some differences but I would assume it would change the data rate, not overclock the card. I just can't see how it would be possible although it very well could be.

Chris
Just run 3Dmark06 at the settings I said above. You should see a roughly 5% increase in performance. 5MHz on the PCIe bus on any other card, makes no such difference.
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
Hello,

I think I've found the details.

http://www.digit-life.com/articles2/video/g70-2.html

Look half way down that review until you see the Nvidia control panel with the oc configuration. The text below that starts talking about how the reviewer and the author of Riva Tuner figured out exactly why this was happening.

Chris

Edit:

Just search that page at the above link for A story about the triple core frequencies
and you'll find the start of the info.
 

Wile E

Power User
Joined
Oct 1, 2006
Messages
24,318 (3.79/day)
System Name The ClusterF**k
Processor 980X @ 4Ghz
Motherboard Gigabyte GA-EX58-UD5 BIOS F12
Cooling MCR-320, DDC-1 pump w/Bitspower res top (1/2" fittings), Koolance CPU-360
Memory 3x2GB Mushkin Redlines 1600Mhz 6-8-6-24 1T
Video Card(s) Evga GTX 580
Storage Corsair Neutron GTX 240GB, 2xSeagate 320GB RAID0; 2xSeagate 3TB; 2xSamsung 2TB; Samsung 1.5TB
Display(s) HP LP2475w 24" 1920x1200 IPS
Case Technofront Bench Station
Audio Device(s) Auzentech X-Fi Forte into Onkyo SR606 and Polk TSi200's + RM6750
Power Supply ENERMAX Galaxy EVO EGX1250EWT 1250W
Software Win7 Ultimate N x64, OSX 10.8.4
You're missing the point entirely. The readings from programs don't matter here. The fact of the matter is, the card gets faster when you increase the PCIe Frequency. Instead of trying to explain the readings, just do as I suggested, and run benchmarks at various PCIe frequencies, and see for yourself.
 

AlexUnwinder

RivaTuner Creator
Joined
Mar 1, 2008
Messages
26 (0.00/day)
The wierd Riva Tuner output;
http://i30.tinypic.com/2dqjdd4.jpg

GPU-Z output for Stock Factory OC;
http://i26.tinypic.com/2e4btbo.jpg

Re-applied OC Settings with Expertool showing with GPU-Z;
http://i30.tinypic.com/33capgz.jpg

This is why I honestly think Riva Tuner is inaccurate moreso than anything.

You're mistaking. All the tools you've mentioned including nTune, GPU-Z and ExpertTool show just the target clocks, which you "ask" to set. So you'll always see "correct" clocks there regardless of the reall PLL state, thermal throttling conditions etc.
The real clocks generated by hardware must be and normally are different comparing to target ones. And there are only two tools, allowing to monitor real PLL clocks: RivaTuner and Everest. The rest will give your target clocks only.
 
Top