• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

NVIDIA's shady trick to boost the GeForce 9600GT

Joined
Nov 21, 2004
Messages
64 (0.01/day)
System Name Dothan
Processor Intel Pentium M 770 @ 160*16
Motherboard MSI Speedster FA-4
Cooling Zalman CNPS9500 LED
Memory Adata DDR2 @ 240
Video Card(s) Sapphire HD3870
Storage Maxtor DiamondMax 10 300GB | Hitachi 160GB | Seagate 250GB
Display(s) Samsung 920T + LG Flatron 22"
Case TT Tsunami Xaser Black
Audio Device(s) Creative Audigy 2 ZS
Power Supply Hiper type-R 480W
Software ATITool & SysTool
Hello,

I've talked to a friend who stopped by. He said that the main reason why those who overclock for high scores on 3DMark06 and so forth do overclock the PCIe bus as it "Does" raise the core clock on the cards.

What I find confusing about the article as well as all of the replies to my own is that there are tons of articles dating back to 2006 stating that the nforce boards have linkboost and/or allowing changes to the PCIe frequency. So if these features have been available for over 2 years and the G80+ chipsets have also supported this for the past year or more. Why would it just now be noticed with a 9600 GT card when there are reviews and tech sites that say it's been available for over a year, maybe two?

I don't know when nvidia first offered the nForce 590 board or the G80 chipset. But there seems to be enough info stating that this shady feature has been around for quite some time.

Other sites were first to mention it in 2006. But some people are just now finding out while testing all of these new cards since there seem to be so much hype around them.

Chris

If you do take a look at the chart at page 2, because it seems that you havnt. http://www.techpowerup.com/reviews/NVIDIA/Shady_9600_GT/2.html

There you the theoretical benchmark, comparing PCI-E busspeeds on the 9600 and the older 8800, you will see that the shady feature isnt present at the later.
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
How many people reading this thread overclocked thier PCIe bus speed and noticed changes on thier 3DMark96 scores with using a 9600 card?

I didn't so lets see if anyone else does?

Chris
 

TmkGod

New Member
Joined
Feb 24, 2008
Messages
1 (0.00/day)
Location
Israel
"Too good to be true" is the right phrase

it seems like they (not accusing) were trying to sell a low/mid priced card that acts like a higher end card.
Thats probably is the reason for the initial delay, when they were having "trouble" with the vcore voltage.

higher than normal voltage + higher frequency than manufactured = Lots of people buy O.C'd products without knowing that theyre oc'd

P.S: i'm a owner of a G92 8800GT(65mm fan) and i now have a clue to why the cooling is so faulty and lame, that is - to cut OC potential and encourage those who didnt buy 8800gt to buy a lower priced "equally good" 9600GT


dont flame me because i'm sarcastic, i'm nice most of the time :)
 
Joined
Dec 18, 2005
Messages
8,253 (1.23/day)
System Name money pit..
Processor Intel 9900K 4.8 at 1.152 core voltage minus 0.120 offset
Motherboard Asus rog Strix Z370-F Gaming
Cooling Dark Rock TF air cooler.. Stock vga air coolers with case side fans to help cooling..
Memory 32 gb corsair vengeance 3200
Video Card(s) Palit Gaming Pro OC 2080TI
Storage 150 nvme boot drive partition.. 1T Sandisk sata.. 1T Transend sata.. 1T 970 evo nvme m 2..
Display(s) 27" Asus PG279Q ROG Swift 165Hrz Nvidia G-Sync, IPS.. 2560x1440..
Case Gigabyte mid-tower.. cheap and nothing special..
Audio Device(s) onboard sounds with stereo amp..
Power Supply EVGA 850 watt..
Mouse Logitech G700s
Keyboard Logitech K270
Software Win 10 pro..
Benchmark Scores Firestike 29500.. timepsy 14000..
your 8800gt is lame cos it was a rushed out (extra cheap) to beat ati torpedo job..

in other words not a proper product release.. the 9600 is the real one..

another nvidia shady trick

trog
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,801 (3.87/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
your 8800gt is lame cos it was a rushed out (extra cheap) to beat ati torpedo job..

in other words not a proper product release.. the 9600 is the real one..

another nvidia shady trick

trog

I'll take the 8800....it's faster :D if it takes em longer to bring out slower....i'll take rushed job anytime :toast:
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.27/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
I'll take the 8800....it's faster :D if it takes em longer to bring out slower....i'll take rushed job anytime :toast:

what is it faster in?
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,801 (3.87/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.27/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,801 (3.87/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
well i guess that excludes benchmarks?

http://hwbot.org/hardware.compare.do?type=gpu&id=1278_1&id=1233_1&id=1291_1

cause other than 3dm06 the 9600GT takes the cake in everything

Obviously we are reading different reviews, I have just read 2 and they certainly didnt say that!

Take out the the Palit and EVGA because they are overclocked and the 8800GT is reference so just look at the Asus figures for the 9600GT, then go through the gaming benches. It does well tho I will give it that, but in this review.....not quite well enuff :)

http://www.hothardware.com/printarticle.aspx?articleid=1112
 
Last edited:

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.27/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
Obviously we are reading different reviews, I have just read 2 and they certainly didnt say that!

:roll: i wasn't reading reviews i was looking at the avg oc's on the cards and the score that was achieved with them. apparently the reviews didn't cover that those crazy reviewers
 

Tatty_Two

Gone Fishing
Joined
Jan 18, 2006
Messages
25,801 (3.87/day)
Location
Worcestershire, UK
Processor Rocket Lake Core i5 11600K @ 5 Ghz with PL tweaks
Motherboard MSI MAG Z490 TOMAHAWK
Cooling Thermalright Peerless Assassin 120SE + 4 Phanteks 140mm case fans
Memory 32GB (4 x 8GB SR) Patriot Viper Steel 4133Mhz DDR4 @ 3600Mhz CL14@1.45v Gear 1
Video Card(s) Asus Dual RTX 4070 OC
Storage WD Blue SN550 1TB M.2 NVME//Crucial MX500 500GB SSD (OS)
Display(s) AOC Q2781PQ 27 inch Ultra Slim 2560 x 1440 IPS
Case Phanteks Enthoo Pro M Windowed - Gunmetal
Audio Device(s) Onboard Realtek ALC1200/SPDIF to Sony AVR @ 5.1
Power Supply Seasonic CORE GM650w Gold Semi modular
Mouse Coolermaster Storm Octane wired
Keyboard Element Gaming Carbon Mk2 Tournament Mech
Software Win 10 Home x64
:roll: i wasn't reading reviews i was looking at the avg oc's on the cards and the score that was achieved with them. apparently the reviews didn't cover that those crazy reviewers

Ahhhhh right, I was talking gaming performance at stock.....I made that about 11-3 in the 8800's favour with one tie but TBH I like the 9600GT very much, there is so little in it in most things gaming and with the 9600 being cheaper.....well.

Ohhhhh and synthetic benching?? Pfffttt :)
 

cdawall

where the hell are my stars
Joined
Jul 23, 2006
Messages
27,680 (4.27/day)
Location
Houston
System Name All the cores
Processor 2990WX
Motherboard Asrock X399M
Cooling CPU-XSPC RayStorm Neo, 2x240mm+360mm, D5PWM+140mL, GPU-2x360mm, 2xbyski, D4+D5+100mL
Memory 4x16GB G.Skill 3600
Video Card(s) (2) EVGA SC BLACK 1080Ti's
Storage 2x Samsung SM951 512GB, Samsung PM961 512GB
Display(s) Dell UP2414Q 3840X2160@60hz
Case Caselabs Mercury S5+pedestal
Audio Device(s) Fischer HA-02->Fischer FA-002W High edition/FA-003/Jubilate/FA-011 depending on my mood
Power Supply Seasonic Prime 1200w
Mouse Thermaltake Theron, Steam controller
Keyboard Keychron K8
Software W10P
Ahhhhh right, I was talking gaming performance at stock.....I made that about 11-3 in the 8800's favour with one tie but TBH I like the 9600GT very much, there is so little in it in most things gaming and with the 9600 being cheaper.....well.

Ohhhhh and synthetic benching?? Pfffttt :)

i have the same view on synths :D

but we both know that are card is going to end up oc'd so why not look at the results of that and go form there ;)
 

Richteralan

New Member
Joined
Jul 7, 2006
Messages
5 (0.00/day)
1. I don't think this is "cheating". It doesn't get more confusing for joe six pack: "We are no longer using 27-MHz oscillator to generate our clock rate, we now changed to use 1/4 PCIe frequency as our base frequency. So any change of the PCIe frequency changes our GPU frequency, too!"
2. The same thing happens on my Geforce 8800M GTS.
3. I guess techpowerup needs more clicks for their website.:rolleyes:
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
Hello,

I think what has been seen with the PCIe bus frequency changes are not the same as what is being referred to in the article. I've also seen the mention where Alex, AKA Unwinder will not change anything to support the 27Mhz chip.

However, One of the 9600 GT features is the PureVideo.

A Powerful Entertainment Hub
Experience the GPU’s power while enjoying HD movies or having a premium 3D user experience with Windows Vista and Windows Media Center. PureVideo® HD technology provides lifelike pictures and vibrant color while CPU-offload capabilities enable you to manage your photos and videos with ease.

It would seem to me that it was the reviewer who first tried to put the PCIe freqency changes as being a result of the 27Mhz chip, However,

A method for managing an asynchronous data buffer to provide an output data stream comprises the steps of receiving asynchronous data at a nominal data rate and writing at least a portion of the received asynchronous data into the buffer. A fullness level of the buffer is monitored to determine whether the fullness falls within a first, nominal range, or into second or third higher ranges. For example, the buffer may have a capacity of 1024 bytes, the first range may be from 0 to 648 bytes, the second range may be from 648 to 836 bytes, and the third range may be from 836 to 1024 bytes.

A target data output rate is then determined, which may be 19,200 bits per second, or 19,200/2n bits per second, where n is a non-negative integer (e.g., n=0, 1, 2, . . . ). A fixed reference clock signal having an associated rate, for example, 27 MHz is also provided. 27 MHz is selected as an example since it is used in the MPEG system. However, virtually any system clock frequency may be used. A clocking signal is provided for outputting the asynchronous data from the buffer at a rate which corresponds to a ratio of the associated rate and the divisor. A divisor is selected to provide the clocking signal at a first rate to minimize a difference between the target data output rate and the first rate when the buffer fullness falls within the first range. Optionally, a direct digital synthesis (DDS) circuit may be used to provide the clocking signal at the desired level by providing a fractional divisor.

Full page where text came from. http://www.patentstorm.us/patents/5949795-description.html

The above example is not related to Nvidia cards, but 27Mhz seems to be pretty standard with applications supporting video encoding / decoding for data output. And as I mentioned above, The 9600 GT offers Pure Video, HD and so on so I think that chip has more to do with encoding / decoding than the PCIe changes.

Again, I am not saying that I am right or the info. I am just looking to see exactly what would cause this to happen and see what exactly is causing it.
Chris

BTW:

Here is a full list of Nvidia Patents. I sure all of the chips, designs or whatever is listed here somewhere.
http://www.patentstorm.us/patents/search-results.html?search=nvidia&imageField2.x=14&imageField2.y=8
 
Last edited:
Joined
Mar 3, 2008
Messages
28 (0.00/day)
System Name Monster
Processor Intel i5 3570K
Motherboard MSI Z77-GD65
Cooling Corsair H100i
Memory G.Skill Ares LP 8GB DDR3
Video Card(s) MSI Radeon HD 7950 O/C
Storage Samsung 840 EVO 250GB, OCZ Vertex 4 128GB, OCZ Vertex 2E 90GB etc.
Display(s) Samsung T220
Case Fractal Design R3
Audio Device(s) Realtek 898 AC
Power Supply Corsair RM 750
Mouse Logitech G9x
Keyboard QPAD MK-50
Software Windows 7 Ultimate, Windows 10 Professional
OMG, cbunting is hilarious.

cbunting: you STILL haven't understood what's going on? seriously? :banghead::p

Let me give it a shot:

Most other monitors/overclockingtools use the drivers settings to calculate the frequency. Rivatuner does not read from the driver but instead reads directly off the hardware. The fact that there's a discrepancy here is the very key to understanding the article and what's happening.

Rivatuner is using the standard clock of 27Mhz (which indeed sits on the 9600GT) which doesn't correspond to the new way the 9600GT derives it's clock frequency (PCIe-bus/4). This is very interesting for understanding what's going on.

What happens is that the 9600GT partly uses the PCIe-bus to set it's frequency. This is new. With modern nForce-based boards (Linkboost), the PCIe-bus may be automatically overclocked, overclocking the 9600GT at the same time without you or any other 9600GT-owner knowing it.

If you don't have linkboost but have overclocked your PCIe-bus even a little (5Mhz), your 9600GT is overclocked quite a bit. This goes for all PCIe-buses whether PCIe 1.1 or 2.0, AMD, Intel or nForce, doesn't matter.

The REALLY shady thing here is that the drivers don't seem to reflect the increase in speed. They can still say the core is running a default "650Mhz" when it's actually running @ 687.5Mhz. This inflates reviewers benchmarks, creates possible instability and confuses people like you.

So in the end it's very simple.

From www.nordichardware.com:

In other cases, people have had to downclock the card to make it work. Some partners have launched quite heavily overclocked cards and for natural reasons these were unstable to a higher degree, but now to a higher degree than normal. Downclocking the cards have been the only solution, which is a pity when you've paid for an overclocked card.

This called for further investigation and the people over at techPowerUp! noticed a discrepancy between the frequency reported by the driver and the clock generator. Most tools will read the frequency from the driver, but the RivaTuner monitor reads directly from the clock generator. However, the monitoring reads the information incorrectly. It multiplies by 27 instead of 25.

The problem is that the only physical crystal on the card is 27MHz, but the driver use a crystal frequency of 25MHz to calculate the GPU frequency. This doesn't quite add up. To test this in practice they turned to the multitexture fillrate test in 3DMark 06 and compared how much better 9600GT performed when raising the PCIe frequency. Increments of 5MHz clearly showed that performance scaled linearly to that. Similar tests with 8800GT show no performance improvement what so ever.

It seems like NVIDIA has designed the GPU so that that it takes the PCIe frequency and divides by four to get the crystal frequency. This means that if you overclock the PCIe bus you also overclock the GPU. But the thing is, the drivers will still report the stock frequency when you overclock the PCIe bus. E.g. if you raise the PCIe bus from 100 to 104MHz the GPU will jump up to 676MHz, from 650MHz, but the driver, and all tools reading from the driver (GPU-Z, RivaTuner and so forth), will still report 650MHz.

:pimp:
 
Last edited:
Joined
Nov 4, 2005
Messages
11,689 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Reality-------------------------------------------------------------------------Most of you.


While the idea of the patent is all right I believe it deals more with the Memory timings as stated in the article than the core clock.




We have established the following.

1) Nvidia manipulated a situation to show a supposed superior product when it fails in most cases to reach said potential.

http://www.pcstats.com/articleview.cfm?articleid=2253&page=6
http://www.techpowerup.com/reviews/Zotac/GeForce_9600_GT_Amp_Edition/20.html

When you place the card in a standard board you get substandard performance. My system kicks the card to the curb if you compare 3D06 setups. I have a old system.



However if you include a performance boost with a previously unmentioned twist that you have to overclock your PCI-e bus, many people would turn away. However Nvidias clever marketing department has twisted it to seem that this is a killer card for a great price, instead of the potential turd it is.

2) On the majority of setups the card will not perform to the expected level due to the lower core clock.

3) PCI-e bus is not saturated when using a 1.1 with a 2.0 card.

http://www.tomshardware.com/2004/11/22/sli_is_coming/


After reading this sit and smoke a cigar, and have a drink. Let the full test sink in then post something intelligent, like how easy it is to be a backseat driver, or a professional reviewer without all the hassle of doing it for a living or injecting more than your presupposed ideas onto the internet for people everywhere to laugh at who know better. Tell us about your plants and you dog scruffie. But please refrain from showing your stupidity on the internet, unlike at your mom's house we will laugh at you.
 
Last edited:

Unwinder

RivaTuner Creator
Joined
Jun 7, 2007
Messages
8 (0.00/day)
The REALLY shady thing here is that the drivers don't seem to reflect the increase in speed. They can still say the core is running a default "650Mhz" when it's actually running @ 687.5Mhz. This inflates reviewers benchmarks, creates possible instability and confuses people like you.

Most likely NVIDIA driver will never show real clock speed in case of PCI-E bus overclocking, I've explained why here:

http://forums.guru3d.com/showpost.php?p=2618787&postcount=3

The same will apply to new versions of diagnostic tools reading PLL clock directly (RivaTuner and Everest). Both will also use fixed 25MHz reference clock.
 
Joined
Nov 21, 2004
Messages
64 (0.01/day)
System Name Dothan
Processor Intel Pentium M 770 @ 160*16
Motherboard MSI Speedster FA-4
Cooling Zalman CNPS9500 LED
Memory Adata DDR2 @ 240
Video Card(s) Sapphire HD3870
Storage Maxtor DiamondMax 10 300GB | Hitachi 160GB | Seagate 250GB
Display(s) Samsung 920T + LG Flatron 22"
Case TT Tsunami Xaser Black
Audio Device(s) Creative Audigy 2 ZS
Power Supply Hiper type-R 480W
Software ATITool & SysTool
...
CRAP
...

You sir, seems to have no clue of what thread you are answeing in. I doubt more and more that you get what the 4 pages by W1zzard tries to tell you.

Another explanation is that you are a employe of NVidia, and tries to make W1zzard's article look bad by de-railing from its purpose.

No flame or anything, but i cant figure out why you would otherwise do this.
 
Joined
Mar 1, 2008
Messages
282 (0.05/day)
Location
Antwerp, Belgium
You sir, seems to have no clue of what thread you are answeing in. I doubt more and more that you get what the 4 pages by W1zzard tries to tell you.

Another explanation is that you are a employe of NVidia, and tries to make W1zzard's article look bad by de-railing from its purpose.

No flame or anything, but i cant figure out why you would otherwise do this.

Well it's obvious to me that he doesn't have a clue. Since he's a noob in computer hardware, he doesn't even understand everything in the article and that makes him confused. I don't think he means any harm but he really should stop spamming this thread.
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
I apologize for not understanding the article then.

So if Riva Tuner reads the real clock value, which in my case, my core clock stock is 650Mhz, but shows as 729Mhz in Riva Tuner is the real clock speed of the card. Then this makes no sense because "ANY" clock speed over 650Mhz for the 9600 GT OC VOIDS THE WARRENTY..

So what now? We have a card that sets it's core clock speed to that higher than the factory default. So if the 729Mhz core clock as shown by Riva Tuner, causes my card to burn up or malfuction, I am out $239.00 because Nvidia added some new feature or clock?

I'm very sorry but we do not have any fan controller or overclocking utility
as any overclocking of the card beyond the factory settings voids your warranty.

Thank you,
Tim S
BFG Support

So a Stock OC Card installed on a pc with the PCIe oc'd to 125Mhz automaticly voids my warrenty correct?

Chris
 

cbunting

New Member
Joined
Mar 1, 2008
Messages
26 (0.00/day)
OMG, cbunting is hilarious.
Rivatuner does not read from the driver but instead reads directly off the hardware. The fact that there's a discrepancy here is the very key to understanding the article and what's happening.

I just caught this part of the reply. And yes, This is where I get confused.

#1. Alex AKA Underwinder has said himself that Riva Tuner does not support the G94 Chipset. Therefore, it is not able to correctly read the freqencies. At least that was my understanding.

I'd like to add that v2.06 "knows" nothing about G94 core has no internal G94 specific codepath. So adding G94 support to 2.06 this way may cause unpredictable results and several thing may function improperly or not work at all. For example, RAM type will be detected improperly, bus width will not be detected at all, core clock can be monitored improperly in hardware monitoring module etc.
The only case when it is safe to add new card support by means of editing GPU database in .cfg file is when you're adding support for new display adapter model based on supported GPU family (e.g. you can safely add 8800GTS 512 support this way, because 2.06 fully supports G92 core). G94 is a bit different story. So please use it at your own risk.

See Alex/Unwinder's reply directly under mine in the full thread.
http://www.evga.com/forums/tm.asp?m=266477

What I have been trying to get at all along is that the software used as mentioned is NOT an accurate method for basing this theory on. Are there any specifics that state the 9600 really changes the clock speeds based on the PCIe bus speed?

But the article was written based on the core clock changes as found with using Riva Tuner and the Benchmark Software. But how does that prove anything?

If no software currently supports reading the G94 chipset correctly, Then how can anyone know that what the article is written about and based on is actually true?

Chris

BTW:

Based on all of the info that I have found and/or have been given. What was the exact basis for the article?

I've been told by various Overclockers that overclocking the PCIe Bus to a different frequency changes the core clock speed on ANY graphics card. That is why ppl started oc'ing the PCIe bus to begin with. But again, the article is about the 9600 GT and Nvidia's shady trick because of what? Possibly because the 9600 is the only card that will actually show the core clock changes based on the OC'd PCIe bus?
 
Last edited:

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
25,887 (3.79/day)
Location
Alabama
System Name Rocinante
Processor I9 14900KS
Motherboard EVGA z690 Dark KINGPIN (modded BIOS)
Cooling EK-AIO Elite 360 D-RGB
Memory 64GB Gskill Trident Z5 DDR5 6000 @6400
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 500GB 980 Pro | 1x 1TB 980 Pro | 1x 8TB Corsair MP400
Display(s) Odyssey OLED G9 G95SC
Case Lian Li o11 Evo Dynamic White
Audio Device(s) Moondrop S8's on Schiit Hel 2e
Power Supply Bequiet! Power Pro 12 1500w
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11
Benchmark Scores I dont have time for that.
Unwinder in the new revision of RT have you took out hardware reading from the pll all together and just stuck to measuring multi's? now the clock slider and monitor display the same clocks....as apposed to the 27+ mhz differance...
 
Last edited:

Saakki

New Member
Joined
Mar 3, 2008
Messages
301 (0.05/day)
Location
Finland
System Name JGG
Processor Q6600 @ 3,4ghz
Motherboard Rampage Formula
Cooling Xigmatek, Nexus
Memory 1100 mhz Pi
Video Card(s) 6950 + Accelero Xtreme Plus
Storage 1tb+1tb
Display(s) Acer 19" LCD 5ms + 17 " Hitachi
Case Fractal Design Define R2
Audio Device(s) X-FI
Power Supply XFX Core 650
Software Win 7 ultimate x64
Benchmark Scores Q6600 @ 3.5 ghz + and 6950 @ 6970 +
i think Cbunting is a shady nVidia mindblender..after this..and hello all im new face in your nice forum..after this..should i go 9600 GT or 8800 GT for my upcoming rig..?
 

Solaris17

Super Dainty Moderator
Staff member
Joined
Aug 16, 2005
Messages
25,887 (3.79/day)
Location
Alabama
System Name Rocinante
Processor I9 14900KS
Motherboard EVGA z690 Dark KINGPIN (modded BIOS)
Cooling EK-AIO Elite 360 D-RGB
Memory 64GB Gskill Trident Z5 DDR5 6000 @6400
Video Card(s) MSI SUPRIM Liquid X 4090
Storage 1x 500GB 980 Pro | 1x 1TB 980 Pro | 1x 8TB Corsair MP400
Display(s) Odyssey OLED G9 G95SC
Case Lian Li o11 Evo Dynamic White
Audio Device(s) Moondrop S8's on Schiit Hel 2e
Power Supply Bequiet! Power Pro 12 1500w
Mouse Lamzu Atlantis mini (White)
Keyboard Monsgeek M3 Lavender, Akko Crystal Blues
VR HMD Quest 3
Software Windows 11
Benchmark Scores I dont have time for that.
WELCOME TO THE FORUMS! go 8800GT it performs better the 9600 is on par with the 8800GS
 

imperialreign

New Member
Joined
Jul 19, 2007
Messages
7,043 (1.15/day)
Location
Sector ZZ₉ Plural Z Alpha
System Name УльтраФиолет
Processor Intel Kentsfield Q9650 @ 3.8GHz (4.2GHz highest achieved)
Motherboard ASUS P5E3 Deluxe/WiFi; X38 NSB, ICH9R SSB
Cooling Delta V3 block, XPSC res, 120x3 rad, ST 1/2" pump - 10 fans, SYSTRIN HDD cooler, Antec HDD cooler
Memory Dual channel 8GB OCZ Platinum DDR3 @ 1800MHz @ 7-7-7-20 1T
Video Card(s) Quadfire: (2) Sapphire HD5970
Storage (2) WD VelociRaptor 300GB SATA-300; WD 320GB SATA-300; WD 200GB UATA + WD 160GB UATA
Display(s) Samsung Syncmaster T240 24" (16:10)
Case Cooler Master Stacker 830
Audio Device(s) Creative X-Fi Titanium Fatal1ty Pro PCI-E x1
Power Supply Kingwin Mach1 1200W modular
Software Windows XP Home SP3; Vista Ultimate x64 SP2
Benchmark Scores 3m06: 20270 here: http://hwbot.org/user.do?userId=12313
Based on all of the info that I have found and/or have been given. What was the exact basis for the article?

I've been told by various Overclockers that overclocking the PCIe Bus to a different frequency changes the core clock speed on ANY graphics card. That is why ppl started oc'ing the PCIe bus to begin with. But again, the article is about the 9600 GT and Nvidia's shady trick because of what? Possibly because the 9600 is the only card that will actually show the core clock changes based on the OC'd PCIe bus?

the basis of the article is that the majority of people who would potentially buy this card don't really mess around with OCing their systems. Seeing as how it's with a 9600 card, and I'd expect those to slide into the mid-range bracket 'ere long - the mid range market is "typically" the highest cost average consumers are willing to purchase on a new card.

Even still, there aren't too many users that really start digging into a system BIOS for graphics OCing, and remember, there are only a few brands of motherboards whose BIOS allows for adjusting the PCIE frequency. If you have a OE system from Dell, HP, eMachines, etc, you wouldn't have access to that setting in BIOS at all, either. And being such, if the setting is left to [AUTO] or there isn't a means to adjust it, the PCIE frequency can change during running of applications (someone correct me on that if I'm wrong).

What it all boils down to, is that for the average consumer purchasing a 9600 for use in the Dell, Gateway, etc setup - or users who don't OC and just run everything at stock speeds - your card will effectively OC itself without your knowing about it.

I'm personally not too keen on the idea, as it makes the card appear better than it truly is, and effectively throws scoring for reviews because the card is OCing itself. IMO, this is like steroid use amoungst athletes - until there's hard evidence that something is amiss, no one's the wiser about it, but that doesn't make it right.
 
Top