• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

Overclocking of GTX 9xx series

Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
Can someone explain to me why everything always stops at clock step CLK 63 ? The range goes up to 74 I think, but the 63 is where the card has an actual limit. I thought it's related to ASIC quality or something, but seeing other users talk about CLK 63 specifically as well, that can't be it. So, what's the purpose of other steps if card never ever uses them? The clock I want to set for the boost has to be located in the CLK63 field. Always. I've been asking this since like first page and no one gave me a clear answer about it.
 
Joined
Feb 8, 2012
Messages
3,013 (0.68/day)
Location
Zagreb, Croatia
System Name Windows 10 64-bit Core i7 6700
Processor Intel Core i7 6700
Motherboard Asus Z170M-PLUS
Cooling Corsair AIO
Memory 2 x 8 GB Kingston DDR4 2666
Video Card(s) Gigabyte NVIDIA GeForce GTX 1060 6GB
Storage Western Digital Caviar Blue 1 TB, Seagate Baracuda 1 TB
Display(s) Dell P2414H
Case Corsair Carbide Air 540
Audio Device(s) Realtek HD Audio
Power Supply Corsair TX v2 650W
Mouse Steelseries Sensei
Keyboard CM Storm Quickfire Pro, Cherry MX Reds
Software MS Windows 10 Pro 64-bit
Can someone explain to me why everything always stops at clock step CLK 63 ?
Two opening posts in thread http://www.overclock.net/t/1553510/...for-most-cards-if-you-want-a-stable-overclock explain this in some detail.
In short:
By specifying your max. stable clock as "default boost clock" you force your card to the max. boost bin, all the way at the end of the voltage table. (And not, as would be default, somewhere in the middle, like CLK 63 or CLK64)
... so it's a one value edit in vbios, but the voltage curve is different at the end of the voltage table so:
The instability becomes most apparent when the card CLOCKS DOWN
Posts #6 and #8 in that thread are also very informative.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
So, it is connected to ASIC quality. But it's so weird that so many cards max out at CLK 63 exactly. Not CLK 62 or CLK 64. CLK 63 exactly...

I did see instability issues when card hit the TDP limits (when I was experimenting with those at just 200W and 150W). But with further tweaking, I'm achieving very high boost that is not triggering any PWR or voltage safeguards because I've drastically decreased voltage. Before, for 1405 MHz core the card was maxing at 1.210V. Now, it's using 1.150V. It generates a lot less heat, less noise and is not tripping PWR limits despite using same clock as before. Still need more stability testing, but it's looking good. It even gets 98.8% frame consistency in 3DMark Stress test which is actually higher than with higher voltage.

People talk "but people who mod BIOS don't know what they are doing". How can one know everything if the tool itself has no explanations in the menus, not even basic ones and every guide is different than the other. Only thing left is to simply ask people and do trial by error and learn the thing. At first I also had 300W TDP limit and crazy voltages, but after weeks of fiddling with it I've refined it so much I'm now at TDP lower than it had from factory while using way higher clocks. I'm gonna try to push voltages even lower in a hope to loosen up the fan curve a bit. It's not exactly super noisy, but I'm a silence freak and if I can do something about it, I most likely will.
 
Joined
Nov 18, 2010
Messages
7,125 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
Ah RejZor.... you are poking around the obvious... it is reverse engineering. What do you expect? Daddy Jen-Hsun Huang holding your hand and helping you? It is trial and error and a lot of wood.

Put your card on water and you will see completely different things. For fun make a graph with those two overlapping at each CLK, you will see the data sheet for your ASIC combined with your Strix cooler( + RPM ramp) and expected performance and noise. Adjust, rinse and repeat. It eats too much time, not worth really. You want the card to be silent, but you have to adjust the whole curve, not only the MAX clk it can handle.

For example my card clocks like a turd no matter what I do to it... I must supply high voltage to hit 1500MHz and be stable. The boost thing is unwelcome in my books. Every mechanisms that calculates load and frequency ie governor introduces latency, there are scenarios where it introduces stutter, yes it works fine mostly, that's it.... mostly, but I don't care for the TDP, look at your phone and see there, completely the same thing happens there. As long the card is under 65C no limiters will kick in usually, except some water bioses that have lower limit.

My Gigabyte's WF3/G1 is hell of a cooler actually and temperatures aren't a issue... the silicon and your luck is... My case dampens the sound really good too, I am also touchy about noise, but while gaming I don't care... just put the volume up to the 11.
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
No, but the Maxwell 2 Tweaker maker could at least throw freaking 3 lines of text to the damn interface. Just look at the freaking Power Table. It's just a list of values. How hard would it be to throw some text next to important ones, like the PCIe 6 and 8 pin entries, the TDP and PWR limits? It's not like I'm asking the impossible...

Just hit 99% frame quality at 1405 MHz using 1.125V. At 1.1V it won't pass the 3DMark stress test. Dropped the mid fan RPM from 1400 to 1200 and raised the 85°C 3000RPM to 95°C, meaning 100% fan will only kick in at that point while making the curve a bit loose. I did see PWR kick at 2 occasions on the GPU-Z graph, but I wouldn't worry about it.

The reason why I prefer overclocking via boost is a) performance b) stability c) reliability. Sure, you can crank the clocks up by setting base 3D clock, but then as soon as you step outside of the ideal, shit will go bad. With boost, clocks will just drop a bit and that will be that. In most cases I won't even be able to notice it, performance won't be affected and I can game on until conditions change. Sure it takes longer to configure it, but when you do, I think it's a better option.

Tried overclocking with MSI Afterburner and I didn't get results this good even by far. And I hate the god damn offset control. I want to see absolute values I use, not have +250 MHz and then try to figure out 250 MHz added to what!? I hate that. And it wasn't as stable anyway.
 
Joined
Nov 18, 2010
Messages
7,125 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
Yes, but you haven't paid a dime to the creator, nor he is interested in making such changes as long everything works. The power table looks fine. There are more horrid programs in the wild of controlling and setting up firmwares... thank whatever whom that at least it ain't in Chinese only, the heck... even hexeditor mode only!

MSI OC is added to the base clock obviously. You see it in the tool. So do your math. The boost have to be disabled to ensure your clocks are safe and error less. I use OCCT for short tests like 1-2min for error generation... it is a bullet proof concept as it gathers errors really fast(crank up the fan to max during to test to protect the VRM). As long you get calculation errors at speed xx, turn down 5-10MHz and you are safe. With boost enabled you need to artificially rise high clocks to ensure the throttled values are high enough and test then.

Your arguments why boost is good versus solid clock are funny... if your OC isn't stable it isn't stable for whatever what mode you prefer, if it is reliable and stable it works fine everywhere. My card works always on maximum boost clock while gaming - 1470MHz, always... I put up 5MHz in AB and it rises up 5MHz... If it triggers throttle due to temp/current then see the CLK state and add +5MHz to that clock state... simple as that... what's so tough in that? Each power state corresponds to their current and voltage settings according to the table and temperature target of your bios...

And don't overdo the fan lowering... watch your VRM's... Ah... your ambitions just screams that you just need to do water on your cards to achieve your goals.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
So, just because he made the tool I'm not allowed to give a legitimate criticism. Oh boy, it's almost as if I'm talking about racism/sexism with someone again... Call me ungrateful bastard all you want, that still doesn't change the fact it's badly designed, especially considering things that could be fully explainable with 5 strings of plain text. No fancy popups or GUI elements, simply text next to a setting. But oh well, criticism = harassment+being ungrateful. Now, where have I seen that...

Offsets are stupid. If all GTX 980 had identical round clock, no one would care. But constantly thinking what was the starting point is just idiotic. Same criticism as above. Criticize me for it all you want, that doesn't change the fact it's dumb and clumsy. Especially since it's absolute values in same tool using Radeon graphic card...

As for the clocks, what makes you think your setup is prepared for every scenario just because you've tested it for 30 hours straight? I've seen too many problems with that. The boost way is just more elegant, even if it takes longer to test.
 
Joined
Nov 18, 2010
Messages
7,125 (1.45/day)
Location
Rīga, Latvia
System Name HELLSTAR
Processor AMD RYZEN 9 5950X
Motherboard ASUS Strix X570-E
Cooling 2x 360 + 280 rads. 3x Gentle Typhoons, 3x Phanteks T30, 2x TT T140 . EK-Quantum Momentum Monoblock.
Memory 4x8GB G.SKILL Trident Z RGB F4-4133C19D-16GTZR 14-16-12-30-44
Video Card(s) Sapphire Pulse RX 7900XTX + under waterblock.
Storage Optane 900P[W11] + WD BLACK SN850X 4TB + 750 EVO 500GB + 1TB 980PRO[FEDORA]
Display(s) Philips PHL BDM3270 + Acer XV242Y
Case Lian Li O11 Dynamic EVO
Audio Device(s) Sound Blaster ZxR
Power Supply Fractal Design Newton R3 1000W
Mouse Razer Basilisk
Keyboard Razer BlackWidow V3 - Yellow Switch
Software FEDORA 39 / Windows 11 insider
But oh well, criticism = harassment+being ungrateful

It ain't some sort of product where you can do it really. Contribute and make yourself eligible to make criticism - make better a software yourself. That's the way free software works and these are ethics around it, every coder will tell you the same and bring up the Linus Torvalds middle finger to your face. Seconds the explanations aren't there because of obvious reasons. It is nvidia property using patented technologies and using instructions that are actually confidential material. Be grateful that the tool even exists, it officially should be taken down actually.

Offsets are stupid.

What?

you've tested it for 30 hours straight?

What?

I understand what are you trying to do, but just don't burn the card - that's what I am telling to you... calm down and roll some beers.
 
Last edited:
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
Just because "i don't make the software myself" doesn't mean I'm not allowed to criticize. That's the ages old bullshit I'm 100% immune to. I make certain tools myself and there is this thing some of us call "feedback". For some it's "the evil ungrateful users criticizing muh work". That shit ain't gonna fly with me. With no feedback or criticism, we'll continue to have garbage software because no one will bother to improve it. But hey, I guess my view on this is weird...
 
Joined
Sep 12, 2015
Messages
413 (0.13/day)
Location
Corn field in Iowa
System Name Vellinious
Processor i7 6950X
Motherboard ASUS X99-A II
Cooling Custom Liquid
Memory 32GB GSkill TridentZ 3200 14
Video Card(s) 2 x EVGA GTX 1080 FTW
Storage 512 GB Samsung 950 Pro, 120GB Kingston Hyper X SSD, 2 x 1TB WD Caviar Black
Case Thermaltake Core X9, stacked
Power Supply EVGA SuperNova 1000P2, EVGA SuperNova 750G2
Mouse Razer Naga Molten Edition
Keyboard TT eSports Challenger Ultimate
Benchmark Scores Timespy-1080 SLI-15972
I think you just like to bitch about stuff.....that's pretty much this whole thread. You bitching about something.....
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
Sure mate, if you say so. You know, because "bitching" never ever improved anything. Like the same "bitching" that made AMD fix the RX480 power distribution. I mean, why were people "bitching" about it? Those ungrateful bastards. They got the product, what more do they want eh!? Or the bitching over driver distribution through NVIDIA Experience only. Those ungrateful whining NVIDIA users. They'd still get the drivers. Why were they "bitching" about? This very same "bitching" and "whining" turned EVERYTHING you use today into more user friendly products than they were 10 years ago. But hey, if you want to just praise everything even when it's all broken and crappy, then be my guest. I'm not one of those idiots. Something being free doesn't have any immunity in that regard. I make free stuff as well and people comment and complain about here and there. And I don't go "omg you bitching assholes" on them just because they dare commenting a FREE app.
 
Joined
Sep 12, 2015
Messages
413 (0.13/day)
Location
Corn field in Iowa
System Name Vellinious
Processor i7 6950X
Motherboard ASUS X99-A II
Cooling Custom Liquid
Memory 32GB GSkill TridentZ 3200 14
Video Card(s) 2 x EVGA GTX 1080 FTW
Storage 512 GB Samsung 950 Pro, 120GB Kingston Hyper X SSD, 2 x 1TB WD Caviar Black
Case Thermaltake Core X9, stacked
Power Supply EVGA SuperNova 1000P2, EVGA SuperNova 750G2
Mouse Razer Naga Molten Edition
Keyboard TT eSports Challenger Ultimate
Benchmark Scores Timespy-1080 SLI-15972
The maxwell bios editor is not hard to figure out, man.....not even a little bit.
 
Joined
Oct 2, 2004
Messages
13,791 (1.93/day)
Yeah, Power Table is a brilliant example of how self explanatory it is... All it needs is 5 plain text strings. None of which are there. I mean, if you go the length of hacking the BIOS and making the tool for it, drop some basic description in it. And I'm now the bad guy for merely pointing this out.
 
Top