• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Does GPU overclocking make a noticeable difference to you.

Joined
Apr 14, 2016
Messages
121 (0.04/day)
Of course the benchmark scores are higher, but when you're actually playing a game can you notice the difference between stock vs say a 5% overclock.

what about 10%. 15%? At what point does it make a noticeable difference to you?

50 FPS + 5% = 52.5
50 FPS + 10% = 55
50 FPS + 15% = 57.5
 
Now this is actually a really good question. I have asked it many times and my honest answer is: no.

I run my 570 at 1400 from 1284 and memory from 7000 to 8000 simply because it feels nice to have it giving a slightly higher FPS lol. But yeah I can run it stock no issue and not feel it. But then I am playing games where i am getting over 80FPS pretty much all the time.

I think if i was sub-60 it would make more difference. Anyway the last GPU i really owned that gained a huge amount from overclocking was the 980 Ti. Maxwell just OC'd like crazy. Still dont know if it was noticeable in gaming though XD
 
With my old 980 Ti, OC gave me ~20% on top of the ~20% (from ref to custom).

40% better perf is quite noticable, yep :)
 
With my old 980 Ti, OC gave me ~20% on top of the ~20% (from ref to custom).

40% better perf is quite noticable, yep :)
Oh man i miss the 980 Ti. I had two reference blowers in SLI at one point, they would do 1150MHz each in SLI benching but I think I managed to get them to around 1550Mhz (hair dryer mode though) thats 34.7% increase in clocks alone, not even VRAM Oc so yeah the perf gain was enormous. 980 Ti was a champ *nostalgia*
 
In case fps is around 50, then yeah every single fps is nice. After all it's free performance.
If you're already at 180 fps, maybe going to 190-200 won't change much. Unless you're a 240 Hz gamer maybe :p You'll typically be more CPU and memory bound here tho.

I always overclock my GPU's. I don't like the fact that I'm not getting full performance from the card :p

Oh man i miss the 980 Ti. I had two reference blowers in SLI at one point, they would do 1150MHz each in SLI benching but I think I managed to get them to around 1550Mhz (hair dryer mode though) thats 34.7% increase in clocks alone, not even VRAM Oc so yeah the perf gain was enormous. 980 Ti was a champ *nostalgia*

Yeah it's one of the best overclockers ever I think.
I still find it weird that Nvidia shipped the reference cards with those clocks.

Funny that it launched alongside Fury X, which Lisa Su claimed was an overclockers dream :laugh:

My 980 Ti ran 1500/8000.

With custom firmware, it did 1580 solid and benching at 1633, but it was not worth it for me on air. At 1500/2000 performance was good and noise levels were nice.
 
Last edited:
I would also answer with a big: NO.

In the age of FreeSync/GSync and power-limited GPUs (like Pascal, Turing), or clock limited (All new AMD ones), that extra 5-10% is irrelevant, as the sync-enabled display can simply take the 5% slower framerate and be fine with it and you won't notice any difference.

Overclocking somewhat mattered in the past when unable to maintain 60 fps which caused stutters (especially with VSync), and in the past 20% was possible, but now with "turbo" or "precision boost" the cards overclock themselves anyway to near the limit.

I only overclock my GPU for benchmarks, (personal) record breaking and that stuff.
All "factory" during gaming.
 
Yeah it's one of the best overclockers ever I think.
I still find it weird that Nvidia shipped the reference cards with those clocks.

Funny that it launched alongside Fury X, which Lisa Su claimed was an overclockers dream :laugh:

My 980 Ti ran 1500/8000.

With custom firmware, it did 1580 solid and benching at 1633, but it was not worth it for me on air. At 1500/2000 performance was good and noise levels were nice.
Damn thats mental, 1633. And yeah but I guess they wanted to keep power in check, and tbh the competition wasn't you know, all that great. I had Fury X crossfire too also and it was a mess compared to 980 Tis. Fiji could barely add 100 mhz :/ and the 4 vs 6GB really made the difference. This was all back when i had inherited a large sum of money from a family member but then blew it all on PC parts. -_- Now it's all gone and im skint xD
Worth it.:rockout:
 
Of course the benchmark scores are higher, but when you're actually playing a game can you notice the difference between stock vs say a 5% overclock.

what about 10%. 15%? At what point does it make a noticeable difference to you?

50 FPS + 5% = 52.5
50 FPS + 10% = 55
50 FPS + 15% = 57.5
No noticeable difference to me as you prove above. Either you are struggling at 20fps and go up to 22 fps or you are humming along at 60 fps and go up to 66. You really need 25%+ increase to see a noticeable difference and that pretty much means a different card.
 
Instead of pure perf. Pascal and Turing OC should focus on maintaining highest GPU clocks over long periods of time.
Less power/temp throttling = less stutters caused by GPU clock bouncing around.
 
Damn thats mental, 1633. And yeah but I guess they wanted to keep power in check, and tbh the competition wasn't you know, all that great. I had Fury X crossfire too also and it was a mess compared to 980 Tis. Fiji could barely add 100 mhz :/ and the 4 vs 6GB really made the difference. This was all back when i had inherited a large sum of money from a family member but then blew it all on PC parts. -_- Now it's all gone and im skint xD
Worth it.:rockout:

It was with full power/voltages and not 100% stable in games (artifacts) but all benchmarks would run (and score better). 1580 MHz was 100% stable tho. Those 80 MHz probably added 10-15dB so I kept it at 1500/8000 :laugh:

Yeah I agree. Those 2GB extra VRAM came in handy over time.

I would also answer with a big: NO.

In the age of FreeSync/GSync and power-limited GPUs (like Pascal, Turing), or clock limited (All new AMD ones), that extra 5-10% is irrelevant, as the sync-enabled display can simply take the 5% slower framerate and be fine with it and you won't notice any difference.

Overclocking somewhat mattered in the past when unable to maintain 60 fps which caused stutters (especially with VSync), and in the past 20% was possible, but now with "turbo" or "precision boost" the cards overclock themselves anyway to near the limit.

I only overclock my GPU for benchmarks, (personal) record breaking and that stuff.
All "factory" during gaming.

Well, not everyone aims for 55-65 fps and most FreeSync/Gsync monitors are 100+ Hz anyway so you'll all the power you can get.

2060 with OC will pretty much match 2070.
2070 with OC will pretty much match 2080.

Free performance is always nice. So I'll keep OC'ing my cards.

Btw cards won't OC near the limit out of the box just because of turbo boost. They will be hold back by powerlimits.

With my 1080 Ti, the performance jumps 10-12% just from adjusting powerlimit to max and keep everything else at stock. It will boost much higher as a result.
 
Free performance is always nice.
But it isn't really "free", isn't it ?
Power draw aside (which increases your electricity bill), it also stresses the components more and increases the chance of a failure.

Like you said "not everyone aims for", I could say "not everyone can afford buying new GPU every year or two".
Overclocking is and always will be a risk that one takes.

But of course, if you have a golden goose (or pigeon) or a river of money flowing through your bedroom, knock yourself out, POWER TO THE MAX !
 
Some cards, depending on their cooling setup, will "work" at the OC level only for a short time, a benchmark if you want. But sustaining that OC during a gaming session of one, two or more hours is something else. They will drop the OC frequency to cool off.
 
But it isn't really "free", isn't it ?
Power draw aside (which increases your electricity bill), it also stresses the components more and increases the chance of a failure.

Like you said "not everyone aims for", I could say "not everyone can afford buying new GPU every year or two".
Overclocking is and always will be a risk that one takes.

But of course, if you have a golden goose (or pigeon) or a river of money flowing through your bedroom, knock yourself out, POWER TO THE MAX !

A few more watts costs a few cents a month, I have been overclocking GPU's for over 20 years now and never seen a dead chip from standard OC.

Overclocking can sometimes postpone a GPU upgrade, so I don't see your point.

Some cards, depending on their cooling setup, will "work" at the OC level only for a short time, a benchmark if you want. But sustaining that OC during a gaming session of one, two or more hours is something else. They will drop the OC frequency to cool off.

No they won't. Unless you gimp the fanspeed.

My 980 Ti did 1500 all day long, at auto fan.
My 1080 Ti does 2000+ forever.
 
playing farcry new dawn at 1440 i am running my 2080ti card underclcocked.. 75% max power with a 75 FPS frame rate cap..

trog

ps.. the real max power could be 125%..
 
playing farcry new dawn at 1440 i am running my 2080ti card underclcocked.. 75% max power with a 75 FPS frame rate cap..

trog

ITX Build? :laugh:
 
depends on the card. your target fps and what you get at stock.
like my old 780 or my 970 getting the oc just right means the difference between a constant 75fps vs dips to mid 60's in some games and if thats your target then yes.
if you have a 2080ti and your target fps is 60fps i doubt you will notice the oc.

The 780 i had water cooled and i had to modify the bios to remove the tdp limit so it would sustain the speeds i had it doing. The 970 does the same performance with a standard oc no bios mods and stock cooling.
Mostly for me overclocking is just a numbers game. il happily edit in game settings to achieve the performance i want, but if a oc can let me play at slightly higer settings and achieve my target frame rate then im gonna be overclocking.
 
Last edited:
10% is usually where it makes a noticeable difference. Anything below that no. Still if you overclock your entire system the benefits add up... you can get +10% from GPU overclocking then +10% from CPU and tweaking ram and the end result is a 20% improvement over your baseline is kind of huge.

For example: here is a far cry 5 1080P run on a 2080ti (i game on 1440P but they're very close) on the left is a 9900k (the best gaming cpu @stock) on the right is my 7820x @ stock which sucks at gaming....
9900k stock.JPG


here is my overclocked 7820x system with an OC'd 1080ti:


fc5.JPG



So in my favorite title at 1440p and 1080P my 2 year old rig is able to run as if it was a brand new stock top of the line rig (i've compared benchmarks in this title and new dawn). So overclocking does help quite a bit - and for me if i reverted all my clocks including GPU i would be 30% slower overall which is extremely noticeable.
 
Last edited:
It depends... 5 fps (10% at 5p fps) can be a difference in settings and better IQ...


Lisa Su claimed was an overclockers dream :laugh:
You've repeated that falsehood more than one recently here...

I was at that press event in LA and for at least then umteenth time on TPU they were referring to the cooling. Look at a transcript. The most media didnt report it right, and forum lemmings followed. :(

Also, it was Joe Macri who said it (first), not Lisa. He also clarified in our 1:1 interview that the cooling was beast and allowed for high overclocking.
 
Last edited:
It used to make a huge difference back when i had a GeForce 210 , but now it's not noticeable with my GTX 750 TI.
 
A fully overclocked rig will put your system in "new card territory" so I would vote it's always worth. Even if the GPU OC is only around 10% if you're an overclocker you usually overclock everything including household appliances.

A fully oc'd RTX 2060 rig could easily match/beat the same rig at stock with a 2070 base clocks in most titles - the same applies for a ton of cards out there and $150 of free performance is a pretty sweet deal.
 
I definitely notice it, on some cards more than others...I specifically remember volt modding an 8600 GT to nearly 8800 GT levels back in the day...now on my current 290x, not quite as noticeable with a 17ish % overclock, but still very noticeable.
 
It depends of circumstances,but oc can not only be noticeable,it can be make or break.when I had r9 290 it made some games hit constant 60 fps instead of dropping to 52-53 frequently.
that said,on my current hardware it's hardly make or break anymore,though there are many cases when I do notice it.
I'd put +5% as hardly noticeable.
+10% as definitely noticeable unless you're at +130 fps. 50 to 55 or 90 to 100 are both an improvement to me.130 vs 145 I can't usually tell apart,though I can definitely tell apart 130 vs 160.
at +20% I usually consider upgrading if the cost can be kept reasonable but it's not enough if the cost is gonna be substantial.
 
Last edited:
Depends on gpu some brings 20-30% some 5-15% but in the end we overclock for performance every bit helps and about increased power usage and shorter lifespan who cares nothing lasts forever not even us eh.
 
as some noted...

it is no about OC. The card does that itself... it's about temps. I am using water cooling for past few gens... and it gives me the best boost I can have... as the card holds maxumim boost state all the time.
 
Back
Top