• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

A comparison of stock vs efficient gaming

Does capping FPS to your refresh rate cause micro stutter? Do you have a link where I can read about this? I'm genuinely interested.
Hitting your Vsync limit (60FPS at 60Hz, 144 at 144, etc etc) causes the latency to skyrocket, just by the nature of how Vsync works.

Gsync and freesync both need Vsync active, they work with it and dont replace it.

AMD's enhanced sync and Nvidias fast Vsync are alternatives (and covered in one of those videos) but aren't quite as good - I use fast vsync and a 120Hz cap on my 4k 72hz monitor, so i get 72Hz worth of display latency with input latency being half of that



Two displays, one 60Hz one 120Hz.
You run Fast Vsync, yay.

Both would have the same render latency and input latency (around 5ms at that level iirc) - but the 60Hz display would skip every second frame so the display wouldnt visually be as smooth, despite it "feeling" as fast thanks to the lower input latency.

Any games with slower inputs - especially anyone gaming on controllers or joysticks - would not benefit at all from that lowered input latency. If this is a multiplayer game with a 50ms ping half a country away server, you wont notice a damn thing.

If you personally can't feel that speed difference or your PC cant keep up, you'd notice zero difference from locking it to 58FPS - and without components building up heat and using power limits, when shit gets hectic in games you have more headroom to spare instantly so your 99% lows can be a lot smoother


I'm a nutjob and slide my mouse and keyboard left and right and toggle what display i'm using based on mood.
Starcraft II? might as well 4K since it's single threaded. Killing floor 2? Oh god, gimme that 165Hz because the game engine LOVES high FPS.

My default settings (i tweak per-game, but these are the nvidia defaults)
120FPS cap
Fast vsync on
background FPS 45 (stops alt-tabbed or idle games from chewing power without giving me a seizure if i open a chat window)

Games i know i'll play on monitor 2, I unlock to 160 FPS of the 165


***************Posts got merged, pay attention below this line**************
Unigine heaven examples:


Vsync on 60hz, no FPS cap
14.3ms.

1656063524516.png


Vsync on, 58FPS limit: Shockingly this is nvidias default FPS limit suggestion like they already knew
6.7ms

1656063711105.png


120FPS, Fast Vsync (same *render* and input latency regardless of on 60Hz or 165Hz display)
Since i've got a whole lotta crap like browser tabs open, that 99% FPS did vary a bit.
4.8ms render latency
1656063153436.png


165Hz, 165FPS, Vsync on
6.9ms
Oh, it's LAGGIER THAN 58FPS

1656064162814.png



Simply put, the gain from limiting to 58FPS is something anyone can achieve for way better render latency, while the diminishing returns from higher and higher FPS are not worth chasing in many titles (Other than the few high FPS optimised Esports titles, if they aren't crippled by lower server tickrates anyway...)


Now i'll deviate into the power consumption differences from my 3090 with the render times.

58FPS: 6.7ms

120FPS

Uncapped undervolt: 210FPS 243W (GPU alone)
1656064501546.png

uncapped stock: yeah it went up in latency and 369W
1656064898221.png

1656064841030.png


And from here on we gain some perspective on how f*cked up amperes stock settings are, as well as why the hell it's not worth it.

120FPS, stock clocks 328W
1656065035065.png


120FPS, but undervolted: 208W
1656065131710.png


58FPS undervolted: 147W
1656065310558.png




TL;DR: 369W vs 147W, to reduce render latency 6ms to 5ms

Oh yeah. Totally worth it.

JUST CAP YOUR DAMN FPS.
 
Last edited:
Hitting your Vsync limit (60FPS at 60Hz, 144 at 144, etc etc) causes the latency to skyrocket, just by the nature of how Vsync works.

Gsync and freesync both need Vsync active, they work with it and dont replace it.

AMD's enhanced sync and Nvidias fast Vsync are alternatives (and covered in one of those videos) but aren't quite as good - I use fast vsync and a 120Hz cap on my 4k 72hz monitor, so i get 72Hz worth of display latency with input latency being half of that



Two displays, one 60Hz one 120Hz.
You run Fast Vsync, yay.

Both would have the same render latency and input latency (around 5ms at that level iirc) - but the 60Hz display would skip every second frame so the display wouldnt visually be as smooth, despite it "feeling" as fast thanks to the lower input latency.

Any games with slower inputs - especially anyone gaming on controllers or joysticks - would not benefit at all from that lowered input latency. If this is a multiplayer game with a 50ms ping half a country away server, you wont notice a damn thing.

If you personally can't feel that speed difference or your PC cant keep up, you'd notice zero difference from locking it to 58FPS - and without components building up heat and using power limits, when shit gets hectic in games you have more headroom to spare instantly so your 99% lows can be a lot smoother


My default settings (i tweak per-game, but these are the nvidia defaults)
120FPS cap
Fast vsync on
background FPS 45 (stops alt-tabbed or idle games from chewing power without giving me a seizure if i open a chat window)

Games i know i'll play on monitor 2, I unlock to 160 FPS of the 165


***************Posts got merged, pay attention below this line**************
Unigine heaven examples:


Vsync on, no FPS cap
14.3ms.

View attachment 252198

Vsync on, 58FPS limit: Shockingly this is nvidias default FPS limit suggestion like they already knew
6.7ms

View attachment 252200

120FPS, Fast Vsync (same *render* and input latency regardless of on 60Hz or 165Hz display)
Since i've got a whole lotta crap like browser tabs open, that 99% FPS did vary a bit.
4.8ms render latency
View attachment 252195


Simply put, the gain from limiting to 58FPS is something anyone can achieve for way better render latency, while the diminishing returns from higher and higher FPS are not worth chasing in many titles (Other than the few high FPS optimised Esports titles, if they aren't crippled by lower server tickrates anyway...)
That's interesting. I think I'm gonna test it myself, too, though I don't think I'll feel much, as I'm used to 30-60 FPS gaming in general (I'm not a very fast paced person). :ohwell:

I'm a nutjob and slide my mouse and keyboard left and right and toggle what display i'm using based on mood.
Starcraft II? might as well 4K since it's single threaded. Killing floor 2? Oh god, gimme that 165Hz because the game engine LOVES high FPS.
I'm a nutjob too and change my PC configuration based on my mood.
I had a 7700 (non-K) that I changed for a 5950X. Then I realized that keeping a £900 CPU below 30% usage at all times isn't a very good investment, so I sold it and swapped for an 11700 (non-K). I'm currently running it with a 6500 XT because I love silence, but I also have a 2070 on the shelf just in case I need more performance later. Or maybe I'll sell one of them, and buy an Intel Arc just because I'm curious. :D
 
@AusWolf, edits were done with the power consumption reported by Hwinfo added in.
I can literally throw 222W at my GPU to reduce latency by 1ms.

You got the summary, i had a lot of MSpaint and settings to change in the NVCP for that, so i hope people understand WHY we undervolt modern hardware and why EVERYONE should run a GD FPS limiter.


60hz is 16.6ms - if your input latency goes above that it'll feel laggy, wolf. I'm not sure if AMD can measure it with any of their software, or if RTSS can display it through afteburners OSD.
 
@AusWolf, edits were done with the power consumption reported by Hwinfo added in.
I can literally throw 222W at my GPU to reduce latency by 1ms.

You got the summary, i had a lot of MSpaint and settings to change in the NVCP for that, so i hope people understand WHY we undervolt modern hardware and why EVERYONE should run a GD FPS limiter.


60hz is 16.6ms - if your input latency goes above that it'll feel laggy, wolf. I'm not sure if AMD can measure it with any of their software, or if RTSS can display it through afteburners OSD.
To be honest, I didn't even know that Nvidia had tools to measure it. It just shows what kind of a gamer I am, I guess. :laugh:

If I do any testing, it'll only be based on how the game feels, as that's the only thing I can test besides FPS. Like I said, I doubt I'll feel much of a difference. You might, but I'm pretty sure I won't. :ohwell:

I still don't think FPS capping is the way. I think not spending money on a graphics card that you don't need is the way. If your 3090 can do 200 FPS, but you're capping it to 120, then the rest of it is wasted money. A 3070 could have done a 120 FPS cap as well for a lot cheaper, and you wouldn't have had to bother with undervolting to achieve the same 209 W power consumption - just move the power slider all the way down and call it a day.
 
To be honest, I didn't even know that Nvidia had tools to measure it. It just shows what kind of a gamer I am, I guess. :laugh:

If I do any testing, it'll only be based on how the game feels, as that's the only thing I can test besides FPS. Like I said, I doubt I'll feel much of a difference. You might, but I'm pretty sure I won't. :ohwell:

I still don't think FPS capping is the way. I think not spending money on a graphics card that you don't need is the way. If your 3090 can do 200 FPS, but you're capping it to 120, then the rest of it is wasted money. A 3070 could have done a 120 FPS cap as well for a lot cheaper, and you wouldn't have had to bother with undervolting to achieve the same 209 W power consumption - just move the power slider all the way down and call it a day.
You can disable Vsync or used enhanced sync, your 6500XT is no 375W monster

With the top tier cards, it's just madness to throw 200W for no actual improvement. Everyones used to how things behave when your GPU isn't fast enough, theres a lack of knowledge of when its TOO fast
It's entirely possible for a user on a 60Hz display to turn all the RTX bells and whistles on, lower their FPS below refresh rate and have everything not just feel faster, but actually BE faster. Vsync is messy.

Edit: RTSS settings/stats don't match the nvidia ones, the youtube videos above cover it in far more detail and do specifically include AMD testing and results.
 
Last edited:
You can disable Vsync or used enhanced sync, your 6500XT is no 375W monster

With the top tier cards, it's just madness to throw 200W for no actual improvement. Everyones used to how things behave when your GPU isn't fast enough, theres a lack of knowledge of when its TOO fast
It's entirely possible for a user on a 60Hz display to turn all the RTX bells and whistles on, lower their FPS below refresh rate and have everything not just feel faster, but actually BE faster. Vsync is messy.
I see what you mean. I don't have much experience on what it's like when something is too fast, but I'll take your word for it.

On the other hand, it's pointless to throw a top tier card at a 60 Hz display. There was a time when I did that because 1080p 60 Hz was the gold standard, and you needed top tier hardware to play at Ultra graphics at decent framerates, but that's not the case anymore. People complain a lot about GPU power consumption, but they forget that 3080-90 level cards aren't meant for mainstream gaming. They're 4K and/or high refresh rate territory. My 6500 XT is a strange bird, but a 6600 (XT) or 3060 (Ti) should be enough for anything anyone with a 60 Hz (edit: 1080p) monitor could want (maybe except RT Ultra without DLSS/FSR).

This is why I don't recommend FPS capping - but heavily recommend adjusting one's shopping list to their needs, and not vice versa.

Edit: RTSS does in fact have rendertime stats, so if you use afterburner you can throw that in your OSD and do some quick comparisons. RTSS allows you to adjust the FPS cap in real time, as well.
I don't use Afterburner, but I could install it for a few tests, I guess. :ohwell: Or use my 2070 instead of the 6500 XT. :D
 
Last edited:
About this FPS cap and whatnot debate.
When I had my RX 570 and used it with my 75Hz Freesync monitor, capping it in the AMD driver to 74 and enabling Freesync gave me the best experience with Vsync off except for 1-2 games where Freesync refused to work properly so I just Vsynched it.

Now with the GTX 1070 I just cap everything to 75 in Nvidia driver and call it a day, works 99% of the time but in specific games I use Fast sync on top of that.
Tho it needs to be said that I'm not sensitive to imput delay at all, heck can't even notice it cause I don't play multiplayer titles but I am very sensitive to any form of screen tearing and that has to be fixed/get rid of one way or another.
 
I hope you won't do this. Undervolting helps, but not that much and you absolutely shouldn't base your buying decisions on undervolting. Realistic undervolting gains are about 5-20% in lost wattage, but that's average. Your graphics card can have amperage spikes and trip PSU. And you don't really want your PSU pushed to 100% anyway. One thing is that they wear out over time and lose wattage, another is just plain fan noise at 100% load.
I have a quality 750w PSU, so its not the reason I do it personally.
 
Last edited:
Anything is possible, but the doomsday scenario you paint is if the CPU hits max turbo, maxes all or most of its cores, and the GPU does something similar at the same time from just booting the OS. If that can happen then a slightly higher wattage PSU is probably vulnerable as well.
I deleted the post you replied to because you edited yours. Never mind.

You don't need max turbo for a power surge. The sudden rush when you press the power button can be enough. I was dealing with a faulty RTX 2060 a couple weeks ago that did exactly that.
 
I see what you mean. I don't have much experience on what it's like when something is too fast, but I'll take your word for it.

On the other hand, it's pointless to throw a top tier card at a 60 Hz display. There was a time when I did that because 1080p 60 Hz was the gold standard, and you needed top tier hardware to play at Ultra graphics at decent framerates, but that's not the case anymore. People complain a lot about GPU power consumption, but they forget that 3080-90 level cards aren't meant for mainstream gaming. They're 4K and/or high refresh rate territory. My 6500 XT is a strange bird, but a 6600 (XT) or 3060 (Ti) should be enough for anything anyone with a 60 Hz (edit: 1080p) monitor could want (maybe except RT Ultra without DLSS/FSR).

This is why I don't recommend FPS capping - but heavily recommend adjusting one's shopping list to their needs, and not vice versa.


I don't use Afterburner, but I could install it for a few tests, I guess. :ohwell: Or use my 2070 instead of the 6500 XT. :D
4k users say "hello, high refresh rates are super expensive"


Also 1440p 165Hz doesnt help if you play less demanding, or older games. Among us shouldnt suffer from 30ms rendering times, OR use 400W from a GPU.
But it totally can, if you just run without an FPS cap.
 
I deleted the post you replied to because you edited yours. Never mind.

You don't need max turbo for a power surge. The sudden rush when you press the power button can be enough.
Well I can edit my post as well, and we forget the convo happened :).
 
Well I can edit my post as well, and we forget the convo happened :).
No need. The purpose of an online forum is to learn - not only for us, but also for people who come around a couple months or years down the line looking for info. :)
 
About this FPS cap and whatnot debate.
When I had my RX 570 and used it with my 75Hz Freesync monitor, capping it in the AMD driver to 74 and enabling Freesync gave me the best experience with Vsync off except for 1-2 games where Freesync refused to work properly so I just Vsynched it.

Now with the GTX 1070 I just cap everything to 75 in Nvidia driver and call it a day, works 99% of the time but in specific games I use Fast sync on top of that.
Tho it needs to be said that I'm not sensitive to imput delay at all, heck can't even notice it cause I don't play multiplayer titles but I am very sensitive to any form of screen tearing and that has to be fixed/get rid of one way or another.
fast vsync fixes the issue, but for it to work perfectly you need to be double your refresh rate, or it adds a small amount of input latency. The closer you are to double the refresh rate, the smaller that delay is.

With 75Hz, you'll get the best render latency at 73FPS, OR if you use fast vsync you should aim for 150FPS. Both is just adding some input latency for no gain
(In my case, i've got hybrid settings because of two displays with different refresh rates - Fast vsync works well at 144FPS when the slowest screen is 72Hz)

In some of the examples i gave earlier, that's the kind of thing you see with a 5ms to 6ms difference so it's still worlds better than the Vsync limit, but it's only really worth using if you want to 1.5x->2x your refresh rate.


I edited out my RTSS stuff because the content i found was outdated, and the method they used... well it sucked. render times went to 30ms.
 
4k users say "hello, high refresh rates are super expensive"


Also 1440p 165Hz doesnt help if you play less demanding, or older games. Among us shouldnt suffer from 30ms rendering times, OR use 400W from a GPU.
But it totally can, if you just run without an FPS cap.
That's odd. I've never experienced such strange behaviour from an old game (or at least I haven't felt it).
 
fast vsync fixes the issue, but for it to work perfectly you need to be double your refresh rate, or it adds a small amount of input latency. The closer you are to double the refresh rate, the smaller that delay is.

With 75Hz, you'll get the best render latency at 73FPS, OR if you use fast vsync you should aim for 150FPS. Both is just adding some input latency for no gain
(In my case, i've got hybrid settings because of two displays with different refresh rates - Fast vsync works well at 144FPS when the slowest screen is 72Hz)

In some of the examples i gave earlier, that's the kind of thing you see with a 5ms to 6ms difference so it's still worlds better than the Vsync limit, but it's only really worth using if you want to 1.5x->2x your refresh rate.


I edited out my RTSS stuff because the content i found was outdated, and the method they used... well it sucked. render times went to 30ms.

I don't mind a little bit of extra latency, like I said I simply cannot notice it in single player games but I really can't stand tearing.
In most games the FPS cap alone gets the job done but in certain games I see tearing and then I have to make a separate profile for those games with Fast sync then it works.

I kinda miss having Freesync tbh, I tried adaptive with Nvida but that caused some serious issues with my setup/monitor so I ditched the idea and went back to capping FPS the basic way.

I will give that 73 FPS cap a try tho.:)
 
That's odd. I've never experienced such strange behaviour from an old game (or at least I haven't felt it).
It's easy to reproduce, it's just easier to see on nvidia thanks to GFE and the whole push thanks to nvidia reflex giving us fancy tools.
Turning the 'advanced' performance overlay on gives you a world of info (this is in windows, so some are not active)

Rember that this is biggest for situations where the GPU is total and utter overkill, with Vsync enabled. (that includes Gsync) If your FPS is fluctuating at all and not locked at 60/refresh rate, you wont get the huge delay added

I don't mind a little bit of extra latency, like I said I simply cannot notice it in single player games but I really can't stand tearing.
In most games the FPS cap alone gets the job done but in certain games I see tearing and then I have to make a separate profile for those games with Fast sync then it works.

I kinda miss having Freesync tbh, I tried adaptive with Nvida but that caused some serious issues with my setup/monitor so I ditched the idea and went back to capping FPS the basic way.

I will give that 73 FPS cap a try tho.:)
Both my displays are freesync but not officially Gsync compatible, and they don't do great when it's enabled with fun flickering.
I found all this stuff out gaming with my brother and JC316 here from TPU game changing around over the last... 10? 15? years, as we'd often have one of the three of us complaining about input lag in various games - and we slowly narrowed in on the issue due to JC being on 60hz. Vsync off fixed it but added tearing... until then the opposite happened with Dead island riptide. Dead island the original was peachy the whole game, but suddenly the sequel i was getting microstutter and massive frametime spikes - and i was the only 165Hz user.
My brother had capped himself to 90FPS since he was in a heatwave, while it was colder where I am.
That game engine had serious issues above 120FPS, and ever since then i've paid attention to various engines and their bloody weirdness if you go above 60FPS (Yay for console ports!)
Dying light 1 was amazing graphically and i could run 1440p 165Hz/FPS easily, but i got those same stutters in the easier to render areas - I tried 120FPS, problem free. Worked it up to 163FPS problem free and then found that youtuber above and blurbusters had already figured it out years before me.

That all ties back into the goal of this thread from then on: How do I get the most out of my hardware, since i can throw 200W+ extra and get WORSE performance withrender latency/stuttering in titles that hated high FPS, or Vsync limits. Or just ampere throwing 200W+ for no freakin gains.
 
Last edited:
I think the more impressive numbers would be your load wattages just to see what they actually differ :)
Well, you made me do it :laugh: I tested the FX8300 + HD7970 (both shameless juicers) in Fall Guys @ 1080p with maximum detail.

Overclocked and uncapped
CPU 4m/8t@4500 - GPU@1135/1650 +20% power limit: 113 fps avg, 418w peak power
Underclocked and 60fps cap
CPU 1m/2t@4100 - GPU@500/700 -20% power limit: 51 fps avg, 211w peak power

For a casual game where all you do is run and jump 51 fps feels OK.
 
It's easy to reproduce, it's just easier to see on nvidia thanks to GFE and the whole push thanks to nvidia reflex giving us fancy tools.
Turning the 'advanced' performance overlay on gives you a world of info (this is in windows, so some are not active)

Rember that this is biggest for situations where the GPU is total and utter overkill, with Vsync enabled. (that includes Gsync) If your FPS is fluctuating at all and not locked at 60/refresh rate, you wont get the huge delay added


Both my displays are freesync but not officially Gsync compatible, and they don't do great when it's enabled with fun flickering.
I found all this stuff out gaming with my brother and JC316 here from TPU game changing around over the last... 10? 15? years, as we'd often have one of the three of us complaining about input lag in various games - and we slowly narrowed in on the issue due to JC being on 60hz. Vsync off fixed it but added tearing... until then the opposite happened with Dead island riptide. Dead island the original was peachy the whole game, but suddenly the sequel i was getting microstutter and massive frametime spikes - and i was the only 165Hz user.
My brother had capped himself to 90FPS since he was in a heatwave, while it was colder where I am.
That game engine had serious issues above 120FPS, and ever since then i've paid attention to various engines and their bloody weirdness if you go above 60FPS (Yay for console ports!)
Dying light 1 was amazing graphically and i could run 1440p 165Hz/FPS easily, but i got those same stutters in the easier to render areas - I tried 120FPS, problem free. Worked it up to 163FPS problem free and then found that youtuber above and blurbusters had already figured it out years before me.

That all ties back into the goal of this thread from then on: How do I get the most out of my hardware, since i can throw 200W+ extra and get WORSE performance withrender latency/stuttering in titles that hated high FPS, or Vsync limits. Or just ampere throwing 200W+ for no freakin gains.

Yep I had some really weird flickers and even random blinking so I just noped out of that.

Anyway I will chip in with some undervolt data, this is my 1070 at stock settings and the 975 Mv undervolt I have applied in MSI Afterburner via custom curve.
All of the pics will be clickable thumbs.
Stock:
1070stockpower.jpg1070Stock.jpg

975 Mv Undervolt:

1070uvpower.jpg1070UV.jpg

As you can see that alone shaved off ~20W and 3 celsius on my GPU and also 5% fan speed.
Mind you its pretty damn hot in my room atm cause summer with no AC. :laugh:
Heck I even have a stable +50 MHz boost on the core with the UV setup, sure it doesn't really translate to more FPS but hey its free performance for less power draw and heat so I will take it.:)

Maybe I could UV/Oc a bit more but eh I can't be arsed and its rock stable this way. 'Its an old second hand GPU I don't want to abuse it'
 
Wow, this tread took another turn in an interesting direction with input latency etc. Maybe you can make a separate tread Mussels specific to feame cap, inout latency etc? I can contribute some, but you know more than me :)

exactly, buying AMG or M power cars then drive 50 KPH at the city. Clowns, really.


one should be a real idiot or enthusiasist to buy 900W gpu for real.
Driving faster much faster than 50kph may make your licence go away and then it's like having an awesome pc w/o mouse and keyboard ;)

Noise, other operations (mining for my part) makes an overpowered gpu worth it in my case, but if you purely game in 1080p 60fps and want an efficient card, getting a rx 6600 may be a better choice.

My point is more directed to the potential many people have, for instance, say you have a 4k 60Hz monitor and a 3080. Uncapped you get 70-90fps in a given game, you undervolt your card to 1900@875mv and save 80W there at identical performance, noise is reduced from 38 to 34dB. Put on a framcap at 58fps, input latency identical, but consumption is 60W lower. You could go 1650@750mv and save another 60W, fps drops to 65-82, but you don't notice due to framecap.

Both scenarios gives you identical percieved smoothnesd, input lag etc, but one saves you 140-200W and noise is significantly lower.

In fps shooters etc I would go higher than 60fps cap if my monitor has higher refresh.

Coarison to cars: Your MB AMG drives steady at the freeway in 110kph, you can run it in 4th gear at 4000rpm using 1.0l pr 10km or in 6th gear at 2200rpm using 0.7l pr km. If the need to accelerate comes you gear down to 3 and put pedel to the metal anyways. Which do you prefer?
 
i worry more about my frigde, i should realy get a more efficient one.
 
Who said its maxing out at 50%? do some research & see for yourself, all PSU reviews online clearly indicated around 50% load is peak efficiency for the design of ATX & SFF units.
And when it idles you drop way bellow 80% efficiency.
 
It's easy to reproduce, it's just easier to see on nvidia thanks to GFE and the whole push thanks to nvidia reflex giving us fancy tools.
Turning the 'advanced' performance overlay on gives you a world of info (this is in windows, so some are not active)

Rember that this is biggest for situations where the GPU is total and utter overkill, with Vsync enabled. (that includes Gsync) If your FPS is fluctuating at all and not locked at 60/refresh rate, you wont get the huge delay added
Ah, I see. Well, I don't really have an overkill GPU at hand (not since they cost 1,000+ USD), but perhaps I could play Half-Life on my 2070. :D

I've never liked traditional Vsync, to be honest. If capping FPS to 60 is the same kind of experience, then I think I know exactly what you're talking about. The input lag traditional Vsync adds is huge, even for a casual gamer like myself. Fast/Enhanced Sync is brilliant, though. No screen tearing above 60 FPS and no input lag. It's just that it doesn't reduce power consumption, but this is where what I said comes into play: don't waste money on a GPU that you don't need. ;)

And when it idles you drop way bellow 80% efficiency.
Yes, but in idle, we're talking about maybe 50 W total system power consumption. Maybe. With 80% efficiency, it's 60. That's not what's killing your wallet.
 
I actually think its unusual to run hardware maxed out. I see all the people with 99% gpu utilisation I am like wtf.

But i can answer the question.

I have a 3080, I play at 60fps maximum. I undervolt.

So why have the latest tech and such a high level GPU?

Well I also in some games run SGSSAA or 4k downsampling. So a lower GPU like say a 1070 doesnt cut it.
GPUs become less power efficient, the higher the clocks and the higher the utilisation, his saving was actually quite conservative.
Next you looking at the cost of electric in America which is fair to say is in a completely different world to Europe.
Hardware requirements of course also vary per game, I might play a lot of games where the 3080 is jogging at worst, but then put a game on which makes it sweat. Some of us are variety gamers, and dont just spec out our PC for one game. FF7 remake is an interesting example, for most of the game the 3080 is at low 3D clocks and no higher than 60% utilisation, but then every time I use a limit break the particles required spike it to the 90s at max clocks. A lower end GPU would stutter every time I use a limit break, but because I cap the frame rate the rest of the time its running in a much lower power/heat state.
I have access to latest NVENC, hardware integer scaling, the latest DLSS stuff as well, so buying a GPU isnt as simple as do I just buy one enough to get those high frames.

Undervolting can also actually increase performance as less likely to hit power limits as well as improve longevity on the hardware, level of cooling required and so on.

For reference I spend circa £50($61) a month now on electric to use my PC. I think with no CPU/GPU undervolt, and uncapped FPS, it would easily be double that, one year savings paid for my RTX 3080.
actually it also depends on *where* in America you live. Electricity is not cheap in California either.
 
Aww man no wonder we disagreed on GPU undervolting in other threads - the 1080Ti is one of the best power efficiency cards out there. My regular GTX 1080 is a fucking champ, but even overclocked its total wattage was *nothing* like the madness 3080 and 3090 suffer from.

It was seriously an epiphany moment to compare my 1080 experience with your 1080ti, vs the 3080 i had that died and now the 3090.
You can overclock and add 30W for 5% gains.
Me? My cards already overclocked out of the box to add 100W for those 5% gains, and there isn't enough power for the entire board already - i cant OC the VRAM without power starving the GPU, on a 350W BIOS. If i overclock the GPU, the VRAM clocks go down to power the GPU. It's madness.

You've simply missed out on the joys that the new cards aren't as good in that aspect - and we want the cards to cut back to be just as power efficient as what you're enjoying

I never realised the 1080Ti was that good power wise. i would really like a 3070/80 but still a bit too much ££ for me. I don't mind the 1080Ti though, even at 1440 it's not too bad.
 
Ah, I see. Well, I don't really have an overkill GPU at hand (not since they cost 1,000+ USD), but perhaps I could play Half-Life on my 2070. :D

I've never liked traditional Vsync, to be honest. If capping FPS to 60 is the same kind of experience, then I think I know exactly what you're talking about. The input lag traditional Vsync adds is huge, even for a casual gamer like myself. Fast/Enhanced Sync is brilliant, though. No screen tearing above 60 FPS and no input lag. It's just that it doesn't reduce power consumption, but this is where what I said comes into play: don't waste money on a GPU that you don't need. ;)


Yes, but in idle, we're talking about maybe 50 W total system power consumption. Maybe. With 80% efficiency, it's 60. That's not what's killing your wallet.
That's the key - the input lag of vsync ONLY activates when you reach vsync. the FPS cap prevents that lag entirely

I never realised the 1080Ti was that good power wise. i would really like a 3070/80 but still a bit too much ££ for me. I don't mind the 1080Ti though, even at 1440 it's not too bad.
If we ran an FPS unlimited situation, you'd use ~180W where i'd use 375W for the same end result. (Some BIOS'es would use 450W on their 3090s)

That sort of difference changes perspectives

Actually just had a thought, while playing starcraft II
Again... sorry AMD folks, nvidia make this easier.

If i'm able to peak at 165FPS but the game itself doesnt let that happen often (SC2 is DX9 single threaded, hell it only barely became 64 bit) - so the 99% FPS is a lot lower and everyones FPS tanks as the matches get bigger, no matter the hardware.

~80FPS 100% of the time, or 140 99% with 1% dips?
If i just capped that to 80FPS, i'd get 80FPS 100% of the time and have a really smooth game session without any fluctuations, and it'd save whatever GPU and CPU power was spent rendering the extra.
1656147695751.png
 
Last edited:
Back
Top