• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

A comparison of stock vs efficient gaming

I am all for NOT wasting power. And power consumption/efficiency is something I do take into consideration when making my purchasing decisions. But I feel some go overboard when it comes to conserving power with their computers. This makes no sense to me - especially for a computer being used for entertainment.

What I find puzzling is how some folks will spend a lot more money for a device that is only 3 - 5% more efficient. Computer power supplies are a perfect example. It takes many years to make up the difference in costs between a Titanium (94% efficient at 50% load) and a Gold (90% at 50% load) certified PSU. Yet people buy the Titanium, often in part, because they believe they are helping the planet. Yet the reality is, in many cases, it takes a lot more energy to manufacture those more efficient products.

Except to prevent overheating issues that cannot be resolved with conventional solutions, I see no reason to intentionally restrict performance just to save a few pennies.


It may or may not be unusual, IDK. But what seems to be too often forgotten (or perhaps just ignored) is that the majority of the time our computers are powered on, percentage wise, they spend more time running closer to idle than they are to being maxed out. Of course there are exceptions - mining rigs, for example. But exceptions don't make the rule.

Gamers are not gaming the whole time they are using their computers and even if they are, games are not maxing out demands 100% of the time. I note too, for example, even if the GPU is running at 99% utilization, it would be very rare for the CPU to also be running at 99% at the same time. Same with drives, RAM, motherboard, etc.

If you live in a house, there most likely are much bigger power hogs than your computer. Refrigerators and freezers are huge hogs that cycle on and off 24/7/365. Space heaters are horrible. Electric ovens, clothes dryers and water heaters love energy. Dishwashers (unless you air dry), vacuum cleaners, clothes irons are all much worse than most PCs.

And don't forget, most electronics these days don't power off when you press the power button. They just go into standby mode. And one of the worst offenders there are cable boxes - especially if a DVR too.

IMO, if you are that concerned about saving energy, set your AC thermostat to 75°F instead of 72° and close the drapes and blinds. Set your furnace thermostat to 68°F instead of 72° and wear a sweater. Turn off the lights when you leave a room. Know what you want before you open the fridge door.
In my examples above you save over 40% power on GPU alone when gaming and still getting 90% of stock perf. Gaming 3 hours a day is not that much, some friends of mine spend 8 hours on WOW. If your power costs 0.3usd pr kWh this becomes 150-200usd in powercost + maybe AC cooling cost in warm climate. Extra wear on equipment, more noise etc.
 
some friends of mine spend 8 hours on WOW
I have a grandson who does the same. My points remain the same. The computer is still spending more time closer to idle consumption rates than maxed out rates.

Plus I think it important to point out that your price per kWh value is unrealistic. Even with rate increases rampant these days, the average cost per kWh in the US, as of this month is 14.47 cents.

As for wear and tear and noise - any difference in wear and tear is going be negligible at best. If the computer is being used 8 hours per day doing normal office tasks, or if gaming, there is little difference in wear and tear - except maybe on a hard drive. And while I personally really hate fan noise, the increased power consumption from a fan spinning faster is going to be insignificant too.

Even if a fan would otherwise be off, I note a 140mm case fan spinning at 2,000RPM typically only pulls ~5 watts.

I am NOT saying your points are totally invalid. I am just saying your numbers are unrealistic simply because the power consumption of computers constantly varies widely and except for a few exceptions, rarely sit at or near capacity for extended periods of time.

This is exactly why PSU power supplies are so different from the power supplies of other electronics - and exactly why 80 PLUS certifications matter. Those supplies provide a "flat" efficiency curve at 20% load all the way up to 100% load. Other supplies typically have a "bell shaped" curve with peak efficiency typically at just 1 load rate.

A big screen TV, for example, puts a relatively consistent load on its power supply. So designers can pick a much less expensive power supply and simply match the load to the supply's peak efficiency point and be good to go.
 
I'm also one of those who undervolt and cap FPS to my monitor's refresh rate for various reasons.
Started undervolting when I had my RX 570 and it stuck with me ever since.

With my 570 it was a must otherwise it had really unstable/not proper boost clocks and undervolting fixed/stabilized that, now I'm also undervolting my current GTX 1070 for less power draw/heat even tho it was running fine at stock settings but its better this way imo. 'no performance loss either'

I'm only using a 75Hz Ultrawide monitor and I don't play competitive games at all so I always cap my FPS to 75, don't like to stress my system for no reason.
This way while playing a demanding game the whole system draws around ~230W from the wall and that I'm comfortable with.

I will sure do the same with my next GPU. 'possibly a 6700XT later this year'
 
I don't underclock or undervolt nothing. My GPU is OC'd as temps are pretty low gaming, same for CPU. It just shows how much better a custom loop is than air cooling. Even though the temps are rising ere now, my temps are still very acceptable.
As I have said before, I'm not fussed about power use, not gonna cry about it. If you want a powerful gaming rig accept it, or cripple it to lower power use/temps. I don't have 3 rigs or mine as some do, maybe it's them that should change their behaviour if they don't like high power use.

My GPU is 1080ti, monitor is 1440p/165hz i go for highest FPS i can get in games. 60fps=bleeurgh
 
Last edited by a moderator:
IMO power bills should be of least concern. I mean, if you can afford a modern gaming PC, then surely you can afford to pay your bills?

But that doesn't mean I don't care about energy consumption personally. To the contrary, I'm all for efficiency and reducing my carbon footprint. When it comes to computers, I pick the components according to my needs, and their cost effectiveness. I game at 1200p60 even though my rig is capable of much higher fps. Playing with an uncapped/unsynced frame rate seems wasteful to me. Even if the temps are fine, the PC will use far more power and dump all the resulting heat into your room. This may not be obvious if you're running air conditioning 24/7, but let's keep in mind that not everyone has or can afford AC.

Some parts of the world have already witnessed record temperatures this year. With high ambients and no AC, a rig consuming 500w+ is likely to raise your room temperature to 30c+ in a short time. And no, it won't go down after you're done playing, unless the outside temperature is significantly lower.

Undervolting/underclocking your hardware and syncing frame rate will reduce component and ambient temperatures, energy consumption and cut down on fan noise when gaming. Still, most of us don't spend their entire day gaming. This is when idle/low threaded power consumption matters. Luckily, modern computers do well in that respect and generally utilize less than 100w in such scenarios.
 
Last edited:
So you buy a 500$ GPU and let it run at lower performance just to save 20~100$ an year?
It's not as simple as that.It's more like 5% drop in performance at 15-20% less power for GPU, or even more if you like old games.
You still get all of your cores and memory, you still get your min and mostly avg FPS, the only thing that usually suffers is the max FPS, which on my 60Hz monitor barely matters.
And in case of 3080/3080Ti - it'll help a lot with thermal throttling (VRAM overheating is a plague). Plus, you won't have to sweat your balls off after a 2-hour gaming marathon on your day off.
At one point I had a BIG wake up call when I accidentally had both monitors and my PC connected on the battery side. When a blackout came during gaming, I had much less time than I thought I did...
For me it was an occupational necessity. I have a server rack with few work&personal servers in it, so I had to make it as efficient as possible, so I won't have to spend few grand on a big-ass rackmount UPS(just a regular big-ass consumer UPSes). This saved my ass in April, when missile hit a substation nearby - had no power and water for over 24H, but at least I had lights, internet, and fully charged phone. Could've been worse.
Ditto for me - noise concerns in an HTPC rather than temperature concerns but noise and temperature are both related to GPU power draw.
What's funny, is that at that time for me it was a CPU. A teensy-wheensy i3-6100 :D
I used to have a Lian Li Q11B, and it had this weird choice of construction, where PSU serves as an exhaust for CPU, cause it's right above it.
My old GTX 950 was doing just fine, even in the middle of 40C day (I had no AC and lived on the 2nd floor of a house, which means it used to get even hotter in my room). But the CPU was choking on its own heat all the time, except winter.
Or just not fill their room with heat in the first place, only to burn more energy just to vent it.
+1. I only have AC in the workshop, but the electricity is 4-5 times more expensive (different tariffs for private/business category).
 
It's not as simple as that.It's more like 5% drop in performance at 15-20% less power for GPU, or even more if you like old games.
You still get all of your cores and memory, you still get your min and mostly avg FPS, the only thing that usually suffers is the max FPS, which on my 60Hz monitor barely matters.
And in case of 3080/3080Ti - it'll help a lot with thermal throttling (VRAM overheating is a plague). Plus, you won't have to sweat your balls off after a 2-hour gaming marathon on your day off.
I mean you're only talking about the undervolt part while the OP is literally cutting down the performance by limiting the FPS just to save power
 
I have a grandson who does the same. My points remain the same. The computer is still spending more time closer to idle consumption rates than maxed out rates.

Plus I think it important to point out that your price per kWh value is unrealistic. Even with rate increases rampant these days, the average cost per kWh in the US, as of this month is 14.47 cents.

As for wear and tear and noise - any difference in wear and tear is going be negligible at best. If the computer is being used 8 hours per day doing normal office tasks, or if gaming, there is little difference in wear and tear - except maybe on a hard drive. And while I personally really hate fan noise, the increased power consumption from a fan spinning faster is going to be insignificant too.

Even if a fan would otherwise be off, I note a 140mm case fan spinning at 2,000RPM typically only pulls ~5 watts.

I am NOT saying your points are totally invalid. I am just saying your numbers are unrealistic simply because the power consumption of computers constantly varies widely and except for a few exceptions, rarely sit at or near capacity for extended periods of time.

This is exactly why PSU power supplies are so different from the power supplies of other electronics - and exactly why 80 PLUS certifications matter. Those supplies provide a "flat" efficiency curve at 20% load all the way up to 100% load. Other supplies typically have a "bell shaped" curve with peak efficiency typically at just 1 load rate.

A big screen TV, for example, puts a relatively consistent load on its power supply. So designers can pick a much less expensive power supply and simply match the load to the supply's peak efficiency point and be good to go.
If you look at my examples when testing they are not maxed out, but WOW can be quite demanding on both GPU and CPU and if you game 8 hours a day without fps cap consumption will be far higher than idle most of the time. In europe where I live many countries has had energy pruces above 0.3usd kWh since last november now, some countries have these prices all year. The absolute max consumption of my pc is 340W with GPU+CPU stress. Cyberpunk running 1080p high is quite close to that, games like Fortnite is quite a bit below.

I mean you're only talking about the undervolt part while the OP is literally cutting down the performance by limiting the FPS just to save power
Noise is also one reason :)
 
In warmer climate's like where I am, daily temps can go over 40 degrees celeus, UV and UC makes a huge difference in temps on both the CPU and GPU.

Even if you run water cooling the radiator is already warm to the touch when switched off and the fans blow hot air onto it makes things even worse.

There is no reason for my GPU to push 150 fps when my screen is 1080p 75Hz. Same for most people with 144Hz you dont 300fps from your gpu in "most" games its just wasted power.
 
So you buy a 500$ GPU and let it run at lower performance just to save 20~100$ an year?
Why not just buy a lower grade card with lower power draw?
As long as your happy with the frame rate yes, might even get a few extra years out of the card too as games get more demanding.
 
You can do whatever you want, i play anno and cap at 60fps, i take nothing from 144hz on that game, that was not my point. My point is your surprise at someone using the gpu at 99%, makes no sense.
There are no moving parts on a gpu besides the fans, this is nothing like a full revved car. And my fans don't spin nowhere near 100% even at 99% utilization. Not even with the hot 5700xt i let the fans spin at 100%.
Buying a bigger gpu for "headroom"? i'm not a fan of future proofing, it doesn't make sense, it's money lost. And buying for headroom makes even less sense to me.

Buy what you need. There's a new game you can't play, sell the card and buy another, it will even be more efficient.
For GPUs futureproofing as in buying something you don't need too upgrade in 2 years may not be dumb. Those who bought a 1080ti in 2017 for 6-700usd still has a good card except for 4k. Little did they know then about the mining to come (except late 2017) and the horrible pricing of the slightly faster Turing or non-inventory of Ampere to come ;) I'd rather buy a good gpu today and keep it for 3-4 years which I may do with my 3060ti (which I earned money on due to mining). Some futureproofing may be smart in the GPU world if you aim for 3070ish max.

Example with prices where I live for a 1440p gamer:
Buy 1070 for 400usd 2017 or 1080ti for 700usd.
Sell 1070 for 300usd in 2019 and buy 2070S for 600usd.

Sell 2070S for 400usd or 1080ti for 400usd in 2022 and buy 3080 for 900usd.

It nets about the same, but don't have to buy/sell every 2-3 years.
 
Meh power consumption be damned when I game I want it go as fast as it can not beetle around the bush the only thing I do is adjust EDC, TDC, PPT and run a static all core oc of 4750 at 1.3V otherwise it's full steam ahead does it use a lot of power nope not really does it heat up my lounge mmmm nope I still have to run the heat pump 24/7 in winter to stay warm and I'm paying 0.27NZD per KWh + transmission fee of 0.30NZD per day over winter my monthly power bill is $170NZD and in summer it's $145 ish depending on whether or not I run the heat pump in cooling made
 
Meh power consumption be damned when I game I want it go as fast as it can not beetle around the bush the only thing I do is adjust EDC, TDC, PPT and run a static all core oc of 4750 at 1.3V otherwise it's full steam ahead does it use a lot of power nope not really does it heat up my lounge mmmm nope I still have to run the heat pump 24/7 in winter to stay warm and I'm paying 0.27NZD per KWh + transmission fee of 0.30NZD per day over winter my monthly power bill is $170NZD and in summer it's $145 ish depending on whether or not I run the heat pump in cooling made
Then this post is not for you ;) Do what you like with your setup, my guide was meant for those interested in increasing efficiency, lowering noise etc :)

I mean you're only talking about the undervolt part while the OP is literally cutting down the performance by limiting the FPS just to save power
Many people run 4k 60Hz monitors, 1080p 75Hz monitors etc, for those running uncapped is a real waste. Also playing games like Age 4 etc going above 60fps make viritually no difference, running Fortnite etc however is noticeably better running high fps IF your monitor supports it. UV with stock performance at lower power is also quite easy. Running Ampere at 1900-1950@900mv gives you stock oerformance at 20% lower powerdraw :)
 
I mean you're only talking about the undervolt part while the OP is literally cutting down the performance by limiting the FPS just to save power
No, as I said - it's more complicated than that.
I don't even touch voltages on my GPU. Something as simple as tweaking PL and temperature limits affects quite a few things, like how far your GPU can boost and how often it does.
And something as simple as turning on VSYNC or enabling framecap can drastically reduce load on GPU as well. As I said - I have a 4K 60Hz monitor, and it makes very little to no difference to me if the game is running at 60 or 666 FPS - it still looks the same. Input lag is not critical to me - I quit playing online shooters ever since Quake Champions went to shit, though on newer titles it matters even less: NV and AMD both have "Anti-lag" features, more and more game engine devs work hard on decoupling input from rendering pipeline(notably Epic).
So, I'm on OP's side here. I'd rather not render frames that I won't see anyways to get tangible benefits, rather than render everything to my GPU's full potential to get questionable or highly subjective benefits.
And to the point of "getting a cheaper GPU" : with cheaper GPU I have no headroom, and with more expensive - I can always just press a button. I have the performance when I need it. Same reason why people buy R9/i9's.
 
What I find puzzling is how some folks will spend a lot more money for a device that is only 3 - 5% more efficient. Computer power supplies are a perfect example. It takes many years to make up the difference in costs between a Titanium (94% efficient at 50% load) and a Gold (90% at 50% load) certified PSU. Yet people buy the Titanium, often in part, because they believe they are helping the planet. Yet the reality is, in many cases, it takes a lot more energy to manufacture those more efficient products.
I bought a Platinum rather than Gold Seasonic PSU. The thing is damn near cold to the touch under load and I've never seen the fan spin up even once. Zero regrets at paying £20 more here. I know people who spend 3x more than that purely on changing the colour of their PSU braided cable sleeving, so "wasted money" is entirely relative.
Plus I think it important to point out that your price per kWh value is unrealistic. Even with rate increases rampant these days, the average cost per kWh in the US, as of this month is 14.47 cents.
Well not everyone lives in the US.
Even if a fan would otherwise be off, I note a 140mm case fan spinning at 2,000RPM typically only pulls ~5 watts.
Maybe I'm just noise sensitive, but I find 2000rpm fans are noisy as hell. All my case fans are silent ones that run 500-1200rpm max. Personally, I've been happy to pay slightly more for a much quieter system going back to when there were no motherboard fan controls and you had to use those little Zalman Fan Mate style 3-pin fan controllers that either had a variable resistor or switched the 12v/GND fan inputs into 12v/5v (7v) or 5v/GND (5v), back when people were suspending 3.5" HDD's in 5.25" drive bays with bungee cord to eliminate case vibrations, and back when silentpcreview was the only refuge away from noisy = normal. Things have greatly changed for the better since then for silent PC enthusiasts, but at the end of the day less heat = less noise laws of physics are still in place, and if you can reduce heat in ways you are happy with, why not? Silence really is golden.
 
Last edited:
I have a 4K 60Hz monitor, and it makes very little to no difference to me if the game is running at 60 or 666 FPS - it still looks the same. Input lag is not critical to me
I think you didn't notice that the OP has a 144Hz monitor. In your case, a similar situation should be capping the FPS to maybe 30 to reduce power draw
 
Before reflex etc running uncapped could make a noticable
I think you didn't notice that the OP has a 144Hz monitor. In your case, a similar situation should be capping the FPS to maybe 30 to reduce power draw
As said earlier I cap fps at 140 in Fortnite, Apex and other fast shooters, in Age 4, Elden ring (capped by 60 default unless you mod it) 60 works fine :) 30fps is not percieved at smooth, whereas 60fps is so that is a bad comparison.
 
So you buy a 500$ GPU and let it run at lower performance just to save 20~100$ an year?
Why not just buy a lower grade card with lower power draw?
All he's doing is turning on vsync which is what alot of people do regardless either because there monitor only supports 60/75hz or they have a 144hz monitor but play mostly slower single player based games (thus capping it somewhere in the 60-90hz range isn't a big deal).

As to "why buy more GPU if it's going to run lower performance/FPS" the reason is it' gives you more flexibility in regards to turning on advanced settings, it gives you the extra horsepower you need for games that aren't well optimized, and arguably most importantly it gives you a buffer for 1% lows and dips (EX: someone who runs a 1080p 75hz monitor could possibly get by with a 3050/2060 level card but would 100% see dips in the 40s and 50s in some games, why not just buy a rtx 3060ti level card and then NEVER see dips below 75hz ?)
 
Last edited:
Some big power savings there, imagine lots will look at doing this to save electricity $. Also be interesting to see how next Gen cards get on with rumoured 900w power draw in the coming economic environment.
Yeah, Jensen will have to shove RTX 4090 up his arse.
 
Looking at my numbers, obviously vsync saves a crap ton, not wasting resources on frames that serve no purpose.
Games like tales of berseria, Nvidia's messed up power algorithm will now idle at boost clocks instead of base 3d clocks, increasing gpu power consumption roughly by circa 30-40%, to fix that I have a profile that caps max clocks at 1500mhz (only for those type of games).
Games that need a lot of GPU grunt even have a custom curve as I determined less power = more performance via less throttling.

I would post my voltage curve profiles here, but most here seem overclockers instead so not sure if anyone cares for it.

Pretty crazy how desperate the vendors are getting rolling out stock curves that grab a few % extra peak performance for 30% more power.

UK energy prices, the cost savings over 12 months paid for my 3080. By the way another 54% energy cost increase coming in October as well.
 
Pretty crazy how desperate the vendors are getting rolling out stock curves that grab a few % extra peak performance for 30% more power.

They do cap gpu's if the cooler is shit. If they go out of their way to put 3 fans and 1k of aluminium they will obviously want to get in front of the competition. Blame reviewers.
 
A good video on transient spikes. Doing something like a small UC or UV can solve random shutdown issues for some.
 
I always have vsync on, my monitor is a 165hz though. Apart from my GPU being OC'd because i cannot afford any better card, my CPU is running stock with PL1 and 2 set at 195 watts, and a 0.40 undervolt, it is quiet when gaming, and when idle is silent even with the custom loop and 7 fans.
 
Last edited by a moderator:
They do cap gpu's if the cooler is shit. If they go out of their way to put 3 fans and 1k of aluminium they will obviously want to get in front of the competition. Blame reviewers.
I had the Palit 3060ti dual oc previously, it has the same consumption/pwr limits as my Asus 3060ti dual oc and tuf oc, but Palit weighs 600g/25cm/2slot, Asus dual weighs 1050g/27cm/2.7slot, Asus tuf weighs 1150g/30cm/2.7slot and the noisedifference is huge and temps are much better on the 2 latter.
 
I had the Palit 3060ti dual oc previously, it has the same consumption/pwr limits as my Asus 3060ti dual oc and tuf oc, but Palit weighs 600g/25cm/2slot, Asus dual weighs 1050g/27cm/2.7slot, Asus tuf weighs 1150g/30cm/2.7slot and the noisedifference is huge and temps are much better on the 2 latter.

you can visist any techpowerp gpu page and see that power limits, clocks, etc varies a lot, it's a fact
 
Back
Top