• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

A comparison of stock vs efficient gaming

I OC everything (except of course storage devices), when I pay $ for this investment I want every bit of performance I can get. But I'm not one who spends long hrs everyday only gaming... when I do, my system "could" be viewed as a small portable heater - especially with a Rocket lake platform that infamous for drinking electricity when pushed. Don't give a toss really, I got that type of criticism back in the days when I had an AM3+ system with an OC FX-8350 nearing 5.0GHz constantly every day...
Besides that, its a matter of relevance. Don't use heaters or AC in my gaming room because of passive thermal design in my house to take care of climate management.
So if my system is drinking about 500w on avg, can't see what the big deal is here anyway... :D
 
you can visist any techpowerp gpu page and see that power limits, clocks, etc varies a lot, it's a fact
Yes, I only talked about the models I own or have owned. They have powerlimits og 210-216W so very similar even though cooling varirs a lot.
 
I OC everything (except of course storage devices), when I pay $ for this investment I want every bit of performance I can get. But I'm not one who spends long hrs everyday only gaming... when I do, my system "could" be viewed as a small portable heater - especially with a Rocket lake platform that infamous for drinking electricity when pushed. Don't give a toss really, I got that type of criticism back in the days when I had an AM3+ system with an OC FX-8350 nearing 5.0GHz constantly every day...
Besides that, its a matter of relevance. Don't use heaters or AC in my gaming room because of passive thermal design in my house to take care of climate management.
So if my system is drinking about 500w on avg, can't see what the big deal is here anyway... :D

Wow the most honest post in the thread
 
I can honestly say trying to run 5 to 10 rigs a day when the sun is good, you do look a little bit at the power draw. I've found with crunching/GPU folding, much like mining, you know the end goal is never going to come but your in it for the long haul so, tweak, find a happy medium and away you go.

Most of my GPUs are at 80% power on air cooling some maybe less as even with open air test benches, things get hot and I prefer them not too. I've recently had a bit of a change around with my crunching machines and so now finding the happy medium is a game I have to play again. I think because I'm running things at 100% load for most of the time they are on, why not reduce the heat and power that they use. As its already been mentioned about the different power supplies, its such a small amount in comparison to tweaking your system, be it for under or overclocking, that's really not making a worth while difference. I mean pay £160 for a 1000w Gold PSU for example or £230 for a 1000w Titanium model. £70 more expense to save a few watts being pulled when you could save 100w or more when tweaking the GPU.

Playing games depending on the panel/refresh and all that jazz, you could set to V Sync and watch your temps and your watts drop like a stone. We all know 1080P is the most common res for gaming, so when you step up to 1440P or 4k, you know you are going to use more juice to run the same game, even with the same hardware.

A good example from me is FAH on my 3090, 100% load my full system pulls a fair whack, about 560 to 580w. If I lower the power to 80%, that drops about 100w or more and naturally, my temps come down with it as well. Performance? Hardly notice a difference but thank god my 3090 is fully water blocked (front and back) so temps are nice and cool, about 40 to 50C on the core, maybe up to 60C on the VRAM and VRMs etc. (I'd have to look to confirm) but that will just help keep the card running for that bit longer. Well I hope so.. :)

You run your systems however you need them to run or how you want, whether its overclocked or underclocked. As long as they do the job your asking them to do and are not slowing down, I don't think anything else matters :)
Has anyone in the thread done any power readings from their rigs to compare the power/performance at all?? (Base unit I mean, not with all the added toys like screens/speakers and anything else you might use :))
 
I can honestly say trying to run 5 to 10 rigs a day when the sun is good, you do look a little bit at the power draw. I've found with crunching/GPU folding, much like mining, you know the end goal is never going to come but your in it for the long haul so, tweak, find a happy medium and away you go.

Most of my GPUs are at 80% power on air cooling some maybe less as even with open air test benches, things get hot and I prefer them not too. I've recently had a bit of a change around with my crunching machines and so now finding the happy medium is a game I have to play again. I think because I'm running things at 100% load for most of the time they are on, why not reduce the heat and power that they use. As its already been mentioned about the different power supplies, its such a small amount in comparison to tweaking your system, be it for under or overclocking, that's really not making a worth while difference. I mean pay £160 for a 1000w Gold PSU for example or £230 for a 1000w Titanium model. £70 more expense to save a few watts being pulled when you could save 100w or more when tweaking the GPU.

Playing games depending on the panel/refresh and all that jazz, you could set to V Sync and watch your temps and your watts drop like a stone. We all know 1080P is the most common res for gaming, so when you step up to 1440P or 4k, you know you are going to use more juice to run the same game, even with the same hardware.

A good example from me is FAH on my 3090, 100% load my full system pulls a fair whack, about 560 to 580w. If I lower the power to 80%, that drops about 100w or more and naturally, my temps come down with it as well. Performance? Hardly notice a difference but thank god my 3090 is fully water blocked (front and back) so temps are nice and cool, about 40 to 50C on the core, maybe up to 60C on the VRAM and VRMs etc. (I'd have to look to confirm) but that will just help keep the card running for that bit longer. Well I hope so.. :)

You run your systems however you need them to run or how you want, whether its overclocked or underclocked. As long as they do the job your asking them to do and are not slowing down, I don't think anything else matters :)
Has anyone in the thread done any power readings from their rigs to compare the power/performance at all?? (Base unit I mean, not with all the added toys like screens/speakers and anything else you might use :))
I have done a lot of power-readings from wall with plug. I tested Cyberpunk 1080p dlss perf yesterday again in a different scene than in the starting post, monitor is not included, just PC:
1560@700mv 10GHz vram
60fps lock 142W
No lock: 138fps 185W
0.72fps pr watt whole system

1620@731mv 15GHz vram
60fps lock 144W
No lock: 154fps 205W
0.75fps pr watt whole system

Stock (200W powerdraw)
60fps lock 150W
No lock: 194fps 330W
0.59fps pr watt whole system
 
A good video on transient spikes. Doing something like a small UC or UV can solve random shutdown issues for some.
Indeed thats another benefit, undervolting can make these cards work on lower spec PSU's that they would otherwise need.

I have done a lot of power-readings from wall with plug. I tested Cyberpunk 1080p dlss perf yesterday again in a different scene than in the starting post, monitor is not included, just PC:
1560@700mv 10GHz vram
60fps lock 142W
No lock: 138fps 185W
0.72fps pr watt whole system

1620@731mv 15GHz vram
60fps lock 144W
No lock: 154fps 205W
0.75fps pr watt whole system

Stock (200W powerdraw)
60fps lock 150W
No lock: 194fps 330W
0.59fps pr watt whole system
Nice, its even more pronounced in weaker games.
I cant find my Berseria notes but do have my Timespy notes.

Profile 1 - Stock card configuration. (maxes out at 1905mhz but cannot hold it due to hitting power limit)
Profile 2 - 80% power limit, 1500mhz 0.718v curve
Profile 3 - 80% power limit, 1830mhz 0.850v curve (this out performs stock, around 20% less peak power)
Profile 4 - 100% power limit, 1905mhz 0.882v curve (unlike stock this can hold 1905mhz so out performs it as well, at around 10% less peak power).

Average power savings are higher as seen below.

Timespy data below uncapped fps

Code:
Profile 1- Bottleneck power limit
power usage range 315w-320w, average 305w
score 16155

Profile 2 - Bottleneck clock limit
power usage range 135w-180w, average 142w
score 13300

Profile 3 - Bottleneck clock limit
power usage range 240w-260w, average 247w
score 16866

Profile 4 - Bottleneck clock limit
power usage range 270w-308w, average 282w
score 17642

Timespy 60fps capped

Profile 1
power usage range 106w-316w, average 305w
score 9793

Profile 2
power usage range 105w-153ww, average 148w
score 9760 (two very brief points dipped to 56fps)

Profile 3
power usage range 105w-230w, average 197w
score 9789

Profile 4
power usage range 105w-240w, average 221w
score 9800

I use profile 3 day to day, but profile 4 is a good candidate, in old games like Tales of Berseria when I switch the driver to "prefer max performance" I will use profile 2. Can see at 60fps capped the power savings are considerable.
 
Has anyone in the thread done any power readings from their rigs to compare the power/performance at all??
FX8300
4m/8t @ 1400 - IMM/HTT @ 2600 - RAM @ 1200 - 88w idle
1m/2t @ 1400 - IMM/HTT @ 1000 - RAM @ 400 - 72w idle

This is my backup rig that I use for internet, office work and retro gaming. Heavily underclocked due to silly temperatures outside (46c in the sun as I type). Plays Fall Guys on that single core with an HD7970 @ 50% clocks just right - even though it's far below minimum requirements ;)
 
Last edited:
FX8300
4m/8t @ 1400 - IMM/HTT @ 2600 - RAM @ 1200 - 88w idle
1m/2t @ 1400 - IMM/HTT @ 1000 - RAM @ 400 - 72w idle

This is my backup rig that I use for internet, office work and retro gaming. Heavily underclocked due to silly temperatures outside (46c in the sun as I type). Plays Fall Guys on that single core with an HD7970 @ 50% clocks just right - even though it's far below minimum requirements ;)
I think the more impressive numbers would be your load wattages just to see what they actually differ :)

I have done a lot of power-readings from wall with plug. I tested Cyberpunk 1080p dlss perf yesterday again in a different scene than in the starting post, monitor is not included, just PC:
1560@700mv 10GHz vram
60fps lock 142W
No lock: 138fps 185W
0.72fps pr watt whole system

1620@731mv 15GHz vram
60fps lock 144W
No lock: 154fps 205W
0.75fps pr watt whole system

Stock (200W powerdraw)
60fps lock 150W
No lock: 194fps 330W
0.59fps pr watt whole system
Massive differences there!! And the question for you is, do you feel any differences when gaming on the stock or unlocked frame rates compared to the lock 60FPS tests you've done ? :)
 
I think the more impressive numbers would be your load wattages just to see what they actually differ :)


Massive differences there!! And the question for you is, do you feel any differences when gaming on the stock or unlocked frame rates compared to the lock 60FPS tests you've done ? :)
In Cyberpunk there is little difference for me except in a few fast paced fights. In Age 4 and other RTS I feel no difference. In Fortnite, Apex etc 140 fps cap (no point going higher due to g-sync+144Hz monitor) feels better/smoother.

What is your experience?
 
In Cyberpunk there is little difference for me except in a few fast paced fights. In Age 4 and other RTS I feel no difference. In Fortnite, Apex etc 140 fps cap (no point going higher due to g-sync+144Hz monitor) feels better/smoother.

What is your experience?
It's been so long since I've managed a game on my PC I can't remember lol The last games I remember playing properly where Saints Row 3 and 4, and that locked at 60FPS as that's what my panels are (1080 sadly but I use triple screens) never really noticed a difference to be honest. Things might change if I can ever get some new monitors (aiming for 3 again I think :)) but until then I think its been mostly Xbox of late for me. The PC hasn't really had much of a look in :(
 
Indeed thats another benefit, undervolting can make these cards work on lower spec PSU's that they would otherwise need.
I hope you won't do this. Undervolting helps, but not that much and you absolutely shouldn't base your buying decisions on undervolting. Realistic undervolting gains are about 5-20% in lost wattage, but that's average. Your graphics card can have amperage spikes and trip PSU. And you don't really want your PSU pushed to 100% anyway. One thing is that they wear out over time and lose wattage, another is just plain fan noise at 100% load.
 
I hope you won't do this. Undervolting helps, but not that much and you absolutely shouldn't base your buying decisions on undervolting. Realistic undervolting gains are about 5-20% in lost wattage, but that's average. Your graphics card can have amperage spikes and trip PSU. And you don't really want your PSU pushed to 100% anyway. One thing is that they wear out over time and lose wattage, another is just plain fan noise at 100% load.
All good points here, especially with PSU efficiency - the ideal of course is hovering around the 50% level for load. Plenty of room here for energy spikes.
 
Plenty of room here for energy spikes
I started playing with power again.. I would fold with high power limits on the CPU and a good clock on the GPU. Voltages looked ok, but the 5v is what brings my system down. I got some hard shutdowns where I would have to trip the switch on the PSU to start the system again. If I have my CPU at stock clocks/power then everything is ok, PSU is still strained but not past its limit. Not sure why EVGA said 750 is ok, clearly it is not. Their CSR said downclock my GPU.. didn't tell them it was overclocked.. not gonna turn down my GPU.. Just wont buy another Seasonic is all :D
 
Interesting topic. Let me drop in my 2 cents. :)

My 2070 at stock power limit (175 W) is the base line.
At 125 W, it performs 7% worse on average. That's a 29% reduction in power for a 7% performance loss.
At 200 W, its core clock is a bit more stable, but it doesn't come with any extra performance. Any! That's an extra 14% power consumed for absolutely nothing.

Conclusion: GPUs could be a lot more efficient than they are, but AMD and Nvidia choose to win benchmarks in reviews instead.

I still don't think one should waste money on an expensive GPU if they don't need its full potential.
 
Interesting topic. Let me drop in my 2 cents. :)

My 2070 at stock power limit (175 W) is the base line.
At 125 W, it performs 7% worse on average. That's a 29% reduction in power for a 7% performance loss.
At 200 W, its core clock is a bit more stable, but it doesn't come with any extra performance. Any! That's an extra 14% power consumed for absolutely nothing.

Conclusion: GPUs could be a lot more efficient than they are, but AMD and Nvidia choose to win benchmarks in reviews instead.

I still don't think one should waste money on an expensive GPU if they don't need its full potential.
Wouldn't that depend on the game being played? diff game engines & their development have different outcomes @ any given point in time.
I have noticed with my current RX 5700 XT factory overclocked, I can clock it higher to get up to another 5 -10 fps for an extra 20w above its 225w tdp, according to HWiNFO in borderlands 3.
 
Wouldn't that depend on the game being played? diff game engines & their development have different outcomes @ any given point in time.
Not really. It tops out at 1800 MHz regardless. It just dips into the 1750 MHz region a bit less often at 200 W. I think most of the extra power goes into voltage to maintain clock stability, which you don't notice while gaming.
 
All good points here, especially with PSU efficiency - the ideal of course is hovering around the 50% level for load. Plenty of room here for energy spikes.
Maxing out at 50% means that idling is inefficient. That's just too much PSU wattage for your system and money wasted.
 
Maxing out at 50% means that idling is inefficient. That's just too much PSU wattage for your system and money wasted.
Who said its maxing out at 50%? do some research & see for yourself, all PSU reviews online clearly indicated around 50% load is peak efficiency for the design of ATX & SFF units.
 
Maxing out at 50% means that idling is inefficient. That's just too much PSU wattage for your system and money wasted.
Who said its maxing out at 50%? do some research & see for yourself, all PSU reviews online clearly indicated around 50% load is peak efficiency for the design of ATX & SFF units.
What does a couple % difference in PSU efficiency matter, anyway? It's a couple of Watts at best, even with a high-end system.
 
My 2 cents on PSU efficiency:

avg_efficiency_low_loads1_230v.png

Typical idle loads (my setup idles at 60W+/- (CPU 20W, GPU 10W, rest 30W). That means that the most efficient PSU will consume between 5-10W more at idle vs the most efficient.

What I find much more interesting with PSUs than efficiency is noise. I have had very noisy bronze units, now I have Evga 650W gold with eco mode, fans don't start until temp reaches around 50C. Since my worst daily load is Cyberpunk at below 150W total system draw my PSU is silent all the time :)

A GPU UV on the other hand can with minimal efgort save you 40W alone with stock performance or 80-200W+ if you sacrifice a few percent perf (on Nvidia this means operating at 700-800mv range which translates to 1500-1850MHz depending on binning.) On my previous 5700XT consumption dropped from 220W to 120W (card reads 100W, but vram etc uses about 20W) dropping voltage from 1200mv@1800-1900MHz stock to 850mv@1600MHz UV.

It's been so long since I've managed a game on my PC I can't remember lol The last games I remember playing properly where Saints Row 3 and 4, and that locked at 60FPS as that's what my panels are (1080 sadly but I use triple screens) never really noticed a difference to be honest. Things might change if I can ever get some new monitors (aiming for 3 again I think :)) but until then I think its been mostly Xbox of late for me. The PC hasn't really had much of a look in :(
A bit if repeating, but generally for RTS/building like Anno, City Skylines, Age 4, mostly slow pace adventure games like SOTTR 60fps works very well, I have trouble noticing difference in RTS/building, but in certain setting in SOTTR I notice. In fast shooters it's more noticable and 100+ fps feels better.
 
Power costs are quite high in many countries so I thought I could share my experiences using a wallplug measuring powerdraw. This is pc with 5600X 76W lim, 3060ti 200W, 3 case fans. Motherboard, SSD, fans etc draws 30-40W, CPU typically 30-60W in games and GPU 200W if uncapped, stock and GPU limited. Tweaked setting using UV on GPU (1620@731mv) and 60fps limit with RTSS:

Age 4 1080p high/highest:
Stock 280-300W (80-110fps)
Tweaked 130-140W (60 locked)

Cyberpunk 1080p high/highest dlss performance:
Stock 290-320W (90-140fps)
Tweaked 135-150W (60fps locked)

The gpu UV accounts for 80W savings when gpu bound, the rest is due to fps cap. If you are satisfied with 60fps and game 3hours a day this can save you up to 200kWh (typically a kWh cost 0.1-0.5usd depending on where you live) a year if that is something you care about, maybe more if you live in hot climate and can reduce AC usage, less if you live in cold climate and use your pc as a heater. Further improvements could be running 1560@700mv on GPU, this drops max powerdraw to 100W vs 120W at 731mv on my GPU, but vram downclocks to 10GHz so performance drops to 80% of stock vs 90% at 731mv. A lower powerlimit on CPU also can help, 45W limit makes allcore run at 3.7GHz vs 4.6GHz using 76W, performance multicore is around 15-20% lower, but SC is the same.
I already loved you from the RAM timing help you gave me, but to find that you also think the way i do about useless wasted power with gaming seals the deal
My system doesnt even break 400W monitor included, with a 3090. I always get accused of lying or faking the numbers, since people just max every clock speed and uncap FPS with vsync off.

Always cap your FPS to at least two FPS below your refresh to prevent micro stutter and reduce 99% latency spikes. Never to your refresh rate.

IMO, if you are that concerned about saving energy, set your AC thermostat to 75°F instead of 72° and close the drapes and blinds. Set your furnace thermostat to 68°F instead of 72° and wear a sweater. Turn off the lights when you leave a room. Know what you want before you open the fridge door.
What I find puzzling is how some folks will spend a lot more money for a device that is only 3 - 5% more efficient. Computer power supplies are a perfect example. It takes many years to make up the difference in costs between a Titanium (94% efficient at 50% load) and a Gold (90% at 50% load) certified PSU. Yet people buy the Titanium, often in part, because they believe they are helping the planet. Yet the reality is, in many cases, it takes a lot more energy to manufacture those more efficient products.

Bill (woops the tags didnt work) that advice might seem super logical to you, but as someone from a very different climate it just uh... it wont work here. That sort of advice is extremely regional.
The rest of what you said is fine, i just laughed at the idea of an aussie house with a furnace, or that we even need lights for 80% of the year as opposed to the sun being visible through solid sheets of lead.

Second part:
Because the titanium parts often have longer warranties, and run colder due to that efficiency. My hx750i had a 10 year warranty and the fan doesnt even turn on when i game - i paid extra for *that*

I have a grandson who does the same. My points remain the same. The computer is still spending more time closer to idle consumption rates than maxed out rates.
(Yay this one worked)
Not with games - seriously, you can fire up games like WoW and minimise them and they'll still max out a lot of systems even if you're AFK, unless you modify settings to lower the FPS when they're in the background.
With MMO's and other social games this happens a lot - some of the people i play SC2 with literally leave their systems on and in game 24/7 unless servers are down for updates.
The FB groups i'm in have people constantly asking about bottlenecks with modern intel systems at 100% CPU usage in their esports games

Shit even battle.net adds 20-30W on most systems just from being open, since it fires up 3D clocks to display the stupid animations

In my examples above you save over 40% power on GPU alone when gaming and still getting 90% of stock perf. Gaming 3 hours a day is not that much, some friends of mine spend 8 hours on WOW. If your power costs 0.3usd pr kWh this becomes 150-200usd in powercost + maybe AC cooling cost in warm climate. Extra wear on equipment, more noise etc.
So... uhhh... we pay 23cents per kw/h.
USD has some really cheap electricity thanks to nuclear power




Why would you buy something and not use it? That surprise of yours makes no sense. A well optimized PC should see the gpu at 99% most of the time.

That's like buying a big SUV to go to the coffee shop
The only PC at 99% GPU all the time, is a PC with a too weak GPU.
I'm always surprised by people with that view on things you have.
Why are you not draining your phone to 0% battery? Why the hell did you let the screen go off?
Why is your car not in first gear at 10K RPM at all times, especially stopped at traffic lights?


Because it's stupid to max something out for no reason. You gain nothing for throwing hundreds of watts for an FPS gain you cant notice, and you're more likely to run into performance issues from overheating components and degradation years faster than just running things at a reasonable setting in the first place


I don't underclock or undervolt nothing. My GPU is OC'd as temps are pretty low gaming, same for CPU. It just shows how much better a custom loop is than air cooling. Even though the temps are rising ere now, my temps are still very acceptable.
As I have said before, I'm not fussed about power use, not gonna cry about it. If you want a powerful gaming rig accept it, or cripple it to lower power use/temps. I don't have 3 rigs or mine as some do, maybe it's them that should change their behaviour if they don't like high power use.

My GPU is 1080ti, monitor is 1440p/165hz i go for highest FPS i can get in games. 60fps=bleeurgh

Aww man no wonder we disagreed on GPU undervolting in other threads - the 1080Ti is one of the best power efficiency cards out there. My regular GTX 1080 is a fucking champ, but even overclocked its total wattage was *nothing* like the madness 3080 and 3090 suffer from.

It was seriously an epiphany moment to compare my 1080 experience with your 1080ti, vs the 3080 i had that died and now the 3090.
You can overclock and add 30W for 5% gains.
Me? My cards already overclocked out of the box to add 100W for those 5% gains, and there isn't enough power for the entire board already - i cant OC the VRAM without power starving the GPU, on a 350W BIOS. If i overclock the GPU, the VRAM clocks go down to power the GPU. It's madness.

You've simply missed out on the joys that the new cards aren't as good in that aspect - and we want the cards to cut back to be just as power efficient as what you're enjoying


Seriously - the 20 and 30 series cards have so many limits and throttles. If i overclock my VRAM, the GPU gets power starved and tanks to 1.2GHz. Underclocking one part, leaves more wattage for the board to use elsewhere - and yes thats totally F'ed up, but it's how the high end cards are now.


Maxing out at 50% means that idling is inefficient. That's just too much PSU wattage for your system and money wasted.

Whaaaaa
No way. My PSU is rated for fan off til 300W, the loss off the efficiency curve you're talking about would be in the single digits.
Just because a sweet spot exists on a curve doesn't mean that moving from that spot has to be a big problem
150W to 800W might as well be the same, and the efficiency loss of being at 100W is smaller because... a higher percentage of a small number, is small.
1656056454195.png
 
Last edited:
Good points Mussels. I have read a lot Blurbuster and I cap fast fps shooters at 140, combined with g-sync, v-sync on in nvcp, off ingame and reflex in games that have it, my lack of skills is the only thing to blame for being mediocre at Fortnite etc ;) My previous 60Hz g-sync notebook was capped at 57fps.

blur-busters-gsync-101-gsync-vs-fastsync-60Hz.png


Here is the reason to cap a few frames below refresh:
blur-busters-gsync-101-gsync-ceiling-vs-fps-limit-60Hz.png
 
Last edited:
I'm so sorry for being such an edit-whore everyone. My multi quote had a fit when i closed the browser and i did a poor effort at fixing.

My summary for this would be:

1. There is always a point of no returns, where more power consumed doesn't equate more performance. (and a point of NEGATIVE returns, such as VRM throttling)
2. There is always a point of more hardware performance not providing any end-user benefit (Ex: overclocking my 3090 to game at 720p 60hz)
3. Moar clock speed is not always faster. (ex: I can add 1.5GHz to my VRAM in afterburner, but that wattage no longer reaches my GPU and it cripples it HARD)


As Taraquin shows above, if you hit your Vsync limit - you're going to suffer for it. And yet people will happily overclock their hardware to do exactly that.

This youtuber is amazing, and everyone should watch these and breastfeed it to their children:
Because he is testing with an external camera, these numbers are not the same as what you'll see from nvidia stats, etc and can only be compared to results from his own testing.

(comparing FPS limiting methods with slow motion camera)
(Shows the actual latency differences from hitting Vsync limit to not)


300FPS: 17ms average.
144FPS: 42ms average (Hitting Vsync)
138FPS: 19.8ms average

Can anyone tell me, that getting a PC to maintain that 300FPS for the entire gaming session is going to be even remotely worth it?
1656057168094.png

1656057196545.png



If you watch the whole thing and get bombarded with all the info for AMD and Nvidia, you'll reach the summary at the end:
Using an FPS cap below your refresh rate is the only way to get low latency with no tearing. Even Gsync doesn't fix that.

1656057407033.png



So if i know i can shave off 200W, and not lose any *actual performance* in render times or input latency... why the hell would i throw hundreds of watts at nothing?
I might as well turn on a fan heater and blow it into my intake, because that extra heat is going to do the exact same thing for my performance.

Nvidia users can just hit Alt-R (if they have GFE installed), view their render latency and if it's under 8ms they're golden. My system can do 4ms on the 165Hz display and 8ms on the 4k, but then it can skyrocket to 30ms if it actually hits that vsync limit in easy to render titles.
 
Last edited:
I already loved you from the RAM timing help you gave me, but to find that you also think the way i do about useless wasted power with gaming seals the deal
My system doesnt even break 400W monitor included, with a 3090. I always get accused of lying or faking the numbers, since people just max every clock speed and uncap FPS with vsync off.

Always cap your FPS to at least two FPS below your refresh to prevent micro stutter and reduce 99% latency spikes. Never to your refresh rate.
Does capping FPS to your refresh rate cause micro stutter? Do you have a link where I can read about this? I'm genuinely interested.

Good points Mussels. I have read a lot Blurbuster and I cap fast fps shooters at 140, combined with g-sync, v-sync on in nvcp, off ingame and reflex in games that have it, my lack of skills is the only thing to blame for being mediocre at Fortnite etc ;) My previous 60Hz g-sync notebook was capped at 57fps.

blur-busters-gsync-101-gsync-vs-fastsync-60Hz.png


Here is the reason to cap a few frames below refresh:
blur-busters-gsync-101-gsync-ceiling-vs-fps-limit-60Hz.png
That's just proof that Fast Sync (Enhanced Sync on AMD) works a lot better than frame capping, which is exactly what I believe, as well.

The only PC at 99% GPU all the time, is a PC with a too weak GPU.
I'm always surprised by people with that view on things you have.
Why are you not draining your phone to 0% battery? Why the hell did you let the screen go off?
Why is your car not in first gear at 10K RPM at all times, especially stopped at traffic lights?


Because it's stupid to max something out for no reason. You gain nothing for throwing hundreds of watts for an FPS gain you cant notice, and you're more likely to run into performance issues from overheating components and degradation years faster than just running things at a reasonable setting in the first place
I don't drain my phone to 0% because its battery is big enough to last 2 days.
I don't run my car at 10K RPM because a car's engine is not a GPU. Its cooling system is not designed to handle such sustained heat, and it can get damaged due to mechanical friction. GPUs come with coolers and thermal limits that make them able to sustain a 100% load.

If you're not running your GPU at 100%, then you're not using it. It's fine if you don't need 300+ FPS (I don't, either), but then you don't need a powerful GPU in the first place. Why spend 800 bucks on a graphics card and play at capped 60 FPS when you could do the same with one that costs half as much?

Whaaaaa
No way. My PSU is rated for fan off til 300W, the loss off the efficiency curve you're talking about would be in the single digits.
Just because a sweet spot exists on a curve doesn't mean that moving from that spot has to be a big problem
150W to 800W might as well be the same, and the efficiency loss of being at 100W is smaller because... a higher percentage of a small number, is small.
View attachment 252185
That I agree with.
 
So you buy a 500$ GPU and let it run at lower performance just to save 20~100$ an year?
Why not just buy a lower grade card with lower power draw?
exactly, buying AMG or M power cars then drive 50 KPH at the city. Clowns, really.

Some big power savings there, imagine lots will look at doing this to save electricity $. Also be interesting to see how next Gen cards get on with rumoured 900w power draw in the coming economic environment.
one should be a real idiot or enthusiasist to buy 900W gpu for real.
 
Back
Top