• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

How do you justify a CPU upgrade?

we already have had electricity debates together ... a hotter cpu means a hotter house which means more AC usage etc.
And now, as then, I did not disagree with you on that. A 100W light bulb burns more energy and produces more heat than a 60W bulb. No duh!

But (1) unlike a light bulb, a CPU does NOT consume maximum power all the time, or even most of the time it is powered on, a point you keep forgetting or choose to keep ignoring. In fact, for the vast majority of users, for most 24 hour periods, most CPUs consume a small percentage of their total capacities. More facts you keep forgetting or choose to ignore.

(2) Like those CPUs, the difference between that 100W bulb and that 60W does NOT equate to 40W of "wasted" energy seen in the form of heat. It is significantly less than 40W.

(3) In fact, when doing the same tasks (same amount of "work") the hungrier device can take it easy, consuming less percentage of its capacity which can result in slower cooling fan speeds too - and less power consumption.

(4) In cooler months it works the other way around. That heat can help reduce facility heating costs by keeping your toes toasty.

And (5), if you did the math based on the cost of a kWh (averaging $.13 in the US) you would see it would take a very long time, years even, to make up the difference in the purchase cost of the new CPU. This is exactly why paying extra for a Titanium PSU over a Gold is just not worth it.

Yes, with a more efficient CPU, you consume less power and generate less heat. No disputing that. But it takes a long time and many kWh to burn up $100, as an example, in electricity. A 100W lightbulb burning at full power constantly for 10 solid hours would consume just 1 kWh, or $.13 in cost. It would take over 2 years at that rate to use up $100. I don't care how many videos you are rendering, your CPU is not working at 100% utilization, 10 hours per day, 365 days per year.

And for the record, the OP said the computer is used "mostly" for gaming and web browsing. Only "occasional" video "editing". And of course, video editing is greatly assisted by lots of RAM and depending on the type of rendering done, many of those rendering tasks are done by the GPU, not the CPU. And of course, in many computers, the graphics solution is the biggest power eater, not the CPU.

you think a 3930k will render a video using the same amount of heat as a new ryzen chip? (dont answer the question its rhetorical)
LOL This is where your arguments turns even more silly. (1) You are "convinced your sample size of one experience renders moot the whole point". And (2) you seem to think rendering a video takes all day long and that's what the average user does every day. :kookoo:

Rhetorical or not, I never said anything of the sort. So here you are, implying I did say that when in reality, it is just you making stuff up to stroke your own ego! That's pretty sad. And for the record, not once has the OP said anything about considering a 3930K. So even your silly sample size of one is irrelevant here.

I said what I think. Others here have said what they think. None of us need you to fabricate falsehoods about things we didn't say or think.

***********

So here we are full circle again, with me standing by what I actually said - which by the way, you actually agree with! So you arguing here is just trollish nonsense! I say again, with emphasis on the pertinent parts so you can understand,
Bill_Bright said:
I would not worry much about electricity usage unless you run your CPU with very high loads nearly 24/7.
 
Budget is obviously everything but I feel like for AMD purchases the 1600/1700, then 2700, now 3600 - it's less about the raw power of the current gen and more about getting on the current gen that made the bigger differences.

You could grab the 3900x, but unless you really need those 24 threads, I could see a $200-$250 4600 with an OC being faster than the 3900x it in most daily usage and in most games in just a few months.
 
People apparently feel very strongly about how they approach justifying a CPU upgrade. :fear:
 
Start seeing the CPU can't keep up with the GPU with each GPU ugprade every 2-3 years or the performance starts deteriorating overtime or whatever software i'm running the CPU can't handle it then its time for upgrade...or if someone close to me upgrade to something better..my inner enthusiast tell me to one up.
 
Bill. Its an opinion based question so people give opinion.
I agree. And I will defend anyone's right to express their opinion - when and where appropriate. I will not defend or condone someone intentionally misquoting another or purposely implying someone said something they clearly didn't. And that is what was happening. There's no place for such falsehoods in a technical discussion.

You said above, the "3960K is a freaking beast". That is true. "If" I suggested you said the 3960K would generate the same amount of heat as a new Ryzen chip - when clearly you never said anything of the sort, would you just roll over and accept that intentional misrepresentation of you? I don't think so.
People apparently feel very strongly about how they approach justifying a CPU upgrade.
It is not about feeling strongly about it. It is about the need to define the reason to upgrade.

For example, needing to upgrade the CPU because the old CPU will not support the latest version of Grand Theft Auto you want to "play" is not the same thing as needing to upgrade the CPU because the old CPU will not support the latest auditing program required for your work or school.

So justifying upgrading a CPU might be just a matter of opinion or desire. But justifying upgrading a CPU might be an absolute requirement to perform some necessary job. Justifying upgrading a CPU does not have to be about the cost at all.
 
when the cost dosnt matter you buy the best anyway.. i think cost is the biggest factor to be honest bill..

but that is just my oppinion.. quite clearly others think differently

trog
 
I was entertaining the idea of switching from the Ryzen 5 3600 to the Intel Core i7-9700K , and overclock the CPU. Seems like a waste of the money even though I was curious about the differences between the two. I only considered it as a way to get some hands on experience with the 9700K.
 
i think cost is the biggest factor to be honest bill..
No doubt cost is a HUGE factor in selecting "which" CPU to upgrade to. But there are other variables that might be HUGE or even bigger factors too. For example, time. Do I "need" this upgrade today, or can I wait 2 months until I can build the budget a little more? Do I "need" that i7 or will this i5 actually serve me better? Do I "need" the bragging rights, or just "want" them and so am willing to spend the money?
when the cost dosnt matter you buy the best anyway..
One would hope. But even when cost does matter, you hope you are buying the best your money can buy. And of course, the most expensive does not automatically imply the best, or the best buy.
 
Justify is a feeling, which can be expressed strongly or lightly.

So a need and a justification may only be a certain percentage of a person's Need and then also the want.

Really, people's budget, life style and many more things will determine the upgrade based on percentage of which feeling would be strongest.

That aside,
There's a lot of hardware and different views of each hardware.

With a mix of everything above.....

I feel the need to justify my upgrade, not just justify.

So the the opinion is going to greatly vary from user to user.

My opinion is, no expensive upgrades. It was out dated at release time. That's my opinion and I try to base them off of facts just like you.
 
And now, as then, I did not disagree with you on that. A 100W light bulb burns more energy and produces more heat than a 60W bulb. No duh!

But (1) unlike a light bulb, a CPU does NOT consume maximum power all the time, or even most of the time it is powered on, a point you keep forgetting or choose to keep ignoring. In fact, for the vast majority of users, for most 24 hour periods, most CPUs consume a small percentage of their total capacities. More facts you keep forgetting or choose to ignore.

(2) Like those CPUs, the difference between that 100W bulb and that 60W does NOT equate to 40W of "wasted" energy seen in the form of heat. It is significantly less than 40W.

(3) In fact, when doing the same tasks (same amount of "work") the hungrier device can take it easy, consuming less percentage of its capacity which can result in slower cooling fan speeds too - and less power consumption.

(4) In cooler months it works the other way around. That heat can help reduce facility heating costs by keeping your toes toasty.

And (5), if you did the math based on the cost of a kWh (averaging $.13 in the US) you would see it would take a very long time, years even, to make up the difference in the purchase cost of the new CPU. This is exactly why paying extra for a Titanium PSU over a Gold is just not worth it.

Yes, with a more efficient CPU, you consume less power and generate less heat. No disputing that. But it takes a long time and many kWh to burn up $100, as an example, in electricity. A 100W lightbulb burning at full power constantly for 10 solid hours would consume just 1 kWh, or $.13 in cost. It would take over 2 years at that rate to use up $100. I don't care how many videos you are rendering, your CPU is not working at 100% utilization, 10 hours per day, 365 days per year.

And for the record, the OP said the computer is used "mostly" for gaming and web browsing. Only "occasional" video "editing". And of course, video editing is greatly assisted by lots of RAM and depending on the type of rendering done, many of those rendering tasks are done by the GPU, not the CPU. And of course, in many computers, the graphics solution is the biggest power eater, not the CPU.

LOL This is where your arguments turns even more silly. (1) You are "convinced your sample size of one experience renders moot the whole point". And (2) you seem to think rendering a video takes all day long and that's what the average user does every day. :kookoo:

Rhetorical or not, I never said anything of the sort. So here you are, implying I did say that when in reality, it is just you making stuff up to stroke your own ego! That's pretty sad. And for the record, not once has the OP said anything about considering a 3930K. So even your silly sample size of one is irrelevant here.

I said what I think. Others here have said what they think. None of us need you to fabricate falsehoods about things we didn't say or think.

***********

So here we are full circle again, with me standing by what I actually said - which by the way, you actually agree with! So you arguing here is just trollish nonsense! I say again, with emphasis on the pertinent parts so you can understand,
wow what a tangent. im 100% sure that electricity cost 50+cents a kwh NOT 13 as you keep saying. im 100% sure we all dont do the same tasks on the computer and it doesn't say how does bill "bright" justify and that everyone elses OPINION is invalid. even browsing the web uses more than 0% cpu. So the fact that it uses less power to render faster is important to me. if it doesn't matter to you thats fine, but we all justify things different right? i guess im wrong about that and we must all use the same logic as bill "bright" most rendering from GPU is very quickly performed and mostly handled from cpu but you knew that right.. (sure sounds like you dont do much video editing if you think the gpu does most of the work)

if the cpu can give the same fps or better and use less power it gets my vote, of course we dont all have subsidized power like you. My computer is going to save me money in the winter by heating my house compared to a oil heater? WOW so strange they sell heaters then and not just cpu's that mindlessly use power to heat your "toes" (who has their computer on the ground anyway?)
 
wow what a tangent. im 100% sure that electricity cost 50+cents a kwh NOT 13 as you keep saying. im 100% sure we all dont do the same tasks on the computer and it doesn't say how does bill "bright" justify and that everyone elses OPINION is invalid. even browsing the web uses more than 0% cpu. So the fact that it uses less power to render faster is important to me. if it doesn't matter to you thats fine, but we all justify things different right? i guess im wrong about that and we must all use the same logic as bill "bright" most rendering from GPU is very quickly performed and mostly handled from cpu but you knew that right.. (sure sounds like you dont do much video editing if you think the gpu does most of the work)

if the cpu can give the same fps or better and use less power it gets my vote, of course we dont all have subsidized power like you. My computer is going to save me money in the winter by heating my house compared to a oil heater? WOW so strange they sell heaters then and not just cpu's that mindlessly use power to heat your "toes" (who has their computer on the ground anyway?)

I'm curious where you live that you pay 50 + cents per kWh.

Here in Tennessee I pay 11 cents per kWh and checking nationally the rate varies from 9.4 cents in Louisiana to 33 cents in Hawaii. The average in the USA is 13 cents per kWh. The highest that I have seen a member report here is 40 cents per kWh in Denmark if I recall correctly.
 
Last edited:
even browsing the web uses more than 0% cpu.
The GPU isn't idle either when web browsing. I've seen here just browsing the forums gets the GPU up to 30% average usage in Firefox.

I'm curious where you live that you pay 50 + cents per kWh.

Here in Tennessee I pay 11 cents per kWh and checking nationally the rate varies from 9.4 cents in Louisiana to 33 cents in Hawaii. The average in the USA is 13 cents per kWh. The highest that I have seen a member report here is 40 cents per kWh in Denmark if I recall correctly.
I just checked for Anchorage, Alaska
Untitled.jpg
 
Last edited:
I'm curious where you live that you pay 50 + cents per kWh.

Here in Tennessee I pay 11 cents per kWh and checking nationally the rate varies from 9.4 cents in Louisiana to 22.5 cents in Alaska. The highest that I have seen a member report here is 40 cents per kWh in Denmark if I recall correctly.

If I break down my electric bill:
Flat $16.22 customer charge for the billing period
$0.06144 per kWh for delivery charge
$0.00576 per kWh for system benefit charge (taxes)
$0.07714 per kWh for supplier charge for November.
$0.10330 per kWh for supplier charge for December.

First part of the bill (half of November):
(196×.06144)+(196×0.00576)+(196×0.07714) = $28.29064

Second part of the bill (half of December):
(277×.06144)+(277×0.00576)+(277×0.10330) = $47.2285

So those two plus the customer charge: $28.29064 + $47.2285 + $16.22 = $91.74

For December (the more expensive of the two,) the cost per kWh is about: $0.17 per kWh.

If I ran my computer at full tilt consuming 800w for 8 hours every day, that would be roughly 192kWh, at $0.17 (plus $16.22 for the month,) it would cost me about $32.64. The reality though is that my machine is not just only not going full tilt all the time, there are many days where it's not even on for that long. So $32.64 in a month is a worst case situation as of last month's rate. The real number is probably 1/3 or 1/4 of that, so at worst (1/3,) that's only ~$10 USD in a month. If you cut power consumption in half, that's $60 dollars saved over the course of a year, which is peanuts.

Edit: The number is even biased towards being more expensive because I included the customer charge. The reality is that you're probably paying less than $10 USD a month for electricity to run a computer.
 
im 100% sure that electricity cost 50+cents a kwh NOT 13 as you keep saying
Gee whiz. 100%? Ever heard of Google?

What is the average cost per kWh in the US?

If I ran my computer at full tilt consuming 800w for 8 hours every day, that would be roughly 192kWh, at $0.17 (plus $16.22 for the month,) it would cost me about $32.64. The reality though is that my machine is not just only not going full tilt all the time, there are many days where it's not even on for that long. So $32.64 in a month is a worst case situation as of last month's rate. The real number is probably 1/3 or 1/4 of that, so at worst (1/3,) that's only ~$10 USD in a month. If you cut power consumption in half, that's $60 dollars saved over the course of a year, which is peanuts.

Edit: The number is even biased towards being more expensive because I included the customer charge. The reality is that you're probably paying less than $5 USD a month for electricity to run a computer.
Nice breakdown! :) But I note in terms of this discussion (upgrading the CPU) your numbers are even more askew because the CPU is but one component in your computer. Your graphics solution, motherboard, RAM, drives, fans, monitor(s), speakers, USB ports and connected devices are all consuming part of your computer's total consumption too. So yeah, peanuts - but not even premium, roasted peanuts.

For the record - I am not a treehugger but I am all for going green when possible, recycling, and I certainly believe in global warming. But I also believe in looking at the big picture over all, and not just a small, cherry picked part of that picture.
 
Nice breakdown! :) But I note in terms of this discussion (upgrading the CPU) your numbers are even more askew because the CPU is but one component in your computer. Your graphics solution, motherboard, RAM, drives, fans, monitor(s), speakers, USB ports and connected devices are all consuming part of your computer's total consumption too. So yeah, peanuts - but not even premium, roasted peanuts.

For the record - I am not a treehugger but I am all for going green when possible, recycling, and I certainly believe in global warming. But I also believe in looking at the big picture over all, and not just a small, cherry picked part of that picture.
Sure, I was just going with a worst case figure I got off the wall from my UPS just to kind of show how even at its worst, it's really not all that bad.

So that'd be the entire machine overclocked and both GPU and CPU stressed at the same time plus the 4k display, so almost the entire machine under unrealistic load with some unreasonable overclocks. The reality is all of that hardware at idle gets me to about 215 watts. So if I wanted to be realistic, an average is likely closer to 300-watts because my machine spends a lot of time idling because I'm only really taxing it when I'm testing heavy workloads or playing a pretty graphically heavy game. Lately I've been playing Factorio, so it's probably not even getting past 320w. So if I assume that as an average for a month, it'd be about 80kWh, or 11 USD worth of electricity, with my terribly inefficient 3930k. :p

tl;dr: So for some perspective, that'd be about 1/9th of my power bill during a season where I use the least amount of electricity. :laugh:
 
215W at idle is a lot! I don't have a 4K monitor but I do have 2 x 24" monitors. And while I'm streaming Pandora, my speakers are not connected to my UPS. But all my network gear is and just typing this (which is not demanding at all), I'm only pulling 127W through my UPS.

As I said before most users would be amazed at how little their computers pull most of the time. A friend of mine was convinced his UPS display was faulty. So we connected through a kill-o-watt meter and verified his 650W PSU was way too big for his needs.
 
Gee whiz. 100%? Ever heard of Google?

What is the average cost per kWh in the US?

Nice breakdown! :) But I note in terms of this discussion (upgrading the CPU) your numbers are even more askew because the CPU is but one component in your computer. Your graphics solution, motherboard, RAM, drives, fans, monitor(s), speakers, USB ports and connected devices are all consuming part of your computer's total consumption too. So yeah, peanuts - but not even premium, roasted peanuts.

For the record - I am not a treehugger but I am all for going green when possible, recycling, and I certainly believe in global warming. But I also believe in looking at the big picture over all, and not just a small, cherry picked part of that picture.
i didnt mention what you pay or what the freaking average of anything is i know what i pay and its 50+cents not 13. so cool that im not allowed to have a opinion or a different power price than anyone else. that same 80kwn would be 40$ here instead of 11. sure wish i was on low income power or what ever you have to pay such a little amount. sure one day i will get a nice solar setup with battery backup. btw as others said. we are all allowed different answers no need to go back and forth.

it also depends how much you use to what tier you are on, im in california btw.
 
Last edited:
Maybe I'm just a simple guy but I don't put a whole lot of thought into it. The only two questions are "Do I want it?" and "Can I reasonably afford it?" It's all for fun and recreation... this isn't a work machine, even though I do work on it occasionally. Every bit of cash and time I put in is for the sake of it, so I'm not losing sleep about overbuying or getting upgrades I don't really need.

I dunno... I tend to go through phases. It starts off having ONLY what is needed. Of course, that's not the most satisfying way for things to be when you're an enthusiast who's always reading about the latest and greatest. There's that nagging thought of how it could be better. At that point, I may reach up for that perfect sweet spot of cost, performance, and efficiency. Just try to build the most optimal machine for me and stick to it. That in itself can be a fun challenge sometimes. After a while of that, I'll get bored of my perfectly configured/dialed-in machine and start going balls to the wall.

AMD has been breaking me down with their upgrade-friendly AM4 platform and constant improvements. I've bought one from every generation, working up the chain. From 3 with first gen, to 5 for the second, to 9 for the last. Now logically, what is the next step in the sequence for gen 4? :rolleyes: I say I won't but I don't really know that lol. I mean, if we were talking about needs, the 2600 would've been fine for a few good years. I DO have a line, but it's kinda touch-and-go.

It always comes back around. I've built other Ryzen machines that my old CPU's wind up in for upgrades, so it's not like they sit unappreciated. Somebody I know who could use it is gonna wind-up with a little unplanned upgrade. Might as well, since the value drops off anyway.
 
215W at idle is a lot! I don't have a 4K monitor but I do have 2 x 24" monitors. And while I'm streaming Pandora, my speakers are not connected to my UPS. But all my network gear is and just typing this (which is not demanding at all), I'm only pulling 127W through my UPS.

As I said before most users would be amazed at how little their computers pull most of the time. A friend of mine was convinced his UPS display was faulty. So we connected through a kill-o-watt meter and verified his 650W PSU was way too big for his needs.

According to my UPS dual 4k, streaming disney+, and replying on this forum will pull about 200W.
 
215W at idle is a lot!
It is, but you'd be surprised at how much all of the small things add up. There are 7x120mm fans, a pump on an AIO cooler, a 200mm on top, 4x1TB WD Blacks and a 500GB Constellation ES. That could easily be anywhere between 1/3 to 1/2 of that power consumption, probably an easy 80-watts, 30 watts for the display, and that's ~90 watts for the motherboard, CPU, and GPU. It's really not unrealistic.
 
About this time last year I wanted to build the cheapest but enjoyable AMD pc possible and for $400 I built an AMD R5-1600 RX570 8gb ddr4-3000 256gb ssd...and I did it just because I wanted to see how that system would perform to my 8700k/1070ti/16gb 3200mhz system and I concluded that the AMD system was almost as good at 1080p as the 8700k/1070ti was at 2560x1440 and then literally had no use for it...I gave it to my brother in law. He loves it. It's his first ever gaming pc...I didn't build it for that tho and I have no other justification other than I was curious.

I may do it again real soon too.
 
Last edited by a moderator:
It is, but you'd be surprised at how much all of the small things add up. There are 7x120mm fans, a pump on an AIO cooler, a 200mm on top, 4x1TB WD Blacks and a 500GB Constellation ES. That could easily be anywhere between 1/3 to 1/2 of that power consumption, probably an easy 80-watts, 30 watts for the display, and that's ~90 watts for the motherboard, CPU, and GPU. It's really not unrealistic.

I really should get a electricity usage monitor thingy, put a hard number to my computer power consumption...
 
Don't do it if you run retro boxes, my p4 3.4 and 6800gt eat more at idle than I care to admit.
 
215W at idle is a lot! I don't have a 4K monitor but I do have 2 x 24" monitors. And while I'm streaming Pandora, my speakers are not connected to my UPS. But all my network gear is and just typing this (which is not demanding at all), I'm only pulling 127W through my UPS.

As I said before most users would be amazed at how little their computers pull most of the time. A friend of mine was convinced his UPS display was faulty. So we connected through a kill-o-watt meter and verified his 650W PSU was way too big for his needs.

It's a great thing though because then the power supply hardly ever needs replacing and can last 10 years easily.

I really should get a electricity usage monitor thingy, put a hard number to my computer power consumption...

I got a cheapie $20 wall meter from the hardware store and it gave me accurate readings.
It helps if you can get one with a backlight because they can be hard to read in bad lighting.
 
Back
Top