- Joined
- Dec 9, 2013
- Messages
- 911 (0.24/day)
System Name | BlueKnight |
---|---|
Processor | Intel Celeron G1610 @ 2.60GHz |
Motherboard | Gigabyte GA-H61M-S2PH (rev. 1.0) |
Memory | 1x 4GB DDR3 @ 1333MHz (Kingston KVR13N9S8/4) |
Video Card(s) | Onboard |
Storage | 1x 160GB (Western Digital WD1600AAJS-75M0A0) |
Display(s) | 1x 20" 1600x900 (PHILIPS 200VW9FBJ/78) |
Case | μATX Case (Generic) |
Power Supply | 300W (Generic) |
Software | Debian GNU/Linux 8.7 (jessie) |
I have to agree.I see someone already did the math but yeah, it'll take years of 24/7 use at 100% load for that difference to make a dent on your power bill.
My current build including monitor and everything eats an average of 50kWh /month, let's suppose an AMD build would eat about 80kWh /month at a realistic average of 16 hours per day.
The difference in my power bill would still be tolerable and payable: Just $7.80 USD difference at $0.26 /kWh That's the price I'm paying here...
The difference for USA people and other countries would be even lower: Just $3.60 USD at $0.12.
Considering the power consumption "intel vs amd" described above was a bit exaggerated I believe you would pay even less than that.
My conclusion:
Power consumption should not be an excuse for average users (i.e. no 24/7 @ 100%), especially when living in USA and other "cheap energy" countries unless you are a "green hardcore".
Just my opinion.
Last edited: