Originally Posted by jihadjoe
The new FX processors were on TR's podcast, and an interesting point raised was that while the TDP differences suggests a 48W gap between Intel and AMD, in actual practice the measured difference at the wall was about 100W.
That's HUGE, especially if you're running your computer 24/7 for one reason or another (servers, folding nuts, etc). 100W 24/7 at 15c per kWh is $11 a month in electricity. That's $130+ a year or roughly the difference in platform cost compared to an Ivy Bridge i7; if you keep your rig for 2 years an Ivy i7 actually ends up being cheaper to own+operate.
Where I live, the applied electricity rate is actually close to 30c per kWh, so that's a whopping $260 difference in electricity alone for just one year. It actually makes running an FX processor a rather silly option for me.
Actually if you are not running a 24/7 business and are essentially using your computer only for defined tasks, the best way to compute usage would be watts/computation. That would the most efficient method of energy useage.