Ok. This is an issue that's bothered me for a while. I've discussed it with FordGT90Concept before, but we couldn't come to an agreement. So I'm going to look at this objectively now so I can understand. I have an issue with the statement "It's bad to run a PSU at 90% of it's rated power." First, that seems to go against the definition of a PSU's rated power. The wattage usually given to a PSU is how much power it can continuously deliver. But that's isn't fully clear and somebody has to ask, "Well, for how long?" I know parts don't last forever, and you can't expect them to, so I think a PSU should be able to continuously deliver its rated power over its expected lifetime. And of course this needs a definition also. Expected lifetime is just that, simply how long the manufacturer and consumer expects the PSU to last. CDs and in compact disks, don't last forever, but consumers expect them to last at least for so long. Same with PSUs. Me personally, I think a PSU should last anywhere from 5-8 years. So, in my opinion, I think a PSU should be able to give 100% of it's power continuously for 5-8 years. But then that brings us back to the original statement. Apparently i'ts bad to run the PSU at 100% because it stresses the components too much leading to degration and early failure. But that seems to contradict the whole purpose of rated power, which I defined above. So now we have the statement, "The PSU is rated to give X amount of power, but actually using X amount of power is bad for the PSU." Clearly this statement is contradicting. If the parts can only be safely be run at 70-80% of the rating, why is it rated that high? Discuss. Pick out the flaws in my argument. Again, I'm trying to be objective and I don't want this to get heated. Thanks.