http://en.wikipedia.org/wiki/80_Plus
80 plus = no difference in efficiency at load
bronze = 3% variance
silver = 3% variance
Gold = 3% variance
Platinum = 2-3% variance
Titanium = up to 4% variance
So the 50% number is correct for best efficiency, it's just not all that significant. 30 watts out of a 1000 watt load is nothing.
Now add to that the fact that you're pc idles at much lower wattage and it doesn't instantly jump to 100% load on every task and I just find that line of thinking silly. Buy a big psu is you're expecting to suddenly double your load down the line with crossfire or sli, don't buy a large psu just to hit some magic efficiency number. The extra money you spent is wasted, instantly putting you in a hole and the efficiency simply won't pay that money back due to the low yields.
if you buy a 1kw psu for a 500w load you are saving 15 watts max compared with a 500w (100% load) or even less at 650w (77% load) If you game 40 hours a week (work and play the same amount of time) you've saved 600w a week, 31,200watts a year.
US average energy is .12 per kwh so .12 * 31.2kwh = 3.74$ per year.
As an example a Corsair RM 550W is 110$ on newegg right now. A RM 1000W is 160$. So you paid 50$ more to save 3.74$ per year. You would need to run that unit for 13 years, 19 weeks, 1 day, 6 hours, and 15 minutes to break even. Efficiency at 50% is a terrible reason to buy a larger psu. Now if you were to suddenly upgrade your rig to need 600w then sure the 1000w was better, but I say go for the 650 RM for 120$ (currently 110$ on sale) and you'd be fine.
Also speaking from experience with buying for a more power hungry system down the road...beware what you wish for. I bought my psu when I only had a 500w load so I could get bigger badder hw later. I later upgraded to a 950w (peak OC) load and was a happy benchmark junkie for a while. Only to realize later that I would have been just fine on a 600w load at 1080p. (480sli, took a card out and didn't even notice) So essentially for 3.5 years I ran my rig with an extra 350w peak load that was completely unncecesary. For the curious I cost myself an extra 300$ for the card initially plus the energy use of 2.5 megawatt hours which was another 305.76$. 605.76$ down the benchmark whore tubes. Gaming experience at 1080p was identical.
just a cautionary tale for those wanting to go down that route.