I don't know how the average was calculated, but I have to suggest you use baselines per each game before you calculate the average. Otherwise games with very high frames are almost the only ones contributing to the average. Example:
Game 1: card 1 = 200 fps, card 2 = 160 fps, difference 20%
Game 2: card 1 = 40 fps, card 2 = 50 fps, difference -20% or 20% in card 2's favor.
Card 1 = 120 fps
Card 2 = 105 fps
Difference 15% in card 1's favor. Clearly they should be tied, but the end result shows a clear advantage for the card 1. Introducing more samples won't change that issue, so the average performance is not what it should be, if that is the way the average was calculated. And judging the results and what you said in post #6 I guess that's the way you did it.
For example, that's why the HD4870 X2 is so ahead of the GTX280 in performance in your charts, when in reality is not so much faster. I don't have the time to look at every card, but I'm sure the issue is on all of them.
I will gladly vote this as VERY useful, when that changes and I trully think at heart it really is useful. But as it is now I don't think so.