Thanks mussels! I appreciate the explanation, but really need to see this proven in games with empirical testing. We've seen frame time tests, but that mostly means nothing since the frame times were imperceptible and it didn't translate to fps.
So while the statement may be true, less overhead, it really doesn't seem to matter much at all. At least not enough to suggest a specific brand for a low end gpu.
i tried to start a discussion on this a while back and got nowhere, because i couldnt really word the problem i see.
basically, if we run a benchmark (fraps recording, etc) on different GPU hardware to get 100% identical results, we're still totally ignoring the CPU aspect of it because everything else running at the time during normal gameplay (physics, audio, AI, etc) is all no longer present. That makes things like driver overheads, driver multi threading and similar simply not show up in the commonly used benchmarks.
Even when we have in game benchmarks, people find they have higher FPS than the games themselves for the same reasons - they're scripted to be the same every time, so the load is not indicative of real gameplay.