I wrote up a brief summary (inspired by another thread, will find link later) of why benchmarks can be misleading, and first hand experience is key to a video cards performance. I hope this can be stickied or make an article here at some point, enjoy ********************************************************* This is a short, brief guide to explain the flaw behind benchmark comparisons of video cards. Most benchmarks (excluding the 3Dmark series) get the results based on the minimum and maximum FPS (frames per second) This was ok back in the day as no one saw anything wrong with more performance as an end goal - but its flawed. Let me give you an example: Video card one gives 10FPS as a minimum with 100FPS as a maximum - an average between these is 55. Video card two gives 20 FPS minimum, with 80FPS maximum - an average between these is 50. In the average review, card 1 is the winner by 5 FPS or 10%. People would buy this card thinking its faster. But wait a sec, what about the minimum FPS? 20FPS is slightly laggy, but 10FPS is half that speed, and totally unplayable. Some would correctly argue this card is in fact slower! for the best gameplay card 2 is in fact the winner. Some might find this revelation enough, but theres even more to it - what was the FPS doing between these amounts? was that minimum of 10 FPS a once off, or was the score poor the whole way through the test, with one huge boost at some point? As a basic breakdown, consider the following: Think of this as an FPS timeline, spaced 10 seconds apart 20 25 30 25 30 25 10 10 12 15 35 40 100 There is a 10 and a 100 in there, but most of them are below 30 FPS (the minimum framerate most gamers are happy with) So which card is faster? How do you tell which is the faster card, do we examine minimum, maximum, or average frame rate when deciding what to buy? The simple answer: none. Benchmarks need to be updated, and the slightly old game F.E.A.R (First encounter assault recon) had an in-game benchmark with the answer. This benchmark listed: Minimum FPS Average FPS Maximum FPS Thats all three - but which one to look at? How about the other inclusion? It showed a percentage of Frames below a threshold. What if benchmarks were like this? % of frames below 30FPS % of frames below 50FPS % of frames below 75FPS % of frames below 100FPS Its far better, because we can see where the card lies in overall performance - does it generally go above 50FPS for smooth gameplay? Does it lag briefly, or is it smooth consistent results? say 95% of all results were above 50 FPS but below 75FPS - thats perfectly fine for all but the fussiest gamers If a card gets 1% of frames above 100FPS, but 80% of them are below 30 - that card could have beaten the old system, but will be shown to be crap for most gaming here. This is a brief writeup, and only deals with Frames per second. Updates/extensions may come out at a later date, feel free to leave a comment if you want something expanded upon.