- Sep 15, 2009
- 257 (0.06/day)
|Processor||Intel Core 2 Duo Q9550 @ 3.833Ghz [Stable]|
|Motherboard||ASUS P5Q Pro|
|Memory||Team Group DDR2 Xtreem Dark PC2-8500 @ 902Mhz|
|Video Card(s)||PNY GTX 570|
|Storage||Western Digital 1TB WD10EARS|
|Case||AeroCool AeroEngine II|
|Software||Windows 7 x64 RTM|
Review Websites Discover AMD Driver Reduces Image Quality
PC gaming enthusiasts understand image quality (IQ) is a critical part of the PC gaming experience. They frequently upgrade their GPUs to play the latest games at high frame rates, while also dialing up the display resolution and graphical IQ effects to make their games both look and play great. Image quality is important, and if it were not important, we'd all be playing at 10x7 with no AA!
Important Benchmarking Issues and Questionable Optimizations
We are writing this blog post to bring broader attention to some very important image quality findings uncovered recently by top technology Web sites including ComputerBase, PC Games Hardware, Tweak PC, and 3DCenter.org. They all found that changes introduced in AMD's Catalyst 10.10 default driver settings caused an increase in performance and a decrease in image quality. These changes in AMD's default settings do not permit a fair apples-to-apples comparison to NVIDIA default driver settings. NVIDIA GPUs provide higher image quality at default driver settings, which means comparative AMD vs. NVIDIA testing methods need to be adjusted to compensate for the image quality differences.