Again with the screenshots up and down...
I clearly want to be called fanboy again and I know I will be, because I'm going to make another
educated argument that it just happens it plays in favor of Nvidia, but I know that some intelligent people will be able to differentiate and apreciate the truth behind my words. I just didn't write it down before because I thought it was common knowledge. After so many posts talking about screenshots, and taking tham as dogma I feel I have to explain it.
You can't compare image quality on still screenshots, specially anti-aliasing because of its nature. Any IQ feature on graphics cards are thought for games or videos, which are composed of motion pictures, not for still images. When gaming you are always moving and even if you aren't, nowadays all games have camera balancing to some degree, so not two consecutive frames are identical even if you are standing still. This does impact on how the antialiased image looks like, because while on frame 1 a said pixel can have a black value of 80% (after wheighting all the fragments that will compose that pixel), in the next one it can have a value of 50% (with probably almost the same fragments but different weights), 25% in the next, 80% again and so on. When this happens through 30+fps what you get is a much more anti-aliased image than that of each frame. It's just a different way of doing it, and one that IMO doesn't hurt texture clarity and detail so much (because pixel color blending is not as pronounced*), but I guess that's my opinion.
*EDIT: I thought this could require further explanation. Look at the spotlight or chandelier 8xAA close up pictures in the OP. You can clearly see that on Ati the color is exceeding much more the boundaries of the lamp. This way the image appears more smooth (with its pros and cons), Nvidia gets the same effect doing the color blending as frames go by, so the textures don't get as much blurred and the MOVING image is as anti-aliased as with the other technique. Of course Ati's image blends over the time too, but it's redundant as they are already blend them each frame. Why the temporal anti-aliasing is better than the other technique to keep the detail of textures is complicated, but I think that screenshots speak for themselves so that this can be taken as fact.
Think that most older LCDs only have a few thousands of colors and use this same technique to attain "true" color, the millions of colors. Because of the purpose of anti-aliasing, comparing it in screenshots is almost as pointless as comparing DivX IQ on screenshots.
But yeah, you can all take the easy path and call me fanboy. Up to this point I think I don't care. Everyday I learn that most people don't care about the truth (or as Tatty said accuracy into the information), they just want to hear their "something" is better, and by no means they want to hear their flaws. I still have this post on my mind as it pictures very well what I'm saying:
Concerning [H], if you look at it, and Im not going to go there now, they always have a negative, no matter what for ATI, even in praise. They hand out awards as they cut and slam them. Everything deserves criticism, as nothing is perfect, but to me, after seeing ATI fail at highend, and at AA, youd think theyd have left any criticism out just the one time, which they didnt. Anyways, this isnt about [H], its about ATIs superior AA. I hate seeing the edges crawl as I turn in game. Seeing the results of this guys findings, its clear that using ATIs current solution mostly eliminates that at 4xAA, whereas the nVidia soltion doesnt, all the while, taking a lessor hit in performance doing so
Sure, because a REVIEW shouldn't take everything true into account, good and bad, in order to describe and recommend a product. It should be based on feelings...