I secretly laugh at people who spend $400+ on a newly released video card. Even when they're my customers and I'm building them a computer with it.
Shhh....
Looking at your system specs.. do you laugh at yourself?
Dude, that's 5 G92s in there..
Anyways, my definition for buying graphics cards..
What I value is to play the game the way the devolpers idealized it, which is, with everything maxed out.
For my system, I buy a graphics card powerfull enough to play the current most graphics demanding game with max visual settings, 2xAA and 16xAF, using my monitor's native resolution, at over 45fps average and 25fps minimum.
And.. it has to be lower than or equal to 400€.
So I don't
need 150fps, or 16xAA or the fastest thing around or other e-penis related stuff.
The same way I prioritize going to a good movie theater in a good seat when I'm going to watch a good movie, in the PC I prioritize the best possible experience that I can afford.
In my case, back in April 2009, that meant dual HD4890s for playing Crysis.
Looking back, it was a good choice. My system can still play every single game completely maxed out, and it's still a bit faster than a single HD5870.
This means I probably won't be changing my system until end of 2010 or beginning of 2011.
Unless, DX11-only features like tesselation really kick in and become standardized in newer games.
That would probably make me sell my HD4890s and buy a single HD5850, for example.
I'd trade a 70fps + 4xAA experience for a 50fps + 0xAA + tesselated models, any day.
But I'm still hoping for some games to take advantage of the good old tesselation unit that's been in ATI's cards for 4 years, though.
If the tesselation unit in pre-DX11 cards is the same as the one in X360's, and they're doing it for the X360, why not do it for all the pre HD5000 cards?