It said it somewhere in there, or I assume when they say it plays at 34fps its an average, as that seems too low to be the high and too high to be the low. Thanks given.
@Shadow, you were expecting more than a 100% gain in benchies, or you were expecting more fps in Crysis? We all know Crysis is coded poorly (at least that seems to be the popular opinion) and as such should not be surprised if even the latest and greatest struggle somewhat w/ it. Its not lack of power, its lack of efficiency.
Indeed.
Oblivion can dog a GX2...how old is that now? EQ2 anyone?
Yeah thats what i was thinking too, but Nvidia ALWAYS does this, they're playing the GTX and ULTRA game all over again 500 and 600 dollar cards for the next 2 years again.
So? In today's world of PC gaming and custom computers, five hundred dollars isn't a shocking amount, especially for a GPU. There was a time when a new Voodoo card could cost you nearly $600.
Besides, there's plenty of cards currently on the market from both camps, that can handle 98% + of games with ease; and they'll all come down in price soon. It's not as if you haven't any options.
There's only three things I want to see these new cards do:
A) Have enough texture RAM, texture fill rate, and texture dump cycles in order to load all possible high resolution textures, at appropriate distance and in a proper amount of time. I.e. no more 'late loading' visuals.
B) Have enough shader power to render shadows and shadered textures at appropriate distance and in a proper amount of time. I.e. no more 'late loading' visuals.
C) Reduced stuttering overall due to better interfacing between the card's core, memory and it's relationship with the rest of the computer.
At this point, that's exactly what it looks like is going to happen. Neither camp has created a new architecture or changed the way GPUs are designed. We're still looking at Shaders, RAM capacity and RAM speed as the three main variables, and as we can see, ATI and Nvidia have basically boosted these attributes.