Originally Posted by Leon2ky
I don't necessarily see this as good. Video card power requirements are getting just absolutely fucking ridiculous. Aren't you suppose to you know, create more speed AND consume less power? If you ask me the video card industry needs a solid bitch slap to the face.
Ummm, let's take CPUs as an example.
Single cores got hot and fast (Pentium 4 EE). Then they went dual core with only a little more power, but much better performance in applications which supported them. Almost doubling processing power for less than a 25% increase in power. (A64 X2s)
Then die shrinks came into play and better architectures were developed, with renewed emphasis on performance/watt. (65W TDP conroes and 35W+65W AM2s)
What makes you think that this will just make things more power hungry? Especially when there's a VERY similar thing that happened VERY recently with just the opposite outcome.
We are at the end of Single core graphics cards (GX2s count as single). With SLi, Crossfire, and Unified shaders, programs are already equipped to do rendering in a massively parallel way; there won't even be the same multithreaded application thing CPUs went through. When they add another core, even at a slower speed, the performance gains will be above 50% right off the bat. Add some memory bandwidth and it goes higher, get into second generation multi core graphics cards and TDPs will drop, it's not like they really can go much higher anyways (like physical limits of the Silicon and the PCB.)
Yes I agree things are far too power hungry, but to be cynical and say it will just be worse soon, with evidence otherwise, that's just wrong.