Originally Posted by FordGT90Concept
Uh, what? Consoles for the past decade have been using GPU cores virtually unchanged (except for the memory interface/management) from what is available for computers. I can't name the last time computers borrowed graphics technology from consoles--likely because it was in the 80s.
New computers have so much power, they have no need to optimize the code, so they don't. Even your most intensive games these days don't use much more than half of a quad-core.
Consoles can get more out of their hardware because there is only a handful of hardware configurations available. With computers, there's billions of potential hardware combinations and it is impossible to optimize for more than a few.
Do you not remember DX10's introduction? Those features were pioneered on 360 with ATI and that's why everyone was expecting ATI's DX10 card to be so amazing because of their head start experience with it. Was quite some time before they brought that tech to PC.