It's easier to make developers compile for 64-bit on consoles than it is on personal computers. I think all developers would be thrilled if they got the news that the hardware handles 64-bit addressing because they waste weeks trying to suck every kilobyte out of console RAM as is. If they didn't have to worry about it like those that develop for Mac, *nix, and Windows, it's a huge burden off their chest. In short, the "next gen" consoles are going to be crippled out of the starting gate which makes PCs all the more appealing to develop for. In 5 years time, I wouldn't be surprised if most games are running on 64-bit binaries and consoles will have to make the switch to 64-bit just to stay competitive. It's going to happen sooner or later just as it did with 16-bit to 32-bit. When they run around saying a console is 64-bit, it usually means the FPU(s) can handle 64-bit. It can also mean the register size. It doesn't have anything to do with the processor's ability to access RAM. For example, most CPUs found in computers would be considered 128-bit because they can handle quad-precesion floats. Just because the processor supports it doesn't necessarily mean any of the developers use it. That's because the GPU and CPU are using that same pool of RAM--GPU being the #1 customer and for fair comparison, you got to look at GDDR5: They cut corners to save costs everywhere possible and it makes life hell for developers trying to push the envelope.