Before we begin; necro thread anyone?
I say yes, and I hope soon. 386DX was 32 bit, Pentium was 32/64 bit, we are still at 64 bit... not a lot of progression. Either go Risc, or go big.....
There aren't two types of cars. There aren't two types of planes. There aren't two types of pets. We have different types of processors for a reason.
RISC is an instruction set, not a reference to bus sizes. ARM is in fact releasing a 64 bit RISC processor.
Everything else wrong with this statement can be summed up simply; you don't have the slightest inkling of what we have today. Most programs use less than 4 GB of addressable memory. Most programs still run on 32 bit instructions. No hardware can currently support more than 64 GB of RAM per CPU. The benefits to having a 128 bit wide bus are zero, as we can't fully use a 64 bit one. Considering that traditional computing isn't likely to last another two decades, I'd be hard pressed to say 128 bit will ever come about. It's be like someone in 1990 conjecturing that the CPU and PCH would be the only two chips a motherboard really required to run in 2010. It would have been insane at the time, yet it seems like a reasonable conclusion now.
My money is that by the time a 128 bit bus is useful our paradigm for computing will have outmoded the notion.
Actually there will be. That's a fact. And we are all going to be here when it happens. Standard processor word length will be 4Gigabyte, CPU speed will be measured in MHz and our RAM memory will have up to 8Terabyte. It may sound sooo strange, but by the end of the year 2030, it WILL be like that

Can't wait ^_^ Of course, computers will be much different than today.. it's enough to say that one unit for storing one bit data will physically be smaller than one atom. Now it's just a science fiction, but on the other hand it's absolute fact that it will be like that in 15 years.
I'm not even sure what the heck I just read.
CPUs are currently measured in GHz speed, not MHz (that's a factor of 1000 greater, or 1 GHz = 1000 MHz). They have net been measured in MHz since the 90s.
A word is a standardized number of bits. It, by definition, cannot change length.
We could theoretically address 2^64 bits in the current generation of hardware, which translates to 2 Exabytes of memory. This is significantly more than 8 TB.
A bit being smaller than an atom? I don't know how this relates to anything.
Edit:
64 bit would offer 2^64 bits of address space. We can currently use about 2^16. Tell me why we need to start worrying about 2^128 when we've still got 2^48 untapped.
Edit:
Solaris17 brings up an excellent point. Servers do access significantly more resources. If you could link 2-4 processors you'd either need to treat them as independent entities, or use a significant portion of the addressable space. I don't know of anything that can currently do this, but it'd be foolish to not look toward Google or Amazon to pioneer something like this.
Of course, the extra ram is but one component that needs a 128 bit width. Processors themselves would still be running the same smaller operations, with most of the extra width full of place holder values.
I can't maintain this argument for long, as it leads back to the more distant future of computing. Extra RAM sounds great, but it's something we shouldn't be looking at yet. We've still got plenty of room to grow in the 64 bit hardware iteration.