• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Will there ever be a need for a 128-bit CPU in your computer?

Do you think we will we ever see a 128-bit general purpose CPU?


  • Total voters
    163
While these instructions are super-wide, isn't the result always 64-bit? Effectively this would keep the CPU as 64-bit.I'm not challenging here, I just don't know much about these instructions.

You can have full 256 bit for a single scalar value ... as long it is an integer.
Doubles are packed in vectors ... when used as scalars, instructions read lowest 64 bits.
 
Actually there will be. That's a fact. And we are all going to be here when it happens. Standard processor word length will be 4Gigabyte, CPU speed will be measured in MHz and our RAM memory will have up to 8Terabyte. It may sound sooo strange, but by the end of the year 2030, it WILL be like that :) Can't wait ^_^ Of course, computers will be much different than today.. it's enough to say that one unit for storing one bit data will physically be smaller than one atom. Now it's just a science fiction, but on the other hand it's absolute fact that it will be like that in 15 years. :)
 
Last edited by a moderator:
I say yes, and I hope soon. 386DX was 32 bit, Pentium was 32/64 bit, we are still at 64 bit... not a lot of progression. Either go Risc, or go big.....
 
I say yes, and I hope soon. 386DX was 32 bit, Pentium was 32/64 bit, we are still at 64 bit... not a lot of progression. Either go Risc, or go big.....


when abouts do you think we're going to hit the limits of 64 bit? how many years?
 
when abouts do you think we're going to hit the limits of 64 bit? how many years?
In the server world it isnt going to take long at all.
 
Before we begin; necro thread anyone?

I say yes, and I hope soon. 386DX was 32 bit, Pentium was 32/64 bit, we are still at 64 bit... not a lot of progression. Either go Risc, or go big.....

There aren't two types of cars. There aren't two types of planes. There aren't two types of pets. We have different types of processors for a reason.

RISC is an instruction set, not a reference to bus sizes. ARM is in fact releasing a 64 bit RISC processor.

Everything else wrong with this statement can be summed up simply; you don't have the slightest inkling of what we have today. Most programs use less than 4 GB of addressable memory. Most programs still run on 32 bit instructions. No hardware can currently support more than 64 GB of RAM per CPU. The benefits to having a 128 bit wide bus are zero, as we can't fully use a 64 bit one. Considering that traditional computing isn't likely to last another two decades, I'd be hard pressed to say 128 bit will ever come about. It's be like someone in 1990 conjecturing that the CPU and PCH would be the only two chips a motherboard really required to run in 2010. It would have been insane at the time, yet it seems like a reasonable conclusion now.

My money is that by the time a 128 bit bus is useful our paradigm for computing will have outmoded the notion.

Actually there will be. That's a fact. And we are all going to be here when it happens. Standard processor word length will be 4Gigabyte, CPU speed will be measured in MHz and our RAM memory will have up to 8Terabyte. It may sound sooo strange, but by the end of the year 2030, it WILL be like that :) Can't wait ^_^ Of course, computers will be much different than today.. it's enough to say that one unit for storing one bit data will physically be smaller than one atom. Now it's just a science fiction, but on the other hand it's absolute fact that it will be like that in 15 years. :)

I'm not even sure what the heck I just read.

CPUs are currently measured in GHz speed, not MHz (that's a factor of 1000 greater, or 1 GHz = 1000 MHz). They have net been measured in MHz since the 90s.

A word is a standardized number of bits. It, by definition, cannot change length.

We could theoretically address 2^64 bits in the current generation of hardware, which translates to 2 Exabytes of memory. This is significantly more than 8 TB.

A bit being smaller than an atom? I don't know how this relates to anything.



Edit:
64 bit would offer 2^64 bits of address space. We can currently use about 2^16. Tell me why we need to start worrying about 2^128 when we've still got 2^48 untapped.

Edit:
Solaris17 brings up an excellent point. Servers do access significantly more resources. If you could link 2-4 processors you'd either need to treat them as independent entities, or use a significant portion of the addressable space. I don't know of anything that can currently do this, but it'd be foolish to not look toward Google or Amazon to pioneer something like this.

Of course, the extra ram is but one component that needs a 128 bit width. Processors themselves would still be running the same smaller operations, with most of the extra width full of place holder values.

I can't maintain this argument for long, as it leads back to the more distant future of computing. Extra RAM sounds great, but it's something we shouldn't be looking at yet. We've still got plenty of room to grow in the 64 bit hardware iteration.
 
Last edited:
yeah id say one day there will be, unless theres a huge breakthrough in quantum computing which changes the game and the way a cpu works rendering old technology and binary a thing of the past and we'll have to emulate bits and bytes using the new tech
 
Do you think we will we ever see a 128-bit general purpose CPU?

the answer mostly yes but i think it wont come in short time
 
Already had Altivec and that was a 128 bit over a decade ago. But programs had to be coded to take advantage of it and it the entire CPU wasn't 128bit.

Most professional programs were coded for it but OS and everyday stuff, nope. It was the only way Apple could get advantages over Windows back when it launched but those advantages eroded as IBM/Motorola let their processors slide and Intel just surpassed them.
 
when abouts do you think we're going to hit the limits of 64 bit? how many years?
Windows 95 was mainstream 32-bit adoption which debuted in 1995. Windows Vista was mainstream 64-bit and that debuted 2007. 12 years difference (or 10 years if you prefer to go with XP x64). Everything in computing tends to be exponential so 10^2 to 12^2 years from 2005/2007. If the trends continue, we'll be seeing mainstream 128-bit processors around 2105-2151. It won't happen in most of our lifetimes.
 
https://en.wikipedia.org/wiki/64-bit_computing#64-bit_processor_timeline

The AMD64 instruction set we're using today debuted in 2003 with AMD Athlon 64 (Clawhammer). It took 10 years for AMD64 to become mainstream and it will take probably another 10 years before 32-bit programs become a rarity (like 16-bit was in the 2000s).

Pretty much all commercial programs are available for AMD64 except games. Games are rapidly going to catch up with PCs now that PS4 and Xbone are AMD64.

That's what I have in backup rig a 3200+ clawhammer. Surprisingly it more snappy than some 64 bits modern CPU. I just can't figure out why.
 
What operating system? XP on late single-core processors is really fast.
 
I thought there was supposed to be a 128bit version of win8.... hmmm.....
and hopefully it will be a risc cpu.....
 
Already had Altivec and that was a 128 bit over a decade ago. But programs had to be coded to take advantage of it and it the entire CPU wasn't 128bit.

Altivec could handle 128-bit mathematical operations by using it's 128-bit vector store plus ALU but like most CPUs, It does not live in 128-bit memory space, in fact the Apple's PowerPC G5 was the first Apple PowerPC CPU that has supported 64-bit memory addresses, before that they ran in 32-bit memory space.

Edit:
64 bit would offer 2^64 bits of address space. We can currently use about 2^16. Tell me why we need to start worrying about 2^128 when we've still got 2^48 untapped.

I'm sitting on 5.76GB used? That's definitely more than 2^16, consider 2^16 is 16-bit memory... 64KB... I think you mean 32 not 16. ;)
 
I'm sitting on 5.76GB used? That's definitely more than 2^16, consider 2^16 is 16-bit memory... 64KB... I think you mean 32 not 16. ;)

You are correct, my apologies. For some bass ackwards reason I didn't further convert from GB. Stupid mistake on my part.
 
What operating system? XP on late single-core processors is really fast.

Yes but I can also roll seven without problems. With a nice graphic card i canuse Aero mode in windows.
 
Windows 95 was mainstream 32-bit adoption which debuted in 1995. Windows Vista was mainstream 64-bit and that debuted 2007. 12 years difference (or 10 years if you prefer to go with XP x64). Everything in computing tends to be exponential so 10^2 to 12^2 years from 2005/2007. If the trends continue, we'll be seeing mainstream 128-bit processors around 2105-2151. It won't happen in most of our lifetimes.

That may not be correct - not enough ponts.
With intel's x86 timeline where we have more points:

4 bit cpu: 1971
8 bit cpu: 1972 (1 year)
16 bit cpu: 1978 (6 years)
32 bit cpu: 1985 (7 years)
64 bit cpu: 2004 (19 years)

Years in between is what we look to aproximate with exponetial function:
Guesstimated base of the exponential function would be around 2.67

2.67^0 = 1 year
2.67^1 = 2.67 years (more than 50% error - 16 bit cpu sample is off but this is the best we can do)
2.67^2 = 7.1289 years - close enough
2.67^3 = 19.034163 years - close enough

so we can expect 128 bit architecture in 2.67^4 = 50.82121521 years from '04
That's about 40 more years.
 
Last edited:
@BiggieShady Touche! However I think that as transistor densities increase, that might even be a pretty liberal extrapolation, but we'll see. The point is, 64-bit is here to stay in the consumer market for quite some time because the cost to handle 64-bits worth of addresses and having DIMMs to handle it is no cheap (or easy) task.

I'll agree though, 40 years seems a reasonable guesstimate unless something groundbreaking changes the game.
 
That may not be correct - not enough ponts.
With intel's x86 timeline where we have more points:

4 bit cpu: 1971
8 bit cpu: 1972 (1 year)
16 bit cpu: 1978 (6 years)
32 bit cpu: 1985 (7 years)
64 bit cpu: 2004 (19 years)

Years in between is what we look to aproximate with exponetial function:
Guesstimated base of the exponential function would be around 2.67

2.67^0 = 1 year
2.67^1 = 2.67 years (more than 50% error - 16 bit cpu sample is off but this is the best we can do)
2.67^2 = 7.1289 years - close enough
2.67^3 = 19.034163 years - close enough

so we can expect 128 bit architecture in 2.67^4 = 50.82121521 years from '04
That's about 40 more years.
Remember, Itanium debuted in 2001 so it could be even sooner. I'm thinking 128-bit processors can show up in as little as 40 years in specialized uses but mainstream support could take up to 144 years. Even though x86-64 made its debut back in 2003, there's still a large number of 32-bit devices still being sold today. At the same time, 16-bit devices are almost impossible to find. I'd argue that, until 32-bit is almost completely gone, 64-bit still hasn't been mainstream adopted.
 
Remember, Itanium debuted in 2001 so it could be even sooner. I'm thinking 128-bit processors can show up in as little as 40 years in specialized uses but mainstream support could take up to 144 years. Even though x86-64 made its debut back in 2003, there's still a large number of 32-bit devices still being sold today. At the same time, 16-bit devices are almost impossible to find. I'd argue that, until 32-bit is almost completely gone, 64-bit still hasn't been mainstream adopted.

Yes, but you could use 16-bit apps in 32-bit Windows and you got kind of "cut off" when 64-bit variants came around that ditched 16-bit support.

Do you suspect that 32-bit support will eventually get dropped in favor of a purely 64-bit OS, or will it take a 128-bit system that only supports 64-bit and not 32-bit to make that change (much like the transition from 32-bit Windows to 64-bit Windows)? I personally would like to see 32-bit go the way of the dinosaurs and have systems be running purely 64-bit code before 128-bit ever gets introduced. That's me though.
 
Way before 128 bit. They were already considering it with windows 8
 
. If some alien acquired a flash memory chip the alien would have no idea that it represented binary information.

If some Alien acquired a flash drive... Really? We can sent a tin can with a camera to Mars. An Alien is probably laughing his ass off, Right Now, after reading that!
:laugh: (<not an Alien):D

BTW, I voted YES. Everything else is just conjecture. The future is unknown, therefore anything is possible. Take a good look at the past ten years. Then take a look at your current computer. No way you thought it could get this good, back then. BTW, I still have a Socket A, AMD 2600+ running Win 7. It was my daily driver until about 2 years ago, long story, it's awaiting my re-emergence from homelessness, it will be Crunching again, possibly even with an upgraded MB and a 3200 Barton! OC'd of course. :laugh:
 
Last edited:
Yes, but you could use 16-bit apps in 32-bit Windows and you got kind of "cut off" when 64-bit variants came around that ditched 16-bit support.
x86-64 supports 16-bit. Windows 64-bit does not.

Do you suspect that 32-bit support will eventually get dropped in favor of a purely 64-bit OS, or will it take a 128-bit system that only supports 64-bit and not 32-bit to make that change (much like the transition from 32-bit Windows to 64-bit Windows)? I personally would like to see 32-bit go the way of the dinosaurs and have systems be running purely 64-bit code before 128-bit ever gets introduced. That's me though.
I doubt it. To remove 16-bit and 32-bit would require a new 64-bit only instruction set. It could maybe happen when 128-bit is introduced but I doubt before then. The only way it would happen before is if 128-bit still appears distant and they can save a lot of money/power by removing the legacy support. I suspect Intel may be tempted to do this to compete with ARM but the cost of 16-bit and 32-bit compared to 64-bit is tiny so it is very difficult to justify doing that.
 
x86-64 supports 16-bit. Windows 64-bit does not.


I doubt it. To remove 16-bit and 32-bit would require a new 64-bit only instruction set. It could maybe happen when 128-bit is introduced but I doubt before then. The only way it would happen before is if 128-bit still appears distant and they can save a lot of money/power by removing the legacy support. I suspect Intel may be tempted to do this to compete with ARM but the cost of 16-bit and 32-bit compared to 64-bit is tiny so it is very difficult to justify doing that.

Sorry, I wasn't specific enough. I meant at the OS level. Do you think that Windows will abandon 32-bit before 128-bit comes around? X86_64 might be able to execute the commands, but that doesn't mean the OS has to support it, much like how 64-bit windows doesn't "support" 16-bit apps.

I guess the idea is will they disincentive making 32-bit apps now that 64-bit has taken a large foothold? I think it should, but this is talking from a software level and and not hardware.

If we were talking about X86 itself, that a completely different can of worms to be opening. :P
 
Last edited:
I think Microsoft could. 20 years from now, it will be difficult to argue 32-bit is relevant anymore. I don't see anyone defending 16-bit these days and it was kicked to the curb 20 years ago.

There is more incentive to eliminate legacy support in software than in hardware. Micosoft literally has to provide binaries for both versions of everything. That's burdensome which is why they decided to axe 16-bit back in XP x64.
 
Back
Top