• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Will there ever be a need for a 128-bit CPU in your computer?

Do you think we will we ever see a 128-bit general purpose CPU?


  • Total voters
    163
It's funny, anyone remember Itanium? A cpu that could only run 32 bit when 16 bit was mostly predominant. Don't know why I thought of that.... but still, either cisc, or risc, 128 bit is coming....
 
Sorry, I wasn't specific enough. I meant at the OS level. Do you think that Windows will abandon 32-bit before 128-bit comes around? X86_64 might be able to execute the commands, but that doesn't mean the OS has to support it, much like how 64-bit windows doesn't "support" 16-bit apps.

I guess the idea is will they disincentive making 32-bit apps now that 64-bit has taken a large foothold? I think it should, but this is talking from a software level and and not hardware.

If we were talking about X86 itself, that a completely different can of worms to be opening. :p


I'd conjecture that MS will kill support for 32 bit instruction sets well in advance of ever having a 128 bit processor. The math is relatively simple here.

Right now most CPUs (in dedicated PCs) out there run either x86 or x64 instruction sets. These instruction sets are shared between the cheapo work stations, servers, and high end gaming rigs. The real drag on releasing the instruction set native to x86 is legacy device support and a lack of capability in the lower end of hardware. Like it or not, Atom and previous generation gaming systems didn't have the resources to run an x64 environment.

Knowing this, cue Intel upgrading the Atom (into the Celeron) and AMD releasing the APU. Both of these developments leverage enough resources to perform acceptably, while still offering complete x64 instruction sets. The APUs are basically the driving force of current generation video game systems, while Celeron and the APU are duking it out in the smart devices market. It wouldn't therefore be unreasonable to see the end of 32 bit instructions in the next 10 years. There'd be no devices that actually require them, and the small amount of resources they free up would basically offer a free performance boost.

The introduction of 128 bit instructions is insanely unlikely in the next ten years. I make this statement for two reasons. Other people have shown the math related to adoption of new instructions, so read previous posts for their work. Finally, who actually needs that much math? Right now the largest pure computing occurs on distributed networks of CPUs. The common CPU doesn't have to deal with exceptional mathematical loads, and the people who need that computational ability have a source for it. Whenever the average person actually needs 128 bit sized computational abilities we'll see the shift toward it. For now, it doesn't make sense for one person to drive a train to work, when they could use a car to no ill affect. Having the resources to buy a train and lay track doesn't mean it's a good idea.


RISC is not included above because it barely makes sense to be using a 64 bit processor today. I've never seen a phone with more than 4 GB of RAM, and other devices using RISC architecture have even less resources. Saying that it is possible, and being ready to pay $1200 for the next phone you buy, are two different things.


Side note:
4 bit processors are functionally dead. They might still be used as cheap DACs, but they aren't common to see today outside of specific applications.
8 bit processors are hobbyist fodder. Hello learning platform, specifically the likes of Arduino.
16 bit processors are in a murky place. They are used as DACs, but because of increased price, decreased features, and an obliterated code base they aren't as common as might be conjectured.
32 bit processors are ubiquitous. Most intelligent devices running ARM are lumped in here, along with newer hobbyist hardware.
64 bit processors are ubiquitous in the PC market.
128 bit processors haven't ever been produced. There have been one off systems with one component being 128 bit, but never have we seen a true 128 bit processor.


It's funny, anyone remember Itanium? A cpu that could only run 32 bit when 16 bit was mostly predominant. Don't know why I thought of that.... but still, either cisc, or risc, 128 bit is coming....

So, who runs Itanium today?

I remember Itanium as a substantial leap forward, that landed on a rake and wound up with a huge bruise on its face. It ran 64 bit, when 32 bit was common: http://en.wikipedia.org/wiki/Itanium

I'm betting your current system is x86-64, despite Itanium having come on the scene as pure x64 years ago.


It's also fallacious to say "it's coming," as if that were a justification for anything. You and I have an upcoming and inevitable death, which is coming. The death of our sun is coming. The extinction of our universe is coming. These things are inevitable, while 128 bit computers is not inevitable.
 
Last edited:
Intel's problem with Itanium is that it targeted a market that didn't really exist. If Intel reinvented the concept of Itanium to compete directly with, for example, ARM, it could really come into its own.
 
Intel's problem with Itanium is that it targeted a market that didn't really exist. If Intel reinvented the concept of Itanium to compete directly with, for example, ARM, it could really come into its own.

I concede this point.

Itanium was a big risk, that wasn't properly supported. If Intel threw more weight behind it we could see something great. My apprehension is that Intel doesn't come back to many ideas after getting burnt once.
 
Nehalem struck twice! XD

Intel also repurposed Larrabee from a GPU concept into a HPC product.

I think Intel knows x86 is long in the tooth. I think they also see the ominous threat from ARM. If they aren't quietly working on a new instruction set and architecture for it, I think they know they're digging their own grave. Intel can't rely on their process advantage forever. Imagine if Intel reinvented Itanium into a 128-bit processor that can handle four 32-bit, two 64-bit, or one 128-bit instruction at a time. It would be a wide processor but it could run at a low clock speed with very low voltage and heat output.
 
Last edited:
Scratch
 
Last edited:
I've seen computers go from 4K RAM to 4GB RAM in less than 30 years. That's a million old increase. I think we could see similar increases in the next couple decades. Maybe sooner if we get true molecular nanotechnology - the assembler.
 
I've seen computers go from 4K RAM to 4GB RAM in less than 30 years. That's a million old increase. I think we could see similar increases in the next couple decades. Maybe sooner if we get true molecular nanotechnology - the assembler.


i've gone from 8 bit and 4MB on my first PC to 64 bit and 24GB on my current one.

pretty sure we'll see some more increases.
 
No! Everyone knows you don't need that much power, just like how you don't need a hard drive more than 1GB
 
The emotion engine in the ps2 was 128bit wasn't it I realise it's not the main cpu but it was ubiquitous and general purpose.
 
But won't quantum entanglement take care of that?

Quantum entanglement effect is supposed to allow us to read a qubit without affecting it.
 
I think that the question is vague considering I'm wondering if we're talking about memory register widths or data register widths. As it stands right now, I don't see us needing 128-bit memory space for several decades. As far as higher-precision math done by CPUs, I think there might be an argument that some applications that could benefit from fixed precision math, could benefit from an ALU that supports mathemetical operations on integers exceeding 64-bits (long integer) in one clock cycles. Generally speaking, these applications usually benefit from floating point math but I don't see how either 128-bit integer ALU support or 128-bit memory space would be needed in the near future.

A lot of people say, "Well, people said that 640k of memory was plenty and now look." But I think I would make the argument that CPUs and memory modules were a lot more simple with vastly fewer transistors. It's a lot harder to cram more on a wafer than it used to be and I think we need to keep in mind that respect ICs, they can only get so big without yields going to crap and they can only get so small without things like quantum entanglement due the the circuit being only ten or so atoms wide and is a real problem with Si-based semi-conductors. All in all, progress has been good, but it's slowing and we have to keep that in mind when attempting to predicting the future. Even recent history has shown a slowdown and it's hard to ignore that.
 
How is TPU not hosting online tech classes? You all are about to make my brain go ploosh.

Anywho, Intel was smacked in the face with bringing a 64bit into a 32bit world too soon, but what if they built on the idea again? A industrial-type line of Pentiums made to take workloads differently.

Wouldn't it be easier to have mainstream motherboards support two say.. i5's or i3's than to change the bit system to 128? Do we really need another revolution going from 64 to 128? 90% of applications I see or use still run 32bit. I'd think it would be easier to buff the 64bit power before bringing in 128bit to a 32/64bit. Where would the support be?

I have so many questions and Google has no answers for me. :(
 
I have so many questions and Google has no answers for me. :(
I would start by learning how memory addressing works. Without knowing that, you really can't say you know what you're talking about. Increasing the range of memory addresses only helps if you're already constrained by the width of the address bus which is the case when you run out of addressable space (like the 4GB limit with 32-bit systems or the 16K address (not page) limit on 16-bit systems.)

Considering we're only using a mere fraction of the 64-bit address space, I think we have quite the ways to go before we start running out of memory addresses to use for memory and memory mapped I/O.

Edit: I should also note that most "64-bit CPUs" don't tend to actually support a full 64-bits worth of address space, but rather a sufficiently large portion of 64-bit space to satisfy all current memory needs. It's also good to remember that x86(_64) systems address 8-bit words per address, even if they support math operations on say longs (64-bit integers) so it's important to focus on just memory when talking about "64-bit" because the term in and of itself is vague.
 
Last edited:
I would start by learning how memory addressing works. Without knowing that, you really can't say you know what you're talking about. Increasing the range of memory addresses only helps if you're already constrained by the width of the address bus which is the case when you run out of addressable space (like the 4GB limit with 32-bit systems or the 16K address (not page) limit on 16-bit systems.)

Considering we're only using a mere fraction of the 64-bit address space, I think we have quite the ways to go before we start running out of memory addresses to use for memory and memory mapped I/O.
So what would the memory limit be on a 128bit system?
 
Last edited:
It would be 2^128 ~ 340 trillion trillion terabytes.
Yes, On a machine that has an address resolution of 8-bits which is practically every modern microprocessor and microcontroller. Just thought I should throw that out there. Theoretically speaking, forcing 16-bit words and address resolution would allow any given address size to double total capacity without changing the amount of addressable space you have since every address would represent 16-bits instead of 8, but that has much wider spread repercussions as operating systems rely on 8-bit address resolution. To change that would mean a fundamental change to how memory works if we continue to support data types like bytes and 8-bit chars. If I wrote assembly, I would have to completely change the way I thought about bytes if they were still supported in such a design because you would have to manage storing two bytes per memory address unless you want to waste 50% of your addressable space. However, the same argument could be made of a boolean (or a single bit), how do you efficently store that in a 8-bit word without adding overhead or wasting space? That's a hard question to answer.
 
The emotion engine in the ps2 was 128bit wasn't it I realise it's not the main cpu but it was ubiquitous and general purpose.
http://en.wikipedia.org/wiki/Emotion_Engine
Contrary to some misconceptions, these SIMD capabilities did not amount to the processor being "128-bit", as neither the memory addresses nor the integers themselves were 128-bit, only the shared SIMD/integer registers. For comparison, 128-bit wide registers and SIMD instructions had been present in the 32-bit x86 architecture since 1999, with the introduction of SSE.
I think it is 32-bit seeing how it takes two 32-bit operations to get a 64-bit value.
 
maybe they gonna pass 128 and go right to 256 ;)
 
Truth is they did and do use 128 bit parts but on whole are built to efficiently process 16/32/64 bit operations.
I have read that wicky and I was only wrong in that it is the main ps2 cpu it did simd side by side risc based processing which is similar to what some here think 128bit cpus can evolve from and certainly could start up from.
 
For consumer segment, i don't really see a reason to use anything more than 64bit for my lifetime. We are talking memory limitations of 50 petabytes (pebibytes but i'll keep it old way to understand easier). That's 50.000 Terabytes. We haven't reached even 1TB of RAM and we won't for quite some time considering how long we needed to go from 256MB to current 16GB. Other than that, for consumer segment, more bits don't really bring anything than more RAM support which was a serious issue for 32bit processors. But now that klimitation was eliminated, 64bit is here to stay for a while.
 
That's a large number..
maybe they gonna pass 128 and go right to 256 ;)
Check out Graham's number ;) A good punchline for "your mom" jokes.

I don't see how any of these posts are actually constructive. Don't post unless you have something to ask or contribute otherwise, off topic posts and posts that aren't thought through will just derail the thread. I don't know about other people, but I would like a serious, intelligent and thoughtful discussion on the topic.

I love discussing things with respect CPU architecture, but I don't love having to deal with shenanigans while I do it. :)
 
Back
Top