Friday, August 30th 2019

Researchers Build a CPU Without Silicon Using Carbon Nanotubes

It is no secret that silicon manufacturing is an expensive and difficult process which requires big investment and a lot of effort to get right. Take Intel's 10 nm for example. It was originally planned to launch in 2015, but because of technical difficulties, it got delayed for 2019. That shows how silicon scaling is getting more difficult than ever, while costs are rising exponentially. Development of newer nodes is expected to cost billions of Dollars more, just for the research alone and that is not even including the costs for the setting up a manufacturing facility. In order to prepare for the moment when the development of ever-decreasing size nodes becomes financially and physically unfeasible, researchers are exploring new technologies that could replace and possibly possess even better electrical properties than silicon. One such material (actually a structure made from it) is Carbon Nanotube or CNT in short.

Researchers from MIT, in collaboration with scientists from Analog Devices, have successfully built a CPU based on RISC-V architecture entirely using CNTs. Called RV16X Nano, this CPU is currently only capable of executing a classic "Hello World" program. CNT is a natural semiconductor, however, when manufactured, it is being made as a metallic nanotube. That is due to the fact that metallic nanotubes are easier to integrate into the manufacturing ecosystem. Its has numerous challenges in production because CNTs tend to position themselves randomly in XYZ axes. Researchers from MIT and Analog Devices solved this problem by making large enough surfaces so that enough random tubes are positioned well.
The CPU is based on the RISC-V architecture, particularly it is designed to handle 32-bit wide instructions in a 16-bit wide memory address design. Being that all stages of the CPU pipeline (Instruction Fetch, Decode, Register Read, Execute and Write Back) are 16-bit in width, the CPU is officially declared as 16-bit. It uses 14,000 logic gates, like AND or NOT gates, to become a fully functional design. Given the careful manipulation of nanotubes, researchers managed to pull off 100% yield, meaning that all of the 14,000 gates worked correctly. Here is the waveform of execution of the Hello World program (while it isn't exactly Crysis, given the position of this technology, it is quite an impressive achievement):
The technology isn't perfect, yet. The chip ran at a very low clock speed of only 10 kHz, which means that your average CPU is an order of magnitude faster than this. With all of the flaws, this demonstration is an important achievement for the technology - a proof of concept. It shows that you are able to manufacture a working example of CPU based on something that doesn't require silicon and possibly is even better than it. We just haven't perfected all of the bits and pieces required to get CNTs at the same level of performance we already have. Sources: Nature Magazine, ArsTechnica
Add your own comment

40 Comments on Researchers Build a CPU Without Silicon Using Carbon Nanotubes

#26
FinneousPJ
notb
Your Ryzen is also many orders faster than other CPUs made today. Computers aren't just the x86 machines we think about most of the time.
If you're around 40 or more, your PC CPU is multiple orders faster than another PC CPU you've likely owned in the past.

Anyway, this is all irrelevant.
Even if these carbon CPUs can only do very simple stuff today, they'll catch up at some point. Maybe in 5, 10 or 50 years - we don't know that. But we know their theoretical limits are far beyond what silicon offers.
Well yes, I hope new technologies will catch up to and surpass current ones. I'm pointing out an error in the text, current CPUs are not AN order of magnitude faster.
Posted on Reply
#27
Vayra86
Athlonite
I'm picking Quantum CPUs will be out for the consumer before this ever makes it to market
The logical way forward is more specialization. I don't think silicon is out the door anytime soon, despite all the progress made in that area. It will take several decades (!) for silicon to run out of juice for at least the consumer market. And even then it will remain relevant for a large part of that market for another decade or more. There is also a point where 'enough is enough' and its likely that sort of CPU is still easily made on silicon. We've already actually reached that point, as we notice our old CPUs do a pretty good job even today.

Within that time I do believe quantum computing and perhaps solutions such as CNT will slowly start doing 'things' but not for us lowly consumer peasants. But cryptography, for example, and perhaps for CNT, applications that rely on extremely low power use - what about space? - and aren't computationally intensive, or less time critical.

You have to consider the fact that we've become pretty good at building efficient CPUs by now, so any new material is going to have to combat a highly finetuned set of best practices and architectures. But our CPUs also rely on several factors these new materials won't rely on, or in a different way - temperature, power, packaging etc.
Posted on Reply
#28
Wavetrex
Imagine the people that made the first computer with vacuum tubes, which was order of magnitude faster and much quieter than previous computer using mechanical relays:
"THIS IS THE FUTURE"

It was... for about 15 years.

It's absolutely amazing how come we've come in less than the lifetime of a single person.
Posted on Reply
#29
Prima.Vera
Athlonite
I'm picking Quantum CPUs will be out for the consumer before this ever makes it to market
This. Time to put de x86 dinosaur out of its misery tbh. It has been along for too many years, time for a new architecture and design based on quantum tech.
Posted on Reply
#30
medi01
Mephis
Gotta love how it only took 6 comments for someone to bash Intel on an article having nothing to do with them.
OP asked for it, ignoring how inadequate does word "thrash" describe cited post.
Posted on Reply
#31
notb
Prima.Vera
This. Time to put de x86 dinosaur out of its misery tbh. It has been along for too many years, time for a new architecture and design based on quantum tech.
No, quantum CPUs will not replace x86. They don't support the instruction set, they can't be run in room temperature. And it makes no sense anyway.
Posted on Reply
#33
notb
DeathtoGnomes
have you heard about the magnetic cooling ???

https://www.tomshardware.com/news/permanent-magnetic-cooling-quantum-computers-tum,39586.html
Yeah, I've heard about multiple ways of cooling an object to near 0K. :-) That's a fairly common problem in physics.
Any way you do this, you need a lot of energy (and time!) to cool the chip to near 0K - not to mention making it stay there under load.

But that's hardly relevant in this discussion. Because even if you're able to buy and run a quantum computer, it's still not a solution for general computing. Quantum processors are being developed for particular problems at which they'll be vastly superior to deterministic chips.
They will in fact work as "hardware accelerators" for particular tasks (e.g. tensor cores, hardware RNG), not standalone machines.
Posted on Reply
#34
DeathtoGnomes
notb
Yeah, I've heard about multiple ways of cooling an object to near 0K. :) That's a fairly common problem in physics.
Any way you do this, you need a lot of energy (and time!) to cool the chip to near 0K - not to mention making it stay there under load.

But that's hardly relevant in this discussion. Because even if you're able to buy and run a quantum computer, it's still not a solution for general computing. Quantum processors are being developed for particular problems at which they'll be vastly superior to deterministic chips.
They will in fact work as "hardware accelerators" for particular tasks (e.g. tensor cores, hardware RNG), not standalone machines.
Relevant, really? The size of the cooling solution shown in the article matters, quantum computers currently require a huge elaborate system. The article doesnt say how much exactly power is required, and I'm sure you know more than those on the research team, and judging from the photo it can fit your bedroom closet, so it must need 440 atleast.

If you're buying a quantum computer your not buying it to use for "general computing". LOL
Posted on Reply
#35
Athlonite
DeathtoGnomes
If you're buying a quantum computer your not buying it to use for "general computing". LOL
No I'd be buying it to play Crisis LOL
Posted on Reply
#36
kapone32
Athlonite
No I'd be buying it to play Crisis LOL
:laugh:
Posted on Reply
#37
notb
DeathtoGnomes
Relevant, really? The size of the cooling solution shown in the article matters, quantum computers currently require a huge elaborate system. The article doesnt say how much exactly power is required, and I'm sure you know more than those on the research team, and judging from the photo it can fit your bedroom closet, so it must need 440 atleast.
It takes around 20 seconds to go from the article *you suggested* to the corporate website of this team:
https://kiutra.com
The photo in that article shows their current product - a research cryostat.
It's way too weak for a quantum processor, so we should treat its requirements as a lower limit (likely by a significant margin):
Power consumption: 6.6-7.2 kW
Cooling water: 6-9 l/min, 5-25°C


The cryostat IBM actually uses isn't much larger, but surely is a lot more advanced and expensive.
If you're buying a quantum computer your not buying it to use for "general computing". LOL
The idea that quantum computers will replace what we have today was suggested in earlier post - you've followed it.
Posted on Reply
#38
DeathtoGnomes
notb
It takes around 20 seconds to go from the article *you suggested* to the corporate website of this team:
https://kiutra.com
The photo in that article shows their current product - a research cryostat.
It's way too weak for a quantum processor, so we should treat its requirements as a lower limit (likely by a significant margin):
Power consumption: 6.6-7.2 kW
Cooling water: 6-9 l/min, 5-25°C


The cryostat IBM actually uses isn't much larger, but surely is a lot more advanced and expensive.



The idea that quantum computers will replace what we have today was suggested in earlier post - you've followed it.
yea maybe you should have spent that 20 seconds before you replied to me, afterall, I was replying to you not what everyone else said. Misunderstandings happen when speaking generically.
Posted on Reply
#39
notb
DeathtoGnomes
yea maybe you should have spent that 20 seconds before you replied to me, afterall, I was replying to you not what everyone else said. Misunderstandings happen when speaking generically.
Replying to Prima.Vera, I said quantum computers won't replace x86 because - among other things - they need very low temperatures.
You said, quoting this particular comment, that there's something called magnetic cooling. In the following post you've also mentioned the size being somehow compact.
Is all that true? Or maybe I misunderstood something?

Let's move to the subject you started: the cooling itself. I can talk about that. Do you at least acknowledge that it's extremely hard to run these computers?
What's your opinion? What was your goal when you decided to challenge me on cryostats?
And now you're backing off in such a rough way... :o
Posted on Reply
#40
DeathtoGnomes
notb
Replying to Prima.Vera, I said quantum computers won't replace x86 because - among other things - they need very low temperatures.
You said, quoting this particular comment, that there's something called magnetic cooling. In the following post you've also mentioned the size being somehow compact.
Is all that true? Or maybe I misunderstood something?
I was referring to whats in the article not anything more. The image you posted is misleading. The image from the article clearly shows how small it is and after
notb
Let's move to the subject you started: the cooling itself. I can talk about that. Do you at least acknowledge that it's extremely hard to run these computers?
What's your opinion? What was your goal when you decided to challenge me on cryostats?
And now you're backing off in such a rough way... :eek:
You just picked a random product to use for power reference? which happens to be the largest unit they have. I'm surprised more people didnt call you out.

as for backing away, yea real life calling....
Posted on Reply
Add your own comment