• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.
  • The forums have been upgraded with support for dark mode. By default it will follow the setting on your system/browser. You may override it by scrolling to the end of the page and clicking the gears icon.

Graphene Processors

Joined
Jan 28, 2009
Messages
345 (0.06/day)
System Name Harley
Processor Intel core i7 3820
Motherboard ASRock Extreme6 X79
Cooling NZXT Kraken X60
Memory Geil Enhance Corsa DDR3 1600+ 16GB
Video Card(s) EVGA SuperClocked GTX Titan
Storage OCZ Vertex4 128GB SSD / Toshiba 7200 2TB HDD
Display(s) Crossover 30" 2560x1600
Case CoolerMaster HAF XB - Custom Modded
Audio Device(s) n/a
Power Supply CoolerMaster Silent Pro Hybrid 1300w
Software Win7 64
Been a long long time since i have started a thread here, so first i would like to say this looks a bit weird making this post (page that you use to create a thread/post)

anyhoot, for a few years now we have been reading on the advancements of graphene. it will have many uses in the not too distant future, as its one of if not the strongest material known to mankind, with lots of other great properties. one of those properties is how it can be used to make processors (with a very friendly flow of electrons) that are not only way faster than our current silicone chips, but would also be much smaller and require much less power.

Raise your hand if you have an Amazon Fire for your TV. It is slightly bigger than my wallet (since i dont keep cash in there). That will be the size of your desktop many years down the road, or maybe even smaller, such as a thumbdrive that you just plug into the back of your screen (if we even use them)

moving this along, I am asking your opinion as to how much longer we will have to wait for the first batch of graphene processors to hit the market that will be mainstream and capable of being used for our every day activities, such as reading this post here, watching netflix, playing games, catching up on news, etc etc.

Some of the older articles out there from 2014-2016 suggest that IBM may have it out by 2019, but that just does not seem realistic yet as I have not seen any good articles backing that claim up now that a couple of years have past.

I can only hope that by 2025, we can finally get our hands on this new tech, lets face it, we aint getting any younger.

Reported speeds for these chips are freakish, even at the low end of the spectrum. I have seen reports of them being able to handle 30GHz, up to 1THz. I will take anything in that range for sure, as i would imaging most of you would be happy with just 6GHz.

Its really fascinating to think of, all the possibilities that is. Imaging playing a game that looked 99%+ like real life. something that just isnt possible today because our GPU's just can't handle the amount of processing it would take to do so, increased those speeds by 10,000, or more....and it just might be possible. But those artists who would be making said games, would have soo much more work on their hands, so perhaps we would need a compromise.....98% sounds good.

Our monitors may also use graphene, as they are working on that as we speak as well. graphene pixels, hmm, not exactly sure how that would work, but i would imagine 4k would be like pong on Atari compared to this tech.

So I say 2025, what year do you predict ??
 
Been a long long time since i have started a thread here, so first i would like to say this looks a bit weird making this post (page that you use to create a thread/post)

anyhoot, for a few years now we have been reading on the advancements of graphene. it will have many uses in the not too distant future, as its one of if not the strongest material known to mankind, with lots of other great properties. one of those properties is how it can be used to make processors (with a very friendly flow of electrons) that are not only way faster than our current silicone chips, but would also be much smaller and require much less power.

Raise your hand if you have an Amazon Fire for your TV. It is slightly bigger than my wallet (since i dont keep cash in there). That will be the size of your desktop many years down the road, or maybe even smaller, such as a thumbdrive that you just plug into the back of your screen (if we even use them)

moving this along, I am asking your opinion as to how much longer we will have to wait for the first batch of graphene processors to hit the market that will be mainstream and capable of being used for our every day activities, such as reading this post here, watching netflix, playing games, catching up on news, etc etc.

Some of the older articles out there from 2014-2016 suggest that IBM may have it out by 2019, but that just does not seem realistic yet as I have not seen any good articles backing that claim up now that a couple of years have past.

I can only hope that by 2025, we can finally get our hands on this new tech, lets face it, we aint getting any younger.

Reported speeds for these chips are freakish, even at the low end of the spectrum. I have seen reports of them being able to handle 30GHz, up to 1THz. I will take anything in that range for sure, as i would imaging most of you would be happy with just 6GHz.

Its really fascinating to think of, all the possibilities that is. Imaging playing a game that looked 99%+ like real life. something that just isnt possible today because our GPU's just can't handle the amount of processing it would take to do so, increased those speeds by 10,000, or more....and it just might be possible. But those artists who would be making said games, would have soo much more work on their hands, so perhaps we would need a compromise.....98% sounds good.

Our monitors may also use graphene, as they are working on that as we speak as well. graphene pixels, hmm, not exactly sure how that would work, but i would imagine 4k would be like pong on Atari compared to this tech.

So I say 2025, what year do you predict ??

I'm guessing this is related to IBM quantum cpu.
Last thing I've read was that they send out some sorts of developer kits to developers.
So when the thing is ready there will be adequate understating on how to code for it.
Sending developer kits sound like they made some advances but I would not hold my breath.
Especially because from what I've read so far is that they plan first to build one for them and just charge for service.
I will be surprised if I see one on my desk in my lifetime.
 
You won't get graphene cpu's any time soon because graphene while being an excellent conductor, doesn't have a bandgap like silicon which means it doesn't have semiconductor properties need to make a functioning transistor. Carbon nanotubes are an attempt to circumvent this by creating the necessary bandgap for a switching transistor.
 
I keep hearing about new materials being worked on by scientists such as Carbon Nanotubes and Graphene for transistors. There are problems that need to be ironed out but I do believe that new materials will replace Silicon and is the way forward in the future rather than more and more cores on the average PC using Silicon. I read an article a couple of years ago that claimed that a Carbon Nanotube transistor has the potential of being 5 times faster than silicon while using 5 times less power. Here's the most recent article that I have read on the subject

https://arstechnica.com/science/201...s-push-up-against-quantum-uncertainty-limits/
 
There is one important detail to take into consideration , although these things might be perfectly feasible for the future of ICs , it's going to take an insane amount of cash and time for manufacturers such as TSMC to convert or develop their production lines in order to make these new chips viable for consumer use.
 
Quantum computing
Graphene
Optical ICs
Carbon nanotubes

the future is interesting :D
 
You won't get graphene cpu's any time soon because graphene while being an excellent conductor, doesn't have a bandgap like silicon which means it doesn't have semiconductor properties need to make a functioning transistor. Carbon nanotubes are an attempt to circumvent this by creating the necessary bandgap for a switching transistor.
The "bandgap" problem (or "energy gap", "on-off ratio", or ability to completely shut off) for effective discrete high/low (on/off or 1/0) differentiation needed for high speed "digital" information processing is still a bit away. But it works fine now for many "analog" electronics applications.

The price to produce it just needs to come down and that will happen, as all things do, once more products reach the market.

IBM shows smallest, fastest graphene processor (and note this was written in 2011).

I agree it will still be awhile before we see graphene processors in our home computers and smart phones but it's coming, unless Black arsenic-phosphorus proves to be more viable.
 
It's all theory now, but I think that for 10 years we might have a first working graphene or phosporene CPU's. As every new tech, first examples will be bought by the military-industrial complex and a few years later they will become available to the consumers.
 
It's all theory now

It's a bit more than theory. There are some functional tests being done. It's not like they are saying "hey, Graphene might exist and could be awesome," but I know, I'm splitting hairs. It's still a long way off for any normal consumer. ;)
 
I don't expect to see any graphene processors before like 2025-2030, the ability to create transistors out of the graphene is just too difficult for mass production any time soon
 
It's all theory now... .
...first examples will be bought by the military-industrial complex
It's a bit more than theory.
I agree. It is well beyond the theory stage. The proof is in the link I provided above. And again, that was from 6 years ago. I do agree, however, that it will first appear (if it hasn't already) in practical use through the military and that makes sense since DoD's DARPA (Defense Advanced Research Projects Agency) is funding IBM's research.
 
I think it will only be used as specialized processors here and there as I feel they have a different path for pushing the processing power (maybe messing with time on a quantum level to manipulate the results out-put to become faster
) and we will probably see it more in different smaller less complicated parts of electronics.
 
Last edited:
I think it will only be used as specialized processors here and there as I feel they have a different path for pushing the processing power (maybe messing with time on a quantum level to manipulate the results out-put to become faster
) and we will probably see it more in different smaller less complicated parts of electronics.


interesting video lol, that stuff is so far beyond my comprehension level that i find parts to be unbelievable, mainly the "tapping into" parallel universes.
when he said we could steal their resources, i wonder if he thought of the consequences. if it is a parallel universe, and we could steal from them......they could steal from us what we stole from them.
 
I saw on The List TV show yesterday they were saying Hemp fibers would actually perform as good if not better than graphene in semiconductors, and of course be MUCH cheaper to manufacture. Another plus is it's completely biodegradable and sustainable.

In doing some fact checking apparently there is something to it. https://www.google.com/#q=hemp+better+than+graphene?

On the segment they also said (and showed via a blow torch) that hemp doesn't even burn, so I tend to think computer chips made of it would handle temps well too. That includes not just compressed hemp products such as "Hempcrete", an alternative to cinder blocks, but also hemp fibers used for insulation. I think we all know how valuable naturally fireproof construction materials would be, vs ones that have to be chem treated just to be fire retardant.

I appreciate that scientists often think outside the box of what is possible, but they tend to usually think via the most expensive methods, which are also sometimes less practical in other ways. I can't help but think it's because they are usually funded by large corporations that are more concerned about cornering a market than really contributing to society and planet Earth.

I'm not advocating tree huggers should be in control of scientific research, but clearly some balance is needed. This opens up all sorts of potential for humorous ads. I can just see the graphene scientists banging their heads in the lab over the obstacles they face, only to have a pot smoking hippie walk in with a hemp chip supercomputer laptop saying, "Why don't you just use weed man?" It's literally like stuff from the old Cheech and Chong movies.
 
Last edited by a moderator:
Quantum computing
Graphene
Optical ICs
Carbon nanotubes

the future is interesting :D

Yup, what is the best about it that it isn't just an interesting laboratory experiment anymore but a full grown industry. You have companies tailoring these nano-materials in mass so basically any big brand can toy with it. Right now you have it everywhere from computer to snowboard. :)
 
Back
Top