• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

MIT Graphene Chip Could Reach 1,000 Gigahertz

This could be used with GPUs too, right!?
 
1,000 Gigahertz

They plan to embedded it in the human brain or something? Something like that is just a waste of time and money because before we have those thing in our home we'll be dead or broke. Another useless technology for this ages (as in consumer part of it). Don't think any average joe can afford a robotic arms do you? My guess in the future we will all be having some kind of nuclear ("nuc-cu-lar", said Bush) generator in our backyard to feed our computers and liquid nitrogen in our every day use to cool our processor.
 
I won't get excited until IBM or Intel prove it on a marketing scale. Running 1 transistor at 1 THz is good but it isn't exactly useful until you have 2 billion at 1 THz. By the time that technology is implemented, 2 billion will be considered average/small.
 
They have until 21st of December, 2021. Then the Mayan calander will come to an end.
 
They have until 21st of December, 2021. Then the Mayan calander will come to an end.

Wasnt it 2012? :laugh:

I love bullshit.

Anyway 1000 Gigahertz means jack when all it is is just cycles through one transistor. Its just a piece of carbon with electricity running through it at that frequency.
 
Wrong and wrong. This will just be part of the cloud in 2035. :p

One day, possession of an non-cloud computer will be a crime.

I completely agree with you, PCs will become so powerful so that they may become very dangerous... so that people will not be allowed to posses one, instead they would rent a Network cloud PC, lets just hope they don´t control the content
 
can i play crysis on this cpu
 
hmm, i wonder how much the first 1thz processor will cost, and who will have the rights to it..hopefully amd...THats right i said it! GET BACK! GET BACK I SAY!

But it would then have a crippling errata on the first few runs and take a while before its fixed. :laugh:
 
Wasnt it 2012? :laugh:

I love bullshit.

Your absolutely correct. Its 21st December, 2012. I just had a typo moment. It was probably a subconcious move. I think its bullshit that its an Apocalyptic date, but the Mayan's wernt the ones that said it would be so. They just predicted that the Sun, Earth and the centre of our galaxy would align and would mark the beginning of the Golden Age of humanity or some crap like that. Unfortunately the date draws to a near and we still cant prove that the 3 will align or not.

Irrelevant, but have you seen the movie titled ' Knowing ' starring Nicholas Cage ? I wont spoil the movie for you, but i like the way things end up . . . . *cough* bs.

----------

Anyways . . . if mankind really did land on the moon with shitty 086's and such, im sure a 1,000 Ghz CPU with the right instruction sets, HP's A.I memory recall technology could power up a T-888. If they didnt land on the moon and it was all a hoax, they still managed to photoshop and video edit with the same hardware and shitty coded software.

Although im worried at what the human race has install for itself, i eagerly await the technologies that will decide our worlds future.

Crytek should be banned from the gaming industry for poor coding. Lets all get over whether new hardware can actually play Crysis well or not together :P
 
Last edited:
There is a laser ranging retroreflector on the moon placed there by the Apollo 11 mission. If you shine a laser beam at the moon, you'll get a response. Measure that length of time and you can figure out how far the moon is away from the Earth (they are actually drifting apart).

We've been to the moon and back many times in the Apollo days. It is disgusting how people so easily give in to conspiracy.
 
yes we have been to the moon, yay for us. ON TOPIC


What kind of program would be used to run these futuristic chips at full load? Guess they could be used in the super computer world to generate some kind of chaos theory for everything.
 
yes we have been to the moon, yay for us. ON TOPIC


What kind of program would be used to run these futuristic chips at full load? Guess they could be used in the super computer world to generate some kind of chaos theory for everything.

I'd imagine we could do errr things. Honestly I don't even think there is a demand for this kinda cpu ATM except for the US to calculate when their missiles won't work.
 
I'd imagine we could do errr things. Honestly I don't even think there is a demand for this kinda cpu ATM except for the US to calculate when their missiles won't work.

they could use it to calculate all kinds of things with already existing things now i think about it. All the things that RoadRunner and all the IBM supercomputers do could be done in a fraction of the time, Folding@Home would simply not exist because this thing would do it all.
 
they could use it to calculate all kinds of things with already existing things now i think about it. All the things that RoadRunner and all the IBM supercomputers do could be done in a fraction of the time, Folding@Home would simply not exist because this thing would do it all.

I think all our computing power can do more than one of these but it still hasn't accomplished anything in FAH. Still games etc will only use up so much of this processing power. If the game needs to complete 20'000 instructions in a second and this can do that only using a fraction of its power then the rest is pretty much useless.
 
yes we have been to the moon, yay for us. ON TOPIC


What kind of program would be used to run these futuristic chips at full load? Guess they could be used in the super computer world to generate some kind of chaos theory for everything.

The dudes @ weather forecasting could use one of these to better determine the path of hurricanes, for example but, without a doubt, military are salivating over this, heavily ...
 
The dudes @ weather forecasting could use one of these to better determine the path of hurricanes, for example but, without a doubt, military are salivating over this, heavily ...

They probably funded it. Think about the NSA wanting this to use for brute force password breakers.
 
The dudes @ weather forecasting could use one of these to better determine the path of hurricanes, for example but, without a doubt, military are salivating over this, heavily ...

Indeed they are. wouldn't be surprised if MIT got a nice fat research investment bonus by the USA government in a few weeks time.
 
Indeed they are. wouldn't be surprised if MIT got a nice fat research investment bonus by the USA government in a few weeks time.

So that's where the stimulus package $$ are headed ... hmmm ...
 
That'd be cool if Intel/AMD both remake the 486 on a 22nm die with all the necessary code extensions (MMX, SSE, AMD64, etc), add a DDR3 memory controller and release it for the heck of it, clocked at 1Thz! Could it outperform a Core i7 at 4Ghz?

But seriously, I hope to see the microprocessor industry continue the clock speed race rather than the shrink and add cores race.
 
Back
Top