Discussion in 'General Hardware' started by JunkBear, Aug 17, 2014.
about time they killed off copper tracks..
What u mean?
The whole thing is connected together using silicon photonics instead of traditional copper wires
Read more at http://www.iflscience.com/technolog...ne-billionth-second-could#MosDyMoMEdgCKZgd.99
Cool, I've been waiting for an optical replacement for traditional copper etched circuit boards for some time. This is similar to fiber optics JunkBear, only applied to circuit boards.
I found it interesting the guys name is Fink. Was musing over Jeramiah Fink of Bioshock Infinite. This is going to mean ultra low power, extremely cool running components that will last a long time, and it will minimize lost efficiency due to latency problems.
Killing off copper electrical connections is nice, but the bigger change that I can see is memristors. They've functionally chopped up an entire computer and put it into a single chip, even more so than an SOC.
I'd say that decreased latency between transmissions is fine, but you've still got semiconductors inside each chip transmitting data. This model removes the barrier between fast memory (RAM) and permanent memory (HDD/SSDs). If you could do that you've got a system that only requires power during calculation, doesn't have to wait on memory refreshes before having access to data, and is inherently less wasteful because all memory is connected via one interface.
If modern systems could functionally remove the RAM you've either got a memory controller that completely frees up the PCH, or a chip that doesn't need a memory controller. Either way you're looking at huge efficiency increases. IBM seems to have done the math on that, and it is where they are getting the 6 times more efficient figure.
All of this said, why does any of this matter? They are quoting a 2018 release for some devices to the market. We haven't even effectively managed to use more than four cores in our current programs (yes, there are some that do, but they are the exceptions to the rules), let alone a heterogeneous cluster of specified cores. The crazy in me wants to see AMD latch onto this kind of thing, as they've pushed for exactly that. MS doesn't really have anything in the pipeline that would make use of this architecture.
I hate to say this, but it isn't the future of computing unless you can get idiots to use it. Makers and the internet of things are a nice little set of buzz words, but without reasonable pricing and actual market penetration you're dead in the water. IBM hasn't really struck me as particularly good at selling their products to the average person, least of all with a completely new device into an entirely undefined market. What I'm actually hoping for is the bastard child of this and ARM architecture. A phone that can actually get some talk time, act as a connected device, and not require charging every night would be a blessing.
now not saying its bull.
but well yeah im saying its bull i guess.
hope im wrong though.
personally im not a fan of ifl science.
they keep posting vids of free energy generators as true when any one can see its fake.
they seem to be one special shade of gullible.
Im holding out for the phone version
Yeah the memristors is huge in itself. Seems to me all these changes though might as well somewhat limit the quirks some systems have that others don't when it comes to gaming, because the design is more simplified and unified, eg more consistency. It would probably even minimize user error, at least on the hardware settings end anyway.
I can't help but think though that between the 2018 rollout for limited devices, and the introductory cost being high based on a revolutionary new technology and small quantities, that only the business and OEM small device sectors will see it for some time.
In fact it may be quite some time before it's ever cost effective enough for mainstream use, and the conundrum there is it usually takes early adopter consumers to get things going enough to justify even small scale mass production. Then you have the redundancy of parts lasting a long time in an industry that sees new advances and thus new components on a regular basis.
Another plus to limiting copper use though is it's a very toxic mining process that involves arsenic. Chile has the biggest copper mine in the world, producing over 30% of the global copper supply. They at one point wanted to shut it down, but the world powers convinced them to keep it going so as to not devastate the global economies and industries.
I had to muse over "The Internet of Things", because it sounds a lot like the birth of what eventually becomes Skynet in Terminator. It might start out as more like the G-Force story, where household gadgets transform and terrorize everyone. LOL
The used-to-Windows-compatibility idiot in me worries that "an open source-based OS that has yet to be developed" means it has even less chance of being used for mainstream gaming.
there are bits and pieces that may well be used.. like not using copper..
but i cant see any one moving much further way from our current model for processing information..
what would be needed as a catalyst to get something like this rolling would be another space race or similar. theory could be sound. the ability to make something could potentially be available in say 3 years or so..
but for a company to actually start working on it they would need a guaranteed goverment contract i would imagine.
and even then it wouldn't be available to the masses untill it was a viable business model.
Right now with the processing power we have as end users we have more than we need. with x99 board coming up and already on pre-sale with 8 core 16 thread i7's there really isnt anything that most end users do or need to do that could really push that to the limits.. hell most offices and small company's could easily get buy on a core 2 duo.
so all you have left of the masses is the enthusiasts.. and i know some spend a LOT more money on their systems than i do but others spend less. so if they wanted to sell it to the masses it would need to be pretty cheap. But then what 4-5-6 years later still no need to upgrade? are all the staff share holders and executives going to live off the interest..
i hate to think it. But the only way they could really make it viable is to use the tech in consoles. and mobile phones.
the disposable market of phones is huge. the teens don't care what the phone is all they care about is that it is the "in thing"
and then consoles they keep rolling over every 6-8 years just so you can play the newest games.
the only way this tech would work for a personal computer is if they built in artificial life expectancys to them..
so that's basically why i think its bull..
no company can possibly go ahead and release something that can do so much more than we could possibly need it to do, and expect it to be a viable and growing business.
even how it is as we stand. if you have a 1st gen i7 theirs not much reason for you to get a 4th gen i7, other than you want it.
I like seeing such things put into practice. While this sounds cool and whatnot, I think patience will be a virtue. With that said, I don't think anything in this article is worth arguing about. It's a prediction of the not so near future. That's all. With respect to the consumer market, less power consumption has been the demand (thanks to mobile devices) so we'll see more of that. With servers there has been a demand for more cores that consume less power so you see those changes in their respective markets.
I suspect we will see more of the same and until any of these theoretical models actually gets prototyped and put into small scale production, I will be hesitant to believe there will be any major change to the market in the next couple years and only then will we be able to determine what will happen a couple years from then because things can change.
I think its definitely going to get interesting and likely disastrous for some, if the machine sounds good it only does compared to what we now have now but with the dawn of either decent usable quatum processors or possibly synapt derived ones it could well be a niche idea that gets superceded.
I would think they should ideally be aiming for a quantum based synapt architecture using photonic connection's myself , I mean if your aiming for the next paradigm of computing you might as well reach for the best you can get eh.
Separate names with a comma.