• Welcome to TechPowerUp Forums, Guest! Please check out our forum guidelines for info related to our community.

0,1 and the power of light computing

Joined
Mar 28, 2011
Messages
197 (0.04/day)
Location
Tolleson, Az
System Name Station 6
Processor I7 920
Motherboard MSI BB xpower
Cooling Custom water
Memory 12gb Gskill Pi
Video Card(s) xfx 7970
Storage 4x f60 ssds, 2x 300gb Raptor
Display(s) 24" Gateway
Case scratch built
Audio Device(s) creative fataliy pro
Power Supply 700w Toughpower
well, to start, im far from being a genious, or really even having much knowledge of what im about to start a discussion about, but i cant stop wondering about this and im not to sure on where to post it either. (this area looks good i guess)

so, to start, lets look at yes or no. as in voltage present or not. each represents a 1 or 0 respectively when information is processed. originally, computers were meant to use the numbers 1-9, but regulating the discrete amounts of voltage to represent each was too much for the time. thus, 1 and 0 were used, and this simplified the issue enough to grow into what we have today. (correct me if im wrong please)

now, with the advent of optic fiber and its uses, (thunderbolt is a great example), we're using the presence of light, or no light to send information over a medium, and its moving faster than electrons through a conductor. that is awesome, and i hope to see this replace the use of free flowing electron (electricity) in our next generation circuits.

whats boggling me is what else can we do with light? what could we produce if instead of just using yes or no to compute a problem, we used white, green, blue, yellow, red, or any of the 278 trillion colors availible to us to compute? even on a simple scale of three colors (pick your own) how much further could we expand the compute environment? what could 5 colors provide? the ability to learn real time as humans do? we could see the use of the original variables 1-9 expressed as colors, moving faster (over distance) through a more full plane (if you will) rather than the two dimensional plane graph of 1 or 0.

granted, the ability to create the light would still be somewhat limited by the speed of flowing electrons through a LED (or whatever comes next) but the base of the coversation i would like to start here is where are we going, what can we imagine doing with it, and is this possibley more feasable than a bio engineered circuit? (something simular to our human body, which is another discussion i would like to start later)
 
Joined
Apr 2, 2011
Messages
2,657 (0.56/day)
Let me get this out of the way first, but please continue reading. You are missing the point of why we use a binary system for computing, and have a fundamental misunderstanding of light based computations.

The reason that we use binary is that a stable voltage is nearly impossible in the computing world. Variations in wire length, heat expansion, material, and a handful of others contribute to variation in voltages. Considering that you are looking to system to create 10 discrete states, the voltages required would be immense. As most components could not reliably produce these large difference, it was necessary to have less discrete values. To make things simple, a binary based math system was implemented. A part could read a 0 from no voltage to four volts, and a 1 for anything above four volts. The problem of variation in voltages was thereby resolved, but computing was forced to use a binary system. This is not a loss to computers, but makes it more difficult for base 10 thinkers to comprehend the information.

Light computing can use multiple wavelengths, visually differentiated by their color. This is the exact same theory as a standard electronic cable uses. Cat5 wire utilizes four pairs of conductors (8 total wires) to send and received data. Light can use the same technique, but the barrier to using this is the high cost of laser diodes and the substantial chemical composition differences required to produce each type of laser. Here again, the theory isn't new, and is already in use.


What is being forgotten is the fact that data has to be stored. Whether it be on an SSD, HDD, RAM, or ROM data storage is structured only as a collection of 1s and 0s. Even if communication protocols were amazingly quick, the data transferred will be limited by the storage media. SSDs may saturate a Gigabit ethernet connection, but trusty old mechanical drives are where my home network fails. Basically, you're asking for a car to be able to travel at 200kph, in an area where the speed limit is 50 kph. Extra speed is communication might be useful to computer clusters, but the average user won't be able to find a use.

Perhaps if internet providers used this kind of connection (some use fiber optic cables, but not in this way) it might make sense, but the costs would be astronomical.
 
Joined
Mar 10, 2010
Messages
11,878 (2.30/day)
Location
Manchester uk
System Name RyzenGtEvo/ Asus strix scar II
Processor Amd R5 5900X/ Intel 8750H
Motherboard Crosshair hero8 impact/Asus
Cooling 360EK extreme rad+ 360$EK slim all push, cpu ek suprim Gpu full cover all EK
Memory Corsair Vengeance Rgb pro 3600cas14 16Gb in four sticks./16Gb/16GB
Video Card(s) Powercolour RX7900XT Reference/Rtx 2060
Storage Silicon power 2TB nvme/8Tb external/1Tb samsung Evo nvme 2Tb sata ssd/1Tb nvme
Display(s) Samsung UAE28"850R 4k freesync.dell shiter
Case Lianli 011 dynamic/strix scar2
Audio Device(s) Xfi creative 7.1 on board ,Yamaha dts av setup, corsair void pro headset
Power Supply corsair 1200Hxi/Asus stock
Mouse Roccat Kova/ Logitech G wireless
Keyboard Roccat Aimo 120
VR HMD Oculus rift
Software Win 10 Pro
Benchmark Scores 8726 vega 3dmark timespy/ laptop Timespy 6506
so, to start, lets look at yes or no. as in voltage present or not. each represents a 1 or 0 respectively when information is processed. originally, computers were meant to use the numbers 1-9, but regulating the discrete amounts of voltage to represent each was too much for the time. thus, 1 and 0 were used, and this simplified the issue enough to grow into what we have today. (correct me if im wrong please)

i just typed for hours and it got off (disapeared on accepting something os related) tut damn explorer, again

you should look up transistor design as it is this that determined the binary thing initially ,as processors are massive arrays of these, processors can use alternate languages to binary now though

massive arrays of volt regulators and measureing elements would be impossible to manufature id imagine anyway and yes i know a trnsistor IS a volt regulator but they are not used for that in cpu,s their used as a state switch

you do raise some good points, and IBM and intel (id imagine) pretty much have optical chip interconnects figured out and nearing volume production but optical chips themselfs are still in the first stages of developement afaik ie they are still trying to work out materials for optical transistors and even how they will work so an all opto chip is some ways off, and might not be the One anyway what with quantum computers and other tech

as id imagine a quatum computer would run crysis much better then an optochip:)

wtf , whys my post before turned back up
 
Joined
Nov 4, 2005
Messages
11,683 (1.73/day)
System Name Compy 386
Processor 7800X3D
Motherboard Asus
Cooling Air for now.....
Memory 64 GB DDR5 6400Mhz
Video Card(s) 7900XTX 310 Merc
Storage Samsung 990 2TB, 2 SP 2TB SSDs and over 10TB spinning
Display(s) 56" Samsung 4K HDR
Audio Device(s) ATI HDMI
Mouse Logitech MX518
Keyboard Razer
Software A lot.
Benchmark Scores Its fast. Enough.
Let me get this out of the way first, but please continue reading. You are missing the point of why we use a binary system for computing, and have a fundamental misunderstanding of light based computations.

The reason that we use binary is that a stable voltage is nearly impossible in the computing world. Variations in wire length, heat expansion, material, and a handful of others contribute to variation in voltages. Considering that you are looking to system to create 10 discrete states, the voltages required would be immense. As most components could not reliably produce these large difference, it was necessary to have less discrete values. To make things simple, a binary based math system was implemented. A part could read a 0 from no voltage to four volts, and a 1 for anything above four volts. The problem of variation in voltages was thereby resolved, but computing was forced to use a binary system. This is not a loss to computers, but makes it more difficult for base 10 thinkers to comprehend the information.

Light computing can use multiple wavelengths, visually differentiated by their color. This is the exact same theory as a standard electronic cable uses. Cat5 wire utilizes four pairs of conductors (8 total wires) to send and received data. Light can use the same technique, but the barrier to using this is the high cost of laser diodes and the substantial chemical composition differences required to produce each type of laser. Here again, the theory isn't new, and is already in use.


What is being forgotten is the fact that data has to be stored. Whether it be on an SSD, HDD, RAM, or ROM data storage is structured only as a collection of 1s and 0s. Even if communication protocols were amazingly quick, the data transferred will be limited by the storage media. SSDs may saturate a Gigabit ethernet connection, but trusty old mechanical drives are where my home network fails. Basically, you're asking for a car to be able to travel at 200kph, in an area where the speed limit is 50 kph. Extra speed is communication might be useful to computer clusters, but the average user won't be able to find a use.

Perhaps if internet providers used this kind of connection (some use fiber optic cables, but not in this way) it might make sense, but the costs would be astronomical.

That may have been what WAS the case in early transistors, but we now use multiple states of electrical energy to transfer data, the most known being QAM, or PKS, and multiple states thereof.

Plus newer GDDR5 standards allow for retraining and jitter control per channel/link.

We could move to a multistate, however all of your compilers, hardware, essentially everything is made to run binary as it is a true and purest form of computation, and the simplest answer, yes or no, on or off.

The next fastest thing would be optical transistors with color filter gates so one lead could carry the instruction data for a set of transistors and allow the output of multiple at once.
 
Top