Discussion in 'News' started by btarunr, Aug 17, 2009.
How? It's not little baby blood in it, it's Deoxyribonucleic acid.
Pretty much. The computer can calculate a lot faster than us, but the only thing it can do is crunch numbers. You have to tell it exactly what numbers to crunch (course its a lot easier now than it was back in the punch-card days).
So in other words, human brains are way ahead in the software department. For now.... *queue ominous music*
Yes but they will fully be able to connect a computer to the brain soon, it's not that far off now. They already connect chips inside of pets.
And just like humans we will not do any thing about it until it's near to late.
Yeah, but that is just rfid chips right (in pets)? As for connecting to a human brain, I think they have something for disabled people where they can steer a mouse by thinking about it. It is basically like voice recognition, but recognizes brain wave patterns instead. You have to go through training it by associating a certain brain wave as "move cursor left", etc.
anyhow, too late for what?
And probably fail.
Cool, keep this up and I could donate my DNA the day I die. Then I will live on muahahahaha.
lmfao haha, wouldn't that be funny, all you'd be thinking is 10101110100010101010111100101001001001(numbers) all day. scary thoughts, then we'd get pc's with personalities. Higher premiums for the hardworking computers, then there'll be el cheapo lazy
But then I will hire law enforcement machines to kill you
You guys trip me out. Do you really think any CPU in the planet can compete with a brain? Honestly? Do you have any concept on how complex the human brain is?
technically no one does right?
No there is what you said and there on about putting chips in US soldiers too. How far this will go has to be seen. But if ya really think about it we already know how far it will go.
There is one for blind people too if i remember correctly.
Yep we fail at near every thing, except destroying every thing and ironic enough it be us who destroys us.
No i think as of yet it cannot but i do think a brain can be interfaced with a computer or we are dam close to it at least... I bet there closer to interfacing the brain with computer than we actually know. And i mean on a larger scale than in pets and for some types of disability.
Sooner or later it will happen.. And we know we can do it if we learn enough as technology advancements will allow us more ways to do it.
Computers are already at the starting edge of being able to "read" our thoughts through pattern recognition.
Some argue that we are all capable of solving very complex math problems, it's just that we don't know we are doing it. The most common example given is the ability to catch a ball thrown at you. You have to figure out how fast the ball is going, the path it is taking, and then where to put your hand to catch it. There are many variables in this that are also calculated.
All of the data is input and calculated in fractions of a second and updated to the very split second the ball is in our hand.
All very simple, we do it every day, but it requires complex computations to pull it off.
Computers suffer all the flaws that the binary system does. Binary sucks at storing images, it sucks at being random, and it sucks with decimal numbers. Anything that works great in binary (like addition, subtraction, and multiplication) works considerably faster on modern CPUs than the brain. For example, a Core i7 920 with HT enabled can count to over 90 million on each core in a second. Combined, it can count from 0 to 720 million in just one second.
Humans can't get even close because, figuratively speaking, 0-9 are alien concepts to the brain. The brain must calculate a quantity and assign it a value then interpret the value via language, announce it, and repeat. In this regard, the brain operates about 240,000,000 times slower than a Core i7 920.
Both have distinct advantages and disadvantages. Neural networks (brains) have the ability to store, interpret, and recall images at a rate at least three times faster than a CPU. The more complex the image, the greater the lead. Neural networks also have the ability to learn and repair damage (to some extent) which CPUs do not. Neural networks are pretty lousy at math though where CPUs kick ass.
No doubt, a merger of the two would be ideal but that means heading into territory I'm not so certain we should be (you've seen or at least heard of all the sci-fi material out there depicting the possibilities).
Oh, computers can't make an curve either--especially with digital monitors.
That is figured using one's understanding of how one expects a ball to fly. Just like how one expects that stepping off a cliff means the end of you. The brain doesn't handle the situation with a bunch of equations, mathematics, variables, algebra, etc. It handles it from a very simple perspective that is learned through repetition.
Take, for example, a puppy. You don't have to teach it all of the concepts of math to make it catch a ball. You have to toss a ball at its face until it figures out it has to open its mouth and catch it. Do that for a few days and, if the dog is coordinated enough, it can pull off some pretty remarkable moves to catch that ball in short time. Not because it understands how gravity works--it understands how you throw it. Throw it a different way like put a curve on it and, just like a batter, there's a good chance it won't be able to catch it. The brain either can't register the rotation of the ball fast enough or the eyes simply can't pick up the detail to make that decision. Either way, dog and human alike are fooled. Throw a curve ball every time and viola, both hit/catch it.
Acting out expectations is a simple task for a brain to achieve (requires no "computations"). A computer, on the other hand, could use a high speed camera to tell you the trajectory of a ball just by watching the laces and position of the ball over a few frames. The only difficulty there is programming the computer to "find" the ball and then "find" the laces. The calculations are readily handled by the CPU's architecture with some simple instructions based on fluid dynamics, velocities, and accelerations.
I did not say that the person or animal understood gravity or physics. The theory is that we are calculating these variables on a level not quite understood.
Stepping off of a cliff is confusing instinct with rational thinking. In tests done with newborns, it was shown that the baby's would not cross over a perceived drop of a few feet even though the drop off was covered by a sheet of glass. The baby's would go right up to the edge and stop. It knew instinctively that it was a dangerous situation. It's instinct warned it of a drop off but it could not at this point in time know or comprehend glass. The incentive used to try and get the baby's to cross was the Mom on the other side with a bottle. Definitely not a learned response.
As to the math side of it, we may not be figuring out the paths in the traditional sense, but calculations are none the less being performed. Dug this up.
So how does the same gooey substance simultaneously acquire visual data, calculate positional information, and gauge trajectory to let a lizard’s tongue snatch a fly, a dog’s mouth catch a Frisbee, or a hand catch a falling glass? “With the thousands of muscles in the body, the motor cortex clearly isn’t ‘thinking’ in any sense about movement,” says UC San Diego neuroscientist Patricia Churchland. According to Stanford University’s Krishna Shenoy, the brain seems to create an internal model of the physical world, then, like some super-sophisticated neural joystick, traces intended movements onto this model. “But it’s all in a code that science has yet to crack,” he says. Whatever that code is, it’s not about size. “Even a cat’s brain can modify the most complicated motions while executing them.”
How is the instruction set related to the material used to make transistors?
LMAO ahh Can't disagree there
Sorry if someone already mentioned this but, did you see that OCZ Neurolizer Thing they were selling on the EGG for gamers (WTF) it's closer than you think
Intel tried to get rid of it years ago, the alternative is still being developed and thus available. And in case you're worried about software the Itanium can emulate x86.
How to catch a ball is also instinct. You learn that falling down a hole isn't a good thing just as you learn how a ball usually flies. Brains don't operate with numbers, they operate on repeating scenarios learned. Computers can't learn so the only way they can catch a ball is to get the physics involved.
However, a computer can be programmed to learn from repetition. For instance, if you take a robot arm and tell the computer to record the movements as you guide the arm along a path, the computer can repeat that path. If you throw the ball to the same place all the time, it could catch it every time. The only reason why they don't behave like humans is because they process everything differently from humans (binary instead of neurons).
That's exactly what I'm getting at. Neurons are very good at controling muscles (they speak the same language). The only real delay is in visual cues as it takes the brain longer to recognize something is flying at you than to position your hands to catch or deflect it.
It's the Sperminators I'm worried about...
It's not that far off, they're already copulating amongst themselves and procreating. Soon they'll turn their ways on us!
As long as I'm giving it and not getting it I'm cool.
I have to disagree with this statement. The testing with infants shows that you know pretty much know from day one that falling down a hole is bad. You are in essence, pre-programed with this knowledge. If you had to learn everything with no help from instinct, you would be in for a very hard life.
There are of course exceptions. You learn what can and can not burn you, etc.
I at no time stated that the brain worked like a computer. I said that the brain was doing the calculations on a level and in a way that we do not understand. It absolutely is doing these calculations as, we will stick with the ball, it is able to figure out/calculate variables. If a gust of wind catches the ball at the last second, the brain will recalculate the flight path and move the hand to catch the ball.
The brain can miscalculate and not catch the ball just as a computer can miscalculate due to the input, sensory or binary, being wrong.
We become better at catching the ball through repetition not because of the repetition, but because the brain is learning that the ball will not always follow the calculated path. One thread (ha ha) is calculating the flight path, while another one is waiting for a variable and will then recalculate and supersede the first one if a variable is introduced.
I will give you that catching a ball is somewhat learned as it is instinct to get out of the way when something is thrown at you. You sometimes do not get out of the way in time due to slow reaction, fear or other reasons.
The computer, which for all intents and purpose is nothing more than a glorified abacus, is so vastly inferior to the brain, not on the 1+1=2 scale, but on the "aha, I've got it" scale, that it will be centuries before it is even close to cognitive thinking.
And I said it doesn't. Take, programming a motor for instance to spin in a specific pattern. You give it this much voltage, it spins x number of times in y period of time. The muscles are very similar in that the brain knows what to expect from the muscles given the proper amount of stimulation. You think move your hand to touch that, your brain applies the proper amount of stimulation to the proper nerves to carry out that motion. It all works on expectations (familiarity with one self), not calculations. Just as you don't need the decimal system to make gravity work, you don't need decimals to make your body work.
The entire concept of math/algebra is used to describe what happens around us; not in any way control it. The decimal system represents an epic fail with arcs, zero, and on the quantum level. The world doesn't run on numbers--we try to stick numbers on everything to make it appear less chaotic. It also gives a sense of control over it which in turn suppresses fears. Numbers still don't define nature--they aren't the lowest common denominator.
If that were the case than eventually, you'd never miss (a computer can be made to behave that way). Reality is even the best professional baseball players can and do miss. Expectations determine hit or miss, not inflight calculations. As evidence of this, most players know if they are going to swing or not before the ball is even thrown. How the batter swings (bunt, slow, or hard) is also determined before hand. Once the ball is thrown, all that is decided, based on expectations, is when to swing the bat in their predetermined way. If it is a curve ball and the batter didn't expect it, the batter will most likely miss or foul. If the ball is moving slower than expected, the batter will most likely swing too early. If the ball is moving faster than expected, the batter will swing late.
The swing itself is controlled through experience. This is why a different bat than usual can really screw a batter up. On the other hand, if you give a bat to a robot, it can calculate the weight distribution of the bat and all the optics can be set up to never miss. Humans (margin of error) never operate on a degree of exactness that computers always operate on (zero errors, only operator/programmer error).
Computer's expertise is binary; the brain's expertise is recognizing food, threats, and genetic compatibility. A computer can do what the brain can do inefficiently and the brain can do what computers do inefficiently. Leave the thinking to humans and the calculating to computers.
Never miss? Please nothing is perfect. It will miss eventually. Computers know nothing but what man builds them to do.
Separate names with a comma.