Monday, August 17th 2009

IBM Scientists Use DNA Scaffolding To Build Tiny Circuit Board

Today, scientists at IBM Research and the California Institute of Technology announced a scientific advancement that could be a major breakthrough in enabling the semiconductor industry to pack more power and speed into tiny computer chips, while making them more energy efficient and less expensive to manufacture.

IBM researchers and collaborator Paul W.K. Rothemund, of the California Institute of Technology, have made an advancement in combining lithographic patterning with self assembly - a method to arrange DNA origami structures on surfaces compatible with today's semiconductor manufacturing equipment.
Today, the semiconductor industry is faced with the challenges of developing lithographic technology for feature sizes smaller than 22 nm and exploring new classes of transistors that employ carbon nanotubes or silicon nanowires. IBM's approach of using DNA molecules as scaffolding - where millions of carbon nanotubes could be deposited and self-assembled into precise patterns by sticking to the DNA molecules - may provide a way to reach sub-22 nm lithography.

The utility of this approach lies in the fact that the positioned DNA nanostructures can serve as scaffolds, or miniature circuit boards, for the precise assembly of components - such as carbon nanotubes, nanowires and nanoparticles - at dimensions significantly smaller than possible with conventional semiconductor fabrication techniques. This opens up the possibility of creating functional devices that can be integrated into larger structures, as well as enabling studies of arrays of nanostructures with known coordinates.

"The cost involved in shrinking features to improve performance is a limiting factor in keeping pace with Moore's Law and a concern across the semiconductor industry," said Spike Narayan, manager, Science & Technology, IBM Research - Almaden. "The combination of this directed self-assembly with today's fabrication technology eventually could lead to substantial savings in the most expensive and challenging part of the chip-making process."

The techniques for preparing DNA origami, developed at Caltech, cause single DNA molecules to self assemble in solution via a reaction between a long single strand of viral DNA and a mixture of different short synthetic oligonucleotide strands. These short segments act as staples - effectively folding the viral DNA into the desired 2D shape through complementary base pair binding. The short staples can be modified to provide attachment sites for nanoscale components at resolutions (separation between sites) as small as 6 nanometers (nm). In this way, DNA nanostructures such as squares, triangles and stars can be prepared with dimensions of 100 - 150 nm on an edge and a thickness of the width of the DNA double helix.

The lithographic templates were fabricated at IBM using traditional semiconductor techniques, the same used to make the chips found in today's computers, to etch out patterns. Either electron beam or optical lithography were used to create arrays of binding sites of the proper size and shape to match those of individual origami structures. Key to the process were the discovery of the template material and deposition conditions to afford high selectivity so that origami binds only to the patterns of "sticky patches" and nowhere else.

The paper on this work, "Placement and orientation of DNA nanostructures on lithographically patterned surfaces," by scientists at IBM Research and the California Institute of Technology, will be published in the September issue of Nature Nanotechnology and is currently available here.
Source: IBM
Add your own comment

51 Comments on IBM Scientists Use DNA Scaffolding To Build Tiny Circuit Board

#26
ArmoredCavalry
Nailezsso, like i said, in that aspect, computers are limited not in their speed, but in the programming

am i wrong?
Pretty much. The computer can calculate a lot faster than us, but the only thing it can do is crunch numbers. You have to tell it exactly what numbers to crunch (course its a lot easier now than it was back in the punch-card days).

So in other words, human brains are way ahead in the software department. For now.... *queue ominous music*
Posted on Reply
#27
AsRock
TPU addict
ArmoredCavalryPretty much. The computer can calculate a lot faster than us, but the only thing it can do is crunch numbers. You have to tell it exactly what numbers to crunch (course its a lot easier now than it was back in the punch-card days).

So in other words, human brains are way ahead in the software department. For now.... *queue ominous music*
Yes but they will fully be able to connect a computer to the brain soon, it's not that far off now. They already connect chips inside of pets.

And just like humans we will not do any thing about it until it's near to late.
Posted on Reply
#28
ArmoredCavalry
AsRockYes but they will fully be able to connect a computer to the brain soon, it's not that far off now. They already connect chips inside of pets.

And just like humans we will not do any thing about it until it's near to late.
Yeah, but that is just rfid chips right (in pets)? As for connecting to a human brain, I think they have something for disabled people where they can steer a mouse by thinking about it. It is basically like voice recognition, but recognizes brain wave patterns instead. You have to go through training it by associating a certain brain wave as "move cursor left", etc.

anyhow, too late for what?
Posted on Reply
#29
Kantastic
AsRockAnd just like humans we will not do any thing about it until it's near to late.
And probably fail.
Posted on Reply
#30
PCpraiser100
Cool, keep this up and I could donate my DNA the day I die. Then I will live on muahahahaha.
Posted on Reply
#31
a_ump
PCpraiser100Cool, keep this up and I could donate my DNA the day I die. Then I will live on muahahahaha.
lmfao haha, wouldn't that be funny, all you'd be thinking is 10101110100010101010111100101001001001(numbers) all day:laugh:. scary thoughts, then we'd get pc's with personalities. Higher premiums for the hardworking computers, then there'll be el cheapo lazy :roll:
Posted on Reply
#32
PCpraiser100
a_umplmfao haha, wouldn't that be funny, all you'd be thinking is 10101110100010101010111100101001001001(numbers) all day:laugh:. scary thoughts, then we'd get pc's with personalities. Higher premiums for the hardworking computers, then there'll be el cheapo lazy :roll:
But then I will hire law enforcement machines to kill you :pimp:
Posted on Reply
#33
TheMailMan78
Big Member
You guys trip me out. Do you really think any CPU in the planet can compete with a brain? Honestly? Do you have any concept on how complex the human brain is?
Posted on Reply
#34
a_ump
technically no one does right?
Posted on Reply
#35
AsRock
TPU addict
ArmoredCavalryYeah, but that is just rfid chips right (in pets)? As for connecting to a human brain, I think they have something for disabled people where they can steer a mouse by thinking about it. It is basically like voice recognition, but recognizes brain wave patterns instead. You have to go through training it by associating a certain brain wave as "move cursor left", etc.

anyhow, too late for what?
No there is what you said and there on about putting chips in US soldiers too. How far this will go has to be seen. But if ya really think about it we already know how far it will go.

There is one for blind people too if i remember correctly.
KantasticAnd probably fail.
Yep we fail at near every thing, except destroying every thing and ironic enough it be us who destroys us.
TheMailMan78You guys trip me out. Do you really think any CPU in the planet can compete with a brain? Honestly? Do you have any concept on how complex the human brain is?
No i think as of yet it cannot but i do think a brain can be interfaced with a computer or we are dam close to it at least... I bet there closer to interfacing the brain with computer than we actually know. And i mean on a larger scale than in pets and for some types of disability.

Sooner or later it will happen.. And we know we can do it if we learn enough as technology advancements will allow us more ways to do it.
Posted on Reply
#36
Unregistered
Computers are already at the starting edge of being able to "read" our thoughts through pattern recognition.

Some argue that we are all capable of solving very complex math problems, it's just that we don't know we are doing it. The most common example given is the ability to catch a ball thrown at you. You have to figure out how fast the ball is going, the path it is taking, and then where to put your hand to catch it. There are many variables in this that are also calculated.

All of the data is input and calculated in fractions of a second and updated to the very split second the ball is in our hand.

All very simple, we do it every day, but it requires complex computations to pull it off.
#37
FordGT90Concept
"I go fast!1!11!1!"
TheMailMan78You guys trip me out. Do you really think any CPU in the planet can compete with a brain? Honestly? Do you have any concept on how complex the human brain is?
Computers suffer all the flaws that the binary system does. Binary sucks at storing images, it sucks at being random, and it sucks with decimal numbers. Anything that works great in binary (like addition, subtraction, and multiplication) works considerably faster on modern CPUs than the brain. For example, a Core i7 920 with HT enabled can count to over 90 million on each core in a second. Combined, it can count from 0 to 720 million in just one second.

Humans can't get even close because, figuratively speaking, 0-9 are alien concepts to the brain. The brain must calculate a quantity and assign it a value then interpret the value via language, announce it, and repeat. In this regard, the brain operates about 240,000,000 times slower than a Core i7 920.

Both have distinct advantages and disadvantages. Neural networks (brains) have the ability to store, interpret, and recall images at a rate at least three times faster than a CPU. The more complex the image, the greater the lead. Neural networks also have the ability to learn and repair damage (to some extent) which CPUs do not. Neural networks are pretty lousy at math though where CPUs kick ass.

No doubt, a merger of the two would be ideal but that means heading into territory I'm not so certain we should be (you've seen or at least heard of all the sci-fi material out there depicting the possibilities).


Oh, computers can't make an curve either--especially with digital monitors. :laugh:
LaidLawJonesSome argue that we are all capable of solving very complex math problems, it's just that we don't know we are doing it. The most common example given is the ability to catch a ball thrown at you. You have to figure out how fast the ball is going, the path it is taking, and then where to put your hand to catch it. There are many variables in this that are also calculated.
That is figured using one's understanding of how one expects a ball to fly. Just like how one expects that stepping off a cliff means the end of you. The brain doesn't handle the situation with a bunch of equations, mathematics, variables, algebra, etc. It handles it from a very simple perspective that is learned through repetition.

Take, for example, a puppy. You don't have to teach it all of the concepts of math to make it catch a ball. You have to toss a ball at its face until it figures out it has to open its mouth and catch it. Do that for a few days and, if the dog is coordinated enough, it can pull off some pretty remarkable moves to catch that ball in short time. Not because it understands how gravity works--it understands how you throw it. Throw it a different way like put a curve on it and, just like a batter, there's a good chance it won't be able to catch it. The brain either can't register the rotation of the ball fast enough or the eyes simply can't pick up the detail to make that decision. Either way, dog and human alike are fooled. Throw a curve ball every time and viola, both hit/catch it.

Acting out expectations is a simple task for a brain to achieve (requires no "computations"). A computer, on the other hand, could use a high speed camera to tell you the trajectory of a ball just by watching the laces and position of the ball over a few frames. The only difficulty there is programming the computer to "find" the ball and then "find" the laces. The calculations are readily handled by the CPU's architecture with some simple instructions based on fluid dynamics, velocities, and accelerations.
Posted on Reply
#38
Unregistered
I did not say that the person or animal understood gravity or physics. The theory is that we are calculating these variables on a level not quite understood.

Stepping off of a cliff is confusing instinct with rational thinking. In tests done with newborns, it was shown that the baby's would not cross over a perceived drop of a few feet even though the drop off was covered by a sheet of glass. The baby's would go right up to the edge and stop. It knew instinctively that it was a dangerous situation. It's instinct warned it of a drop off but it could not at this point in time know or comprehend glass. The incentive used to try and get the baby's to cross was the Mom on the other side with a bottle. Definitely not a learned response.

As to the math side of it, we may not be figuring out the paths in the traditional sense, but calculations are none the less being performed. Dug this up.

So how does the same gooey substance simultaneously acquire visual data, calculate positional information, and gauge trajectory to let a lizard’s tongue snatch a fly, a dog’s mouth catch a Frisbee, or a hand catch a falling glass? “With the thousands of muscles in the body, the motor cortex clearly isn’t ‘thinking’ in any sense about movement,” says UC San Diego neuroscientist Patricia Churchland. According to Stanford University’s Krishna Shenoy, the brain seems to create an internal model of the physical world, then, like some super-sophisticated neural joystick, traces intended movements onto this model. “But it’s all in a code that science has yet to crack,” he says. Whatever that code is, it’s not about size. “Even a cat’s brain can modify the most complicated motions while executing them.”
#39
Deleted member 3
mrhugglesi think you got the wrong idea, this is a desperate attempt to hold onto x86 [to make it smaller and faster]
How is the instruction set related to the material used to make transistors?
Posted on Reply
#40
Unregistered
TheMailMan78This is way beyond my understanding but anything to get away from the x86. Go IBM!
:roll: LMAO ahh Can't disagree there
#41
Unregistered
AsRockNo there is what you said and there on about putting chips in US soldiers too. How far this will go has to be seen. But if ya really think about it we already know how far it will go.

There is one for blind people too if i remember correctly.



Yep we fail at near every thing, except destroying every thing and ironic enough it be us who destroys us.



No i think as of yet it cannot but i do think a brain can be interfaced with a computer or we are dam close to it at least... I bet there closer to interfacing the brain with computer than we actually know. And i mean on a larger scale than in pets and for some types of disability.

Sooner or later it will happen.. And we know we can do it if we learn enough as technology advancements will allow us more ways to do it.
Sorry if someone already mentioned this but, did you see that OCZ Neurolizer Thing they were selling on the EGG for gamers (WTF) it's closer than you think
www.ocztechnology.com/products/ocz_peripherals/nia-neural_impulse_actuator
#42
Deleted member 3
jmcslob:roll: LMAO ahh Can't disagree there
Intel tried to get rid of it years ago, the alternative is still being developed and thus available. And in case you're worried about software the Itanium can emulate x86.
Posted on Reply
#43
FordGT90Concept
"I go fast!1!11!1!"
LaidLawJonesStepping off of a cliff is confusing instinct with rational thinking. In tests done with newborns, it was shown that the baby's would not cross over a perceived drop of a few feet even though the drop off was covered by a sheet of glass. The baby's would go right up to the edge and stop. It knew instinctively that it was a dangerous situation. It's instinct warned it of a drop off but it could not at this point in time know or comprehend glass. The incentive used to try and get the baby's to cross was the Mom on the other side with a bottle. Definitely not a learned response.
How to catch a ball is also instinct. You learn that falling down a hole isn't a good thing just as you learn how a ball usually flies. Brains don't operate with numbers, they operate on repeating scenarios learned. Computers can't learn so the only way they can catch a ball is to get the physics involved.

However, a computer can be programmed to learn from repetition. For instance, if you take a robot arm and tell the computer to record the movements as you guide the arm along a path, the computer can repeat that path. If you throw the ball to the same place all the time, it could catch it every time. The only reason why they don't behave like humans is because they process everything differently from humans (binary instead of neurons).
LaidLawJonesSo how does the same gooey substance simultaneously acquire visual data, calculate positional information, and gauge trajectory to let a lizard’s tongue snatch a fly, a dog’s mouth catch a Frisbee, or a hand catch a falling glass? “With the thousands of muscles in the body, the motor cortex clearly isn’t ‘thinking’ in any sense about movement,” says UC San Diego neuroscientist Patricia Churchland. According to Stanford University’s Krishna Shenoy, the brain seems to create an internal model of the physical world, then, like some super-sophisticated neural joystick, traces intended movements onto this model. “But it’s all in a code that science has yet to crack,” he says. Whatever that code is, it’s not about size. “Even a cat’s brain can modify the most complicated motions while executing them.”
That's exactly what I'm getting at. Neurons are very good at controling muscles (they speak the same language). The only real delay is in visual cues as it takes the brain longer to recognize something is flying at you than to position your hands to catch or deflect it.
Posted on Reply
#44
CyberDruid
El FiendoNot if computer comes installed with a fleshlight. In which case we can start making terminators with functional perforations.
It's the Sperminators I'm worried about...
Posted on Reply
#45
El Fiendo
It's not that far off, they're already copulating amongst themselves and procreating. Soon they'll turn their ways on us!

Posted on Reply
#47
Unregistered
You learn that falling down a hole isn't a good thing
I have to disagree with this statement. The testing with infants shows that you know pretty much know from day one that falling down a hole is bad. You are in essence, pre-programed with this knowledge. If you had to learn everything with no help from instinct, you would be in for a very hard life.

There are of course exceptions. You learn what can and can not burn you, etc.

I at no time stated that the brain worked like a computer. I said that the brain was doing the calculations on a level and in a way that we do not understand. It absolutely is doing these calculations as, we will stick with the ball, it is able to figure out/calculate variables. If a gust of wind catches the ball at the last second, the brain will recalculate the flight path and move the hand to catch the ball.

The brain can miscalculate and not catch the ball just as a computer can miscalculate due to the input, sensory or binary, being wrong.

We become better at catching the ball through repetition not because of the repetition, but because the brain is learning that the ball will not always follow the calculated path. One thread (ha ha) is calculating the flight path, while another one is waiting for a variable and will then recalculate and supersede the first one if a variable is introduced.

I will give you that catching a ball is somewhat learned as it is instinct to get out of the way when something is thrown at you. You sometimes do not get out of the way in time due to slow reaction, fear or other reasons.

The computer, which for all intents and purpose is nothing more than a glorified abacus, is so vastly inferior to the brain, not on the 1+1=2 scale, but on the "aha, I've got it" scale, that it will be centuries before it is even close to cognitive thinking.
#48
FordGT90Concept
"I go fast!1!11!1!"
LaidLawJonesI said that the brain was doing the calculations on a level and in a way that we do not understand. It absolutely is doing these calculations as, we will stick with the ball, it is able to figure out/calculate variables. If a gust of wind catches the ball at the last second, the brain will recalculate the flight path and move the hand to catch the ball.
And I said it doesn't. Take, programming a motor for instance to spin in a specific pattern. You give it this much voltage, it spins x number of times in y period of time. The muscles are very similar in that the brain knows what to expect from the muscles given the proper amount of stimulation. You think move your hand to touch that, your brain applies the proper amount of stimulation to the proper nerves to carry out that motion. It all works on expectations (familiarity with one self), not calculations. Just as you don't need the decimal system to make gravity work, you don't need decimals to make your body work.

The entire concept of math/algebra is used to describe what happens around us; not in any way control it. The decimal system represents an epic fail with arcs, zero, and on the quantum level. The world doesn't run on numbers--we try to stick numbers on everything to make it appear less chaotic. It also gives a sense of control over it which in turn suppresses fears. Numbers still don't define nature--they aren't the lowest common denominator.
LaidLawJonesWe become better at catching the ball through repetition not because of the repetition, but because the brain is learning that the ball will not always follow the calculated path. One thread (ha ha) is calculating the flight path, while another one is waiting for a variable and will then recalculate and supersede the first one if a variable is introduced.
If that were the case than eventually, you'd never miss (a computer can be made to behave that way). Reality is even the best professional baseball players can and do miss. Expectations determine hit or miss, not inflight calculations. As evidence of this, most players know if they are going to swing or not before the ball is even thrown. How the batter swings (bunt, slow, or hard) is also determined before hand. Once the ball is thrown, all that is decided, based on expectations, is when to swing the bat in their predetermined way. If it is a curve ball and the batter didn't expect it, the batter will most likely miss or foul. If the ball is moving slower than expected, the batter will most likely swing too early. If the ball is moving faster than expected, the batter will swing late.

The swing itself is controlled through experience. This is why a different bat than usual can really screw a batter up. On the other hand, if you give a bat to a robot, it can calculate the weight distribution of the bat and all the optics can be set up to never miss. Humans (margin of error) never operate on a degree of exactness that computers always operate on (zero errors, only operator/programmer error).
LaidLawJonesThe computer, which for all intents and purpose is nothing more than a glorified abacus, is so vastly inferior to the brain, not on the 1+1=2 scale, but on the "aha, I've got it" scale, that it will be centuries before it is even close to cognitive thinking.
Computer's expertise is binary; the brain's expertise is recognizing food, threats, and genetic compatibility. A computer can do what the brain can do inefficiently and the brain can do what computers do inefficiently. Leave the thinking to humans and the calculating to computers.
Posted on Reply
#49
TheMailMan78
Big Member
FordGT90ConceptAnd I said it doesn't. Take, programming a motor for instance to spin in a specific pattern. You give it this much voltage, it spins x number of times in y period of time. The muscles are very similar in that the brain knows what to expect from the muscles given the proper amount of stimulation. You think move your to touch that, your brain applies the proper amount of stimulation to that nerve to carry it out. It all works on expectations (familiarity with one self), not calculations. Just as you don't need the decimal system to make gravity work, you don't need decimals to make your body work.

The entire concept of math/algebra is used to describe what happens around us; not in any way control it.



If that were the case than eventually, you'd never miss (a computer can be made to behave that way). Reality is even the best professional baseball players can and do miss. Expectations determine hit or miss, not inflight calculations. As evidence of this, most players know if they are going to swing or not before the ball is even thrown. How the batter swings (bunt, slow, or hard) is also determined before hand. Once the ball is thrown, all that is decided, based on expectations, is when to swing the bat in their predetermined way. If it is a curve ball and the batter didn't expect it, the batter will most likely miss or foul. If the ball is moving slower than expected, the batter will most likely swing too early. If the ball is moving faster than expected, the batter will swing late.

The swing itself is controlled through experience. This is why a different bat than usual can really screw a batter up. On the other hand, if you give a bat to a robot, it can calculate the weight distribution of the bat and all the optics can be set up to never miss. Humans never operate one a degree of exactness (margin of error) that computers always operate on.



Computer's expertise is binary; the brain's expertise is recognizing food, threats, and genetic compatibility. A computer can do what the brain can do inefficiently and the brain can do what computers do inefficiently. Leave the thinking to humans and the calculating to computers.
Never miss? Please nothing is perfect. It will miss eventually. Computers know nothing but what man builds them to do.
Posted on Reply
#50
Unregistered
And I said it doesn't.
Well, it appears we are as split upon this topic. The good news is so are most of the people who discuss this topic.

It has been a very entertaining topic and I thank you for the stimulating conversation.

I will counter the last couple of points and then give you the floor.
If that were the case than eventually, you'd never miss..
You would always stand a chance of missing as there are an infinite number of variables that could happen. The catcher could miss-step, he could see something in the stands that distracts him etc. The catcher however is a pro and therefore his abilities are naturally better than the average person. This is why he very seldom misses compared to the average person.
...most players know if they are going to swing or not before the ball is even thrown. How the batter swings (bunt, slow, or hard) is also determined before hand. Once the ball is thrown, all that is decided, based on expectations, is when to swing the bat in their predetermined way. If it is a curve ball and the batter didn't expect it, the batter will most likely miss or foul. If the ball is moving slower than expected, the batter will most likely swing too early. If the ball is moving faster than expected, the batter will swing late.
This particular example has nothing to do with the brain, this is a case of pure gambling.
If the pitcher has thrown a curve,two fast and a drop, and the next pitch according to the films and his stats will be a curve, then all calculations, however performed, are removed as the batter is going to swing in a particular way with no adjustments or compensation.
...the brain's expertise is recognizing food, threats, and genetic compatibility.
I would argue that these are pre-programmed instincts for survival. Food is a try it and see proposition, threats such as loud noises, sudden movement etc. are common in all animals and genetic compatibility, well, there are anti sheep laws out there for a reason.:laugh:

The brains true gift is reasoning, no other animal can even come close to it. Reasoning encompasses the " I wonder why that happens.."

The other gift is abstract thinking. It is not always right, but it is a very powerful mechanism. It may have been wrong in the " he is sick because demons are in him..." but none the less, this was a very abstract and profound thought.


The floor is yours.
Posted on Edit | Reply
Add your own comment
Apr 24th, 2024 20:29 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts