Researchers at the Massachusetts Institute of Technology (MIT) and Lucent Technologies' Bell Labs have made a breakthrough in chip design by blending neurobiology with computer design to create what is believed to be the first hybrid digital and analog electronic circuit.
The scientists designed the circuit with artificial neurons on a silicon chip, a step that may lead to everything from better artificial-vision systems for robots and computers to better prosthetic eyes and ears for humans, and has so far been hailed as a breakthrough in "neuromorphic" engineering.
The chip, constructed out of ordinary transistors arranged into a ring-shaped network of 16 artificial neurons and the synaptic connections between them, according to a cover story in the June 22 issue of Nature, displays behavior seen only before in the brain.
Each neuron on the circuit board connects to four of its neighbors, and similar to biological counterparts, tends to fire when receiving incoming impulses.
Furthermore, the neurons are also connected to a central inhibitory neuron, which acts as a regulator.
The regulator keeps the circuit from working itself into an uncontrolled chain reaction of neuronal firings, the researchers said.
"If there was only positive feedback, this thing would blow itself to death," co-creator Rahul Sarpeshkar, an assistant professor of electrical engineering and computer science at MIT, was quoted as saying by Wired Magazine.
And similar to the brain, the negative feedback also allows the artificial neurons to amplify or suppress their firings according to the behavior of their neighbors, the scientists add.
Computers, which perform calculations in the binary logic of ones and zeros -- on or off -- are purely digital. Biological computers, however, blend digital and analog processing.
In the brain, nerve cells fire in response to impulses received from their neighbors -- a digital response -- but frequently the nerve's response is modulated by factors like the delay between a pair of input signals, an analog process that cannot be readily mimicked by digital circuits. That is, until now.
"It's a hybrid of digital and analog," Sarpeshkar said. "You don't see that in electronics today."
Sarpeshkar said that in experiments, the chip closely modeled about half a dozen behaviors often seen in neuronal circuits.
For example, the artificial neurons were able to amplify the strongest of a number of competing stimuli and suppress the weaker ones.
Sarpeshkar said this "selective amplification" is akin to tuning into a conversation at a loud party while ignoring all the distractions around you.
Significantly, behaviors like selective amplification aren't programmed: They emerge from the interaction of the artificial neurons.
"It does it itself," Sarpeshkar said. "It's not like we're telling it what to do."
Sarpeshkar said he hopes that in the future, more sophisticated versions of the chip might one day help build better artificial eyes and ears.
The circuit could function as a low-power back end to an artificial cochlea and retina, helping to filter out distracting sensory stimuli for its wearer, he said. By contrast, today's artificial prosthetics perform a lot of complicated computations that are very power-hungry.
Sarpeshkar admitted the circuit is relatively simple compared to neuronal circuits, which consist of thousands of nerve cells with tens of thousands of connections between them.
"It's not like it's a fancy complicated circuit," he said. "It's extremely simple."
Chris Diorio, an assistant professor at the University of Washington at Seattle, told Wired Magazine the work advanced the understanding of alternative computational systems.
"Biology provides examples of non-digital computing machines that are incredibly space- and energy-efficient, and that excel at finding good solutions to ill-posed problems," he wrote in an introduction to the paper.
"Scientists may eventually decipher all of nature's electrochemical circuits, but the work... demonstrates that we already know enough to begin building integrated circuits that compute like biology" -- Albawaba.com
© 2000 Al Bawaba (www.albawaba.com)