WHAT BELL LABORATORIES IS LEARNING FROM SLUGS The slow-moving creatures are being studied by AT&T scientists to find out how to make lightning-fast computers.
By - John Paul Newport Jr.

(FORTUNE Magazine) – AT AT&T's famed Bell Laboratories in New Jersey, garden-variety slugs are nudging forward the frontier of computer science. Inspired by the discoveries of physicist John Hopfield, a team of Bell Labs scientists has been using research on slugs' brains to develop a radically new type of computer. It would use decision-making techniques eerily similar to human intuition to solve certain complicated problems almost instantly. Testing simulations of the envisioned computer, Bell scientists tried its speed at handling a classic quandary known as the traveling salesman problem: What is the shortest way, given all the possibilities, to route a salesman to a lot of different cities? The results suggested that if the new computer were built, it could produce a good answer for 30 cities in a few millionths of a second, whereas large computers today might take a second to get an answer. The Bell computer does not always select the single best route, but --much like a human--it comes up with one of the better routes, and never picks anything obviously loony. New techniques for recording neurological activity in rats and in three types of slugs--favored because of their large and accessible nerve cells--are providing Bell's team with reams of information about how neurons work. But the conceptual focus of the Bell project is the model of the new neural- network computer created by Hopfield, 51, who splits his time between Bell Labs and the California Institute of Technology. Neural networks operate in the analog mode--when information enters the brain, the neurons start firing and their values, or ''charges,'' rise and fall like electric voltage in analog computers. When information is digested, the network settles down into a so-called steady state, with each of its many neurons resting close to their highest or lowest values--effectively, then, either on or off. A computer designed to mimic a neural network would solve problems speedily by manipulating data in analog fashion. But it would report its findings when each neuron is either in the on or off state, operating like a digital computer speaking a binary language. The simulated computer designed by Hopfield and his AT&T colleagues uses microprocessors to do the work of neurons. Each microprocessor is connected to all others--as many neurons are interconnected--which would make the machine costly and complex to build. Another major difference between this computer and traditional ones is that memory is not localized in any one processor or set of processors. Instead, memory is in the patterns formed by all the neurons, whether on or off, when they are in steady states. As a result, the computer can deal with fragmentary or imprecise information. When given a misspelled name, for example, it can retrieve the full name and data about the person by settling on the closest name in the network. Though analog computation is astonishingly fast, it sacrifices precision. Neural-network computers work best on problems that have more than one reasonable solution. Examples include airline scheduling, superfast processing for robots or weapons, and, more in AT&T's line, routing long-distance telephone traffic.