The smartest machines on earth
Supercomputers? How do you predict the unpredictable? Test bombs without explosions? With a very large, very expensive machine in a very cold room.
(Fortune Magazine) -- Racing to solve the world's most urgent problems - and to out do one another in global rankings - supercomputer designers have unleashed almost unimaginable power. A look inside the vast, chilled rooms where the big machines work their magic.
Forget about the America's Cup or Formula One: When it comes to high-stakes technical virtuosity with nationalistic undertones, nothing compares to the race for speed on the Top500. That's the twice-yearly ranking of the world's fastest supercomputers, compiled by a group of computer scientists in the U.S. and Germany.
When the latest results were announced in June, the top spot went to IBM's (Charts) BlueGene/L, which has won every contest since it was brought online in 2004. The machine, which cost more than $100 million and occupies a 2,500-square-foot air-conditioned room at California's Lawrence Livermore National Laboratory has a peak processing power of 367 teraflops, or 367 trillion "floating-point operations per second."
That's roughly equivalent to 75,000 personal computers hammering away at the same problem at the same time. Even the bottom dwellers on the Top500 list are impressive, clocking in at six teraflops, 1,000 times more muscle than your home PC.
"Tera" derives from the Greek for "monster." It's an apt description of BlueGene/L, a behemoth that helps manage the U.S. nuclear arsenal. It was this military application that attracted war-zone photographer Simon Norfolk, who captured the images on these pages. "I wanted to draw out the idea of a battlefield," says Norfolk, who, in more than a year of shooting, managed to talk himself and his oversized mahogany field camera in to photograph BlueGene/L and a number of other closely guarded supercomputing facilities on the Top500 list.
Norfolk's images provide a rare view of machines that, in a world of ever smaller and more personal computing, might seem to be relics. But thanks to extraordinary price-to-performance gains, supercomputer sales were up 25% in 2005, to $9.2 billion. That makes them the fastest-growing IT segment tracked by research firm IDC - and the demand is coming from both public and private sectors.
President Bush promised to double federal funding to America's biggest supercomputing projects in his 2006 State of the Union, and companies are finding new ways to use the smaller systems that are now flooding the market. "Supercomputers are about a trillion times faster than they were 30 years ago," says Alan Gara, chief Blue Gene architect at IBM. "And the cost hasn't changed that much. In other words, they are now a trillion times cheaper."
The essential skill of supercomputers boils down to a single word: simulation. Reality is data-rich-filled with systems that interact with one another in tangled feedback loops. So it's handy to have a massively parallel supercomputer around to break down tough simulation problems into many small bits and distribute the work to hundreds, even thousands, of wired-together processors.
Want to know how high the sea will rise given an increase in global temperature? Divide earth, sea, and air into billions of easy-to-model pieces and assign those pieces to your processors. Insert a disturbance, such as an assumption about greenhouse-gas emissions, and each processor will begin frantically calculating and recalculating to keep track of its own territories, even as neighboring territories are changing, somewhat like a giant, high-speed game of telephone.
Eventually - sometimes after days of that collective cacophony - a unified version of some future reality emerges. Variations on the technique can be applied to many of the world's other most pressing problems: the hunt for oil, planning for flu pandemics, even scanning for troublesome inbound asteroids.
Not all supercomputing applications are so apocalyptic. Procter & Gamble (Charts) once deployed a supercomputer to model the aerodynamics of a Pringles potato chip when it found that too many were flying off the assembly line. Car companies have long used supercomputers to design new models. And the financial services industry quietly burns teraflops to test investment strategies and balance portfolio risks.
Then, of course, there is the ultimate simulation.
Researchers in Switzerland are using a cousin of BlueGene/L, dubbed Blue Brain, to simulate the human neocortex. Like most in the field, Henry Markram, the project's leader, downplays the probability of his computer's "waking up" one day.
The human brain, he says, is perhaps a million times more powerful than today's most powerful machines. Still, Markram takes the hypothetical coolly in stride. "If it does happen," he says simply, "then we're going to be able to study consciousness very systematically."