The Cell of a New Machine
AN IBM-LED CONSORTIUM NAMED ITS RADICAL SPEED-DEMON-OF-A-CHIP AFTER THE BASIC BUILDING BLOCK OF LIFE. WILL IT GIVE BIRTH TO A REVOLUTIONARY ERA IN THE ELECTRONICS INDUSTRY?
By Erick Schonfeld

(Business 2.0) – To reach the lab where IBM, Sony, and Toshiba engineers have spent four years and more than $400 million toiling in secret on a computer chip that, if they are right, will usher in a dramatic new era of electronics, you must first pass through a lobby in a six-story IBM building on the outskirts of Austin. The glass walls there are plastered with bright yellow signs warning employees not to "discuss confidential information in this area." After negotiating a swipe-protected door, you reach the lab where prototypes of the advanced chip are kept. Half a dozen engineers huddle around a workstation, eyeing you warily. No outsider has ever breached this inner sanctum before.

They call this chip Cell--as in the basic building block of life--and the engineers gathered in the lab on a recent morning don't find anything presumptuous about the name. Engineer Barry Minor boots up a terrain-rendering program he wrote for Cell. "I've waited four years to show this to somebody," he says.

Using a joystick, he maneuvers above an incredibly lifelike three-dimensional representation of Mount St. Helens displayed on a 30-inch high-definition screen. The flyover is smooth and exhilarating, but the most mind-blowing thing is that this is an exact replica of the real mountain, down to the location of every ridge, dome, and lake. Minor's program, powered solely by two Cell chips, takes data from two overlaid bird's-eye maps of Mount St. Helens--a satellite map provides the landscape's color, while a laser-generated aircraft map provides the elevation. Through immensely complex mathematics, the program converts these two-dimensional images into 3-D images of what the mountain would look like from any angle, at any second, in response to any twitch of the joystick.

A few feet to Minor's left, a Pentium 4 laptop with an extra graphics processor groans away, trying to render just a single frame of a mountain using a more traditional technique employed by today's high-grade computer graphics systems. The side-by-side comparison with Minor's exhibition is stark. The fluid, ever-shifting alpine images powered by Cell are stunning. One of Minor's colleagues offers up the fact that before they had prototypes of Cell, they were running the same software on a simulator, and "it took all night to render one frame." In other words, this chip is crazy fast.

The Cell processor is unlike any microchip ever made. It is at the forefront of an emerging class of chips known as "multicore" processors, meaning chips with more than one processing core burned onto the same piece of silicon. PC-oriented heavyweights Intel and Advanced Micro Devices are only now introducing chips with two processing cores. But the Cell chip has nine cores, each of which can run vast numbers of separate calculations simultaneously. Another mark of singularity: Cell is aimed not at the PC market but at the wider and rapidly evolving broadband world. Cell is faster than any comparable PC or videogame chip out there, especially for the heavy lifting of generating graphics or decompressing video. It can run 256 billion operations per second--256 gigaflops, in geek-speak. And Cell has already been sold: The chip goes into mass production this year and is expected to hit the streets in force in 2006 as the engine of Sony's wildly anticipated PlayStation 3 game player. Cell is 40 times faster than the chip that runs the PlayStation 2, one of the great consumer device successes of all time.

Cell represents more than just a radical new design with great potential for spurring new products and markets. The semiconductor industry is at the brink of a momentous shift, a point when basic design approaches that have fueled chip--and thus computing--advances for decades must yield to something new. The reason is simple: Moore's Law, the durable principle that explained how chips could double in power roughly every two years, has begun to run up against very real limitations, and that kind of performance gain is increasingly difficult to deliver. But multicore designs get around some of the major barriers facing Moore's Law by promising more muscle with less power consumption.

Or so much of the chip industry is betting. Almost all the big guns of the semiconductor world are shifting to a multicore approach; the resulting race to command the high ground of the new technology will do much to determine who dominates the chip industry in coming decades. "Everybody is doing this," says Sun Microsystems co-founder Andy Bechtolsheim, who is designing a new multicore server line for Sun based on AMD's Opteron chips. Intel has at least 15 dual-core projects in various stages of development. And the next Xbox from Microsoft, the Xbox 360, will sport a three-core chip that is also based on IBM's architecture but is otherwise completely different from Cell. "People see the current limitations, so they are going multicore," notes Jim Kahle, the IBM engineer who is the chief architect of Cell. "We just got out in front of it."

To stay there, IBM, Sony, and Toshiba have dreamed up elaborate plans for maximizing Cell's potential. Each partner will use Cell in its own products. Ultimately they want to license it to other manufacturers and even rivals. But the first place Cell will make a splash will be in a living room near you. The PlayStation 3 will be the highest-profile device to take advantage of Cell, but others will follow. Sony, which plans to spend $2 billion to produce the chip, is designing it into everything from flat-panel TVs to home media servers. Toshiba is also working on a Cell TV. Toshiba engineer Yoshio Masubuchi predicts that such a device could connect to other Cell TVs through videoconferences, allow viewers to change channels with hand gestures or voice commands, and permit content and advertising to be customized as it rolls in.

Expanding into the living room, of course, is also one of the top priorities of Apple, Intel, Microsoft, and just about every other tech power. But whereas most of them foresee the PC acting as a hub that directs all the other digital devices in the home, the introduction of Cell suggests something entirely different. "We are going to see consumer electronic devices as we know them morph into add-ons to the network," predicts Michael Cohen, an analyst with Pacific American Securities, "and I see Cell as the brains." One of the driving ambitions behind Cell, then, is to scatter digital smarts among the videogame players, TVs, media servers, and other digital devices connected on the home network. These devices will be able to act independently or swarm on common tasks as needed. The commercial payoff, some experts say, could be immense. "This," says Kevin Krewell, editor-in-chief of the Microprocessor Report, "is a battle over who owns the living room." The winner will define the next stage of computing.

Moore's Law turned 40 years old this spring. It has had an illustrious life since it was introduced by Intel co-founder Gordon Moore, and Intel marked the latest birthday by paying $10,000 to a collector in Britain who had one of the last mint-condition copies of the obscure engineering journal where Moore first published his idea. For purists, Moore's Law isn't ready for its requiem: Technically, Moore merely predicted that engineers would be able to double the number of transistors on a chip every 12 months. (Later he amended the time frame to 24 months.) Engineers can probably figure out ways to do that for a long time to come.

But what's stumping engineers, and thus threatening Moore's Law in practice, is heat. As the width of transistors shrinks--the standard is now 90 nanometers, 500 times skinnier than a human hair--they run hotter. If nothing is done, chips will soon run hot enough to melt metal. Other problems crop up at the nanoscale; the bottom line is that although we haven't reached the end of Moore's Law, as a practical matter you can see it from here.

Most of these issues were already looming by late 1999, when the Cell project began to take wing. It sprang from a meeting in Tokyo between Lou Gerstner, then IBM's boss, and Nobuyuki Idei, then Sony's CEO. They were looking for some way the two companies could pair IBM's technical wizardry with Sony's market-making muscle. The PlayStation 2 was about to launch, and Sony was already thinking about its successor, so Idei proposed a partnership focused on creating a breakthrough chip for the next PlayStation. From that point, it fell to Sony Computer Entertainment chief Ken Kutaragi and a couple of IBM Microelectronics executives, including head of R&D Bijan Davari, to come up with a more specific plan.

Kutaragi, known for the bold stroke and the grand vision, swung for the fences from the get-go. "We want to do something that has never been done before," he told Davari and a group of IBMers at their first meeting. "Let's work together to change the world." The movie The Matrix had just come out, and Kutaragi relished its premise of a world that is actually a giant computer simulation. "Think about creating a crude version of that world," he said, "where millions of people can play in a realistically rendered virtual Tokyo or New York City as if they are really living there." Creating that magical realm, Kutaragi told the team, would require a chip 1,000 times as powerful as the one in the PlayStation 2. The IBMers tried not to roll their eyes. They tended to like all that Matrix stuff, but when it came to 1,000-fold chip boosts, they thought Kutaragi was out of his mind.

The person Davari tapped to lead the project was Kahle, then a 40-year-old hotshot chip designer. An unyielding perfectionist, Kahle is the kind of guy who built his own loudspeakers in high school, winding the coils by hand (and often listening to Frank Zappa over them with his cousin, Web pioneer Brewster Kahle). He had designed IBM's first dual-core chip, the Power4, and was just coming off a project that produced the IBM chip that powers Apple's G5 computers. "I don't want to do the normal stuff," he says with a shrug.

Normal obviously wasn't what Kutaragi had in mind. Still, one of Kahle's first moves was to talk Kutaragi down from that fantasy of a 1,000-fold power increase. Kahle figured a goal of a 100-fold boost from one chip generation to the next, having rarely if ever been achieved in the history of semiconductors, was ambitious enough. In March 2001 the design center was set up in Austin. Sony and third partner Toshiba together sent about 70 engineers to Texas. Ultimately more than 400 would work on the project, most of them from IBM.

When Kahle started recruiting, he had difficulty persuading people to move to Austin, and had to overcome skepticism about the project itself. "What if you can't do it?" asked Ted Maeurer, an IBM programmer recruited by Kahle. "I don't fail," Kahle replied. Maeurer was sold, and became Cell's chief software engineer. As Kahle assembled his team, he rallied them with the grand challenge before them--and the notion that it offered IBM a chance to regain a technical summit in semiconductors that it long ago surrendered to Intel. "The opportunity to work on a chip like this," he told his engineers, "comes only once in a lifetime."

Right away it was evident that a traditional single-core chip would not work. For decades the semiconductor industry in a sense had been on autopilot. Chip designers took advantage of Moore's Law simply to increase the clock speed of each new chip rather than using the transistors in more creative, more efficient ways. But Kahle knew that heat and other issues would make it impossible to create Cell with same-old, same-old thinking. He was envisioning a chip that would ultimately have 234 million transistors; if it were a standard processor, it would run as hot as a toaster oven. "You would not want it sitting in your living room," observes Peter Hofstee, one of Cell's lead designers.

As it happened, Kahle had already run into a similar problem in the late 1990s while working on the Power4. Rather than jamming more transistors into one processor, he decided to use the extra transistor real estate provided by Moore's Law to put two simpler processor cores on the same chip and run them in parallel. While he didn't invent the concept of a multicore chip, Kahle was one of the first to apply the idea to processors. Instead of running a single program faster on one processor, similar performance gains could be achieved by splitting up the program or running multiple programs at the same time on multiple processor cores.

Going multicore solved some of the heat and power problems facing Cell, but it didn't solve everything. Kutaragi was incredibly demanding and repeatedly sent Kahle back to the drawing board. At one point about a year into the project, Kutaragi made the team scrap the whole system structure and start over nearly from scratch. Another time Kutaragi decided he wanted two more cores. Why? "He just wanted to squeeze the engineering team," explains Masakazu Suzuoki, Sony's top Cell engineer, wringing his hands as if strangling a snake. "It hurt your head," Kahle recalls. Making the pain worse: The team still had to deliver the chip on the original schedule.

Kahle recognized that he needed to incorporate more flexibility into the design to meet Kutaragi's ever-changing requirements. He turned on the television one day and saw a show about an airport in Asia that's built on sinking ground. "Every day they relevel the terminals with these giant screw jacks," he explains with a note of wonder. He decided that his project needed to be similarly able to adjust to unforeseen ground shifts. For instance, Kahle wanted the option of allocating different tasks to whatever cores were available, so they couldn't be too specialized. But he couldn't make them all general-purpose processors either, because those need more transistors and more power. Cell ended up having one general-purpose Power processor core that acts as the main coordinator and farms out tasks to a set of leaner and faster "synergistic" cores that can act at will as any one of a number of specialized processors--for, say, graphics, network, media, or security tasks.

The synergistic processor cores do all the heavy computational lifting and are especially adept at broadband-related tasks such as calculating the physics of moving objects, compressing and decompressing video, and encrypting and decrypting copyrighted content. To run things in parallel between the cores, Kahle used technology licensed from Rambus to make the internal bandwidth of the chip really blaze. Cell can even support multiple operating systems (although there are no plans to make Windows one of them).

Kutaragi named the chip Cell because he thinks of each computing device as part of a larger networked organism, held together by Cell chips. By the spring of 2004, crunch time had come for the team: Cell prototypes arrived in Austin from IBM's plant in Fishkill, N.Y. The engineers loaded the Linux operating system on the chip. They turned it on. It worked on the first try. Cheers erupted from the relieved engineers. Champagne corks popped.

Then one senior engineer, impatient with the pace of the testing, yelled, "Crank it up!" They got the chip up to 5.6 gigahertz before its voltage regulator popped. (It was designed to operate at between 3 and 4 gigahertz.) "If you don't get smoke at some point," Maeurer says, "you're probably not pushing things hard enough."

To this day, few people even inside the allied companies know the details of Cell's development or the high hopes its backers hold. The Cell engineers are still not supposed to talk about much of their work to anyone outside the Austin facility. One day the air-conditioning broke down in the lab, and as the temperature soared, the engineers propped open the doors. Word got around. The company had to post guards to turn back rubbernecking colleagues eager for a glimpse of what was going on in there.

Competitors and industry observers are likewise curious, if not fully convinced that Cell will sweep the world. Bill Leszinske, director of marketing for Intel's digital home division, questions how easy Cell will be to program, and whether software developers will support it. "There are already millions of developers who are familiar with the [Intel-based] PC architecture," he notes, and that architecture is also going multicore. Kahle acknowledges that programming a Cell device will be trickier than programming a PC. "But this is not here to take out the PC," he says. It's aimed instead at new broadband markets that the PC is ill-equipped to conquer. There are at least two compelling strategic reasons for aiming at broadband: IBM doesn't see any percentage in taking on Intel in its PC stronghold, and broadband's future growth prospects dwarf those of the aging desktop market.

Besides, Kahle knows he can't make Cell difficult to program. From the outset, he assigned a group of engineers led by Maeurer to develop programming tools that IBM plans to release to the open-source community later this year. Sony and Toshiba are working on their own programming tools for videogames and TVs.

The true test of whether Cell will become the new broadband standard for digital devices, of course, will be simply how quickly it spreads--in both its champions' products and other markets. IBM plans to introduce a Cell workstation for videogame programmers later this year. Within Sony the chip is slated for a whole line of new products, and company executives insist it is still a top priority. But earlier this year, Kutaragi was passed over for CEO in favor of Howard Stringer; Kutaragi also no longer oversees Sony's semiconductor business. Some observers, even inside the company, believe that could ease the pressure to put Cell into every new Sony product.

Then there's the fact that so far no company beyond IBM, Sony, and Toshiba has signed up to license the chip. "Of course we are very open to sharing our dreams with other companies," Suzuoki says. That, however, is news to Microsoft, a company using chip technology from IBM. "They never showed me the Cell approach," says Xbox hardware development chief Todd Holmdahl. Apple's Macs are already based on IBM's Power chips. A Media Mac home server, for instance, could give Cell the sort of outside validation it needs. (Apple declined to comment on its plans.)

To be fair, the Cell partners are just starting to shop the chip around. IBM, in particular, says it's making headway in defense, medical imaging, Internet switching, and industrial inspection equipment. The company suggests that the chip could be especially useful in crunching the mammoth amounts of data the military will collect as it develops so-called network-centric operations, where heavy armor is replaced by more perfect information--such as torrents of broadband video--that must be processed on the fly.

Ask Kahle what new doors Cell will open and he demurs: "I'm a chip designer, not the oracle." Yet there are other glimpses of Cell's prospects in Kahle's own lab. There Minor, the software engineer, displays a modified Toshiba Pocket PC that he has lashed to a small computer board with a GPS chip, a digital compass, and two accelerometers. It can wirelessly send his location back to the Cell-powered server running his terrain-rendering program, which then returns a 3-D image of the landscape to him. A hiker lost in the fog could use this device to see a realistic image of what's beyond the mist or over the next ridge. "It tracks just like a video display on a camcorder," Minor says, "but instead of looking at this world, it is looking at a virtual world." Today this gadget is just a techie's delight. But it's not hard to imagine it one day in the hands of hikers or, for that matter, Special Forces operatives, searching for cave openings in unknown lands.