20 That Made History
Many of these epic decisions were breathtakingly smart. Some were appallingly stupid. But all of them shaped the modern world of business.
By Jerry Useem

(FORTUNE Magazine) – Picture a hallway. You're walking down it, alone. Before you reach the end, you need to reach a decision. Your engineers have been hard at work on a daring new product. But now the stakes have grown so big that ... well, you wouldn't be betting just the farm at this point. You'd be betting the farm, the house, and the kids. And your rival--a far more established firm--has upped the ante, promising a product that has sent your engineers back to the drawing board. Now you're about to meet with your biggest potential customer. And you have two choices. You can make a bet that--if it doesn't bankrupt your company outright--might repay itself sometime in the next couple of decades. Or you can keep your chips safe for another day.

What do you do? If your instincts say, "Walk away," you've made a sound decision--one that would probably pass any discounted-cash-flow test with flying colors. You've also just killed the 707: The plane that vaulted Boeing past Douglas Aircraft.

The man on the spot that day in 1955, Boeing president Bill Allen, wasn't armed with hindsight. What he did have was an iron stomach. In promising American Airlines the jetliner it wanted instead of the one Boeing had geared up to make, Allen took a giant gulp of uncertainty. When the 707 took to the skies three years later, it flew America into the jet age.

But that's the language of history, where everything sounds inevitable. "The problem with television," the New York Times wrote at the dawn of the television age, "is that people must sit and keep their eyes glued to a screen; the average American family hasn't time for it." It's a reminder that things now past were once in the future; real people and their choices were the bridge. The best decision-makers were capable of seeing the present as if it were already the past. And by seeing the future "as in a vision, a dream of the whole thing," as steel magnate Charlie Schwab put it a century ago, they couldn't help but define it.

The 20 decisions below helped create the business world as we know it. But that's not the only reason they are on this list. They're here because they shed light on how powerful people make decisions--some exceedingly good, some hideously bad, and all history-shaping.

So blindfold yourself and imagine you are back in 1876, when the long-distance business involved Morse code, or in 1964, trying to invent a computer age that doesn't yet exist. You don't have a crystal ball to consult. You have a decision to make. -- Jerry Useem

1876: Western Union hangs up on Bell's new invention

In 1876 a man named Gardiner Hubbard presented the Western Union telegraph company with the chance to buy a set of patents. The asking price? $100,000. The product in question? Alexander Graham Bell's telephone.

Bell had won a technological victory earlier that year, beating another inventor, Elisha Gray, to the patent office by hours. But the all-out effort had left him broke. And his soon-to-be father-in-law, Hubbard, had struggled to convince people that the telephone would be anything more than a parlor trick. So when the newlywed Bells left for an extended honeymoon in Europe, George Sanders (the biggest investor in the enterprise) and Hubbard decided to unload the patents. Western Union was an obvious buyer--it dominated the young business of long-distance communication, controlling the network of telegraph lines criss-crossing the country.

Western Union president William Orton responded, "This 'telephone' has too many shortcomings to be seriously considered as a means of communication. The device is inherently of no value to us." He rejected the offer.

Orton had clashed with Hubbard several years before during an attempt to break Western Union's monopoly on telegraph lines. Not only was he uninterested in providing Hubbard with financial gain, but Orton had faith that should the telephone amount to anything, his giant corporation would be able to force Bell out of the market without trouble.

It didn't take long for Western Union to recognize the magnitude of its blunder. Within the year Western Union began hearing that its customers were replacing teletypes with phones leased from the newly founded Bell Co. So the company hastily put out its own version of the telephone, using Elisha Gray's patents and a design by Thomas Edison. A furious legal battle followed. Tiny Bell Co. prevailed--and Western Union was forced to lease telephone equipment from Bell.

Though it would take Bell Co. several years to get on solid financial footing, the baton had been passed: Western Union began to decline. Bell and its successor, AT&T, would rule the communications industry for the next century. -- Kate Bonamici

1903: King Gillette decides to throw away the blades

Today America is awash in disposable diapers, disposable cameras, even disposable clothes. But when a former bottle-cap salesman from Boston named King Camp Gillette started selling safety razors with disposable blades in 1903, people weren't disposed to throw things away. The very idea of discarding something without reusing or repairing it ran counter to American notions of thrift. But Gillette, a part-time inventor whose earlier patents included an improved beer tap, had taken the advice of his boss at Crown Cork & Seal, William Painter, inventor of the bottle cap. "Think of something which, once used, is thrown away," Painter told him, "and the customer keeps coming back for more."

Gillette was staring at his dull razor one morning when that thing came to him. Like other razors of the day, his blade required time-consuming "stropping" and professional resharpenings to remain useful. Gillette spent the next eight years figuring out how to cast a blade thin enough--and therefore cheap enough--to throw away when it got dull. In 1901 he patented the first razor with a disposable blade.

Persuading men to buy it was easier than convincing them they could dispose of it. As Russell Adams relates in King Gillette: The Man and His Wonderful Shaving Device, Sinclair Lewis's fictional salesman, Babbitt, tossed his used blades atop a medicine cabinet "with a mental note" to do something about the pile. H.L. Mencken claimed he put his in the church collection plate. Some barbers offered illicit resharpening services. King Gillette offered this proposal: Drop your used blades off to be resharpened; then never pick them up.

Contrary to myth, Gillette never did "give away the razor and sell the blade." The kit cost a hefty $5. But the U.S. Army gave 3.5 million Gillette razors and 32 million blades to soldiers during World War I, hooking a generation--and planting the beginnings of America's throwaway culture. -- Kate Bonamici

1906: Giannini opens his vaults after the quake

A.P. Giannini had one thing on his mind when he was bounced out of bed by the great San Francisco earthquake: his bank. Rushing into the shattered city, he managed to load $80,000 in gold--removed from the deposit vaults of his Bank of Italy by two quick-thinking employees--onto a horse cart, covered with vegetables, before fire consumed the building. Other banks' vaults would be too hot to open for weeks. When his fellow bankers proposed a six-month banking moratorium at a meeting the day after, Giannini broke ranks. "In November," he argued, "there will be no city or people left to serve." He was open for business the next day at a makeshift desk in North Beach, offering to lend money "on a face and a name."

The gesture made Giannini's own name. And it reflected his democratic philosophy: The money in the vaults wasn't there to serve banks. It was there to serve customers. The son of Italian immigrants, Giannini had founded the Bank of Italy in 1904 on the premise that banks should serve more than the fortunate few. Offering loans of $10 to $300 to anyone who had a job, Giannini also convinced those in the working class that they should turn their tin cans of savings over to a bank.

It was Giannini who popularized home mortgages, auto loans, and other pioneering forms of consumer credit. As he expanded his reach by opening branch offices--another new idea--throughout California, he backed unproven businesses like Hollywood (loaning Walt Disney $2 million for Snow White) and the California wine industry. By 1945 his legacy was just about everywhere. His renamed Bank of America was the largest bank in the world. Access to credit had become a cornerstone of middle-classdom. And a few years after his death in 1949, BofA would introduce the public to another new concept: the credit card. -- Kate Bonamici

1914: Ford offers $5 a day

The first of them arrived at 3 A.M. By daybreak some 4,000 were huddled in the deep January freeze. By 7:30, 10,000 men had gathered at the entrance to the Highland Park, Mich., factory, hoping for a job at a wage that sounded too good to be true.

But it was true. The previous morning, Henry Ford had lingered near a window while his treasurer read a statement to reporters: "At one stroke, [Ford] will reduce the hours of labor from nine to eight" and offer its workers "five dollars per day"--more than twice the prevailing wage of $2.34.

The 1914 announcement hit America like a thunder clap. The Wall Street Journal accused Ford of "an economic blunder if not crimes." But Ford's motives were neither socialist nor utopian. Ever since his assembly lines had lurched into motion the year before, he simply could not keep workers. Turnover of 370% required hiring almost 50,000 people a year just to maintain a workforce of 14,000. Putting $5 in a worker's pocket, Ford hoped, would do more than reward him for grueling and mind-numbing work. It would turn him into a consumer. At one stroke, that is, Ford could mass-produce both a car and a market for it. He could also play social engineer: To qualify for that $5, workers would have to remain in good moral standing (no saloons) and submit to intrusive home visits from Ford's Sociology Department.

In one sense, the $5 day changed nothing. Discontinued in 1917, it was a distant memory by the time Ford goons beat up labor organizers in 1937. But it also changed everything. For the first time, a major industrialist had suggested that the contract between employer and employee consisted of more than just a wage (and the lowest possible one at that). By the 1950s that relationship had deepened to include pensions, dental insurance, and the ultimate symbol of corporate paternalism: the gold watch. -- Jerry Useem

1925: Sears gets physical

In 1940, Sears Roebuck turned out the lights for the last time, disbanding its workforce of 25,000. Just decades earlier, its catalogs had been eagerly awaited in the nation's farm towns and prairies. But its once reliable customers had begun motoring into town--if not to live there, then to shop. Time had passed the company by.

It's a history that might have been. But Sears had a leader who saw history coming and was determined to beat it to the punch.

Gen. Robert F. Wood, business historian Richard Tedlow writes in his new book, New and Improved, "was one of those fortunate few to whom numbers spoke." As an Army logistics officer coordinating construction of the Panama Canal, Wood developed an "odd passion" for The Statistical Abstract of the United States. Upon his return from World War I, the numbers were telling an important story. Farm income was dropping. Automobile registrations were rising. And the so-called chain stores of James C. Penney were multiplying.

Wood spelled it out for his new employer, Sears' archrival, Montgomery Ward, in a 1921 memorandum. "[W]e can beat the chain stores at their own game," he wrote. "We have four distribution points; we have an organized purchasing system; we have a wonderful name if we choose to take advantage of it." Management blocked the idea--then fired Wood in 1924. He went straight to Sears, was named president within four years, and plunged ahead with his plan.

Imagine today an overnight assault of Amazon.com superstores, and you might begin to grasp the magnitude of this decision. Sears operated out of Chicago, period. Where would it put its new stores? Who would manage them? How do you treat a live customer? Nothing in the company's past had prepared it for those challenges. But its leader had a different past. "Business is like war in one respect," Wood said. "If its grand strategy is correct, any number of tactical errors can be made."

The errors were numerous and costly. The blitz of store openings--more than 300 in just three years--caused Sears to report a loss for only the second time in its history. Bitter feuds erupted between catalog men and store managers. "We had a 100% record of mistakes," Wood would say later. And the early stores looked ridiculous: Placed near highways outside cities, they were surrounded by vast parking lots and little else. Eventually, though, the cities engulfed them, and suburbanites filled the parking lots. By 1931 store sales surpassed catalog revenue. And in 1934, Sears opened a peculiar-looking store. It had no windows, like a big box. -- Jerry Useem

1929: Good-Time Charlie sells his stocks

The advice was couched in the gentlest possible way, but for those who listened, it proved as valuable as any opinion ever to come from Wall Street. On April Fool's Day 1928, with the Dow arcing ever higher and the Jazz Age in full swing, 42-year-old Merrill Lynch founder Charles Merrill advised his clients "to sell enough securities to lighten your obligations ... to take advantage of present high prices and put your own financial house in order."

A society-page fixture, the thrice-married Merrill was nicknamed "Good-Time Charlie" for his flamboyant personal life. But he was prudent when it came to finances. By early 1929 he had followed his own advice and sold most of the firm's stock portfolio, perfectly timing his exit from a bubble that had sent the Dow from 120 in 1925 to 360 in 1929. This was the era of buying on margin, when speculators could snap up stocks with just 10% down (the margin requirement nowadays is 50%) and then use paper gains to buy still more shares.

"Merrill had an inside view of the bubble, and he was pretty much alone in telling people to sell," says Charles Geisst, a Wall Street historian. Merrill had an edge because he could observe the behavior of the traders and specialists on the floor of the New York Stock Exchange and see firsthand that the buying was based on herd behavior, not on underlying economic fundamentals. Hordes of other investors who got their information from ticker-tape machines couldn't see that.

When the Great Crash came in October, propelling more than a few indebted traders toward the nearest window ledge, Merrill and his firm escaped nearly unscathed. His money intact and his reputation much enhanced, Merrill would build a crucial bridge between Wall Street and Main Street. His Merrill Lynch would help convince ordinary Americans that stocks and bonds were an appropriate choice for savers rather than just vehicles for speculators. Today Merrill Lynch remains Wall Street's leading firm for individual investors, with $1.6 trillion in client assets. And Good-Time Charlie, who died in 1956, is remembered for making the smartest investment call in the history of Wall Street. -- Nelson D. Schwartz

1935: Pan Am flies the ocean blue

The record of business in the 1930s is a somber one-- bankruptcy, failure, and the loss of those animal spirits that are as important to capitalism as capital. There are a few exceptions. Hollywood is making great flicks; the Twinkie is invented--and aviation takes off. In the wake of Charles Lindbergh's solo flight across the Atlantic in 1927, aviation grew briskly, but leaving the country still meant taking a boat. Juan Trippe, the founder of Pan American World Airways, would change that. The patrician son of a New York investment banker (belying his first name, he had no Hispanic roots), Trippe had been rudely shut out of the lucrative U.S. airmail business. So he decided to become king of the international airways instead.

He started flying to Latin America, a sensible choice since the region required only short hops over water and his planes could find landing spots all along the way. At first Pan Am was strictly a mail service. Then Trippe realized that a passenger or two would boost profits, so he installed a couple of wicker seats. "Fly with us to Havana," went one ad, "and you can bathe in Bacardi rum four hours from now."

And then, in a decision that must have seemed like madness at the time, he set his sights slightly farther away--China. This raised a few problems: big oceans, limited range, primitive navigation, and lack of runways. And it was not at all clear that there was a market for flights to Asia. But Trippe was so sure that the world would eventually catch up to his own expansive views of aviation that he did it anyway. In November 1935, Pan Am inaugurated transpacific airmail service with the China Clipper. These flying boats could take off and land on water; they hopped to Hong Kong in about a week via stops in Hawaii, Wake, Midway, Guam, and the Philippines. The first passenger service came a year later. The cost was steep, about $11,000 one way in today's dollars, but cattle class this wasn't. Late-model Clippers featured sleeping berths, a separate dining area, a VIP lounge, and dressing rooms.

In 1939, Pan Am began the first transatlantic service, the Yankee Clipper. Passengers flew in a luxuriously appointed Boeing 314 double-decker, which even had a honeymoon suite. The 29-hour flights were too expensive for the middle class, but they helped build the idea that flying was not just for daredevils and the mail anymore. Between 1935 and 1940 the number of Americans making international flights soared.

Trippe's decision to launch the China Clipper and then the Yankee Clipper provided the fuel for the takeoff of mass international aviation. He has a lot to answer for. -- Cait Murphy

1950: Deming charts Japan's remarkable course

Sometimes big decisions are made to the blare of banner headlines. And sometimes they are made quietly, with no more drama than a puff of air. At a dinner party in Tokyo in the summer of 1950, 21 of Japan's most influential corporate leaders, who accounted for some 80% of the country's industrial capacity, made the latter kind of decision. What they did was listen--specifically to W. Edwards Deming, an obscure American statistician who had never met a payroll and had been to Japan only once before. Deming was nonetheless certain that he knew how to solve postwar Japan's economic problems. "You can send quality out and get food back," he told his skeptical audience.

The pursuit of quality, Deming said, was the key to higher productivity, bigger profits, more jobs, and therefore a richer society. Quality, he lectured, did not begin by finding defects at the end of the production line. It had to be pursued along every link of the supply chain, with the active cooperation of everyone from suppliers to the humblest worker on the factory floor. If Japanese companies followed his 14 points, Deming promised his dinner companions and other managers in a series of lectures that summer, their goods would be world-class in five years. The notion seemed ridiculous. At the time, the term "Made in Japan" was such a joke that some factory owners set up operations in the village of Usa so that they could mark their products "Made in USA." But Japan's poohbahs didn't have any better ideas, so they decided to take up Deming's challenge.

In 1957, Toyota exported its first car to the U.S.--a clunker, as it happens. But by the 1980s, Japan looked ready to eat everyone's economic lunch. Deming's sardonic comment: "Don't blame the Japanese. We did it to ourselves." (And don't blame Deming for Japan's current economic problems, which have to do with banking and politics, not quality.)

Indeed, Deming was the classic prophet without honor in his own country. It wasn't until the '80s that U.S. manufacturers adopted his principles--largely to meet the competition from Japan. Deming's 14 points are now standard operating procedure around the world. -- Cait Murphy

1955: Ruth Handler bets Mattel on the Mouseketeers

Ruth Handler had a simple question for her CFO in January 1955: If her gamble failed, would the company be broke? "Not broke," he answered, "but badly bent." Handler had gotten a call from an ABC television executive seeking sponsors for a new weekly show, The Mickey Mouse Club, and asking if Mattel would buy a one-year sponsorship for $500,000. The sum was enormous--equal to the net worth of the fledgling company she had co-founded, which sold $5 million a year worth of novelty toys like bubble hats and burp guns.

The power of TV advertising was little understood or exploited at the time, though the medium was growing fast. Between 1949 and 1962 the percentage of U.S. households with televisions shot from 2% to 90%. The toy industry, meanwhile, hardly advertised at all, content to run a few promotions in big cities just before Christmas. Nevertheless, within an hour of getting the call, Handler and her husband, Eliot, Mattel's co-founder, went back to ABC with their answer: yes.

It was a radical move. Instead of betting on a new toy, Handler had bet the company on a whole new concept: advertising toys to children instead of to their parents. Before The Mickey Mouse Club, Handler wrote in her autobiography, 80% of toy sales occurred in the six weeks before Christmas. Her TV ads changed that, spurring kids to demand toys year-round and stores to display toys more prominently. When the ads produced a run on Mattel's burp guns, Handler created another industry standard, sending employees from store to store to set up displays and observe how toys were sold. Within three years Mattel's sales had nearly tripled (to $14 million)--and that was before the birth of Barbie in 1959. So next time your kid whines for a Chicken Dance Elmo you know whom to blame: R-U-T, H-H-A, N-D-L-E-R. -- Kate Bonamici

1957: Arthur Rock funds the Traitorous Eight

In 1956, Nobel laureate William Shockley lured eight world-class engineers and scientists to Palo Alto to form the brain trust of his pioneering semiconductor company. All eight were stars in their fields: Among them were an industrial engineer with a head for numbers named Eugene Kleiner; a dapper physicist from Iowa named Robert Noyce; and a folksy Californian named Gordon Moore, who had earned doctorates in physics and chemistry. But Shockley's mercurial management style quickly wore thin. The eight, who had just as quickly grown to love working with one another, decided to try to find an employer who would hire them as a team.

Arthur Rock, a 31-year-old securities analyst in New York City who specialized in the budding electronics industry, got wind of the restless group by chance: Kleiner's father was a client of Rock's investment bank, Hayden Stone. The best way for the eight to stay together, Rock suggested, was for them to start their own company to develop rudimentary semiconductor components rather than to go to work in the bowels of a larger corporation. There was one small problem: money. "There was no way to set up a company at that time. No mechanisms, no venture capital," Rock recalled.

So he improvised, proposing a financing plan that was as simple as it was profound. Each of the eight would receive 10% of a new company whose sole asset was their combined expertise. Hayden Stone would own the remaining 20%. Then Rock would find an established tech company to loan the startup $1.5 million in seed capital to develop its first products, with the agreement that the investing company would have the option to buy out the startup later.

It was a tough sell. But in 1957 the 36th company Rock approached--Fairchild Camera & Instrument of Syosset, N.Y.--bought in, and Fairchild Semiconductor was born. "Suddenly it became apparent to people like myself, who had always assumed they would be working for a salary for the rest of their lives, that they could get some equity in a startup company," Robert Noyce would say later. (He went on to co-invent the integrated circuit before his death in 1990.) "That was a great revelation--and a great motivation." Two years later Fairchild Camera exercised its option to buy the company, netting each of the Traitorous Eight about $250,000.

Rock didn't know it at the time, but he had hit upon a completely new approach to company building--as well as an almost magical formula for accelerating the development of new technologies and creating immense personal wealth. In a single stroke, Rock had created the DNA for what would become Silicon Valley: venture capital, stock options, and a company that would itself spawn a little startup called Intel. It was all there in that one deal. -- Brent Schlender

1964: Thomas Watson Jr. does a 360

So you've just bought a new computer. Very exciting. Only thing is, the files on your old computer are useless. You need to rewrite all your programs. And your printer won't work anymore. Nothing is compatible! But then, that's how things went before IBM's System/360.

IBM's "$5,000,000,000 Gamble," as FORTUNE called it in 1966, grew out of the company's own compatibility problems. Rivalry between the company's two divisions produced a "wildly disorganized" array of offerings, CEO Thomas J. Watson Jr. recalled in his autobiography, Father, Son & Co. It was T. Vincent Learson, one of the "harsh, scratchy types" Watson valued (and who would eventually succeed him as CEO), who gathered executives from across the company at the New Englander Motor Hotel near Stamford, Conn. Two weeks before Christmas 1961, he essentially locked the doors and told them they couldn't come out until they had reached some conclusions.

On Dec. 28 the group delivered an 80-page report to Watson. It was the birth certificate of the System/360, a family of computers that would remain compatible with future generations--and w ould render all previous computers, including those of IBM itself, obsolete. As FORTUNE put it, "It was roughly as though General Motors had decided to scrap its existing makes and models and offer in their place one new line of cars, covering the entire spectrum of demand, with a radically redesigned engine and an exotic fuel."

As if that weren't challenge enough, Watson promised to roll out the new line--consisting of six computers and 44 peripherals--all at once rather than piecemeal, to make a "tremendous splash." It turned out to be a tremendous stretch, requiring four years, 65,000 additional employees, and five new factories. But a flood of orders, while nearly capsizing IBM, established a new standard. For the first time the world's computers had a common language. Every time you painlessly upgrade your own desktop, you can think of the project that brought IBM to the brink. -- Corey Hajim

1970: Curt Flood refuses to play ball

"I have only two choices," a seething Curt Flood told his wife, Marian, in late 1969. "I can go to Philadelphia, or I can quit baseball altogether. I will not go to Philadelphia."

No one quite believed that Flood, star center fielder on three World Series teams in St. Louis, would pass up the princely salary of $90,000 for a principle. But Flood wasn't merely unhappy over his trade to a city considered unfriendly to blacks. His main objection was baseball's "reserve clause," the century-old system that allowed owners to hoard and swap players as a kid trades playing cards but banned the players from shopping their own talents. Flood wrote in his 1971 book, The Way It Is: "A salesman reluctant to transfer from one office to another may choose to seek employment on the sales force of a different firm. A plumber can reject the dictates of his boss without relinquishing his right to plumb elsewhere.... But [if the athlete] elects not to work for the corporation that 'owns' his services, baseball forbids him to ply his trade at all. In the hierarchy of living things, he ranks with poultry."

This was more than unfair, Flood argued. It was un-American. On Christmas Eve, Flood put his stance in writing: "I believe I have the right to consider offers from other clubs," he wrote baseball commissioner Bowie Kuhn, and "request that you ... advise them of my availability for the 1970 season."

Kuhn refused. Flood sued and lost. In 1975, though, pitchers Dave McNally and Andy Messersmith challenged the reserve clause--and won. Free agency was born. It would transform the sports business, while typifying a new, larger shift from lifetime employment to worker mobility. Today's multimillionaire ballplayers made out well. But Flood's decision cost him. He sat out all of 1970, and played just 13 games with the lowly Washington Senators in 1971 before retiring, at age 31. He died in 1997. -- Jerry Useem

1972: Ford decides to let the Pinto explode

In the early 1970s, Dennis Gioia, a newly hired recall coordinator at Ford, heard scattered tales that the company's popular new compact, the Pinto, "lit up" when hit in a rear-end collision. But it was only after seeing a crumpled Pinto in Ford's "Chamber of Horrors," where damaged cars were examined for possible flaws, that he wondered whether there was a serious problem. "My revulsion on seeing this incinerated hulk was immediate and profound," he wrote in a 1992 article for the Journal of Business Ethics.

Gioia brought the problem before the recall committee but, lacking evidence of a systemic problem, joined them in voting against a recall. About a year later they got some evidence. During preproduction crash tests, they learned, eight of 11 Pintos had "suffered potentially catastrophic gas tank ruptures" on impact. The fuel tanks of the three other cars had survived only because they'd been shielded from a set of studs that did the puncturing.

For the second time the committee voted, and for a second time it decided not to act. The logic was clear: Conventional wisdom held that small cars were inherently unsafe, and as Ford president Lee Iacocca put it, safety didn't sell. Fixing the problem would probably reduce storage space, already at a premium, and ultimately, the design was legal.

The issue arose a third time in 1977, but now in the pages of Mother Jones magazine. Writer Mark Dowie had acquired a Ford cost-benefit analysis from the early '70s that compared the cost of recalling all Ford cars with rear-mounted fuel tanks (not just Pintos) against the costs of restitution for the families of those injured or killed by the Pinto's flaw. It would be cheaper, by a factor of three to one, to pay off victims and their families than to make an $11 fix in each car.

A public still reeling from the betrayal of Watergate now learned that one of its great corporations, Ford Motor Co., had weighed the lives of consumers against the dollar--and chosen the dollar. Ford discontinued the Pinto in 1980 after a costly recall, but the blow to trust would prove more lasting. Consumer activists would now act as safety watchdogs. And when a California jury awarded a Pinto victim a then unheard-of $125 million (later reduced to $3.5 million) for pain and suffering, it galvanized class-action lawyers everywhere. -- Kate Bonamici

1975: Walter Wriston automates the teller

John Reed was operating on faith. He knew he was right, though he didn't know why. Fortunately Reed's boss, Citibank chairman Walter Wriston, was also a believer in the possibilities that technology could bring to banking. So in 1975, Wriston agreed to sink an eye-popping $100 million into Reed's plan. Two years later the bank went public with it. Then, virtually overnight, Citi dotted the Big Apple with a network of more than 400 automatic teller machines, according to Phillip Zweig's biography, Wriston.

No one knew if bank customers would forgo dealing with a live teller. But Wriston was impatient to get Citibank's consumer business growing. "The demand deposits in the city of New York had not grown in ten years," Wriston told FORTUNE in an interview last year, before his death from cancer. "It was perfectly clear that something had to be done."

The gamble paid off--with a little help from above. In early 1978, New York City was walloped with more than a foot of snow. Within three days a commercial ran showing New Yorkers trekking through the slush to Citibank ATMs. A catch phrase was born: "The Citi Never Sleeps." Use of the machines soared. By 1981, Citi's share of New York deposits had doubled. Rivals stopped snickering about Citi's "soulless machines" and started to get with the program. Today, of course, getting money in Paris from your bank account in Portland seems as mundane as traffic lights. "But when you think about it," says Wriston, "it's just extraordinary." Machines, everywhere, that give you money. -- Ellen Florian Kratz

1980: Reg Jones picks Jack

Before his name became synonymous with General Electric, Jack Welch was the anti-GE. He railed at its bureaucracy, ran his Pittsfield, Mass., division as if it were Jack Welch Plastics, and heaped scorn on the "dinks" who ruled at headquarters. In their urbane world, Welch was odd man out.

Yet it was one of their number--the courtly Reginald Jones--who made Welch odd man in, naming him CEO in 1980. The consequences of Jones's decision are well known. By tearing up GE in the absence of any outward crisis, Welch became a pariah--and then, as global competition tore up one U.S. company after another, an icon whose street-fighting instincts inspired book titles like Business the Jack Welch Way, Jack Welch Speaks, and Jack Welch and the G.E. Way.

Welch's name had actually been excluded from an early list of successor candidates. He was too young, too impatient, too reckless. He stammered. But in 1979, Jones closed the door to his office, drew on his pipe, and said to Welch, "You and I are flying in one of the company's planes, and this plane crashes. Who should be chairman?" Like many of the six other finalists, Welch first tried to climb out of the wreckage. But over the next two hours Welch delivered a critique of GE that resonated with Jones.

Like Welch, Jones saw danger on the horizon. As a director at Bethlehem Steel, he'd seen what happens when foreign competition hits a slumbering bureaucracy, and he feared what it might do to GE. "The first thing you do when you're looking for a successor is, don't look for someone like you," Jones told a Harvard Business School class in 1982. "The other thing is, you'd better look to the environment ahead ... [and] get someone who's gonna be attuned to that environment, not the environment in which you lived." Jones's last and perhaps best decision at GE was to pick a decision-maker who would change the way GE made decisions. -- Jerry Useem

1983: Sam Walton explores the final frontier

Wal-Mart's founder was not a technophile. He was a grounded man, and a cheap one. But in 1983, when his subordinates proposed a $24 million investment involving outer space, he listened.

It was Glenn Habern, a data-processing manager, who came up with the idea of building a private satellite network. It was far-fetched, to be sure: Wal-Mart would be the retail test case for this kind of technology. But it had two selling points. The first was personal contact. Walton was adamant about visiting every store personally. The growing number of stores, though, was making that harder and harder. A satellite system would let him beam pep talks to his associates.

The second selling point was data. Walton couldn't get enough of it, and the company's jammed telephone lines couldn't handle it. Satellites allowed him to check on how inventory was piling up, track a day's sales at a particular store, see whether a new product was sitting on the shelves. "With a company, the risk you run is that you grow so rapidly that it gets out of control, that you can't get your arms around it," said David Glass, Wal-Mart's president at the time (and its CEO from 1988 to 2000). "People started asking, 'How you gonna communicate with all these people when you get larger?' We had a very strong culture, but we worried about that."

Walton "didn't like the technology part" of the pitch, Glass recalls. "He was a merchant first and foremost." But the video broadcasts sealed the deal: "He loved the idea of being able to talk to all the associates." Four years later, when Walton addressed a videocamera in an old Wal-Mart warehouse, the broadcast was beamed 22,300 miles skyward and received at roughly 1,000 Wal-Mart stores.

The world's biggest private satellite network gave Wal-Mart a huge informational advantage and the power to combine size with speed. Sales growth, already stunning, hit warp speed. In 1985, two years before the completion of the system, Wal-Mart's sales were $8.4 billion. Ten years later they were $93.6 billion. Ten years after that, they had left the atmosphere altogether: $288 billion, a number without historical precedent. -- Corey Hajim

1984: Ma Bell gives away her babies

Charlie Brown was on the hot seat. American Telephone & Telegraph, after prospering for nearly a century as the only phone company most of the country had ever known, was under attack for being too powerful. Congress and the Federal Communications Commission wanted to make it easier for other companies to compete. The Department of Justice was pursuing an antitrust suit that essentially sought to dismantle the company. It was clear to Brown, chairman of AT&T since 1979, that his employees and shareholders couldn't take more uncertainty.

So in 1982, Brown announced the previously unthinkable: AT&T would voluntarily break itself up on Jan. 1, 1984. The company wouldn't forfeit its equipment-making arm, Western Electric, nor would it lose Bell Labs (the storied research facility that invented the vacuum tube, the transistor, and the laser). And it would keep its most profitable business--long distance. Instead Ma Bell would spin off her "babies," the seven local phone companies that provided dial tones to most of the nation.

Brown had been widely expected to do just the opposite, keeping most of the babies while losing Western Electric. But Brown and his lieutenants believed that the worlds of communications and computing were coming together. By owning a combination of long-distance and technology assets--weren't telephone switches just big computers anyway?--AT&T would remain as powerful as ever. And while they knew long distance would be opened to competitors, they didn't doubt AT&T's ability to fend off little gnats like MCI.

It was the decision of a man who had never faced the rude realities of cut-throat price competition. But perhaps Brown's most fundamental error was that he didn't grasp how quickly his customers could leave the trusted AT&T brand just to save a few cents a minute. They could and they did--providing a searing lesson in the fickleness of the American consumer. Long-distance lines, it turns out, are easily replicated; local phone networks are not.

Had Brown (who retired in 1986 and died in 2003) guessed that long distance would become a commodity in less than a decade, perhaps he would have fought harder to hold on to the local phone businesses with their steady cash flows and direct connections to millions of consumers. AT&T then wouldn't have embarked on its futile quest to get back into the local phone business, spending billions on wireless and cable assets and this year suffering the indignity of being taken over by one of its own offspring. -- Stephanie N. Mehta

1985: Grove fires himself

It's business folklore. Andy Grove and Gordon Moore, Intel's president and CEO, respectively, were facing the rapid decline of their core business, memory chips. Grove turned to Moore one day in 1985 and asked, "If we got kicked out and the board brought in a new CEO, what do you think he would do?" Replied Moore: "He would get us out of memories." Said Grove: "Why shouldn't you and I walk out the door, come back, and do it ourselves?" And so they symbolically fired and rehired themselves, traded memories for microprocessors, and turned Intel into a technology powerhouse.

It's an amazing story, not least because it's true. What's lost in the retelling, though, is the pain--immediate, self-inflicted, and excruciating. At first Grove could scarcely believe what he'd said. "How could we give up our identity?" he wrote later. "How could we exist as a company that was not in the memory business? It was close to being inconceivable."

The prospect of telling his associates about his conclusion filled Grove with dread. For months he tried to communicate it without actually saying it. Listening to Grove's circuitous answers at a dinner one night, an Intel manager finally pressed him: Was Grove saying he could imagine Intel not in the memory business? Pinned down, Grove spit it out: "Yes, I guess I can." At which point, he recalls, "all hell broke loose."

Grove's account of the episode (in his book Only the Paranoid Survive) is filled with apocalyptic language: "A crisis of mammoth proportions" ... "wandering in the valley of death" ... "a long, torturous struggle." These are not the words of a triumphant business icon. They are the sounds of a man describing what a decision feels like. Our minds can reach conclusions, but it's our whole being that decides--often painfully. We can marvel at Grove's intellectual leap. But if he had lacked the will to push Intel into the unknown, we never would have heard about it. -- Kate Bonamici

1985: Drexel Burnham writes a "highly confident" letter

It was late one February night, and Carl Icahn wanted to launch a hostile takeover of Phillips Petroleum. Unfortunately, he was $8 billion short, and almost nobody knew who he was. That could have put a damper on his plans. But a financier named Michael Milken had developed a new way of using junk bonds to raise capital. So Icahn asked Drexel Burnham, the second-tier investment bank where Milken worked, for a letter of commitment.

Impossible, responded the Drexel heavies. They didn't actually have the money the way a bank had money. They could raise it only once Icahn launched his takeover: a chicken-and-egg problem. But as the night wore on, Leon Black, head of the firm's mergers and acquisitions department, finally ventured, "Why don't we say we're 'highly confident' that we can raise [the financing]?"

And so the next morning Drexel drew up a letter saying just that. Icahn would end up ditching the Phillips deal, but the era of the corporate raider had begun.

What that letter meant, in practice, was that with little more than a note from Drexel (usually approved by Milken), any fool could get his hands on a huge sum of money to take over--or at least threaten to take over--a major corporation. As Wall Street firms followed Drexel's lead, executives began to face the very real threat of a Carl Icahn or a T. Boone Pickens seizing their companies and throwing them overboard. In 1986 alone, 3,973 takeovers, mergers, and buyouts were completed in the U.S. at a total cost of $236 billion.

The raiders were finally beaten back with laws and poison pills, and Milken pleaded to charges of securities fraud that sent him to prison for 22 months. But the sense of security in the executive suite never returned. As boards started acting like raiders--dumping CEOs who failed to increase their companies' stock prices--executives took up the rallying cry that had once threatened them: "Shareholder value!" Two words that, for good and ill, would define the decade to come. -- Barney Gimbel

2000: Jerry Levin decides he doesn't need a collar

The chairman of Time Warner (the parent of FORTUNE's publisher) loved the deal he struck in January 2000 to merge his company with highflier America Online, then valued at about $160 billion. Jerry Levin had such faith in the combination of traditional and new media assets, in fact, that he decided not to place a collar on the transaction. A collar enables the seller--in this case Time Warner--to revisit the terms of the transaction if the buyer's stock falls below a certain price.

Why did he make that choice? "With a collar, the implication is that you are not really sure and you need this kind of protection," Levin would later tell an audience at Manhattan's 92nd Street Y. "I wanted to make a statement that I believe in it." Unfortunately, Levin's belief that AOL could deliver a glittering new-media future proved colossally wrong.

Time Warner shareholders wish Levin hadn't been such a true believer. Almost as soon as the companies announced their historic deal, the Internet bubble burst and AOL shares plunged 50%. Without a collar, Time Warner couldn't renegotiate the deal. Some Time Warner executives urged Levin to use the drastic drop in AOL's stock price as an excuse to cancel the merger altogether, or at least as leverage to rework the terms to give Time Warner shareholders a greater stake in the combined company. But Levin, who didn't return calls, never did.

His passion for hooking up with AOL, even as its stock nosedived and questions about the quality of its advertising revenues surfaced, turned a questionable deal into the crowning folly of the dot-com era. Time Warner shareholders--who once owned 100% of a company worth $75 billion--today own 45% of a company worth, well, roughly $75 billion. -- Stephanie N. Mehta