WHY JOHNNY CAN'T SAVE FOR RETIREMENT
Money decisions are hard. It's not that we're stupid--we're just not wired properly. How brain science is changing the way we think about 401(k)s, Social Security, and the whole notion of retirement planning.
By JUSTIN FOX

(FORTUNE Magazine) – The first question, flashed on a screen a few inches from my eyes, is easy. Do I want an Amazon.com gift certificate worth $16.31 today, or one worth $16.31 a month from now? Even with my head and upper body wedged into a magnetic resonance imaging machine that clatters like an unmuffled motorcycle engine as it scans my brain, I answer without hesitation. I push the left button of the small keyboard in my hand: I want it today.

The next choice takes slightly more thought: Do I want $16.33 in a month, or $24.43 in a month and two weeks? I pick the latter. And so it continues, through 58 more such dilemmas. I am in the basement of Princeton University's psychology department, where professor Jonathan Cohen and post-doctoral fellow Sam McClure, along with economists from Harvard and Carnegie Mellon, designed this little exercise. Their goal was to watch where the blood flows in the brain--and thus which parts of the brain are most active--when people decide between current and future rewards.

What they found when they put Princeton undergraduates through these paces a year ago, with actual gift certificates at stake, was that we humans are of two minds. The calculating, cognitive, most advanced parts of the brain like the prefrontal cortex were hard at work computing and weighing options no matter what the students chose. But when they opted for the money now, something else happened. The more primitive limbic system, common to all mammals and associated with emotion and quick reactions, lit up as if it smelled dinner.

"You can think of the limbic system as having evolved at a time when many or all important goods were perishable," says Cohen, director of Princeton's Center for the Study of Brain, Mind, and Behavior. "It was use it or lose it." Then along came the higher brain, with which humans were able to devise such clever preservative innovations as beef jerky, refrigerators, and inflation-indexed bonds. The lower, mammalian brain was clearly not designed for this environment, and it has been acting up ever since.

Choosing between an Amazon.com gift certificate today and a larger one later is an interesting exercise, but it's not a very hard one. A similar but vastly more challenging experiment in balancing current rewards against future ones is being conducted today not with undergraduates lying in MRI machines but with millions of Americans and their retirement accounts.

As you may have heard, individuals are being asked to take on ever more responsibility for their upkeep when they get old. It's not just President Bush's plan for Social Security, in which private accounts would supplant part of the government program. Traditional corporate pensions are on the way out too, and while state and local government workers have so far been immune from this trend, California Governor Arnold Schwarzenegger wants to switch his state's pension funds--the nation's biggest pool of pension money--into something akin to a 401(k). Similar developments are afoot all over the world, or at least in the more affluent parts of it. Leaving aside for a moment the matter of whether we ought to be making these monumental shifts, here's a big question: Are we up to it?

During the 1990s, the prevailing assumption was that of course we were. "Investor nation" had declared its independence. The era of corporate and governmental hand- holding was over. Giving individual Americans the freedom to save their own money, invest it as they choose, and spend it according to their own needs and desires during their retirement years seemed a matter not just of expediency but of economic good sense. Then came the stock market collapse of 2000--02. Since then, most news on the retirement front has been dire. The Employee Benefit Research Institute estimates that by 2030, at current savings rates, American retirees will be $45 billion a year short of the income they need to meet basic food, housing, and medical needs.

This is where the brain scans come in. Functional magnetic resonance imaging is just the latest and coolest tool that economists and kindred spirits in fields like psychology and neuroscience are beginning to use to examine human decision-making involving money. The evidence delivered so far by the brain images is pretty tentative on its own. But what has excited the fast-growing community of scholars who study such matters is that it so neatly fits with what has been discovered in psychological experiments, economic experiments, and even empirical research into the behavior of 401(k) plan participants.

What is emerging from this research is less a despairing view of the human inability to make the right choices than an acknowledgment that the way choices are structured matters--a lot. We're not irrational or incompetent, just conflicted. We're not even always conflicted: It appears that the higher brain isn't able to stir us to action without the lower brain's help. The key to designing good retirement policies and institutions is to make sure they don't pander to our inner rodents or paralyze our higher brains with complexity.

As a result, some of the same economists who have been gazing at MRI scans and dreaming up psychological experiments to test our thinking patterns have developed very specific ideas about how to structure 401(k) plans and any Social Security private accounts that Congress might decide to create. They've even developed a political philosophy, which we'll call "neuroconservatism." Here's the most remarkable thing of all: Decision-makers in Washington and corporate America actually seem to be listening to them.

The idea that we humans harbor conflicts within us is not new. Plato figured it out 2,400 years ago, depicting the soul as a chariot drawn by two horses, one noble and one not. Economists, though, long found it easier to depict us as rational computing machines, weighing all the options and deciding on the path of action that optimizes our well-being.

Then, one night in 1974, Dick Thaler threw a dinner party for some fellow economists at his house just off the campus of the University of Rochester. Thaler, now at the University of Chicago, had just begun teaching at Rochester's business school. The predinner snack he provided his guests was a bowl of cashews. The nuts proved so popular that Thaler finally removed them to the kitchen so everyone didn't fill up before dinner, a move his guests applauded. They then began analyzing the strangeness of what they had just done: They clearly wanted to eat the cashews, or else they wouldn't have been eating them, but they also clearly wanted to stop eating them, or else they wouldn't have approved of their removal. Any normal person would simply chalk that up to the munchies. But these were economists in the process of having their paradigms shifted.

Thaler and his colleague Hersh Shefrin later built a model of this two-minded behavior, using a mathematical approach originally devised to describe the conflicts between executives and shareholders. Along the way they discovered the intriguing work of George Ainslie. Ainslie was a psychiatry student at Harvard Medical School in 1967 when he first tried out his idea in the famous pigeon lab founded by B.F. Skinner. Ainslie gave pigeons a choice of pecking a red key to get food immediately or leaving the key alone and getting even more food a few seconds later, and they invariably opted for the immediate reward. But when he added a green key that, if pecked, would prevent the red key from ever appearing, a minority of pigeons learned after repeated runnings of the experiment to peck the green key instead. That way they wouldn't be tempted by the red key and would end up with the larger helping of feed.

If one can devise mechanisms that enable even pea-brained birds to sensibly weigh present vs. future rewards, then surely the same can be done for humans. As Thaler and Shefrin noted in "An Economic Theory of Self-Control," a paper they published in 1981, people had already developed institutions to do just that, such as the monthly Christmas Club deductionsbanks offer to help customers amass a nice little wad to spend over the holidays. Then again, mankind has also developed lots of devices--the credit card springs to mind--that put the impatient brain in charge.

That raises some perplexing questions about just what is meant by choice and free will. Within us there are competing wills, and the ways in which choices and institutions are structured can determine which of those wills prevails. So which will is the free will? That's a consideration that ought to be at the heart of all economic policymaking. But it is only now really catching on. Empirical evidence coming from 401(k) plans is one major reason why.

As its awkward name suggests, the 401(k) is the inadvertent product of a change made deep in the bowels of federal law, in the form of a 1978 amendment to the Internal Revenue Code. It happened to arrive on scene just as the standard corporate pension was starting to run into trouble. The spread of pension plans after World War II had been sparked by yet another inadvertent regulatory decision (a 1948 ruling by the National Labor Relations Board) and enabled by the seeming immortality of the large American corporation. The problem was that big American corporations weren't immortal. After several major bankruptcies in the 1960s and 1970s, Congress tightened the laws surrounding pension funding, and those who set accounting standards made it harder for companies to hide pension commitments from shareholders. As a result, CEOs and CFOs began to sour on the things.

The 401(k), a retirement plan that shifted risks, responsibilities, and rewards away from the company and onto the shoulders of its employees, seemed to be an attractive alternative. It was attractive as well to the increasing ranks of job-switching employees, who were ill-served by traditional pension setups that rewarded those who stuck around for decades at the expense of short-termers.

In one crucial sense, these 401(k)s were designed to cater to the higher brain rather than the parts we share with, for example, the flying squirrel: Once an employee committed to a certain savings rate, the money was deducted from his paycheck before he ever got his hands on it. But in almost every other aspect, 401(k)s turned out to confound the human brain more than help it. The chief problem seems to be that the decisions associated with them are just too hard for many people. They only got harder over the course of the 1990s, as many companies expanded their 401(k) offerings beyond the five or six plain-vanilla funds originally offered to a full selection of dozens of name-brand mutual funds.

The results have been discouraging: A stubborn third or so of eligible workers still don't have 401(k) accounts. And of those who do, an alarmingly large minority have kept much of their money in either low-return stable-value funds or high-risk company stock--both entirely inappropriate as retirement investments--because that's where the money went if they failed to make a choice.

But for all the financial incompetence among 401(k) account holders, the really startling thing is the way the plans encourage that ineptitude. Suppose an employer were to ask new employees if they wanted to set aside as much of their income as it took to ensure a comfortable retirement and invest it in a way that balanced risk and return over their lifetimes. The almost universal answer would be, You betcha! But the question has been phrased instead as a series of increasingly complex choices: Do you want to contribute to a 401(k)? If so, what percentage of your salary do you want to put into it? Once you've decided that, tell us how you want to allot those contributions among the 73 mutual funds offered in our plan. The areas of the brain that get fresh blood while pondering such questions must be identical to the parts of a deer's brain that activate when it stares at a pair of headlights.

Evidence that hard choices can overwhelm our better selves gushes forth wherever researchers look for it. Baba Shiv of the University of Iowa and Alexander Fedorikhin of the University of Southern California gave half of a test group a seven-digit number to memorize and the other half a two-digit number. The test subjects were then offered a choice between a slice of cake or a bowl of fruit. Of those memorizing the longer numbers, 59% chose the cake, compared with 37% of the two-digit crowd. The explanation: The prefrontal cortex was too busy memorizing the number to rein in the limbic system, which wanted the damn cake now. Another experiment, devised by Sheena Iyengar of Columbia and Mark Lepper of Stanford, involved setting up a table of fancy jams in front of a gourmet food store near the Stanford campus. When there were 30 varieties of jam on the table, only 3% of those who stopped to examine them actually bought any. When only six varieties were displayed, fully 30% bought jam. Too many choices made choice almost impossible.

Not surprisingly, when employers frame 401(k) options differently or offer fewer options, behavior can change a lot. When companies switch from offering 401(k) participation to employees as a yes/no choice to automatically enrolling new hires but allowing them to opt out, participation invariably goes up dramatically. When Thaler and UCLA's Shlomo Benartzi proposed to workers at one company that they commit to automatically increasing their 401(k) contribution percentage every time they got a pay raise, savings rates shot up. Last fall, when Financial Engines, a company founded by former Stanford finance professor and Nobel laureate William Sharpe, offered to make the 401(k) investment choices for employees of Motorola and J.C. Penney, 15,000 jumped at the offer. Makes sense: Who wouldn't want a little help keeping that frisky limbic system on a leash?

So workers clearly want to do the right, rational thing with their money, but 401(k) policies have been making it too hard. Once the path was smoothed, their decisions changed. What this means is that determining the path and the structure of individual choice is a huge responsibility for 401(k) plan administrators, legislators, regulators, and anybody else in a position of authority. They cannot simply "let the people have what they want," because what the people want is affected by how the questions are posed.

For much of the 20th century, the standard policy response to people making bad choices was to take those choices away. But that approach does not appeal to many of the scholars studying our conflicted minds. "My great fear is that legislators are going to take this and say, 'Ah, people can't exercise self-control and therefore we need to pass laws to do it for them,'" says pigeon pioneer George Ainslie, who is now chief of psychiatry at the Coatesville Veterans Affairs Medical Center in Pennsylvania. "When you take it out of somebody's hands and control it externally, then the person is apt to cut back on his own efforts."

One neuroconservative alternative offered up by Thaler and Chicago law professor Cass Sunstein is what they call "libertarian paternalism," in which government and other institutions try to steer individual choices in what they deem the right direction, but allow individuals to opt out and choose their own path. Another group of behavioral economists has offered up a kindred approach, "asymmetric paternalism," in which financial decision-making is seen as something akin to drinking or driving: subject to age limits, restricted opening hours, and competence testing. "I think liquor regulation is about right," says one of this group, Colin Camerer of the California Institute of Technology. "You tax it, you make it hard to get all the time, you prohibit use by minors. That's something that's emerged after a wild pendulum swing of cycles."

You can see the pendulum beginning to swing right now in the 401(k) racket. Last fall, a quarter of the 180 big-company HR chiefs surveyed by Hewitt Associates said they were likely to start automatically enrolling employees in 401(k) plans. About 20% said they're considering offering contribution-increase plans along the lines prescribed by Thaler and Benartzi. Until recently, corporate executives shied away from any role in employees' 401(k) choices for fear of getting sued if those choices turned out badly. But lately companies have been sued for failing to tell employees not to risk blowing their retirement funds by keeping all the money in company stock, so intervention may prove unavoidable. Meanwhile, companies like Financial Engines and Guided Choice (affiliated with another finance guru and Nobel laureate, Harry Markowitz) have arisen to take on the responsibility and potential liability of telling people what to do with their 401(k) money.

What of Social Security? The modern critique of the program originated along libertarian lines: "The citizen of the United States who is compelled by law to devote something like 10% of his income to the purchase of a particular kind of retirement contract, administered by the government, is being deprived of a corresponding part of his personal freedom," argued University of Chicago economist Milton Friedman a half-century ago.

But if that citizen would be unlikely on his own to save enough for retirement or invest the money wisely even though he really meant to, then the pure libertarian case against the program begins to wobble. William Niskanen, a former Friedman student who is now chairman of the Cato Institute, the libertarian Washington think tank that for years has been one of the most vocal advocates of Social Security privatization, admits as much. "We've accepted the argument of behavioralists like Dick Thaler that people do dumb things," he says. As a result, he doesn't have a problem with the fact that just about every privatization plan floating around Washington--including President Bush's--would automatically stick the money in something low-fee and conservative, would strictly limit investment choices beyond that, and would force retiring private-account holders to buy an annuity with what they've saved so that they don't run out of money before they die.

If privatization requires such restrictions on liberty, why bother? Niskanen still makes the argument along libertarian lines: "It gives people a great deal more freedom than they have now." Harvard's Martin Feldstein, the most influential outside economic advisor to the Bush administration, offers another reason: People will see the money they put into the accounts not as taxes being paid to the government but as their own savings, a change in attitude that he thinks will positively affect work incentives. There's also the argument that most people will get higher returns from private accounts than the current Social Security system. But some people won't, and there's no way of knowing how big a percentage that "some" will be.

The most interesting argument for privatization comes out of the neurocon playbook: It would structure the taxing and spending choices faced by Congress in a way that might keep that august body's collective prefrontal cortex in charge. "I'm very skeptical of privatization, because I see all these pitfalls of investors making bad choices," says Harvard's David Laibson, a leading student of both 401(k)s and the brain. (He and Carnegie Mellon's George Loewenstein were the economists behind the MRI experiment at the beginning of this article.) "But it does have this one interesting benefit, which is that it gets the Social Security surplus off the government books." The U.S. will take in almost $200 billion more in Social Security payroll taxes this year than it pays out in benefits. This money is supposed to be socked away to pay for the retirement needs of the baby-boom generation; instead it is simply being counted as part of the unified federal budget, where, on paper at least, it reduces the size of the deficit. Instead of investing the money in any meaningful way, Congress is by all appearances simply spending it each year.

Think of our nation's lawmakers as pigeons in an extremely nice lab. In the face of budget pressures and political expediency, Senators and Representatives keep pecking the red key. Spend now; worry later. But a vote to take some of that money out of their own hands and divert it into private accounts--that would be pecking the green key. (Congress want a cracker?)

Even if the current privatization plans go nowhere--and it hasn't been looking good for President Bush lately--the question of spending now or later is one that we'll always face. But now, at least, we know that the way we pose the question matters as much as how we answer it. ■

FEEDBACK jfox@fortunemail.com