Dodging bullets, loading springs, and backdating options
Your company is about to report spectacularly good news. You get a big grant of options just before the announcement. Is that a crime?
It depends, I learned Monday at a conference put on in Washington by Stanford Law School's Rock Center for Corporate Governance. If you're the CEO, you know about the pending announcement, and you fail to inform the board members who sign off on the options grant, you're in big trouble. This is what Rock Center faculty director and Stanford Law professor Joseph Grundfest calls "asymmetric springloading," and a federal appeals court ruled 38 years ago (for those of you with access to Westlaw: SEC v. Texas Gulf Sulphur Co., 401 F.2d 833) that it amounts to fraud--even if the board, after the fact, declares that it didn't really mind. But what of "symmetric springloading," where the board members making the grant are fully aware of the news that is about to send the stock price leaping? Or "bullet dodging," where a grant is delayed until after the announcement of some really bad news? Or timing releases of bad news to precede a regularly scheduled options grant--a practice for which there is apparently no nickname? These are all a bit icky, and certainly give insiders an advantage unavailable to outside shareholders. But are they illegal? The consensus of the legal experts on hand Monday was that they would be very difficult cases for prosecutors to win. "What I'd tell a client is, 'You have a very serious fraud problem, but I can help you,' " said David Becker, a former SEC general counsel who is now a partner at Cleary Gottlieb, when Grundfest questioned him on the legality of bullet dodging. Forging documents to make it seem that options were granted before they really were--backdating--turns out to be just the most obviously illegal tip of an iceberg of dodgy corporate behavior regarding options grants. These practices were discovered by accounting and finance professors looking at the interesting behavior of stock prices before and after options grants. Study after study has found that the stock price of a company granting options tends to underperform the market in the days leading up to the grant, and dramatically outperform it afterwards. The first paper revealing this empirical result, by NYU's David Yermack, was published in the Journal of Finance way back in 1997 (it's not available free online, but an abstract is). Yermack speculated that companies timed their options grants to take advantage of pending news. This unleashed a torrent of similar research, Stanford accounting professor Ron Kasznik said at the conference Monday, including a 2000 paper by Kasznik and UCLA's David Aboody (summary here) that found a similar stock price pattern around regularly scheduled options grants. Aboody and Kasznik theorized that companies timed their news releases to maximize the value of their options. It was only in 2004, though, that Erik Lie of the University of Iowa proposed that some companies might actually be rewriting history and pretending that options were granted well before they actually were. That's what set off the current frenzy of investigations into options backdating that has so far cost the companies involved hundreds of millions of dollars and claimed the jobs of 40 high-level executives. What about the less-obviously illegal practices of timing options grants and news releases to maximize gains? Such actions amount to, in more or less innocent form, insider trading. There is a school of thought, most vigorously represented through the years by Henry Manne, which holds that insider trading is an good thing, because it facilitates the rapid transfer of information--through the mechanism of stock prices--from insiders to the market at large. But Manne has never been able to convince Congress or the SEC or most of the securities bar of the rightness of his views. So we're left with the question: Are springloading and bullet dodging and messing with the timing of news releases--all of which appear to be far more widespread in corporate America than options backdating--things we ought to be up in arms about? UPDATE: For those of you who can't get enough of this stuff, Jack Cieselski has already put up a couple of posts about the conference on his Analyst's Accounting Observer Weblog, with more likely to come. UPDATE 2: My Fortune colleague and fellow blogger Roger Parloff has posted his own take on the backdating mess. Tax cuts and their consequences
In a comment to my piece last week on supply-side economics, Dan G. of Milwaukee cast doubt on the forecasts of the Congressional Budget Office: "The CBO can't make predictions, assumptions, or analysis any better than some far left economics professor shielded from business realities on a college campus," he wrote. Then he declared:
History has shown, every time major individual tax cuts have gone through, tax receipts go up considerably quicker than they did during the preceding period. You can call this "coincidence" if you wish, but there is more to it than that.I didn't know if this was a "coincidence." What I really wanted to know was if it was true. I figured Dan would want me to keep the pointy-headed economists out of it, so I simply looked at the last three decades of personal income tax receipts (taken from this year's Economic Report of the President), adjusted for inflation (measured by the Bureau of Labor Statistics). It was just me and a spreadsheet, mano a Excel. Plus a copy of C. Eugene Steuerle's indispensable book, Contemporary U.S. Tax Policy, so I could see when the tax cuts happened. Here's what I found: Several modest tax cuts were enacted in the mid-to-late 1970s. Their impact, however, was swamped by that of inflation, which bumped taxpayers into higher income brackets and made capital gains seem bigger than they really were. So, effective tax rates mostly rose during the decade. Tax receipts dropped 6% in 1975, in part because of a big tax rebate paid that year, but after that increased at a healthy clip--up 12% in 1977, 7% in 1978, and 8% in 1979 before dropping 1% in the recession year of 1980. Then came the sweeping Reagan tax cuts, enacted in 1981 and put into effect over the next three years (the 1981 law also indexed tax brackets to inflation starting in 1984, putting an end to those bracket-creep-induced backdoor tax hikes). After that Reagan-era tax policy was marked by repeated small tax hikes and then the sweeping "revenue-neutral" reform of 1986, which reduced the top marginal rate to 28% but also raised taxes on capital gains and took away lots of exemptions. What happened to personal income tax receipts? They rose 6% in 1981, then fell 2% in 1982, 6% in 1983, and 1% in 1984 before finally bouncing back 8% in 1985, 2% in 1986, and 9% in 1987. Then they dropped 2% in 1988 and rose 6% in 1989. In 1990, with the federal government deep in the red, Reagan's successor George Bush acquiesced to a tax bill that included effectively raising the top tax rate to 31%. Bill Clinton and Congress upped that to 39.6% in 1993. What happened to personal income tax receipts? Down 1% in 1990, 4% in 1991, and 1% in 1992--but up 4% in both 1993 and 1994, 6% in 1995, 8% in 1996, and 10% in 1997. Congress passed and President Clinton signed into law a variety of tax cuts that year, and revenues kept rising: 11% in 1998, 4% in 1999, 10% in 2000. Then came the Bush tax cuts of 2001, which included a rebate paid out that year, followed by another round of cuts in 2003. Personal income tax receipts dropped 4% in 2001, 15% in 2002 (the sharpest one-year decline since 1949), 10% in 2003, and 1% in 2004 before finally rising 11% in 2005. Measured by decade, personal income tax receipts rose at a 2.4% annual rate in the 1970s, 1.8% in the 1980s, and 3.9% in the 1990s. So far in the 2000s they've fallen 3.3% a year. To summarize, tax receipts rose more slowly after the Reagan tax cuts than before. They dropped after the 1990 Bush tax hike, rose after the 1993 Clinton hike, rose after the 1997 tax cut, then dropped after the 2001 Bush tax cut. You can call this "coincidence" if you wish. I just call it confusing. What are the lessons here? (1) There are reasons why we let pointy-headed economists deal with this stuff, and (2) Dan's statement that "every time major individual tax cuts have gone through, tax receipts go up considerably quicker than they did during the preceding period" is false. What that GDP report really meant (not much)
As you've probably heard, the U.S. economy grew at a 1.6% annual, inflation-adjusted crawl in the quarter that ended Sept. 30.
Except that, well, it probably didn't. It was the "advance" estimate of gross domestic product that the Commerce Department's Bureau of Economic Analysis released today. On Nov. 29 we'll get the "preliminary" estimate and then, on Dec. 21, the "final" one. Only that won't be final, either, because in 2008 the BEA's once-every-half-decade revision of the National Income and Product Accounts will change all the numbers yet again. All this uncertainty and revision comes because GDP numbers are built upon vast edifices of educated guesswork that only get filled in with real data months and years down the road--and even then there remain big questions about what we count and how we count it. So why exactly do we pay attention to these numbers? In part because a lot of people are grasping for any sign whatsoever as to whether the slumping housing market is going to take down the rest of the economy. For NYU economics professor Nouriel Roubini, who has been the most noteworthy, outspoken, and (so far) correct bear among economic forecasters over the past year, today's GDP release was occasion to gloat a little. Recession, he predicts, is just around the corner. For most of the forecasters on Wall Street, who as a group are of the opinion that the economy has so much strength outside of housing that we'll muddle through without an outright downturn, the report has occasioned a few downward adjustments to forecasts and a lot of statements that we will nonetheless muddle through without an outright downturn. Who's right? I have no idea. What I'm almost certain of, though, is that the economy did not in fact grow at a 1.6% pace in the third quarter. The path from supply-side economics to deficit spending
I got an e-mail a couple of weeks ago from Ben Etheridge, a high school senior in Marietta, Georgia, who had come across a 2003 article I wrote on the Bush tax cuts. Ben said the article was "more helpful in trying understand supply side economics than many other sources on the Internet" but that, well, he still didn't understand supply-side economics.
This may indicate that I don't understand the subject either, but Ben asked me if I could take another stab at explaining it. With the midterm elections less than two weeks away for a Congress loaded with apparent supply-siders, now seems as good a time as any to try: (Sadly, the great popularizer of supply-side economics, former Wall Street Journal editorial writer Jude Wanniski, is no longer around to critique what I come up with--although you can read his annotation of my 2003 article here.) At its core, supply-side economics is the economics that reigned before John Maynard Keynes came along. You could also call it traditional economics, neoclassical economics, or mainstream economics. It assumes that people respond rationally to economic incentives, and unfettered markets arrive at something close to optimal results. Saving, in this worldview, is a good thing--because savings are always put to use in productive investments that make the economy grow. During the Great Depression of the 1930s, with banks failing and people stuffing the money they still had in mattresses, English economist/investor Keynes became convinced that savings weren't always put to good use and government needed to intervene to stimulate economic activity with tax cuts or--better yet, since the money from the tax cuts might get stuffed in mattresses too--spending. Keynes's argument was vindicated by the American experience during World War II, when massive deficit spending brought full employment and strong economic growth. A few years later, monetary policy was added to the picture--many economists came to believe that the Federal Reserve could reliably fight unemployment by keeping interest rates low (and putting up with moderate inflation). Economic policymaking in the U.S. thus came to focus on manipulating demand through taxing, spending and tweaking interest rates. This wasn't just a Democrat thing. Declared Republican President Richard Nixon in 1971: "Now, I am a Keynesian." Not long after Nixon said that, though, Keynesianism seemed to stop working. Despite government deficits and high inflation, the economy sputtered. The strong growth in productivity (usually measured as economic output per hour worked) that had brought vastly increased prosperity from the 1940s through the 1960s slowed to a Perimeter-at-rush-hour crawl. To explain why this was happening, economists found themselves returning to pre-Keynesian ideas about incentives and the importance of savings and investment. I think it's fair to say that most academic economists now think that while Keynes was onto something about short-run economic fluctuations, it's more productive to focus on what drives long-run growth. That means things like the incentive effects of tax policy, the human capital created by education, and the ways in which legal and regulatory systems enable investment and entrepreneurship. It's the supply side (labor supply, capital supply, etc.) that interests them more than the demand side. Most of these economists would, however, cringe at being called "supply-siders." That's partly because the term has become identified with the Republican Party and, even though economists are perceived as the right wingers on most college campuses, they're still on college campuses, which means they're usually Democrats. But it's also because Wanniski attached the label to a wildly oversimplified version of traditional economics in which the only thing that mattered was tax policy, and tax cuts were always a good idea. Wanniski arrived at the Journal editorial page in 1972 knowing nothing about economics. Watching how flummoxed the Keynesians were by the strange events that followed, he soon concluded that most economists didn't know much about economics either. But he was impressed by two professors who had seen at least some of the troubles of the mid-1970s coming: Robert Mundell of Columbia University (who won a Nobel in 1999 for his work in international economics) and Arthur Laffer of the University of Southern California (who now runs an economic consulting firm). Wanniski's contribution was to take what he learned from Mundell and Laffer and adapt it to political reality. He adopted the term "supply-sider" after being labeled as such by the chairman of Nixon's Council of Economic Advisers, Herb Stein (Ben's dad). He converted his boss at the Journal, Robert Bartley, to the cause and wrote a 1978 book, How the World Works, that laid out his philosophy in detail. He became an adviser to presidential hopeful Ronald Reagan, and after Reagan won in 1980 he helped craft the dramatic tax cuts that Reagan pushed through Congress in 1981. Wanniski's rallying cry was what he dubbed the "Laffer curve," a simple chart illustrating how lower tax rates can bring in higher revenue by stimulating economic activity (or at least cutting back on tax avoidance). According to Wanniski, Laffer sketched the curve on a napkin during a December 1974 dinner at the Two Continents restaurants in the Hotel Washington with him and White House aides Dick Cheney and Donald Rumsfeld, whom you might have heard of. Laffer himself later cast some minor aspersions on this account. He also disclaimed authorship of the idea, giving earlier economists (among them Keynes!) all the credit. But the name "Laffer curve" stuck. The Laffer curve enabled Wanniski to sell his supply-side ideas as a free lunch, which is what made them so politically successful. You could cut taxes, yet not cut spending--the best of all worlds for an elected official. Economists generally don't believe in free lunches, but most agree with Laffer that when tax rates get high enough, lowering them brings in more revenue. The question is how high the rates have to be, and no one has a reliable answer to that. With personal income tax, it's probably somewhere upwards of a 50% marginal rate. (The top marginal rate was 70% when Reagan took office and 28% when he left; it's 35% now.) With taxes on capital gains, dividends, and interest, the cutoff is probably lower. That's partly because such taxes are easier to avoid, but also because they weigh more directly on the savings and investment that bring long-run growth. The reduction of the top income tax rate in the Reagan years did have a Laffer effect, but his tax cuts as a whole did not. Are we in Laffer curve territory now? I wouldn't be entirely surprised if a reduction in the U.S. corporate income tax rate eventually brought in higher receipts, given as how it's currently among the highest on the planet. Beyond that, I'm doubtful. Serious economists with supply-side leanings--like former Bush economic adviser Glenn Hubbard, now the dean of Columbia Business School--think the dividend, capital gains and income tax cuts enacted during the Bush presidency can increase economic growth by several tenths of a percentage point a year. (That may not sound like much, but compound three-tenths of a percentage point in added growth over 50 years and you get $7,000 more dollars a year in the pocket of the average American.) I haven't been able to find any such economists, though, claiming that the tax cuts paid for themselves, Laffer-style. That sort of talk has been the sole province of polemicists and politicians. Here's how President Bush put it in a speech in February: What happened was we cut taxes and in 2004, revenues increased 5.5 percent. And last year those revenues increased 14.5 percent, or $274 billion. And the reason why is cutting taxes caused the economy to grow, and as the economy grows there is more revenue generated in the private sector, which yields more tax revenues. The problem with this argument is that the economy, and with it tax receipts, would have grown in 2004 and 2005 even if there hadn't been any tax cuts. Growing happens to be something the U.S. economy does most every year (you can look it up). The tax cuts may have have made it grow a little bit faster, but not enough to make up for the revenue loss caused by the lower tax rates. This isn't just my opinion; it's also the verdict of the Congressional Budget Office, the nonpartisan maker of deficit projections currently run by a former Bush administration economist. Even after making some pretty liberal assumptions about how much the tax cuts will boost long-run economic growth, the CBO estimated earlier this year that extending them past 2010 would still reduce government revenue, not increase it. Even tax cuts that don't pay for themselves can be a good idea--I happen to be a big fan of the cut in taxes on dividend income that the President (egged on by Hubbard) pushed through Congress in 2003. But such cuts do eventually have to be paid for, either by cutting spending or raising some other tax. The current administration has so far opted to shunt this burden to future generations (or current generations, a few years down the road). As I've written before, the Bush administration's deficit spending isn't necessarily a disaster. But neither is it really supply-side economics, because the increased saving by individuals and businesses enabled by the tax cut has been largely gobbled up by increased government borrowing. That makes it either (1) a wartime necessity, (2) closet Keynesianism, or (3) buck passing. UPDATE: I've responded to one of the comments, which claims that "every time major individual tax cuts have gone through, tax receipts go up considerably quicker than they did during the preceding period," here. Joi Ito's take on outsiders in Japan
In my interview with Michael Zielenziger about his new book on Japan, Shutting out the Sun, we discussed the role of such business mavericks as management consultant Kenichi Ohmae, tech mogul Masayoshi Son, and blogger/investor/Gnome Mage Joi Ito. Michael's take was that they remain marginalized. Joi e-mailed me this morning to say he didn't entirely disagree. But I'll let him tell you:
In some ways I agree with the characterization and in some ways I don't. The "old guard" of Japan is not some monolithic single group, but rather a complex web of various interdependent networks. I think Ohmae, Son and myself all have various connections to various segments of the "old guard". Why Japan isn't ready for a comeback
My first visit to Japan came in the sticky late summer of 1998. Fortune had sent me across the Pacific to figure out whether there was any end in sight for the island nation's long economic nightmare. That's not really the kind of question you can answer in a week of interviews in a country where you don't speak the language, but I had read Karel van Wolferen's The Enigma of Japanese Power, so I sort of had an idea going in.
Van Wolferen is a Dutch journalist-turned-professor, and his book--one of the most important written about Japan in the 20th century--described how difficult it is for Japanese society to change course. In Van Wolferen's telling, power in Japan was exercised not by individuals but by interlocking groups. And it took deeply traumatic events like the arrival of Commodore Perry's warships in 1852 and defeat in World War II to get Japan's groups to adopt new objectives. So in 1998, I wrote that Japan wasn't going to change anytime soon, and its economy wasn't going to turn around anytime soon. It was a pretty good call: It wasn't until 2004 that Japan's economy showed real signs of life. But since then a lot of people, among them Fortune's Asia editor Clay Chandler, have been proclaiming that the country is now truly on the comeback trail. Not everyone, however, agrees. During my 1998 visit, I got to spend a weekend at Van Wolferen's country house outside Tokyo. I met a bunch of interesting people there, among them Michael Zielenziger--then Knight-Ridder Newspapers' man in Asia--and his wife, artist Diane Abt. Now Michael, who has since moved back to the San Francisco Bay area, has written a book, Shutting out the Sun, arguing that Japan is nowhere near a real turnaround or comeback. (Similar sentiments can be found on a near-daily basis on Michael's blog.) The book is only partly an economic tract, though. Michael also spent a lot of time among the hikikomori--young male recluses whose plight, he argues, encapsulates much of what's wrong about Japan. Because I'm too lazy to take notes, I conducted the following interview with Michael via instant message. I edited it somewhat for length and comprehension. JF: I thought everything was turning around for Japan. The economy was growing again, the politicians were more dynamic, etc. You saying that isn't so? MZ: The conventional wisdom is that Japan is about to "turn the corner." That has been the line for some five years now. You say it's wrong? It is a very l-o-n-g corner. The surge of growth in China (on the investment side) and the U.S. (on the consumption side) has masked Japan's deep structural problems. Japan is very much a HIGH COST country in a LOW COST region. Japanese firms--the 90 percent that are domestic focused--are almost uniformly unproductive (by global standards) and unprofitable. Zero interest rates hide that problem. And consumption in Japan isn't really back? Consumers are pessimistic, worried about the collapse of the safety net, and getting older. The only reason SPENDING grew is that so many young people are out of work... Huh? Their parents have to help support them. There was a "modest" uptick in consumer spending last year. This was a reflection of the fact that Japanese had to dip into their enormous nest eggs because wages were no longer growing. We have 4 million PLUS young Japanese who only have part-time work. They earn about 15K a year, and their chances of getting hired mid-career are small. Wow, you're just the gloomy gloomster. Isn't anything positive happening over there? Positive? Well Toyota and Nissan are kicking ass...but that's overseas. But still, there's stuff going on--in terms of foreign investment, startups, etc.--that wasn't there 15 years ago, isn't there? Yes there is some more foreign direct investment. The corporate rules have been eased somewhat, corporate cross-ownership of shares has been reduced. But certainly there has been NO radical restructuring of the system, and the Japanese consistently demonstrate that they DON'T WANT any radical restructuring of their system. Those who say Japan is "turning the corner" also insist that Japan will "converge" and emerge like us. I say, it just ain't so. Take that, Tom Friedman. The world is not "flat." Tell me about your hikikomori buddies. I argue in the book that if you understand why about 1 million young men isolate themselves in their own homes, don't work or go to school, then you begin to understand why it is so hard for Japan to globalize, "play well with others," open their economy, etc. Those are the hikikomori? Yes, to be hikikomori is to become socially isolated, alone at home, not going out. I use the Japanese term because Japanese experts (and I, based on my research) believe this pathology does not exist in other societies. When you test these folks, they are not agoraphobic, schizo, psychotic, or anything like that. When you sit and talk to them, they are highly engaged, self-aware and creative young people. They COULD be creative if allowed to actually express their individuality and intelligence. But that is not what Japanese society wants, or expects. From infancy they have been taught to suppress their own true feelings and put on a mask to "get along with others" in Japanese group-dominated society. So basically, they're the people who don't have many friends in high school, who in the U.S. grow up to be superwealthy software entrepreneurs... That's one way of thinking about it. Steve Wozniak was probably not the most popular guy in h.s., but he had the courage and guts to go do his own thing. He wasn't "bashed" all the time and told to just be like everyone else. Bill Gates wasn't exactly Mr. Magnetism either as a kid. Entrepreneurship, risk taking and critical thinking are all in very short supply in Japan. Which is why it's been hard for Japan to move from the comfortable, gradual improvement world of the industrial society, to the chaotic, turbulent, paradigm-breaking world of the post-industrial world. Continuous improvement of door hinges or even Megaflops of a semiconductor chip only gets you so far. So is Japan just going to shrivel up and become irrelevant? Yes, that is clearly one possible outcome. When South Korea fell apart, the IMF had to write a check and helped force the ROK [Republic of Korea] to make some big changes. But the Bank of Japan could OWN the IMF. Who has the leverage? And there's no significant force in Japan pushing for a change in direction? Yes, but only at the margins. Young people in Japan have very little power. The push is thru rebellion: Young women refusing to have kids, or moving to Manhattan. Young men hiding in their rooms. Rising suicide rates. Because in Japan, as opposed to the U.S., you don't "Act Out" ... you "Act In." So basically it's Karel van Wolferen all over again? Well yes, and no. I am very sympathetic to Karel's book, obviously. But Karel wrote about a very different time. The Japanese Model was still successful then. Now we have had FIFTEEN years of bad economics and NO Rebellion, which demands some explanation. I actually discuss some psychological and social theory to explain this. For instance, do you know the argument about the "strength of weak ties"? Dunno This is the basis for conversations about "social capital," i.e, that we benefit from knowing many people, in many diverse fields, from different ethnic groups. etc.--having weak ties with many people. Like u and me Yup. The reverse is to be like a Mafia don, or a resident of North Boston. You ONLY hang out with the people you already know: from your own family, or your own "house." These sorts of people only have STRONG ties. They don't want weak ones. Japan is more like a Mafia family than two Western journalists who met at a cocktail party in Japan ten years ago, and discovered mutual interests. As long as you are IN a Network, you are fine. That's why being a COMPANY MAN is so important. But fall out of that network, and you are toast. No one will give you the time of day. Hikikimori want to live in a world like ours, but are trapped in a Mafia family... And so people like Ken Ohmae and Joi Ito are just totally anomalous? Or total outsiders? Most Japanese think Ohmae is a wack job. And Joi Ito represents a new class of global Japanese who trade on their "globalness" but have little clout or influence on the domestic scene. These are smart Japanese who have been POLLUTED by living and/or studying abroad. Once they leave the "famiglia" they can't really get back in. That's why they work for foreign firms, or just leave. A guy like Ito or Masayoshi Son can do okay for themselves in Japan. Quite well, even. But I wouldn't say they can "change" Japan. What does it say about society that after 15 years of DEFLATION...there still hasn't been a change in government??? So 15 years ago, Americans were all freaked out that Japan was going to take over the world economy. Now that they so clearly aren't doing that, should we be happy about it or sad? We should be sad, and worried. The world needs an engine of growth in Japan. We are too dependent on U.S. consumers to stimulate global expansion, and we should worry for the Japanese, who, as a society, are deeply unhappy and pessimistic. A question about South Korea: You mentioned already that the IMF forced change there. Are there other reasons why Korean society, which is similar in some ways to Japanese, has adapted so much better to 21st century commerce and life? YES. Of course ROK is somewhat smaller than Japan, and therefore less insulated. So when ROK gets flooded by a big wave, Japan just rolls along like a supertanker. But the HISTORY of Korea's modernization is different. Korea FOUGHT for democracy. Japan had democracy imposed by MacArthur. Korean society was constantly in upheaval because of colonization and war. And interestingly enough, one of the most salient forces that drove Korean modernization was the Protestant Church. Protestant missionaries came to Korea in the 1880s. These missionaries taught peasants to read. They built western-style hospitals. And they told Koreans to manage their own churches. So church groups in Korea consistently fought against Japanese occupation. They also helped build those "loose ties" that connected students and workers to protest martial law when Park was strongman. I argue that it is no accident that the Koreans were able to say "we screwed up. we have to change our system. we have to open up our country..." So it's Hooray Protestant Missionaries, Boo Shinto Priests!? Now that sounds like Jon Stewart... But yeah, I guess. No, it sounds like a Red Stripe beer ad. Sorry, we don't get them in S.F. UPDATE: For Joi Ito's reaction, either scroll up to the next post, or click here. Just who was getting those options? (Part 2)
Corey Rosen of the National Center for Employee Ownership sent me a comment on my options backdating posts that I figure deserves a post of its own. Take it away, Corey:
Justin Fox's suspicion about backdated options being more common for top executives is backed up by broader data. In a July study of 53 companies that were being investigated for backdating at the time, Jack Ciesielski of the Analyst's Accounting Observer found that 57% of the companies involved granted considerably more options to their top executives relative to other employees than did companies in general. The limited (but real) impact of the CEO
Do CEOs really matter? Back in 1972, an article by Stanley Lieberson and James O'Connor in the American Sociological Review contended that the identity of the chief executive mattered far less to corporate performance than which company he ran and which industry it happened to be in.
Ever since then, management scholars have been arguing about whether Lieberson and O'Connor were right. I think it's fair to say that the consensus now is that they weren't--CEOs do matter. Two interesting recent papers on the topic available online, for those of you who like footnotes and advanced statistics, are "The Good, The Bad, and the Lucky: CEO Pay and Skill," by Robert Daines, Vinay B. Nair, and Lewis Kornhauser, and "How Much Do CEOs Influence Firm Performance--Really?" by Alison Mackey. But it's also fair to say that academic research does not reveal CEOs to be the corporate superheroes we often portray them as in the business media. Consider the recent study of "superstar CEOs" by Ulrike Malmendier of the University of California at Berkeley and Geoffrey Tate of the University of Pennsylvania's Wharton School, who found that companies run by top executives who won awards handed out by the business press between 1975 and 2002 consistently underperformed the market after being honored. Malmendier and Tate argue, in part, that CEOs who are anointed as superstars neglect their jobs. I got into this subject while working on a story for the latest Fortune on "The CEO Stats That Matter." My initial faint hope was that maybe there was a Bill James of corporate statistics out there who had figured out what we should and shouldn't be focusing on. The closest thing is probably Jim Collins, whose statistic of choice is shareholder return, measured a decade or two after the CEO in question has retired. Using that standard, Collins identified 11 CEOs in his book Good to Great who steered their corporations to sustained leaps in stock performance. They were all self-effacing insiders who put their companies ahead of themselves and focused most of their early efforts on surrounding themselves with good people. That seems to be where leaders really can have an impact, by making incremental changes in the functioning of an organization. At least, that's how they can have a positive impact. "Good leaders can make a small positive difference," Stanford Business School's Jeffrey Pfeffer told me. "Bad leaders can make a huge negative difference-because they drive people out. If you said to me, 'Who can fix GM?' I don't know. But there's no question that someone could make an enormous difference on the downside." Pfeffer thinks the business media does a great disservice by portraying CEOs as "all-powerful deities" (his new book with Robert I. Sutton, Hard Facts, Dangerous Half-Truths, and Total Nonsense: Profiting from Evidence-Based Management, which includes a nice review of academic work on CEO impact, disapprovingly cites a couple of Fortune articles on this count). I think we're just trying to find ways to tell compelling stories that readers will finish, but of course he has a point. Large corporations are vast and complex entities, with customs and attitudes that are hard for any one leader to change. So why do we talk as if the CEOs are truly in charge--and more importantly, why do we pay them that way? When backdating is perfectly legit
Let me tell you about the nefarious scheme in place at some major American corporations until this year. Employees were allowed pick the lesser of the current stock price and the price a year before, then buy stock at a 15% discount from that lower price. The companies failed to report this clear transfer of value on their income statements, and the employees didn't pay taxes on it. What an outrage!
Except that, um, it was totally legal. Not just legal, but encouraged by government policy. The scenario outlined above was standard practice in the Employee Stock Purchase Plans (ESPPs) employed by lots of forward-thinking companies. Still is, except that now--under the accounting standard known as FAS 123R (a 295-page pdf of which is available here)--the value of the discount has to be reported as a compensation expense, unless it's less than 5% of the stock price. So what's the difference between this practice and the options backdating that has gotten so many CEOs, CFOs, and general counsels thrown out of their jobs in recent weeks? Well, for one thing, it was explicitly allowed in accounting standards and the tax code. As Corey Rosen, executive director of the National Center for Employee Ownership, put it when I tried the ESPP/backdating comparison out on him: "If it's okay for ESPPs it must be for options? No!" For another, ESPPs are invariably broad-based plans, while some of the companies engaged in backdating--as I pointed out in my post Tuesday--concentrated their options grants pretty heavily among top executives. But I still think there's a point here worth considering as we contemplate the backdating mess. A commenter to my Monday post on this topic wrote that "shareholders of any company that has participated in options back-dating should be allowed to back-date the sale of their shares at the highest price during the past year. What's fair for execs IS fair for shareholders too." Well, not really. Executives and other employees of a corporation do something crucial that outside shareholders don't: They work there. Motivating them, luring them from other companies, and keeping them from jumping ship are all important goals, and sometimes it may make sense to use backdated or repriced or otherwise juiced-up options or other stock-based compensation to achieve them. The problem is when companies don't fully report such grants, and don't take seriously the costs inherent in making them. But now all options grants have to be reported almost immediately, and their estimated value subtracted from earnings. So what's the problem? Yes, there are still concerns about how different companies estimate that value. But the accounting disconnect at the core of so many past problems with options--that certain kinds of stock compensation had to be subtracted from earnings and others, for no logical reason, did not--has been taken care of. Let the (fully disclosed and expensed) backdating resume! UPDATE: Corey Rosen had a very interesting comment on this post that somehow got garbled, so we've removed it from the comments below and posted it here. Also, my Fortune colleague Adam Lashinsky had something very different to say on this subject a couple months back. Just who was getting those options?
Several of the commenters to my post Monday about options backdating portrayed it as a case of top executives changing the rules to enrich themselves. But most of the companies caught up in the scandal so far are in the tech industry, which is known for lavishing options on employees far below the CEO paygrade.
Out of curiosity, I did a bit of sniffing through the proxy statements of a few of the backdaters to see how skewed their options grants were toward top management. As a control case, I looked at Cisco Systems, a legendarily generous options granter that has not been implicated in the backdating scandal. I focused on the 2000 proxy statements, detailing options granted in 1999, but in some cases I also looked at other years to make sure 1999 wasn't anomalous. What did I discover? Recently ousted United Health CEO William McGuire received 28.8 percent of the options granted to the company's employees in 1999. The top five executives as a group got 47.3 percent. Things didn't look quite so bad the next year, though, when McGuire got 8.1 percent and the top five 15.1 percent. At Broadcom, none of the top executives received any options in 1999. In 2000, the top nine executives got 2.2 percent of the options granted. Then-CEO Henry Nicholas, who owned 43 percent of the company, received no options either year. Brocade Communications CEO Gregory Reyes, who resigned in January and has since been charged with securities fraud, got no options in 1999, while the top five executives as a group got 5.24 percent. In 2000 Reyes got 3.4 percent and the top five executives 8.14 percent. At McAfee, then-CEO William Larson got 25.8 percent of the options granted in 1999 and the top five executives 73.5 percent. The next year Larson got none and the top five executives got 29.48 percent. At Apple, the top five executives as a group got 27.22 percent of the options granted in 1999. Steve Jobs became CEO in January 2000 and got a spectacular one-time grant of 20 million options, or 43.8 percent of the total, with the top five executives getting 46.43 percent total. By the next year things had settled down a bit, with Jobs getting no options and the top five executives as a group getting 11.48 percent. At CNET Networks, then-chairman Halsey Minor got 8 percent of the options granted in 1999 and recently departed CEO Shelby Bonnie 2 percent, with the top four executives getting 20.2 percent. At Comverse Technology, then-CEO Kobi Alexander got 6.4 percent of the options granted in 1999 and the top five executives 12.3 percent. At Mercury Interactive, since-ousted CEO Amnon Landan got 14.82 percent of the options granted in 1999 and the top four executives got 25.72 percent. And how did it go at our control company, Cisco Systems? In the company's 2000 fiscal year (which started in mid-1999), CEO John Chambers got 1.33 percent of the options granted and the top six executives 2.69 percent. In the 2001 fiscal year it was 2.01 percent to Chambers, 2.82 percent to the top executives as a group. This survey is nowhere near exhaustive enough to draw strong conclusions. But it is very interesting that every backdating company except Broadcom and to a certain extent Brocade was far more generous to top executives relative to the rest of its workforce than Cisco was. My readers appear to be onto something. The real options-backdating culprits
Almost every day there's another one, a top executive thrown out of his job for backdating options. These are some otherwise perfectly respectable people we're talking about: William McGuire at United Health, Shelby Bonnie at CNET, Andrew McKelvey at Monster. Even Apple's Steve Jobs has gotten tangled in the backdating web, although there are no signs that he'll lose his job over it.
When supposed wrongdoing is this widespread, one can't help but wonder: Are there really this many willful rule-breakers in corporate America, or did somebody change the rules on these guys in midstream? I'm tempted to lean ever-so-slightly toward the second answer. What was done was clearly against the rules, but those rules were until recently treated with such disdain in the business world and even by many investors that it's perhaps understandable that so many executives saw no harm in breaking them. First, a brief explanation of options backdating: Say your company's stock is trading for $15, and it gives you 100 options--expiring in 10 years--to buy that stock at $10 a share. So far, so good. As Holman Jenkins argued in The Wall Street Journal last week (not available online unless you have a financial relationship with Dow Jones & Co.), there's nothing intrinsically wrong with giving employees' in-the-money options. It's just like giving them restricted stock, or cash. What's wrong is reporting in a company's financial statements that the $10 options were granted at some time in the past when the stock happened to be selling for $10 a share. Until this year, options priced at the money (that is, with the stock trading for $10, you get an option to buy a share for $10) were considered free for accounting purposes--while an option granted in the money (with the stock at $15, you get an option to buy it for $10) was counted as a compensation expense. This accounting distinction was of course entirely loopy. When last I checked this afternoon, United Health stock was trading at $48 a share. Meanwhile, an option to buy a share of United Health for $50, expiring in Jan. 2009, was selling on the American Stock Exchange for $10. That is, even out-of-the-money options have value. In 1993, after long deliberation, the members of the Financial Accounting Standards Board--the people who determine what constitutes a General Accepted Accounting Principle--acknowledged this truth with a proposed accounting standard requiring that all employee options be valued with one of the mathematical models widely used in the options-trading world (the Black-Scholes model or the related binomial model). Then all hell broke loose. In what should go down as one of the most shameful episodes in modern business history, corporate America bullied FASB into backing down. Silicon Valley was loudest in its opposition, but all the big business groups joined in. Joe Lieberman was enlisted as the chief hatchet man on Capitol Hill (his more vocal allies included Bill Bradley, Barbara Boxer, and Phil Gramm), sponsoring a 1994 resolution--which passed 88-9--urging FASB not to change accounting for options, and making threatening noises about effectively shutting the board down if it didn't comply. There were and still are valid objections to the method FASB proposed for valuing options. It takes a fleeting estimate--the valuation set by the Black-Scholes or binomial model on the day the option is granted--and sets it in earnings-statement stone. But you can't make a serious accounting case for treating options as free, which is what most of FASB's opponents were after. So they couched their argument in economic terms: By motivating employees and aligning their interests with shareholders, options were promoting economic growth. Expensing options would thus hurt the economy, which made it a bad thing. The same argument can be made about expensing cash paychecks, of course, but that didn't seem to bother anybody at the time. This victory of politics over accounting logic had consequences. As Warren Buffett, a lonely voice in support of FASB back in 1994, told me in 2002: "Once CEOs demonstrated their political power to, in effect, roll the FASB and the SEC, they may have felt empowered to do a lot of other things too." Buffett was referring to the accounting shenanigans at Enron and Worldcom, but the connection to the options backdating scandal is much more direct. After the Enron and Worldcom meltdowns, the political climate shifted. More and more companies began expensing options voluntarily, and in 2004 FASB finally pushed through its rule. Starting this year, all options granted to employees have to be expensed. But the backdating offenses coming to light now (thanks to the work of University of Iowa business school professor Erik Lie) almost all predate 2002. They were committed back in a day when virtually every significant business organization in the country was arguing that options shouldn't be expensed, a view endorsed by the Big Six accounting firms (yes, there were six back then), Congress, and even a lot of big money managers. In such an environment, it wasn't all that out of line for the people at United Health and CNET and Monster and Apple and Comverse and Broadcom and Brocade to think tweaking the grant date of an option was a mere technicality. I am not saying don't blame them, blame society. I'm saying blame them and society--society in this case consisting of the American Electronics Association, the Business Roundtable, the big accounting firms, Joe Lieberman, you name it. The guilt is shared pretty widely here. You can't beat the (mobile) phone company
Telekom Austria CEO Boris Nemsic paid a visit to Fortune yesterday afternoon. He's a straight-talking engineer, originally from Sarajevo, who sports the only display of Don-Johnson-in-Miami-Vice scruff that I've ever seen on the CEO of a major corporation. (For some reason he shaved it for the photo that accompanies his official bio above; here's what it looks like.)
Telekom Austria runs the fixed-line network in Austria, and is also the country's leading mobile service provider (T-Mobile is No. 2). It also has major mobile operations in Bulgaria, Croatia, and Slovenia. And Nemsic had some interesting things about the continuing attempts by everyone from Nokia to Yahoo to Microsoft to Google to bypass wireless operators and horn in on the mobile market. "They're missing the crucial thing that the relationship to the customers is ours," he said. That is, it's the operators that send out the bills, sell phones in their stores, and take the calls when something goes wrong. A few years back, Nokia made a big push for users of its phones to sign up for Club Nokia, through which they'd get updates on Nokia products, chances to win swell prices, and access to customer support. That effort mostly fizzled, but now Nokia is again looking for ways to bypass wireless operators and build its brand among customers. "Now they feel powerful again," Nemsic said, then chuckled and held up his own Nokia N73. "The phones are really good." Nokia, he continued, "will always try" to get around the operators' close relationship with customers. "So will Microsoft and Google. BlackBerry is maybe the only one that succeeded with this. But the BlackBerry business model may have peaked: It's not normal to have a monopoly on push mail, because push mail is not rocket science." Billy Beane's successful crapshoot
It's important these days for journalists to own up to their biases, so let me just say up front that I'm a card carrying member of Athletics Nation. I've been an A's fan since 1971, not counting a brief and shameful dalliance with the Giants in the late 1970s (when Charlie Finley let his team go to seed, and my parents had friends with a Candlestick luxury box).
The A's are of course more than just a baseball team. They are an economic phenomenon: A low-budget squad that has made the playoffs five of the last seven seasons (and come close the other two). After Michael Lewis provocatively chronicled the team's success at identifying undervalued talent in the bestseller Moneyball, they also became a controversial phenomenon, with baseball traditionalists splutteringly condemning A's general manager Billy Beane's obsession with statistics and even Freakonomist Steven Levitt wondering if maybe Beane wasn't just lucky. An article in the summer 2006 Journal of Economic Perspectives by economists Jahn Hakes and Raymond Sauer (it's not available free online, but an earlier draft is) would seem to have put this debate to rest. Hakes and Sauer found strong empirical support for what they called the "Moneyball hypothesis"--that hitters' salaries "did not accurately reflect the contribution of various batting skills to winning games." (Essentially, batters who walked a lot were undervalued.) They also found that, after Moneyball came out in 2003, this labor-market inefficiency disappeared. Which brings us to yesterday's playoff game between the A's and the Twins. That the A's made it to the playoffs yet again would seem to be testimony to GM Beane's skill, even though he no longer really follows the Moneyball formula (the A's payroll has ballooned all the way up to 21st among the 30 major league teams, and the team's success this year has had more to do with pitching and defense than the ability to get on base). But the A's 3-2 win over the Twins in game one was about something else. Or someone else: Frank Thomas, whose two homers provided the margin of victory. In the past Beane has explained away the A's lack of postseason success with the excuse that short playoff series are a "crapshoot," not a real test of quality. This year he took a gamble on a gimpy former superstar whom no other team wanted. There's no way any statistical analysis could have told Beane that Thomas was likely to hit 39 homers and drive in 114 runs this season, then almost singlehandedly win the first game of the playoffs. It was a crapshoot, pure and simple, and yesterday it happened to pay off. There are surely all sorts of important lessons about investing and business to be learned from this. But forget that: There's another A's game on this afternoon. UPDATE: The crapshoot theme continued in game two, with the best defensive centerfielder in baseball betting that he could get to a line drive by the A's Mark Kotsay and turning out to be wrong. The resulting inside-the-park-homer provided the A's margin of victory. Also, the Sports Economist blog, which counts among its contributors the abovementioned Raymond Sauer, has a couple of interesting recent posts on the "Moneyball hypothesis." To 300 million and beyond
The population of the United States will pass 300 million this month, says the Census Bureau. Only about 220 more years, according to my deeply unscientific extrapolations of United Nations population projections, and we'll pass China!
China will start losing population towards the middle of this century, predicts the UN's Population Division. Japan and Europe will begin shrinking well before that. The most dramatic population losses will be in Eastern Europe, which already began its decline in the 1990s and is expected to lose another quarter of its current 295 million inhabitants by 2050. The world's population should peak sometime in the latter half of the century, at around 9 billion. Meanwhile the U.S. will keep chugging along, barring of course a major crackdown on immigration or a mass loss of interest in making babies. Some worrywarts, like the group Negative Population Growth, see this as cause for great alarm. NPG is launching a "Wake Up America" ad campaign keyed on the 300 million milestone to warn us of the dire consequences of population growth: Productive farmland is being paved over, fragile wetlands are falling victim to population pressures, urban sprawl is suffocating our cities and suburbs, and we are fast depleting the limited water and energy supplies we will need to survive as a strong and productive nation. Given how sparsely populated the U.S. is by the standards of much of Europe and Asia, I tend to think this kind of talk is nonsense. If we are in fact depleting our resources and letting sprawl suffocate our cities, it's because we're profligate and stupid, not because there are too many of us. Five people driving Priuses use up less gas than one in a Hummer H2. City apartments take up a lot less space and use lot less energy than exurban McMansions. And as this article in today's New York Times (registration required) points out, we're actually getting pretty good at controlling urban water use. Population growth isn't in and of itself an environmental disaster--and population loss certainly isn't turning Eastern Europe into an environmental paradise. Rapid population growth in poor countries usually involves a collision between tradition and medical advances that allow more children to survive childbirth, and is a temporary phenomenon. There's some of that at work in the United States among poorer immigrants. But this country's continued growth seems mainly to be a product of the optimism and economic opportunity that reign here and not in much of the rest of the developed world. Which is something to celebrate, not bemoan. How blogging makes my lunch taste better
I have no idea what the media landscape will look like a few years down the road. But I do know that blogs--or some even better means of self-publishing that hasn't been named yet--will only grow in importance. Not because I'm writing one, but because there are so many examples of the genre that fill now-obvious needs that weren't being filled at all by those of us in the traditional media.
The one I'm thinking of at the moment, because I ate lunch within the past couple of hours and I work in midtown Manhattan, is Midtown Lunch. It's written by an anonymous "fat man" who appears to work about a block south of me. He just started it a few months ago, and has already assembled a large catalog of reviews of the sort of non-expense-account lunch spots that almost never get written about anywhere else and certainly never get such thoughtful and entertaining treatment. (Today's post, for example, is a Yom Kippur-inspired "list of Midtown Lunch pork and shrimp worth going to hell for.") Economic value is being created here (so far it's all going to the restaurants and the midtown workforce, as Mr. Midtown Lunch isn't selling ads). The world is being improved. My lunches are better than they used to be. Hooray blogs!
CNNMoney.com Comment Policy: CNNMoney.com encourages you to add a comment to this discussion. You may not post any unlawful, threatening, libelous, defamatory, obscene, pornographic or other material that would violate the law. Please note that CNNMoney.com makes reasonable efforts to review all comments prior to posting and CNNMoney.com may edit comments for clarity or to keep out questionable or off-topic material. All comments should be relevant to the post and remain respectful of other authors and commenters. By submitting your comment, you hereby give CNNMoney.com the right, but not the obligation, to post, air, edit, exhibit, telecast, cablecast, webcast, re-use, publish, reproduce, use, license, print, distribute or otherwise use your comment(s) and accompanying personal identifying information via all forms of media now known or hereafter devised, worldwide, in perpetuity. CNNMoney.com Privacy Statement.
|
|