Only You Can Prevent Corporate Fires Crisis avoidance--not crisis management--might be one of the most underrated functions of a business. So why do companies constantly ignore it?
By Jeffrey Pfeffer

(Business 2.0) – During the past couple of years, the trend has become annoyingly familiar: Hundreds of American technology firms have rushed to India and other countries to hire inexpensive foreign workers for hardware and software engineering--and, by exporting those jobs, found an effective defense against economic recession here at home. On the face of it, the logic was impeccable: Indian engineers earned about 80 percent less than their U.S. counterparts. Dramatic cost savings seemed guaranteed.

But a funny thing happened on the way to the savings windfall--the law of supply and demand. As more and more companies have "offshored" jobs to India, engineers' salaries there have begun to soar (as much as 30 percent annually), costly turnover is on the rise, and for many U.S. firms, that once-enticing cost/wage differential has all but disappeared. The real shocker? That so many companies didn't see this coming--or didn't build the possibility into decision-making--before hopping on the offshore bandwagon.

I'm not suggesting that outsourcing to India portends sure failure. I am suggesting that far too many companies plant the seeds of failure by ignoring feedback processes--predicting the results of actions--that show how seemingly sensible decisions can yield bad results. Here's another example: Businesses doing product development (especially for software) routinely heap praise, recognition, and rewards on workers who rescue schedules and projects by pulling all-nighters and making extreme sacrifices. Makes sense--until you consider the consequences. In her book Finding Time, Harvard Business School professor Leslie Perlow shows that one of the biggest obstacles to reducing wasted company time on unproductive activities is the internal reward structure that so many executives still practice. That is, throw in a bonus and employee-of-the-month honors for the hero who debugged that awful software at the 11th hour, but pay no attention to his or her counterpart who launched a clean product on time without the fanfare or heroics. (Remember, forestry isn't the only business in which the people putting out the fires have an incentive to set them. Think about that the next time a manager proudly appoints himself or herself fire chief of a failing effort.)

Smart organizations, on the other hand, consider the longer-term effects of their actions. Some do this formally--Shell's forecasters create "scenarios" for world geopolitics stretching half a century forward--but as the preceding examples illustrate, all that's really required is a little thought. Some of the best ideas I've found come from Nelson Repenning, a management professor at MIT who posed a basic question: Why do so few organizational improvement efforts ever pay off, even when they're based on sound research and logic? The answer: In order to make things substantially better, you often have to make things worse in the short run. To upgrade manufacturing efficiency, you may need to redesign the production line and retrain employees--and while machines are being moved and people learn new skills, productivity goes down, not up. An apparently short-run, attractive alternative is to just push the same system harder--and while that might make things better for a while, eventually performance heads south because managers ignored fundamental problems. The "worse before better" or "better before worse" conundrum not only explains why improvement efforts often don't work but lays out the conditions under which they will.

My advice for avoiding such problems is pretty straightforward. It isn't very difficult to predict that if you, and everyone else, move operations to India, wages there are going to rise, and the advantage you enjoy today may not be around tomorrow. Nor does it take much subtlety to figure out that if you reward people for solving crises, you provide a tacit inducement to create more. Thinking more systematically about decisions may take a little extra time and effort, but it's far easier and infinitely smarter than racing toward the next fire.