Trial-and-error: the smart way ahead4 min read . Updated: 17 Jul 2011, 09:01 PM IST
Trial-and-error: the smart way ahead
Trial-and-error: the smart way ahead
Tim Harford, a columnist for the Financial Times, is out with his latest book, Adapt—Why Success Always Starts with Failure.
Harford, who writes the agony aunt column Dear Economist, makes a strong case for an adaptive trial-and-error method as the smart approach to solving everyday economic problems. Adapt is peppered with examples from psychology, the study of evolution, anthropology, physics, business and economics.
Harford’s earlier books include The Undercover Economist, The Logic of Life and a collection of his columns titled Dear Undercover Economist. We spoke to him about Adapt and why he rates it higher than ready-made solutions. Edited excerpts from an email interview:
You suggest improvising and taking baby steps instead of great leaps forward? Economists Abhijit V. Banerjee and Esther Duflo, in their 2011 book ‘Poor Economics’, too suggest smaller interventions rather than grand solutions to help the poor. Why small steps?
However, randomized trials are not the only way to do this. Adapt describes all kinds of different approaches—and all kinds of different obstacles—to the problem of informally experimenting and adapting.
You talk about being tolerant and learning from failure. But what about sectors where the consequences of failure are far too great to allow any trial and error?
Good question. It turns out there are some important parallels between banking and other fragile systems such as nuclear power or industrial processes. I spoke to safety experts—engineers, psychologists and even sociologists—who had studied disasters such as Three Mile Island, Bhopal and Deepwater Horizon. There are various important principles that carry over from these disasters to the financial crisis of recent years. The most important one is that failures will happen, and your systems must be built with that expectation in mind: Simplify, decouple, and listen to whistle-blowers who may be able to provide early warnings.
Tell us about the importance of conflict and how organizations can learn from it.
Again, failure here is inevitable. Napoleon, perhaps the greatest general in European history, committed a catastrophic error in marching his armies to Moscow. Winston Churchill’s Gallipoli campaign in World War I was a poor omen for his leadership in World War II. President Kennedy showed great judgement during the Cuban missile crisis—and appalling misjudgement preparing for the Bay of Pigs invasion. Mao Tse-Tsung was a visionary insurgent general but an appalling peacetime leader. Nobody has the skills to get things right every time in a complex and ever-changing world. Therefore, success in war must involve a healthy dose of error correction. I studied the disastrous invasion of Iraq—and the painful change in strategy— and discovered that the US army only managed to change direction due to a near-insurgency inside its own ranks. Many organizations can learn lessons here: It is often the people on the front line who understand the mistakes an organization is making, and can fix them.
What are feedback loops and what is their role in designing better systems, or in the selection process of finding out what works and what doesn’t?
What is your view on peer monitoring and the “worm’s-eye view" approach to it?
The “worm’s-eye view" is a phrase coined by Muhammad Yunus, creator of the Grameen Bank. The idea is that from the top of an organization you often feel that you have a wonderful bird’s-eye view of what is happening, but the truth is that you’re not really seeing anything at all with clarity. The worm’s-eye view is what you get when you get really close to a problem—and is often essential for fixing problems.
Related is the idea of peer monitoring. Because—for all the reasons discussed—problem solving often requires decentralization and delegation to junior decision makers, the question then becomes: Who makes sure they are being professional and making wise choices? Not the bosses, who cannot clearly see the issues involved. It’s close colleagues: peers. They’re the people who know whether a decision was right or not. And that’s why peer monitoring is so essential in companies such as Google, but also lesser-known companies such as the supermarket Whole Foods or the British shoe-repair chain Timpson. Peer monitoring is essential in high-risk systems such as a nuclear power plant, as I discovered on my own tour of a nuclear power station. And it was critical in the US turnaround in Iraq, when successful strategies, pioneered on the ground not by generals but by colonels, were ignored by senior officers but quickly copied by more junior soldiers.
You talked of creating your own safe spaces to experiment. How does safety and experimentation coexist?
Because failure is so common it’s essential that the costs of failures are contained—a central principle in the regulation of financial systems.
Success, even of great companies, seems transient. Tell us briefly how adapting brings some order to such chaos and uncertainty.
The failure rate of American businesses is 10% per year. That’s tremendously high—far higher, for instance, than the failure rate of Americans! But this is essential for the process of adapting: This failure rate is what economic growth looks like, especially for a mature economy. Bad ideas are being replaced with good ideas. Good ideas are being replaced with better ones. It’s what progress looks like.