When I Was Wrong

When I Was Wrong, by Mark Chussil

This essay starts with a shocking pricing tournament and proceeds to the challenges faced by President-elect Obama and the titans of industry.

You’d have every reason to expect me to do well in that pricing tournament. With over 30 years in competitive strategy, a global roster of brand-name clients, award-winning and patent-pending (update: patented) simulation designs, a slew of publications, and an MBA from a well-known Eastern business school, I look like a good bet. And that’s even before I reveal my unfair advantage, to wit, that I wrote the simulator for the tournament. (I planned not to include my results in the official tournament tally.)

Imagine my surprise when my performance in my own tournament ranged from sort-of-good to what we will generously call below average.

(Does that performance mean you should expect me to do poorly in the future? We’ll get to that. The short answer is no.)

One hundred and fifty able strategists participated in the tournament. (Update: as of May 2014, over 550 able strategists have participated. The lessons in this essay still hold.) My sobering experience was shared by most of them. Pricing specialists, high-end consultants, senior strategists, experienced managers: all, like me, tried to do well; all, being smart and experienced and credentialed, expected to do well; most, like me, didn’t do well. We were wrong, and we were surprised.

When I analyzed the results — in effect, it was a massive business war game with millions of pricing simulations played in a computer, which ACS calls a decision tournament — I figured out what I’d done wrong. It had nothing to do with decimal points, life experience, industry knowledge, or general smartness. It had everything to do with assumptions I made. Those assumptions became clear because I got to see what other people actually did, as opposed to what I assumed they’d do. In the best tradition of the scientific method, the pricing tournament let me stress-test my strategy/hypothesis. It was a safe environment where I could learn.

About whether you should expect me to do poorly in the future. Yes, if I were to implement my tournament strategies in real life, I probably would perform badly. Except that I wouldn’t implement those strategies in real life. Why not? Because I had the benefit of the tournament. I learned that my strategies wouldn’t work and I learned what would work better.

Now let’s translate that pricing-tournament experience to challenges we face in government and industry.

We have great debates about the great issues of the day. To some extent those debates are about personal values; for instance, how we value personal responsibility versus safety nets. Perhaps to a greater extent, though, those debates are about what works. The financial crisis is leading many people to shift from trust-the-free-market toward we-need-more-regulation not because of ideological soul-searching but because of the painful evidence that deep deregulation didn’t work.

No one gets up in the morning intending to make bad decisions. Those who proposed deep deregulation did not expect their proposals to fail any more than I expected my pricing-tournament strategy to fail. Yet despite good intentions bad decisions get made, and they can be hard, costly, and time-consuming to reverse. In businesses, in a country, in a world with shrinking room for error, it is imperative that we waste less time and money being wrong.

Some problems are big without being tough. Although fixing Social Security involves massive numbers, it is not a tough problem; it is well-understood that the solution contains some combination of changes to benefits, contributions, and eligibility. The real issue is about collecting the political will to act. By contrast, challenges such as the financial crisis (relevant for industry and government), climate change (mostly for government), and desperate competition (mostly for industry) are tough because the problems and the proposed solutions involve sequences of actions and reactions among human beings. Statistics, trend lines, and historical analogies are the wrong tools for such problems. They do not solve such problems any more than they would solve a game of chess or the pricing tournament.

Sometimes it looks as though the wrong tools are working, in the sense that they make predictions that come true. When that happens, though, it’s more about lucky data than good predictions. Extrapolating the past into the future works just fine when the future behaves like the past. When the future is different — with, say, a financial crisis in a hyper-interconnected world, or climate change that can literally change the planet — there is no relevant past to extrapolate. That’s where simulation comes in.

This essay is not a product review of the many simulation technologies that can help guide governments and industries through tough challenges. Google finds “about 14,600,000 results” (Update: by May 2014 it had grown to 79,600,000) on a search for “simulation technology,” which would make a review more than I can do today. Rather, it is about the general value of simulation as a guide and as a way to get value from being wrong.

When you began to read this essay, you might have focused on the “I was wrong” part of the title. Let’s attend to the “when” part for a moment. The issue is not whether I, or you, or President-elect Obama, or the CEO of Whatever Inc., will be wrong. All of us are human and all of us make mistakes. What’s important is when we make our mistakes.

It is far cheaper, in every sense of the word, to make our mistakes when we’re in safe environments than to make our mistakes when we’re playing for real. For instance, the person who won the pricing tournament (a safe environment) used a strategy that many people would regard as risky. If it failed, we’d press the reset button on the simulation; if it succeeded, we’d gain more confidence in the path previously considered risky. For another instance, simulations revealed design flaws in DC-10 aircraft. (Unfortunately, the simulations were run after the 1979 crash in Chicago, not before.) For yet another, I’ve seen thousands of experts use business war games and crisis simulations to test their ideas and improve their skills in ways they couldn’t and shouldn’t do with real money, careers, and lives at stake. And there are about 14,599,997 (Update: or 79,599,997) other stories to be told.

We face big problems and we need big, creative solutions. I hope our leaders in government and industry will stress-test their ideas with simulation, where it’s safe, so that when it counts we won’t be wrong.

Update, October 22, 2012. “I’m Right! (For Some Reason),” by Steven Sloman and Philip M. Fernbach in The New York Times, discusses thinking things through in political advocacy and decision-making. I think the same forces are at work in business. Their diagnosis and solution express exactly why business war games and strategy simulation work. From their article: “…most people would agree that it is not productive to have a strong opinion about an issue that one doesn’t really understand. We have a problem in American politics: an illusion of knowledge that leads to extremism. We can start to fix it by acknowledging that we know a lot less than we think.”

Share This Comment

Comments
No Comments