I Didn’t Know You Could Do That: Disaster in simulations leads to progress in real life, by Mark Chussil
I’m a simulations kind of guy, and not just for business. In this essay I’m going to talk about both business and crisis simulations.
A large pharmaceuticals company in Europe used a business war game to help them develop their strategy for a product launch. We divided their management groups into teams to role-play the competitors, and we set up three teams to play their own business. Why three teams for their own business?, they asked. They thought the strategy decision for their own business was pretty clear.
As it turns out, the three home-team teams came up with three very different strategies. That was a major insight from the war game: they didn’t know they had (at least) three options. It was premature to fine-tune an “obvious” strategy (which wasn’t so obvious after all).
I’ve seen similar behavior in crisis simulations. Here’s a true story from an actual simulation conducted by a company I co-founded a few years ago, Crisis Simulations International.
A bomb has gone off in an American city, severely damaging a key bridge, hurling cars into the river below and derailing a train. The train was carrying poisonous chlorine gas, among other things. You are the mayor. You don’t know if the chlorine is leaking. You don’t know if another bomb has been planted, ready to kill rescue workers. You don’t know whether to believe confused, contradictory, maddeningly incomplete reports flooding in from all directions. Simulated TV broadcasts are blaring. Fleeing people may quickly gridlock the city, making it impossible for rescuers to get through. You have a variety of decisions to make. Many lives, the safety of your city, and your political career are at stake.
At key points during the simulation we asked 100 assembled experts, in real time, what they would decide for the simulated mayor, police chief, fire chief, and so on. They would have several starkly different options at each decision point, such as whether or not to pull back rescuers when a suspicious box was discovered near ground zero. The experts had electronic devices that let them vote quickly and anonymously.
The 100 experts were nowhere near unanimous on any of the questions we posed. On one question, the experts split almost exactly evenly on the four possible decisions.
Let’s leave aside the disturbing question of crisis preparedness. Instead, let’s mention three points about the experts’ decisions.
- Each of them selected what he or she thought was right.
- Most of the other experts disagreed.
- Someone else, making assumptions about what others had decided, would probably assume wrong.
People do make assumptions about what others will decide. The people actually going through the computer-based simulation are usually in the same room, and yet they don’t talk to each other about how to coordinate their decisions! They make their decisions based on what they think others will do, and the odds are distressingly high that they’ll be wrong.
It doesn’t take much time to find out the things you can do that you didn’t know you could do. The whole business war game took just one day, and the crisis simulation took a few hours. You don’t even have to run a business war game or a crisis simulation to get benefits like those. (See, there’s something else you can do that you might not know you could do.)
- Broaden your view before you talk about specific action. Ask questions such as what you would do if you discovered your first choice was placed completely out of bounds.
- Treat the word “obvious” not as validation but rather as a warning flag. Or perhaps as an opportunity flag, because you may uncover an option that your competitors haven’t seen.
By the way, it doesn’t have to be grim and serious. In my experience, management groups do their best work when they’re having fun with a challenge.
[…] Further reading: I Didn’t Know You Could Do That. […]