If you’ve seen the movie or Saturday Night Live skit, you’re aware of special operations agent MacGruber. He is typically trapped in a control room with his two assistants attempting to dismantle a ticking time bomb with ordinary objects (paper clips, candy wrappers, etc). Unavoidably, MacGruber gets aggravated about a personal issue and the bomb explodes.
It’s easy to blame MacGruber for each failed mission, but this is the old view of understanding human error. Called the Bad Apple Theory, this outdated belief is predicated on the idea that people cause the problems. When we follow this path, we are saying the systems are perfect; the threat comes from the unreliability of those on our team. The Bad Apple Theory may at times be true, but more often, there’s a trail of resource limitations, inadequate preparation, and communication breakdowns leading to the mishap.
Assistant: Can you defuse [the nuclear warhead]?
MacGruber: Are you kidding me? Look at all this crap! There’s like a million wires in here. I’m more like a three wire guy!
Instead of focusing on the bad apples, research shows our time is better spent understanding the ‘new view’ of human error. The new view is based on the idea that human errors are symptoms of trouble within the system. Each system attempts to balance the contradictory goals that each participant is simultaneously pursuing. These competing demands create a cultural rift, clogging up the effectiveness of those involved.
Where the Bad Apple Theory leads to judgments and witch-hunts to find the blameworthy individual(s), the new view is more concerned with explaining the reasons for why decisions were made and determining how to avoid them in the future.
The point in learning about human error is not to…say what people failed to do. It is to understand why they did what they did, by probing the systematic, lawful connections between their assessments and actions, and the tools, tasks and environment that surrounded them.
MacGruber may not be the poster boy for avoiding human error – he did once let a bomb detonate because he was arguing with MacGyver over a can of Pepsi. Thankfully, you don’t have a team of MacGrubers. You work with result-oriented people who want to do a good job. So let’s not jump to the conclusion that their recent blunder was laziness or some form of idiocy. Learn why they made the less-than-stellar decision and correct the system. It may not be a clean as finding the guilty party, but there will be far less collateral damage.