Learning from failure

failure_31946

Imagine you are a teacher and you decide to try an innovative teaching technique. However, it goes horribly wrong. The technique didn’t work the way you expected, and furthermore numerous students make complaints to your supervisor. Luckily, your supervisor is sympathetic to your efforts and your job is secure.

What do you do next?

  1. Avoid innovative techniques: they’re too risky.
  2. Keep innovating, but be much more careful.
  3. Tell a few close colleagues so they can learn from your experience.
  4. Write an article for other teachers telling what went wrong, so they can learn from your experience.
  5. Invite some independent investigators to analyse what went wrong and to write a report for others to learn from.

The scenario of innovative teaching gone wrong has happened to me several times in my decades of teaching undergraduates. Each time, through no particular fault of my own, what I attempted ended up disastrously. It even happened one time when I designed a course that worked brilliantly one year but failed miserably the next.

dont-be-afraid-to-fail

So what did I do? Mainly options 2 and 3: I kept innovating, more carefully, and told a few colleagues. I never imagined writing about these teaching disasters, even using a pseudonym, much less inviting others to investigate and publish a report. It would be humiliating, might invite additional unwanted scrutiny, and might even make innovation more difficult in the future.

Aviation: a learning culture

These thoughts came to mind as a result of reading Matthew Syed’s new book Black Box Thinking. The title refers to the flight recorders in commercial aircraft, called black boxes, that record data about the flight, including conversations among the pilots. When there is a crash or a near miss, these boxes are vital for learning from the failure. Rather than automatically blaming the pilots, an independent team of experts investigates accidents and incidents and publishes its findings so the whole industry can learn from what happened.

blackbox

Some of the greatest improvements in aircraft safety have resulted from studies of disasters. The improvement might be redesigning instruments so confusion is less likely or changing protocols for interactions between pilots. One important lesson from disasters is that the flight engineer and co-pilot need to be more assertive to prevent the pilot from losing perspective during tense situations. The investigations using black-box information occasionally end up blaming pilots, for example when they are drunk, but usually the cause of errors is not solely individual failure, but a combination of human, procedural and technical factors.

Cover-up cultures: medicine and criminal justice

Syed contrasts this learning culture in aviation with a culture of cover-up in medicine. There is a high rate of failure in hospitals, and indeed medical error is responsible for a huge number of injuries and deaths. But, as the saying goes, surgeons bury their mistakes. Errors are seldom treated as opportunities for learning. In a blame culture, everyone seeks to protect their jobs and reputations, so the same sorts of errors recur.

Syed tells about some hospitals in which efforts are made to change the culture so that errors are routinely reported, without blame attached. This can quickly lead to fixing sources of error, for example by differently labelling drugs or by using checklists. In these hospitals, reported error rates greatly increase because cover-up is reduced, while actual harm due to errors drops dramatically: fewer patients are harmed. Furthermore, costs due to patient legal actions also drop, saving money.

medical-error

So why don’t more hospitals follow the same path? And why don’t more occupations follow the example of aviation? Syed addresses several factors: cultures of blame, excess power at the top of organisations, and belief systems resistant to testing.

In the criminal justice system, one of the most egregious errors is convicting an innocent person of a crime. Police and prosecutors sometimes decide that a particular suspect is the guilty party and ignore evidence to the contrary, or don’t bother to find any additional evidence. Miscarriages of justice are all too common, yet police, prosecutors and judges are reluctant to admit it.

In some cases, after a person has been convicted and spent years in jail, DNA evidence emerges showing the person’s innocence. Yet in quite a few cases, the police involved in the original investigation refuse to change their minds, going through incredible intellectual contortions to explain how the person they charged could actually be guilty. Syed comments, “DNA evidence is indeed strong, but not as strong as the desire to protect one’s self-esteem.” (p. 89)

Black boxes

When I heard about Black Box Thinking, I decided to buy it because I had read Matthew Syed’s previous book Bounce, about which I wrote a comment. Syed was the British table tennis champion for many years and became a media commentator. Bounce is a popularisation of work on expert performance, and is highly engaging. In Black Box Thinking, Syed has tackled a related and broader subject: how to achieve high performance in collective endeavours.

matthewsyed_13056-355-speaker
Matthew Syed

The title had me confused at first, because in other disciplines a black box refers to a system whose internal mechanisms are hidden: only inputs and outputs can be observed. In contrast, flight recorders in aircraft, which actually are coloured orange, not black, are sources of information.

Syed’s book might have been titled “Learning from failure,” because this is the theme throughout his book. He presents stories from medicine, aviation, business, criminal justice, sport and social policy, all to make the point that failures should be treated as opportunities for learning rather than assigning blame. Individuals can heed Syed’s important message, but bringing about change in systems is another matter.

Another theme in the book is the importance of seeking marginal gains, namely small improvements. Syed tells about Formula One racing in which tiny changes here and there led to superior performance. Another example is when the company Unilever was manufacturing soap powder – laundry detergent – and wanted to make the powder come out of the nozzle more consistently.

first-nozzle
Unilever’s initial nozzle

Unilever hired a group of mathematicians, experts in fluid dynamics and high pressure systems, to come up with an answer, but they failed. Unilever then hired a group of biologists – yes, biologists – who used a process modelled on evolution. They tried a variety of designs and determined which one worked best. Then they took the best performing design and tested slight modifications of it. Applying this iterative process repeatedly led to a design that worked well but never could have been imagined in advance.

last-nozzle
Unilever’s final nozzle, after 45 trial-and-error iterations

Learning from mistakes in science

Syed presents science as a model for learning from error, seeing the experimental method as a great advance over adherence to dogma. Science certainly has led to revolutionary changes to human understanding and, in tandem with technology, to dramatic improvements in human welfare, as well as to unprecedented threats to human life (nuclear weapons and climate change). However, Syed notes that science students mainly study the latest ideas, with little or no time examining “failed” theories such as aether or astrology: “By looking only at the theories that have survived, we don’t notice the failures that made them possible.” (p. 52).

Even so, overall Syed’s view of science is an idealistic image of how research is supposed to work by continually trying to falsify hypotheses. Historian-of-science Thomas Kuhn argued in The Structure of Scientific Revolutions that most research is problem-solving within a framework of unquestioned assumptions called a paradigm. Rather than trying to falsify fundamental assumptions, scientists treat them as dogma. Sociologist Robert Merton proposed that science is governed by a set of norms, one of which is “organised scepticism.” However, the relevance of these norms has been challenged. Ian Mitroff, based on his studies, proposed that science is equally well described by a corresponding set of counter-norms, one of which is “organised dogmatism.”

bauer-dogmatism

Although science is incredibly dynamic due to theoretical innovation and experimental testing, it is also resistant to change in some ways, and can be shaped by various interests, including corporate funding, government imperatives and the self-interest of elite scientists.

Therefore, while there is much to learn from the power of the scientific method, there is also quite a bit that scientists can learn from aviation and other fields that learn systematically from error. It would be possible to examine occasions when scientists were resistant to new ideas that were later accepted as correct, for example continental drift, mad cow disease or the cause of ulcers, and spell out the lessons for researchers. But it is hard to find any analyses of these apparent collective failures that are well known to scientists. Similarly, there are many cases in which dissident scientists have had great difficulty in challenging views backed by commercial interests, for example the scandals involving the pharmaceutical drugs thalidomide and Vioxx. There is much to learn from these failures, but again the lessons, whatever they may be, have not led to any systematic changes in the way science is carried out. If anything, the subordination of science to powerful groups with vested interests is increasing, so there is little incentive to institutionalise learning from disasters.

edison-on-failure

Failure: still a dirty word

Although Syed is enthusiastic about the prospects of learning from failure, he is very aware of the obstacles. Although he lauds aviation for its safety culture, in one chapter he describes how the drive to attribute blame took over and a conscientious pilot was pilloried. Blaming seems to be the default mode in most walks of life. In politics, assigning blame has become an art form: opposition politicians and vulnerable groups are regularly blamed for society’s problems, and it is a brave politician indeed who would own up to mistakes as a tool for collective learning. In fact, political dynamics seem to operate with a different form of learning, namely on how to be ever more effective in blaming others for problems.

blaming

I regularly hear from whistleblowers in all sorts of occupations: teachers, police, public servants, corporate employees and others. In nearly every case, there is something going wrong in a workplace, a failure if you want to call it that, and hence a potential opportunity to learn. However, organisational learning seems to be the least likely thing going on. Instead, many whistleblowers are subject to reprisals, sending a message to their co-workers that speaking out about problems is career suicide. Opportunities for learning are regularly squandered. Of course, I’m seeing a one-sided perspective: in workplaces where failure does not automatically lead to blame or cover-up, there is little need for whistleblowing. When those who speak out about problems are encouraged or even rewarded, no one is likely to contact me for advice. Even so, it would seem that such workplaces are the exception rather than the rule.

The more controversial the issue, the more difficult it can be to escape blaming as a mode of operation. On issues such as abortion, climate change, fluoridation and vaccination, partisans on either side of the debate are reluctant to admit any weakness in their views because opponents will seize on it as an avenue for attack. Each side becomes defensive, never admitting error while continually seeking to expose the other side’s shortcomings, including pathologies in reasoning and links to groups with vested interests. These sorts of confrontations seem designed to prevent learning from failure. Therefore it is predictable that such debates will continue largely unchanged.

Although the obstacles to learning from failures might seem insurmountable, there is hope. Black Box Thinking is a powerful antidote to complacency, showing what is possible and identifying the key obstacles to change. The book deserves to be read and its lessons taken to heart. A few courageous readers may decide to take a risk and attempt to resist the stampede to blame and instead foster a learning culture.

black-box-thinking

“The basic proposition of this book is that we have an allergic attitude to failure. We try to avoid it, cover it up and airbrush it from our lives. We have looked at cognitive dissonance, the careful use of euphemisms, anything to divorce us from the pain we feel when we are confronted with the realisation that we have underperformed.” (p. 196)

Brian Martin
bmartin@uow.edu.au