As far back as I can remember, I’ve had an interest in how attitude and culture can influence lives, and often save them. I used to train airline crew on Human Factors and Crew Resource Management (CRM), and a few years ago I embarked on a PhD examining how behavioural, cultural, and ergonomic factors impact safety.

In his book Black Box Thinking: The Surprising Truth About Success, Matthew Syed offers a glimpse into two of the most important safety critical industries in the world: healthcare and aviation. These industries have differences in psychology, culture and institutional change, but the most profound difference is in their divergent approaches to failure.

Syed argues that success hinges, in powerful and often counter-intuitive ways, on how we react to failure. And failure is something we all have to endure from time to time.

In the airline industry, every aircraft is equipped with a black box, which records the conversations and sounds in the cockpit. If there is an accident, the box is located, the data analysed, and the reason for the accident excavated. This ensures that procedures can be changed so that the same event doesn’t happen again. In 2013, there were 36.4 million commercial flights worldwide carrying more than 3 billion passengers. 210 people died. For every one million flights on western-built jets there were 0.41 accidents – a rate of one accident per 2.4 million fights.

Aviation grapples with many safety decisions. New challenges arise almost every week. In March last year, the Germanwings plane crash in the French Alps brought pilot mental health into the spotlight. Industry experts accept that unforeseen contingencies may arise at any time that will push the incident rate up. But they promise that they will always strive to learn from adverse events so that failures are not repeated.

In healthcare, however, things seem to be quite different. It is reported that up to 98,000 Americans die each year as a result of preventable medical errors. But these statistics, while shocking, almost certainly underestimate the true scale of the problem. In 2013, a study published in the Journal of Patient Safety put the number of premature deaths associated with preventable harm at more than 400,000 per year. Categories of avoidable harm included misdiagnosis, dispensing the wrong drugs, injuring the patient during surgery, operating on the wrong part of the body, improper transfusions, falls, burns, and postoperative complications. This is the equivalent of two jumbo jets falling out of the sky every 24-hours.

In the UK, the numbers are also alarming. A report by the National Audit Office estimated that up to 34,000 people are killed per year due to human error. A study into acute care in hospitals found that one in every ten patients is killed or injured as a consequence of medical error or institutional shortcomings. French healthcare put the number even higher at 14 per cent.

We wouldn’t tolerate this degree of preventable harm in any other forum. Why then do so many mistakes happen? One reason is complexity. The World Health Organisation lists over ten thousand diseases and disorders, each of which requires different protocols. This complexity provides ample scope for mistakes in everything from diagnosis to treatment. Another problem is scarcity of resources. Doctors are often overworked and hospitals stretched; they frequently need more money. A third issue is that doctors often have to make quick decisions. With serious cases there is rarely sufficient time to consider all the alternative treatments.

But there is something deeper and more subtle at work, something that has little to do with resources, and everything to do with culture. Syed argues that many of the errors committed in hospitals (and in other areas of life) have particular trajectories, subtle but predictable patterns: what accident investigators call ‘signatures’. With open reporting and honest evaluation, these errors could be spotted and reforms put in place to stop them from happening again, as happens in aviation. But, all too often, they aren’t.

The vast majority of doctors and nurses are honest people. They do not go into healthcare to deceive people – they go into the profession to heal people. Informal studies have shown that many clinicians would willingly trade a loss of income in order to improve outcomes for patients. And yet, deep in the culture, there is a tendency for evasion through a series of euphemisms, such as ‘technical error’, ‘complication’, ‘unanticipated outcome’ – each of which provides an element of truth but not the entire truth. This isn’t about avoiding litigation. In fact, evidence suggests that medical negligence claims actually go down when doctors are open and honest with their patients.

However, many doctors experience an unhelpful need to look good and strive for perfection. Their status and power hierarchy often preclude them from gaining critical feedback, and junior doctors and nurses may experience a reluctance to speak up. Pilots were no different. That was until the United Airlines Flight 173 crash launched the CRM revolution in airline training in 1978 – the breakdown in cockpit management and teamwork led to the preventable loss of life and was the catalyst for culture change. Nowadays, pilots are encouraged to be open and honest about their mistakes. The industry has powerful, independent bodies designed to investigate crashes. Failure is not regarded as an indictment of the specific pilot who messes up, but as a precious learning opportunity for all pilots, all airlines and all regulators.

Of course, it’s entirely normal to have difficulty in accepting our own failures, be it in a presentation or on the golf course. But it turns out, failure to learn from mistakes has been one of the greatest obstacles to progress. Unfortunately, many companies have not yet established an open system in which these attitudes or patterns can be easily discussed and further consequences prevented. In most cases, they will be noticed by colleagues and whispered about or mentioned behind closed doors. Conventionally, errors are still stigmatised as deficits and associated with embarrassment, shame and fear.

Modern error management is different, and requires a different perspective. It accepts errors – and the reasons for them – as an unavoidable part of human behaviour. Sure, those who make mistakes may still become annoyed at themselves, but they need not fear ridicule or sanctions from others. Instead, they should try to analyse what led to the mistake and attempt to eliminate this to prevent future problems.

We should consider redefining our relationship with failure, as individuals, as organisations, and as societies. Instead of denying failure, or spinning it, aviation teaches us to learn from it. Confronting this could not only transform healthcare, but business, politics and much else besides.