In both aviation and medicine, mistakes can cost lives. Yet while the aviation industry has spent decades learning how to learn from failure, healthcare still often fears it.
I sat down with my dad, Chris Tomlinson – a veteran pilot with more than 25 years experience in the RAF and civilian aviation, to talk about what medicine might gain from “black box thinking” and just culture – a culture that turns error into insight, and insight into safer systems.
Our conversation has been adapted for readability.
Editor’s note: “Black Box Thinking” is the title of a book by author and journalist Matthew Syed – in which he explores the power of learning from failure, and strategies to create a better culture for high performance teams. If you haven’t read it already, it’s well worth a read! Click here to find out more.
First of All – Can You Tell Me a Bit About Yourself and Your Career?
Chris: I’ve worked in aerospace all my life – joining the University Air Squadron in the late ’80s, 25 years in the RAF, and then as a civilian pilot flying both helicopters and fixed-wing aircraft. Recently, I’ve been more involved in defence and security.

Charlotte: Today I thought we could talk about something that’s important in both medicine and aviation – the idea of “black box thinking.” What does that mean?
Chris: It’s a concept explored in Matthew Syed’s book. An example pertinent to this book is that of Captain Martin Bromiley, whose wife (Elaine) died during a routine operation. It was a loss that could’ve been avoided. Black box thinking is about an organisation’s ability to learn from mistakes and improve, and the culture needed to do that.
Editor’s note: Elaine Bromiley died during a routine NHS operation in 2005 due to difficulties securing her airway. Despite warnings from theatre nurses, emergency measures weren’t taken, and prolonged low oxygen caused irreversible brain damage. She passed away 13 days later. An inquiry later attributed her death to failings in leadership, decision-making, communication, and teamwork among clinicians.
Can You Share a Moment from Your Flying Career Where Human Error Became a Powerful Learning Opportunity?
Chris: Powerful – that makes it a bit trickier! I can think of a few examples. I remember flying a helicopter once in poor weather when the blade tape got damaged, causing severe vibrations through the aircraft. I’d seen it before, but the two people I was flying with hadn’t.
Charlotte: For my understanding – what exactly is blade tape?
Chris: This was an old military helicopter, and the leading edge of each rotor blade had sacrificial tape on the leading edge to protect it. If it splits – as it did over two or three feet – the airflow is disturbed and it causes vibrations.
Charlotte: Ah, that makes more sense! And what happened next?
Chris: There was a sort of two-to-one situation: they wanted to return through bad weather, but I was confident I knew the problem. As captain, I had to make a choice, and I elected to go against the rest of the crew.
It was tricky, but we learned a lot from that experience. When we got back and examined the problem, it turned out I’d made the right decision – but that’s not always the case.
Even more recently, I had an issue flying a King Air single-pilot – keeping the aircraft straight on take-off. After discussing it with a colleague, it dawned on me that the correction I’d been making was actually making things worse. I could have said nothing, but talking it through helped me find the solution.
How Did the Concept of Learning from Failure Evolve in the Aviation Industry?
Chris: Interesting question. I think it evolved out of necessity. After the Second World War, as aeroplanes began transporting more people, we were losing aircraft – a classic example being the first jet airliner, the Comet, that eventually became the Nimrod in military service and flew with the RAF into the 2000s.
Charlotte: Why was that?
Chris: One of the key reasons was simply because the windows were square. Aircraft are pressurised at high altitude so that we can continue to breathe. Square windows would just pop out, depressurising the aircraft. The solution was oval windows, which is much more secure.
Editor’s note: This is due to something called “stress concentration” at the corners of square windows, creating weak points that developed cracks over time from repeated pressurisation, eventually leading to catastrophic failure.
Chris: So yes, evolution came from necessity. Early planes relied heavily on pilot skill, and the RAF in the 1950s and 60s was losing aircraft regularly. Air travel simply couldn’t have grown into what it is today, if we were still losing aircraft, and people’s lives, at the rate we were back then.
Today, when modern aircraft crash, investigations are much more sophisticated – they look right back into the design of the system itself to identify root cause or error.

Editor’s note: The next section talks about human factors – the organisational, individual, environmental, and job characteristics that influence behaviour in ways that can impact safety. For example, stress and workload.
In Medicine and Aviation, We Talk About Human Factors. How Are They Addressed?
Chris: Through Crew Resource Management (CRM). It started in the late ’70s, and I noticed it when I started my career in the ’90s. We were focusing on individuals – their character, personality, risk attitudes, and how they respond under stress.
For example, if you get a warning light mid-flight, how you respond depends on your training and stress response. Aviation trains crews to manage that effectively.
Charlotte: I think it’s not just about responding to stress, though. It’s responding to mistakes, to feedback.
Chris: Yes, and debriefing is a key part of that. In the military, everything is debriefed – people are brutally honest with each other, you can’t hide. I trained as a fast jet navigator – I didn’t go on to fly fast jets – but I’ve worked alongside formations involving fast jets and rotary aircraft. They talk about “winning the debrief” – representing yourself and the team fairly, because you’ve got to learn.
CRM isn’t just for pilots – it extends to all staff interacting with the system, from ground crew to air traffic control – because every element could induce the potential for errors that can affect safety.
Charlotte: That ties into the Swiss cheese model, doesn’t it?
Chris: Yes – Professor James Reason is behind that model – his daughter, actually, was in the University Air Squadron back in the day. Think of Swiss cheese – every layer (which would represent elements like staff training, safety protocols, equipment design, and so on) is a barrier to error. An accident occurs only if all holes align. CRM spreads that thinking across the organisation so everyone understands their role in safety.
In Hospitals, We Have “Datix” Reporting for When Mistakes Are Made. The Associations With It Can Be Very Negative. Is Aviation Similar?
Chris: I think there’s a big difference. What you’re describing sounds a lot like what we call occurrence reporting.
In aviation, the goal is to encourage reporting, not to assign blame. We see it as a way to learn from mistakes – and more importantly, to stop them happening again.
If the culture’s right, you get what’s called a “just culture“. It’s the opposite of a blame culture, but it’s not “blame-free.” If someone deliberately does something wrong, like an act of sabotage, there are consequences.
But generally, a just culture encourages people to report errors without fear of losing their job. If I can admit a mistake and share it so the organisation can learn, I’m more likely to report it.
When that reporting culture works, the safety management system gets the data it needs, and that feeds a learning culture. Just culture, reporting culture, learning culture – they all come together, and that’s how you keep improving.
Culture is hard to build, hard to change, and easy to break. You need leadership to commit to it, to support reporting, and therefore learning. Any good aviation company will have that.
You Refer to a Safety Management System (SMS) in Aviation. Can You Tell Me More About What This Means?
Chris: It is a systematic approach to managing risk. There are four pillars to it. I’ll see if I can remember them all!
The first is Safety Policy. That’s written from the top of the organisation and sets the conditions for everything that follows. It’s about leadership demonstrating their commitment to safety.
Then there’s Safety Risk Management, which is essentially how you approach and manage risk. The idea is to move from being reactive to being proactive – looking ahead to identify things that could happen, as opposed to responding after something’s gone wrong.
As the system matures, it becomes more predictive – able to anticipate risks before they even develop.
The third pillar is Safety Assurance. That’s essentially about auditing your system – checking how you’re processing reports to make sure that real learning is happening.
And then finally, there’s communication, or Safety Promotion. That’s about making sure people are aware of what’s going on: hearing what the system is learning from errors and mistakes, but also hearing about what’s being done well.
All of that contributes to the overall safety picture.
So, What Can Health Professionals Learn From Pilots About Teamwork, Communication, and Hierarchy – Especially in High-Pressure Environments?
Chris: There used to be an old-fashioned model of medicine – hopefully not so much now – where the consultant was almost seen as “God”. Whatever they said went, and no one questioned it. The same thing used to happen in aviation.
I remember an old video in early management training showing a three-man flight deck: a captain, co-pilot, and flight engineer. The captain and engineer were joking, saying things like, “What’s the difference between a co-pilot and a duck? Ducks can fly.” It sounds funny, but it wasn’t – because that attitude meant the co-pilot was ignored when he raised a problem, and the aircraft nearly crashed.
So what can medicine learn?
There’s always a hierarchy – there has to be. There’s a captain, and sometimes you have to make decisions that aren’t popular, as I’ve had to do before. Leadership often means making choices you don’t necessarily want to make.
But the goal in aviation teamwork is to reach solutions that everyone understands. We have structured ways of making decisions, but they’re designed so that everyone’s involved and bought in. It’s not about rigid consensus – it’s about shared understanding.
Even when you’re flying solo, you’re not really on your own. Take the Red Arrows – one of the highest-performing teams in the world. Each pilot flies their own aircraft, but together they move as one. That’s incredible teamwork. Everything’s rehearsed, everything’s thought through – so when something goes wrong, they’re ready for it.
That’s what high-performing teams do – they prepare, they learn, and they adapt – the same should apply in medicine. And the best systems don’t punish genuine mistakes; they learn from them.
Charlotte: Going back to what you said about hierarchy – the “consultant-as-God” idea – I do think it’s changing, but I think it still depends on the team.
My first job was on the stroke team, and it was brilliant. Everyone’s opinion was valued. You could speak up – even as an FY1 doctor – and you were listened to.
Chris: I remember you talking about that at the time. I didn’t think it was necessarily stroke medicine that made you tick, it was being part of a high-performing team. When you’re in a team that does what it sets out to do, and does it well, that’s incredibly rewarding.
So, In All of Your Years as a Pilot, What Is the Most Important Lesson You’ve Learned About Error and Humility?
Chris: There are a couple things I could pick out. One big one is around what we call third-party reporting – reporting “near misses”, not just actual incidents.
For example, I once missed an approach while flying solo to Ireland because I’d missed something. No one else would have known, but I reported it anyway – because it’s important to a strong safety culture.
If I had to sum it up, I’d say it comes down to the old adage: you’re only as good as your last trip. Even the best pilots – and I wouldn’t claim to be the best – make mistakes. That’s why we work in crew resource management teams: to support each other and avoid errors.
Charlotte: And recognising your limits, being able to say when you need help, that’s important too.
Chris: Exactly. When people stop admitting mistakes, complacency creeps in. You start thinking you know it all – and that’s when things go wrong. Even in aviation, where people are usually vigilant, things still go wrong – simply because humans are fallible.
If Medicine Were to Truly Embrace Black Box Thinking, What Kind of Transformation Could You Imagine – for Both Patients and Practitioners?
Chris: If you take the example of aviation – where safety management systems have led to fewer aircraft accidents – and apply a similar cultural change to medicine, I’d say there are two main benefits.
The first is better patient outcomes, and the second, it (the NHS) would become a better place to work. High-performing teams would become the norm.
If people are happier, comfortable in their roles, and feel they’re part of a learning organisation committed to improvement, that alone drives better performance.
A commitment to a just culture and to continuous learning will benefit NHS staff in their careers, and as a result, improve outcomes for patients.
To Bring It All Together – Thinking Back to Black Box Thinking by Matthew Syed – What Was the Key Takeaway for You?
Chris: They key example in the book came from a tragic incident where a skilled team became so focused on fixing a problem, they lost sight of the bigger picture – the patient was dying.
The lesson is we need the ability to step back, look at the whole situation, to understand what we need to achieve.
The individuals involved in that case didn’t go to work intending for someone to lose their life on their watch. I’m sure they would have changed things if they could. That’s why it’s so important to have a safety management system committed to learning and understanding human factors. With that approach, hopefully tragedies like that can be avoided.
In Summary
So, at the heart of “black box thinking” – it’s not about blame or perfection. It’s about learning, humility, and the courage to keep improving. If healthcare can adopt that mindset, as I do believe we are starting to work towards, we might not only save more lives, but also create teams that feel safe, valued, and proud of the work they do.
What would that kind of culture mean for you – as a patient, a clinician, or a leader? I’d love to hear your thoughts below.

Leave a comment