Did you know that half of all surgical complications are preventable? In the US alone, this means that surgeons cause between 50,00 and 200,000 preventable deaths each year.
Surgeons are, almost literally, getting away with murder.
Why do we let them? Engineers who see their designs catastrophically fail often lose their engineering license, even when they’re found not guilty in criminal proceedings. If surgeons were treated like engineers, many of them wouldn’t be operating anymore.
Indeed, the death rate in surgery is almost unique among regulated professions. One person has died in a commercial aviation accident in the US in the last nine years. Structural engineering related accidents killed at most 251 people in the US in 20161 and only approximately 4% of residential structure failures in the US occur due to deficiencies in design2.
It’s not that interactions with buildings or planes are any less common than surgeries, or that they’re that much inherently safer. In many parts of the world, death due to accidents in aviation or due to structural failure is very, very common.
It isn’t accidental that Canada and America no longer see many plane crashes or structural collapses. Both professions have been rocked by events that made them realize they needed to improve their safety records.
The licensing of professional engineers and the Iron Ring ceremony in Canada for engineering graduates came after two successive bridge collapses killed 88 workers3. The aircraft industry was shaken out of its complacency after the Tenerife disaster, where a miscommunication caused two planes to collide on a run-way, killing 583.
As you can see, subsequent safety improvements were both responsive and deliberate.
These aren’t the only events that caused changes. The D. B. Cooper high-jacking led to the first organised airport security in the US. The Therac-25 radiation overdoses led to the first set of guidelines specifically for software that ran on medical devices. The sinking of the Titanic led to a complete overhaul of requirements for lifeboats and radios for oceangoing vessels. The crash of TAA-538 led to the first mandatory cockpit voice recorders.
All of these disasters combine two things that are rarely seen when surgeries go wrong. First, they involved many people. The more people die at once, the more shocking the event and therefore the more likely it is to become widely known. Because most operations involve one or two patients, it is much rarer for problems in them to make the news4.
Second, they highlight a specific flaw in the participants, procedures, or systems that fail. Retrospectives could clearly point to a factor and say: “this did it”5. It is much harder to do this sort of retrospective on a person and get such a clear answer. It may be true that “blood loss” definitely caused a surgical death, but it’s much harder to tell if that’s the fault of any particular surgeon, or just a natural consequence of poking new holes in a human body. Both explanations feel plausible, so in most cases neither can be wholly accepted.
(I also think there is a third driver here, which is something like “cheapness of death”. I would predict that safety regulation is more common in places where people expect long lives, because death feels more avoidable there. This explains why planes and structures are safer in North America and western Europe, but doesn’t distinguish surgery from other fields in these countries.)
Not every form of engineering or transportation fulfills both of these criteria. Regulation and training have made flying on a commercial flight many, many times safer than riding in a car, while private flights lag behind and show little safety advantage over other forms of transport. When a private plane crashes, few people die. If they’re important (and many people who fly privately are), you might hear about it, but it will quickly fade from the news. These stories don’t have staying power and rarely generate outrage, so there’s never much pressure for improvement.
The best alternative to this model that I can think of is one that focuses on the “danger differential” in a field and predicts that fields with high danger differentials see more and more regulation until the danger differential is largely gone. The danger differential is the difference between how risky a field currently is vs. how risky it could be with near-optimal safety culture. A high danger differential isn’t necessarily correlated with inherent risk in a field, although riskier fields will by their nature have the possibility of larger ones. Here’s three examples:
- Commercial air travel in developed countries currently has a very low danger differential. Before a woman was killed by engine debris earlier this year, commercial aviation in the US had gone 9 years without a single fatality.
- BASE jumping is almost suicidally dangerous and probably could be made only incredibly dangerous if it had a better safety culture. Unfortunately, the illegal nature of the sport and the fact that experienced jumpers die so often make this hard to achieve and lead to a fairly large danger differential. That said, even with an optimal safety culture, BASE jumping would still see many fatalities and still probably be illegal.
- Surgery is fairly dangerous and according to surgeon Atul Gawande, could be much, much safer. Proper adherence to surgical checklists alone could cut adverse events by almost 50%. This means that surgery has a much higher danger differential than air travel.
I think the danger differential model doesn’t hold much water. First, if it were true, we’d expect to see something being done about surgery. Almost a decade after checklists were found to drive such large improvements, there hasn’t been any concerted government action.
Second, this doesn’t match historical accounts of how airlines were regulated into safety. At the dawn of the aviation age, pilots begged for safety standards (which could have reduced crashes a staggering sixtyfold6). Instead of stepping in to regulate things, the government dragged its feet. Some of the lifesaving innovations pioneered in those early days only became standard after later and larger crashes – crashes involving hundreds of members of the public, not just pilots.
While this only deals with external regulation, I strongly suspect that fear for the reputation of a profession (which could be driven by these same two factors) affects internal calls for reform as well. Canadian engineers knew that they had to do something after the Quebec bridge collapse created common knowledge that safety standards weren’t good enough. Pilots were put in a similar position with some of the better publicized mishaps. Perhaps surgeons have faced no successful internal campaign for reform so far because the public is not yet aware of the dangers of surgery to the point where it could put surgeon’s livelihoods at risk or hurt them socially.
I wonder if it’s possible to get a profession running scared about their reputation to the point that they improve their safety, even if there aren’t any of the events that seem to drive regulation. Maybe someone like Atul Gawande, who seems determined to make a very big and very public stink about safety in surgery is the answer here. Perhaps having surgery’s terrible safety record plastered throughout the New Yorker will convince surgeons that they need to start doing better7.
If not, they’ll continue to get away with murder.
From the CDC’s truly excellent Cause of Death search function, using codes V81.7 & V82.7 (derailment with no collision), W13 (falling out of building), W23 (caught or crushed between objects), and W35 (explosion of boiler) at home, other, or unknown. I read through several hundred causes of deaths, some alarmingly unlikely, and these were the only ones that seemed relevant. This estimate seems higher than the one surgeon Atul Gawande gave in The Checklist Manifesto, so I’m confident it isn’t too low. ↩
Furthermore, from 1989 to 2000, none of the observed collapses were due to flaws in the engineers’ designs. Instead, they were largely caused by weather, collisions, poor maintenance, and errors during construction. ↩
Claims that the rings are made from the collapsed bridge are false, but difficult to dispel. They’re actually just boring stainless steel, except in Toronto, where they’re still made from iron (but not iron from the bridge). ↩
There may also be an inherent privateness to surgical deaths that keeps them out of the news. Someone dying in surgery, absent obvious malpractice, doesn’t feel like public information in the way that car crashes, plane crashes, and structural failures do. ↩
It is true that it was never discovered why TAA-538 crashed. But black box technology would have given answers had it been in use. That it wasn’t in use was clearly a systems failure, even though the initial failure is indeterminate. This jives with my model, because regulation addressed the clear failure, not the indeterminate one. ↩
This is the ratio between the average miles flown before crash of the (very safe) post office planes and the (very dangerous) privately owned planes. Many in the airline industry wanted the government to mandate the same safety standards on private planes as they mandated on their airmail planes. ↩
I should mention that I have been very lucky to have been in the hands of a number of very competent and professional surgeons over the years. That said, I’m probably going to ask any future surgeon I’m assigned if they follow safety checklists – and ask for someone else to perform the procedure if they don’t. ↩