Picture of a jet in the sky

A Healthcare Frame for the Boeing Crashes

The recent crashes of two Boeing 737 MAX aircraft raise important questions for patients, physicians, and policymakers. Should aviation safety remain the gold standard that has been so influential in attempts to improve patient safety? Will doctors soon face the same problems as the pilots of those doomed planes, fighting with automated safety systems that threaten their patients? What questions does the Federal Aviation Authority (FAA) certification of the Boeing safety systems raise with regard to evolving approaches to medical device safety promoted by the Food and Drug Administration (FDA)?

Preliminary investigations into the tragic loss of life from last October’s Lion Air flight 610 departing Jakarta and this month’s crash of an Ethiopian Airlines flight 302 departing Addis Ababa have led to the grounding of 737-8 and 737-9 aircraft by the FAA. More generally, those accidents may call into question the status of aviation safety as the gold standard of industrial safety and a standard that has proved hugely influential on health care safety.

Medical errors are the third leading cause of death in the U.S., behind heart disease and cancer and are responsible for more than 250,000 deaths annually. In the years that followed To Err is Human: Building a Safer Health System many of the strategies adopted to combat medical errors have been influenced by aircraft safety. The main reason for adopting this exemplarwas the dramatic reduction in the number of accidents involving commercial jets, particularly since the creation of the Commercial Aviation Safety Team in 1995.

The predominant aircraft safety model, and the one subsequently embraced by health care, is referred to as a systems approach. Much of the work in this area was pioneered by psychologist James Reason, whose “Swiss Cheese” model describes how flawed systems tolerated accidents caused by serial errors by individuals. Reason’s conclusion was that a systems approach to safety was necessary to build defenses to these human frailties. His and related work are behind the adoption of surgical safety checklists, continual learning from near-misses, and other processes such as the encouragement of team-based safety, the use of simulators during medical education, and the promotion of distraction-free environments. If the Boeing crashes seriously implicate the systems approach itself then patient safety experts may second-guess their own reliance on it.

If the early reports are to be believed the Boeing flight crews essentially lost a fight with an automated system when the stall-prevention feature (MCAS) took over. Subsequent reporting suggested that two enhancements to the safety system, the angle of attack indicator and a “disagree” light warning the pilots if sensors were in disagreement were extra charge options, some of which will be included going forward as Boeing releases updated software that will query multiple sensors and disable MCAS if the sensors meaningfully disagree. Health care has not yet reached such levels of automation but increasingly seems to be nearing a tipping point. For example, while the current crop of surgical “robots” are merely examples of robot-assisted surgery, autonomous (albeit supervised) enhancements are on the way along with artificial intelligence diagnosis.

However, in a sense health care has been there before following the broad implementation of clinical decision support systems (CDS). These rule-based systems send an alert to a physician who, for example, prescribes a drug that may dangerously interact with another drug the patient may be taking or recommends a course of treatment at odds with generally accepted practice. Doctors have struggled with alert fatigue and questions about the ultimate decider; doctor or machine? CDS systems will soon be replaced by AI systems that will increasingly make those decisions its own, potentially placing physicians in situations analogous to those tragically experienced by the 737 pilots.

Finally, some news reports have uncovered another wrinkle in the crashed aircrafts’ safety systems. Apparently the underfunded FAA increasingly delegated much of the safety certification to Boeing itself, creating possible internal conflicts as the 737 MAX development fell behind. This shift in compliance practice goes beyond regulatory capture and the “revolving door” model of employees moving in and out of agencies and the industries they regulate, although those practices have also been the subject of criticism from the Department of Transportation’s Office of the Inspector General. The FAA, however, has insisted that “This is not self-certification; the FAA retains strict oversight authority.”

The FDA has been under similar pressure and criticized by industry and others for taking too long to approve innovative drugs and devices while facing backlash from safety advocates for not eliminating approval loopholes. In partial response, outgoing FDA Commissioner Scott Gottlieb piloted a Digital Health Software Pre-certification (Pre-Cert) Program. Essentially, Pre-Cert exchanges a safety examination of the device itself to the certification of manufacturer based on its commitment to principles such as patient safety and proactive culture. The pilot program involves technology companies developing healthcare products such as Apple and Alphabet’s Verily.

Whether the subjects are healthcare technologies, medical devices or other products such as autonomous vehicles the trendlines are clear; increasingly safety-related decisions will be transferred from man to machine while regulators will increasingly be under pressure to modify their own safety approval systems lest the market is denied product innovations. The tragic Boeing crashes should lead healthcare safety organizations, regulators, and manufacturers to reevaluate their values and practices.

Nicolas P. Terry

Nicolas Terry is the Hall Render Professor of Law at Indiana University McKinney School of Law where he serves as the Executive Director of the Hall Center for Law and Health and teaches various healthcare and health policy courses. His recent scholarship has dealt with health privacy, mobile health, the Internet of Things, Big Data, AI, and the opioid overdose epidemic. He serves on IU’s Grand Challenges Scientific Leadership Team, working on the addictions crisis and is the PI on addictions law and policy Grand Challenge grants. His podcast is at TWIHL.com, and he is @nicolasterry on Twitter.

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.