“At the point of weapons release, I trusted my training,” Dana added. “I had to account for the wind, because that affects where the ordnance drops, and with friendlies being that close, I wanted to take responsibility for everything. This is my weapon from my jet and the effects are on me.

—Captain William Dana, describing the danger close airstrikes he executed in defense of friendly troops in Syria, for which he was awarded the Distinguished Flying Cross [16]

Demanding Caution and Precision

It is not easy to explain to a physician how a pill she prescribes may be similar to a bomb a fighter pilot drops from a military jet. Bombs are designed for purposes that are antithetical to the oaths physicians take. Still, in my two careers—first as Air Force pilot and flight safety officer, now as health informatics researcher—I have been struck by the parallels. I believe the opioid crisis will continue until there are systems in place to provide prescribers with the same levels of feedback, situational awareness, and accountability in opioid prescribing that exist in the world of combat aviation.

We can build those systems. It is our duty to do so.

The business of what the Air Force calls Close Air Support (CAS) often comes down to a concept akin to “First do no harm … to friendly troops.” In CAS, pilots engage in risky operations and use potentially dangerous tools in order to minimize the threat to those they have a duty to watch over. Their work is governed by empirically derived reference points that help them manage risk when they put munitions close to friendly ground troops in order to protect them. Prescribing an opioid to a patient in pain is not all that different in spirit, though we as a nation lack much of the structure and feedback required to parallel a CAS pilot’s precision. Few if any prescribers know when they have overprescribed opioids. Few if any know where extra pills are diverted to and who might be harmed by them. In fact, a recent study shows massive variations in opioid prescribing to orthopedic surgery patients, with unused opioids in 61% of cases [17]. A perfect world would have no need for bombs or opioids. The world we live in is imperfect; we face its imperfections with tools that are sometimes necessary, that have the potential to cause devastating damage, and that demand the highest levels of caution and vigilance. Combat aviation has developed procedures over more than a hundred years that are designed to prevent the tragic impact of “friendly fire,” incidents in which a munition dropped with the intent to protect ultimately results in the death of a comrade. In facing the opioid crisis, we cannot afford to ignore lessons learned on the battlefield—or anywhere else.

Outlining the Limits of Safety

“Until clinicians stop prescribing opioids far in excess of clinical need, this crisis will continue unabated.”

—Robert M. Califf, MD, Janet Woodcock, MD, and Stephen Ostroff, MD, formerly of the US Food and Drug Administration [3]

In recent decades, clinicians tasked with managing patients’ pain have been placed in an unfair position. After many years of “misleading” marketing on the part of opioid manufacturers that hid the dangers of opioid dependence and addiction [7], 1 in 12 prescribers of opioids received payments from opioid manufacturers from 2013 to 2015 [6]. In addition, the campaign advocating that pain be assessed as the “fifth vital sign” ultimately tied patients’ satisfaction with pain control to physicians’ reimbursements for care [9]. Now, perceptions have changed, opioids are understood to be dangerous, and physicians and other prescribers are being blamed in part for one of the worst public health crises in US history. Our clinicians deserve better, and we have the tools to give it to them.

Unfortunately, we are still far from agreement about how to do so. Most US health care providers do not work within systems that seek to standardize opioid prescribing for postsurgical pain management, and therefore excess opioid pills represent an onramp to the highway of illicit opioid use and opioid use disorder for many Americans [11, 12]. Patients tend not to store unused pills securely or dispose of them safely [1, 17], facilitating diversion and blunting the impact of efforts to tailor prescribing to an individual patient’s risk of substance misuse. A 2014 study found that 75% of heroin users interviewed began their opioid misuse with a prescription opioid [4]. Research involving the use of patient-reported outcomes to establish right-size opioid prescribing guidelines for postsurgical care has been undertaken only in the last few years and represents a critical step toward the kind of precision that this crisis demands [2], but efforts to scale the resulting guidelines are lagging badly. There are units in the US military whose work parallels these efforts, centering entirely on measuring the danger associated with munitions at a set distance under a given set of conditions. Those researchers and the guidelines they produce make it possible to understand the limits of safety involved in dropping a munition, facilitating remarkable success in that field.

Over the course of his 3 hours in the air over Syria in the spring of 2017, Capt. William Dana made hundreds of life-or-death decisions while focusing on protecting friendly troops facing a potentially overwhelming enemy force. He made those decisions while sitting atop a rocket-powered ejection seat and operating a 25,000 pound single-seat aircraft that was designed to absorb significant amounts of ground fire. His mission was to defend friendly ground troops from insurgents who had broken through their defensive perimeter and threatened to annihilate them. In the end, Dana was forced to strike an enemy position that was within 30 m of the troops he needed to defend, creating a serious risk that he would unintentionally hurt or kill them. Such a strike is referred to as “danger close,” since it falls within a carefully established radius encompassing a friendly position. Danger close strikes require special permission, careful coordination, and near perfect execution. Capt. Dana was successful because of his courage, his discipline, and his training leading up to those moments.

There is a broad system that makes sorties like Capt. Dana’s possible. His strikes were directed by a joint terminal attack controller (JTAC) who had undergone extensive training to speak the language of both pilots in the air and troops on the ground. The JTAC calls for air strikes in a standardized nine-line format that acts as both checklist and game plan. The nine-line communicates the units involved, their locations, their target, the strike plan, the threats, and the follow-up plan. It does so in a way that is practiced and familiar to any CAS pilot. Capt. Dana relied on hundreds of hours of practice in the aircraft but even more time spent on the ground in the kind of stringent feedback debriefings that form the core of military aviation. Each flight is carefully planned, discussed beforehand, and critically debriefed after landing to extract every possible lesson and path toward improvement. While in the air, Capt. Dana sat behind an instrument panel that has been honed over the last century to give pilots the most glance-able feedback possible. Each of these elements represents concepts that health care can borrow from.

American health care is full of smart, dedicated people surrounded by innovative technology, but it lacks much of the standardized communication and feedback systems that enabled Capt. Dana’s success. Opioid prescribing can and should be executed with the same level of precision and discipline with which a danger close air strike is approached. The Air Force has made significant mistakes in CAS and caused unnecessary deaths, but those incidents demand and receive extensive investigation. That’s important. When opioid prescribing goes wrong, effort should be expended to capture relevant data, understand root causes, and change procedures to minimize the risk of recurrence. This has taken place for years now with medical error [10], where many health systems now have instituted some form of the aviation crash investigation techniques I used to employ.

We should reconceive the overprescribing of opioids as a crisis of vast iatrogenic harm, a medical error committed not once but repeatedly over decades, not to one patient but to hundreds of thousands. The solutions will come from detecting and focusing on individual instances of harm and coming to a deeper understanding of their causes. This would entail understanding what drives variance in prescribing among various surgeons and caregivers performing comparable procedures on comparable patients, as well as determining which prescribing decisions result in the best patient outcomes.

Nudging Toward a Future of Shared Ideas

There have already been success stories in taking insights from the cockpit into health care settings. Atul Gawande, MD, MPH, has helped to pioneer and popularize the use of aviation-inspired checklists in surgical suites through a collaboration with the World Health Organization’s World Alliance for Patient Safety (WAPS) [8]. The WAPS checklist closely resembles the safety checks and approach-to-landing briefings that were integral to my work as a military pilot. Checklists have faced resistance from physicians, as they did among pilots in the years leading up to World War II, but their efficacy in medicine has been shown [19].

Such checklists in aviation arose out of a system that is deeply concerned with measuring and broadcasting errors and standardizing the means of avoiding them. For that reason, it is often said that much of a military aircraft operator’s manual is written in blood. These publications are living documents amended with notes, warnings, and cautions after dangerous and deadly mishaps occur. I may have been among the last Air Force pilot training students to be issued a 3-ft stack of paper publications that required constant upkeep—page replacements, penciled modifications, and careful annotation of posting each change. The constant effort required to keep these publications up to date drove home to me how dynamic and evolving the state of our business was.

Later in my career, I was responsible for some of those rewrites, changing procedures and regulations as the result of my findings as a crash investigator. Procedures that had been a part of standard operations for years or decades could suddenly be forbidden as our community responded to a crash and attempted to prevent the occurrence of another. These amendments are made possible by a robust system of investigation that invests a great deal of money, time, and talent in establishing the root causes of mishaps in order to develop new standards to prevent recurrence. Aviation accidents often result in attention-getting explosions and an obvious trail of wreckage to investigate. Tragedy in health care can be far more difficult to recognize and to measure, but we know that more than 350,000 Americans died of opioid overdose from 1999 to 2016 [18]. We are facing something far more deadly than anything ever seen in the realm of aviation safety. When we tally the corpses, we are now losing the equivalent of a Boeing 737’s worth of American citizens to opioid overdose on a daily basis, based on the CDC’s most recent annual estimate [18].

Avoiding tragedy, be it in the air or at the bedside, requires that decision-makers be properly informed. The instrument panel pilots fly behind is a carefully designed data visualization intended to inform life-or-death decisions. The panel facilitates one of the most critical skills that military pilots develop: the capacity to right the aircraft should the pilot fly into clouds and become disoriented. This is called “unusual attitude recovery” and there’s a lot that American health care can learn from it. In training, student pilots are blindfolded and flown in unusual maneuvers that cause their inner ears to lie to them, making them feel that the airplane is, for example, right-side up and climbing when it is in fact upside down and headed toward the ground. The mantra drilled into new pilots is critical: “RECOGNIZE, CONFIRM, RECOVER.” These situations demand that a pilot recognize that she has entered a situation in which her senses disagree with what an instrument on her panel tells her, confirm that the instrument is correct by cross referencing with another instrument, and commit to that reality in recovering the aircraft to a normal attitude. Airplane instrument panels make this possible for students with less than 20 hours of flight training. Prescribers making critical decisions about opioids should have a similar tool, but few do.

My team proposed such an instrument panel for opioid prescribers at the US Department of Health and Human Services’ Opioid Code-a-thon in December 2017. We found that clinicians have no means of measuring their opioid-prescribing behavior relative to that of their peers. Those we spoke to were interested in being able to see how their dosages and volumes stacked up against others treating similar patients for similar problems. We prototyped an application we called the Opioid Prescriber Awareness Tool (OPAT) for providing prescribers a way of comparing their opioid prescribing patterns with those of their peers, with additional functionality to inform referral decisions [20]. The concept was well received. Physicians I have interviewed get excited about the prospect of a feedback loop and empirical targets for acute pain medication. Such a tool, when developed with patient-reported opioid usage, could be as simple as a default prescription order set for a given procedure. These defaults could anchor prescribing to an empirically derived volume but do nothing to compromise the freedom and autonomy of clinicians. Such small interventions, known as “nudges” in the field of behavioral economics [14], have been shown to reduce the prescribing of statins [15], antibiotics [13], and opioids [5].

The time has come to borrow everything we can from every possible source in the effort to reduce the ongoing human cost of the opioid crisis. American health care has recognized the problem we are in. It is up to health system leaders to pull together the teams to build the tools required for prescribers to confirm their prescribing with empirical data, and begin the process of recovering from an era defined by excessive opioid prescribing.

This is a problem that begs for cross-functional teams, including clinicians, administrators, patients, and health informatics experts. We owe our prescribers the kind of intuitive, reliable feedback systems that pilots experience with a glance at their instrument panels. We owe health systems a measure of the way that their aggregated prescribing contributes to the increase or reduction of opioid misuse in the communities they serve. We owe patients options for managing acute pain that keep them safe. We should know what opioid dosages constitute “danger close” prescriptions, and we should set default prescribing levels in electronic health records that anchor the prescribing decision to data on actual patient opioid usage. As a nation, we need a system in which every patient’s opioid prescribing plan is “owned” by a single accountable individual. That prescriber’s health system needs a leader who is accountable for meaningful reduction in opioid overprescribing and who provides the community transparent feedback on those efforts. Data, information, and feedback must be made to flow in a way that befits the level of danger associated with these decisions.

Military aviation is hardly perfect, but it has developed the kind of standardization, feedback loops, and training procedures that facilitate extreme precision with dangerous munitions in close proximity to troops who must be protected. It is time to borrow from the concepts that make such a thing possible in the effort to eliminate the catastrophic effects of opioid overprescribing. We have the human talent we need to solve this crisis. We have the concepts it takes to get it done. It’s up to us to pull it all together.