I recently unearthed a box long lost in the back of my closet, a box that has travelled with me across several states and too many years. I don’t think of myself as a pack rat by any standard (in fact, I hate “stuff” and have worked hard among those in my social circle to end compelled gift-giving), but I have kept copies of articles that have shaped my way of thinking, clips from a cherished if not fully realized stint as a “newspaperman,” papers from college, and personal writing: letters, sporadic journal entries, half-started plays and other narratives.

Sitting on the floor of my guest room-turned-home office, propped against the closet door frame that divides the secret and the known, the archaeology was bittersweet. The papers seemed to multiply as I pulled one after the other out of their resting place; some were fused together, and tacky letters complained with a rat-a-tat-tat as they fought to remain with their original sentences or desert to the other side. Each revealed parts of my past as well as paths not taken and thus futures unknown. Letters from lovers, some forgotten, some too difficult to be remembered, captured moments of longing but were silent as to endings. Course term papers presented youthful thought, buoyant about topics but often naïve in argument and construction—a static warning that I likely lack proper perspective and skill with current writing, too. And then there were the random jottings of ideas, good but not acted upon, some other interest or daily duty intervening and relegating them to a kernel state.

This parade of papers documented what I once thought and who I once was, a mix of aspects I am both glad to have grown beyond and sad to have lost. I tried not to linger in the threshold between youth and whatever lies beyond—although I have found as of late a tug to look back as I look ahead—for there was a specific purpose to this exercise in nostalgia. I was on the hunt for a letter of intent I once wrote as part of a college application. It was not a successful essay by any means. In rereading it, it was clear the applicant was unable to articulate her points or provide a cohesive narrative as to what she would bring to a program. There were mixed metaphors and issues in organization, and the attempts to use scholarly sources to bolster arguments ended up reminiscent of name-dropping. Based on this, I am not certain I would have ranked the applicant highly either. But there were positives as well. The piece possessed an underlying (if not always appropriate) creativity and a passion for the subject matter. It emulated, whether identifiably, the writing and work of notable figures in the field, indicating a penchant to be mentored and molded. And it tried to be evocative while having fun in a way that suggested the author might not take herself too seriously.

It was clear I wasn’t ready for graduate school.

But with heart set on the one university and ignorantly ignoring the schools then willing to give me a chance, I waited a year and reapplied. I wrote a dry, linear letter that outlined a research agenda on a topic with which I had prior experience. Truth be told, these best-laid plans lacked some sincerity (and this essay never made it to the box!). I wasn’t uninterested in that particular object of study, I just couldn’t, and likely didn’t want to, narrowly define my route for the next seven years. Not at that age. Not with my nascent levels of knowledge and skill. But it worked. I seemed like a focused candidate, although I probably still wasn’t ready for graduate school. An inveterate late bloomer, I may only be on the cusp of ready now.

But there was something in that original essay, something I was reminded of when reading the papers for the symposium in this issue of the Journal of Bioethical Inquiry (JBI). Guest edited by Helena Hansen and Jonathan Metzl, the symposium focuses on the need for “structural competency,” particularly in relation to U.S. healthcare but at base applicable to all biomedicine. Hansen and Metzl write: “Key to our intervention is critical engagement with the ways that current medical paradigms often place responsibility for health-related choices and behaviour on individuals” as well as failing to provide healthcare practitioners and systems “the tools to address the social causes of disease” (Hansen and Metzl 2016, ¶3 and ¶2). For those inside medicine and the many scholars on the outside looking in, this inadequacy is nothing new: medical sociologist Irving K. Zola, in his memorably characteristic way, summed it up nearly half a century ago with a parable urging us to move upstream—to “see who [or what] in the hell is … pushing” people into the “swiftly flowing river” of ill health (cited in Alonzo 1993, 1019). Likewise, Hansen and Metzl call for “a paradigm shift in medical practice: to attend to institutional pathologies that lead to clinical pathologies” (Hansen and Metzl 2016, ¶4)

How did we get so far downriver, focused on jumping “back in … [a]gain and again, without end,” in an attempt to save each drowning person long after they are waterlogged (Zola cited in Alonzo 1993, 1019)? As a young college applicant in medical anthropology, it seems I was interested in a similar question, and like Zola and another of my scholarly heroes, Kent V. Flannery (1982), I offered an (unsophisticated) answer in parable form:

Long ago, but not all that far away, sometime before the age of the thirty-second, questionably all-beef patty, Man awoke one morning quintessentially puzzled about Life. (Woman would have ascended from slumber pondering such delicacies if it hadn’t been for that excruciating episode when Life, with head too large, twisted and torqued its way down the birth canal. […] In truth, Woman [had arisen long before Man, tending to chores, and] already knew the answer to this mind-boggling question. She was later quoted as saying, “Eat right, aerobicize daily, die anyway.”)

Such thoughts […] however, consumed Man as ravenously as he had consumed the previous night’s protein prized from the kill (with as much enthusiasm as Marvin Harris [see, for example, Martin 2001]). Man was in a quandary. […] In [his] self-proclaimed position as the center of the universe, which even the omnipotent sun itself encircled, Man arose from his side of the bed and resolved to settle this annoying and nagging question once and for all. […]

[B]ut [he] soon realized it could not be done alone. Exploiting his every kinship tie a future anthropologist would agonizingly diagram, Man summoned his counterparts into the shade of a looming savannah tree [… and] addressed his kin—not with actual names, however, since this practice was disrespectful and therefore strictly forbidden ([ensuring that] the aforementioned future anthropologists [would] use their grant monies thoroughly). He debriefed them on the concern of the day. For Man felt that if they couldn’t solve the Meaning of Life before the sun disappeared from the sky, it was a spurious task […] and not worth continuing. Man turned to […] his brothers and delegated their duties. Each […] was to set off on foot […] and study in depth one aspect of Life. Isaac (not his real name) was given the motions of objects; Charles (not his real name) the plants and animals of the natural world; August (not his real name) the ways groups organize; [and so on … ]. Man, of course, would stay behind and keep the women engaged.

And by nightfall, all would come together, recount their findings, and decipher the Meaning of Life.

And it worked. In theory.

What actually came to pass was polygyny. For the men did not return that night. Or the next. Or the next. Or any evening during the time when the women would have cast off their anger and taken them back.

And although Man’s kin are still specializing and subspecializing in their individual studies, hoping one day to unite their findings into a “General Theory of Everything” and figure out what all this fuss is about, Man has procreated to an extent far surpassing any of Darwin’s ideas regarding evolutionary reproductive success.

And all of Man’s children are as puzzled as he was that one not very inconsequential morning.

And on from there I rambled. About anthropology’s holistic and interdisciplinary approach versus the dangers of “purification” (see Latour 1993). About the important connections between theory and praxis. About the relevance of the discipline in a post-contact world. Little did I know I was attempting to reiterate (and at times misunderstanding) points more substantially and eloquently made by scholars such as David Kaplan and Robert A. Manners (1972) and Bruno Latour (1993). Like Kaplan and Manners, I wondered about the direction of anthropology, a discipline historically tied to both “the primitive world” and “the imperialist expansion of the West” (Kaplan and Manners 1972, 195). What was its purpose in a modern, ultimately connected but hopefully more enlightened globe? Surely understandings of culture could be used to solve (or prevent) social problems and promote general welfare, as Kaplan and Manners noted long before me: “there has been a growing demand (from inside as well as outside the profession) that anthropology become more ‘relevant’ and more activist—that it should begin to play an important role in promoting social change” (Kaplan and Manners 1972, 190, emphasis original). We see evidence for this today, with institutions, particularly medicine, emphasizing the need for “cultural competency,” whatever that may mean and whether it can be taught as an afterthought. Rather, real understandings of culture and how culture shapes social interactions, health, and well-being likely can only come from those paradigm shifts in the production and practice of medicine that Zola and Hansen and Metzl urge.

Although I was drawn to medical anthropology (and public health and bioethics) in part because of its more applied nature, Kaplan and Manners also add a caveat about such approaches:

For to concentrate exclusively on the practical, on the applied, on the attempt to resolve current dilemmas may very well impede the development of any scientific discipline. Concern with application alone encourages the investigator to lean most heavily on what is already known, to employ the techniques, the methods, and the data already at hand in order to cope with the special problem. It has a tendency to deflect the scientist from the free and imaginative speculation which forms the lifeblood of his discipline in its growth as a scientific enterprise (Kaplan and Manners 1972, 204–205, emphasis original).

Such focus on the immediate can thus narrow our vision, which is ultimately what Zola’s parable (and my own), Latour’s (1993) We Have Never Been Modern, and the papers in this JBI symposium on “structural competency” caution against. While we have become proficient in resuscitating the drowning one by one, this isn’t time- or cost-effective, nor does it address the fundamental (and complex) causes of human suffering. Life is messy. It’s intertwined. It often resists distillation into component parts. And when we attempt to do so within the “modern” enterprise, we actually produce more “hybridizations,” more mixes of nature and culture that we are driven yet again to disentangle (Latour 1993). Subdividing the world might have certain practical use and make for more palatable bites but, as Latour (1993) underscores, is but a fabrication—our “modern” attempt at “purification” that supposedly sets us apart from the “premodern.” And “[y]et our work remains incomprehensible because it is segmented,” and “we are always attempting to retie the Gordian knot by crisscrossing, as often as we have to, the divide that separates exact knowledge and the exercise of power” (Latour 1993, 3).

“Our intellectual life is out of kilter,” Latour warns, and “remains recognizable as long as epistemologists, sociologists and deconstructionists remain at arm’s length, the critique of each group feeding on the weaknesses of the other two” (Latour 1993, 5–6). We have seen this in medicine and need only think of Michel Foucault’s (1994) description of the modern physician placing his patient in parentheses in order to engage more directly with disease. Or a healthcare system based on fee-for-service. Or a downstream paradigm that sees only individuals, discretely responsible for both enabling and abating their maladies. Yet such a gaze lasts only as long as we keep turning a blind eye:

The tiny networks we have unfolded are torn apart like the Kurds by the Iranians, the Iraqis and the Turks; once night has fallen, they slip across borders to get married, and they dream of a common homeland that would be carved out of the three countries which have divided them up (Latour 1993, 6–7).

Perhaps there is hope. Positive changes that make visible this modern illusion are occurring in healthcare, particularly as programs and policies call for inter-professional teamwork and more interdisciplinary practitioners. The papers in this symposium of the JBI urge for a more integrated, upstream view as well and, through concrete examples, offer a dry towel and a map for promoting structural competency: from training healthcare students on how to “define and analyse power dynamics within the medical hierarchy and hidden curriculum” (Angoff et al. 2016, under “Abstract”) and “integrat[ing] social environmental influences in genetic predictive models of disease risk” (Conley and Malaspina 2016, under “Abstract”) to “repair[ing] the urban ecosystem” and creating “health-giving built environment[s]” through “perspective and solidarity” (Fullilove and Cantal-Dupart 2016, under “Abstract”), developing multilevel, primary prevention strategies that “prevent torture [and other human rights abuses] from occurring in the first place” (Celermajer and Saul 2016, under “Abstract”), and encouraging clinicians to recognize their pivotal roles in bringing together science, patients, media, policy, and culture to promote harm-reduction, rather than tertiary and often punitive, approaches in addressing issues like substance abuse (Drucker et al. 2016).

In these papers, we also find a new “theoretical lens to see how social conditions influence health”:

Fundamental cause theory makes sense of the fact that an inverse association between socioeconomic status (SES) and mortality has been remarkably persistent across places and times. [… It] also predicts that the introduction of new health technologies, in the absence of interventions to address social inequalities, leads to larger health inequalities, given that social stratification drives unequal access to these technologies. […] Consequently, fundamental causes affect health even when the profiles of risk and protective factors and diseases change radically (Reich, Hansen, and Link 2016, ¶1–¶4 under “Fundamental Cause Theory”).

Similarly, Latour suggests that in order to engage in “an anthropological [i.e., more comprehensive and holistic] analysis of the modern world […] the very definition of the modern world has to be altered” (Latour 1993, 7). This is not a call for throwing out the baby with the bathwater; in fact, he recognizes that the

moderns’ greatness stems from their proliferation of hybrids, their lengthening of a certain type of network, their acceleration of the production of traces, their multiplication of delegates, their groping production of relative universals. Their daring, their research, their innovativeness, their tinkering, their youthful excesses, the ever-increasing scale of their action, the creation of stabilized objects independent of society, the freedom of a society liberated from objects (Latour 1993, 133).

That said, he wants an amalgam of the better aspects of the “premodern,” “modern,” and even “postmodern”: “to retain the production of a nature and of a society that allows change in size through the creation of an external truth and a subject of law, but without neglecting the co-production of sciences and societies” (Latour 1993, 134, emphasis removed).

The nicely sums up the papers in this issue of the JBI. We do not have to recreate healthcare from the ground up, but we cannot proceed in our siloed ways. Perhaps bioethics has already long been engaged in that anthropological analysis of medicine. But how far have we come? Have we been able to make visible the Ariadne’s thread of our interwoven healthcare stories (see Latour 1993, 3)? Or do students, practitioners, and policymakers still struggle in recognizing and applying such perspectives?

As I look back on that essay I attempted to write so many years ago, I certainly do not envision myself a great thinker or scholar (although I still hold out hope for late blooming), but I would like to think some part of my brain or being was reaching towards Latour’s point about the foibles of “The Modern Constitution.” Whether that is true or self-deluded hindsight, revisiting that piece provides at least a personal bookend for me, as I write this, my last editorial as editor in chief of the JBI. Although I remain a consulting editor and member of the editorial board for as long as the journal will have me and I confidently hand over the reins to my esteemed colleagues Michael Ashby and Bronwen Morrell, this transition, too, is bittersweet. Perhaps it would be better not to unveil my long-buried terrible parable that must have baffled admissions folks while providing a good laugh (and not in the ways intended!). But as I contemplate my own future and other possible roles in bioethics—and as I know that I can never repay the debt of kindness and trust I owe the incomparable people who make possible the community that is the JBI Footnote 1—I also wonder about the future of bioethics. Within the health professions, do we still teach bioethics (and other medical social sciences and humanities) as an afterthought? Are we still pursuing myopic paths that pledge depth but ultimately divide us into foreign lands, despite promises to come back together? Does a fundamental shift remain to be made?