Skip to main content

Limits and luxuries of slow research in radical war: how should we represent perpetrators?

Abstract

How can ethnography and principles of “slow research” help make sense of fast-moving battles for truth, attention, and control in digital environments? What are emerging crises of representation and intervention that researchers face when retelling narratives of perpetrators, trolls, or other social media “bad actors”? This essay works through these questions by drawing on 3 years of ethnographic research and policy interventions into the political marketing and its affiliated “fake news” industries in the Philippines. I argue that ethnographic approaches can potentially nuance journalists’ personality-oriented name-and-shame reporting and develop more systemic critique and locally minded interventions. However, we also need a thoughtful reckoning with the limits and luxuries of slow research.

Introduction

“Slow food calls for ‘the pause’ before eating—a moment to contemplate what will be consumed, where it came from, and to be present for the tasting, the nourishing from the food ingested… The pause produces deeper understanding and experience. Similarly, in slow research, those long pauses before activities… might be necessary to fully experience, contextualize, understand, and therefore appropriately represent, the event. Like the pause before the meal, ethnographic pauses during which the researcher listens, engages, and observes patiently, enable us to understand the process, history, and genesis of the event” (Adams et al. 2014, 187).

This passage is drawn from an editorial essay in the journal Medical Anthropology and meant as an invitation for more “slow research” to better inform public health policy and practice. This call has deeply resonated with me as a communication scholar trained in ethnographic methods from traditions of media anthropology and media studies and committed to policy advocacy informed by cultural insight and community engagement. One of the writers above, Vincanne Adams, was particularly instructional to me when her ethnographic research in post-Hurricane Katrina New Orleans uncovered a “second-order disaster” inflicted by the long-term repercussions of predatory resettlement programs (Adams 2013). Her approach, along with media anthropology’s foundational focus on the unintended consequences of technology, helped shape my own research into humanitarian technologies in post-Typhoon Haiyan in the Philippines, where my colleagues and I analyzed how aid agencies used the disaster zone as a test bed for digital innovations not always tailored to the needs of local communities (Ong and Cabañes 2018; Ong et al. 2015). Additionally, Adams and her colleagues’ analogy to the slow food movement is particularly resonant for me as a media researcher and teacher of a Social Media in Everyday Life class in a New England public university where my undergraduate students achingly aspire for, but disagree on, what exactly is a healthy and mindful media diet that is suitably informed by a critical literacy of “how journalism is made and with the most effective ways to consume it” [emphasis in original] (Jolly 2014).

For this inaugural issue of Digital War, I would like to work through the following central questions: In the face of exigencies of today’s radical war and battles for attention, data, and control (Ford and Hoskins 2020) that play out over media manifolds (Couldry 2012), to what extent do principles of slow research help—and hinder—us? Should we double down on some of our guiding values of care and patient attentiveness, or relax some of these in order to make way for collaborative intervention more immediately responsive to emergent threats to democratic practices? Should care and patient attentiveness, as virtues of ethnography, be extended in the retelling of perpetrators’ narratives? Should our narratives even make space for the perspectives of manipulative “bad actors” or should we marshal all energies to denunciation?

On the one hand, slow research and its long pauses between exploratory deep dives are meant to make space for iteration and revision. Some of the most insightful qualitative research into the actions and operations of media manipulators have indeed required an expansive and bottom-up approach, such as tracing clandestine connections between conspiracy theorists, the manosphere, and the alt-right in the USA (Marwick and Lewis 2017) and explaining the genesis of troll armies within historical “structures of disprivilege” of class and caste in the case of middle-class digital entrepreneurs in India (Udupa 2019). As antidote to demands of the attention economy and its cycles of distraction, outrage, and clicktivism, the essayist Rebecca Solnit suggests instead a deeper analysis of “root causes [and]” ways we can try and change them” perhaps better enabled by “a great beautiful slowdown where we can pay more attention to what matters” (Solnit 2019). Slow research—and particularly the ethnographic ethos of uncovering what matters most to ordinary people—is potentially well positioned to address this.

On the other hand, even within our own liberal and academic circles there is humble pie served alongside the comforts of a slow research diet. William Davies’ book Nervous States (2019) reckons with the idea that liberal spaces of deliberation and critique we have traditionally valued now appear too slow and “out of touch.” He diagnoses the nervousness of our current times as ushered in by digital accelerations, where the “possibility for those pauses where critical reflection might once have taken place have come under great strain” (Davies 2019). In this light, the ethnographic value of “being in the present with those we study” (Fabian 2002 in Adams et al. 2014, 189) now appears intolerably indulgent and ill-equipped to mitigate the viral emergencies spreading contagiously during critical events. William Merrin (2019, 214) acknowledges that contemporary tools of warfare wielded by online trolls depend exactly on the speed of their “kinetic provocation.” Against these cycles of manipulative provocation and moral panicky rage-baiting, we academics are justifiably cautious to resist tweeting a knee-jerk hot take we might later regret. But now more than ever, as we are invited ever more frequently—by journalists, funders, the enthusiastic college dean, even our family members—to weigh in on “fake news” debates from whichever field we may identify ourselves in, many of us grow in agreement with media philosopher Daniel Dayan’s (2007, 115) counsel that the media are simply “too important to be left to themselves.” Perhaps slow research is indeed a luxury we cannot anymore afford; so, perhaps we send a sincere offer to clarify, maybe another time feed the hype.

In the remainder of this essay, I reflect on conceptual and methodological choices I have made in my empirical research that temporarily and imperfectly resolved tensions between principles of slow research and the responsibility to engage with the threats emerging from our radical war moment (Ford and Hoskins 2020). In my retelling of ethnographic encounters from 3 years of interviews and participant observation of the producers behind fake news campaigns in Philippine politics, my aim is not to prescribe a singular way of resolving challenges, but to invite mindful engagement with political and moral questions that arise in our everyday labor as academics. To give focus to this essay, I reflect particularly on the opportunities as well as the limits of ethnography in engaging with emerging challenges of tackling “root causes” and proposing “ways we can try and change them” (Solnit 2019) in debates around social media, elections, and disinformation.

Ethnographic encounters with disinformation producers

Unlike quantitative studies that have used social network analysis to describe online actors’ coordinative behaviors or qualitative studies that begin with a normative stance of denunciation of disinformation agents when enumerating their strategies of media manipulation, the research I have done in collaboration with Jason Cabañes (see Ong and Cabañes 2018, 2019) in Manila, Philippines, from 2016 to 2019 was inspired by ethnographic traditions in media production studies. This means analyzing disinformation as a culture of production, that is, a product not of exceptional villains or unprecedented technological innovation, but emerging from organizational structures, collaborative work arrangements, and entrepreneurial subjectivities. Our conceptual and methodological approach was guided by imperatives to identify the social conditions that have led fake news producers to their work, the cultural scripts they use to justify their work to others and themselves, and the institutional procedures that have allowed their work to flourish. We followed insightful precedents of research on “bad actors” in the media industry—from the racist/classist casting agents of Jerry Springer (Grindstaff 2008) to the misogynist porn producers (Mayer 2008)—which have produced systemic critique of economic incentives and organizational cultures without knee-jerk demonization of individual personalities.

Rather than shying away from normative judgment, for us, the ethnographic approach was actually better suited to advance more nuanced questions of ethics and accountability rather than prevailing pearl-clutching liberal responses in the Philippines that have mainly attributed Rodrigo Duterte’s victory in the 2016 presidential race to the work of vociferous influencers, paid troll armies, or even Cambridge Analytica. We imagined that the potential contribution of our ethnographic research to public discussion was to draw attention and put political pressure in assigning responsibility and accountability to those we considered the masterminds of disinformation campaigns: the advertising and public relation (PR) strategists moonlighting as political consultants hiding in plain sight from within the established creative industries.

The central finding of our work is that, in the Philippines, political disinformation projects are typically taken on as financially lucrative side gigs to more respectable bread-and-butter work of corporate marketing. Our deep dive revealed that it has essentially been an open industry secret that PR strategists have taken on black ops projects for politicians and circulate revisionist narratives, (slut-)shame rivals, and stoke public resentments in social media channels. Instead of the heroes-and-villains reportage that Philippine journalists and fact-checkers marshaled as an attempt to hold the line against the angry populist rage of Duterte and his high-profile social media influencers, our normative analysis focused on the complicity and collusion of the whole range of creative and digital workers: from the senior PR strategists consulting for politicians, to their colleagues turning a blind eye, and to the host of external collaborators in digital influencer agencies, data analytics firms, and search engine optimization. Discovering that disinformation production is a normalized project within the promotional industries was an ethnographic surprise that took 1 year of fieldwork before we could publish our first public-facing open-access report (Ong and Cabañes 2018). While journalists and other academics whom we considered our allies in fighting disinformation and valuing tenets of deliberative discourse focused their efforts on creating new systems and routines that would fact-check influencers’ false and emotionally manipulative claims, our ethnographic narration took longer and assigned primary accountability to those at the top. We had to clarify to our colleagues that we were not at all being apologists when we discussed how labor exploitation and “structures of disprivilege” (Udupa 2019) were key factors that impelled influencers and precarious digital workers to seek this kind of shady side gig, which meant they should not be the main targets of journalistic pieces aiming to “unmask the trolls” (e.g., Almario-Gonzalez 2017).

We believe that another contribution of our ethnographic approach was to discuss the local, historical antecedents that allowed networked disinformation to land on well-prepared ground in the Philippines—a country dubbed by a Facebook executive as “patient zero” in the fight against digital disinformation (Harbarth 2018). We discussed how important factors of (1) the country’s image-based political system, (2) the rise of its entrepreneurial and digitally savvy, yet precariously placed workforce, and (3) the growing resentment of populist publics toward the political establishment created the perfect conditions for the country to become one of the world’s cutting-edge test beds for fake news innovations (see Ong and Cabañes 2019 for our newest report published by NATO StratCom).

This historical narrativization from our ethnography certainly challenged other journalists and activists’ acts of resistance against Rodrigo Duterte’s administration. Previously, journalists and activists have channeled their energies to naming and shaming fake account operators through partisan modes of fact-checking that spotlight pro-Duterte influencers while neglecting extreme speech marshaled by the opposition’s own influencers and paid pundits and publishing op-eds with technological determinist analyses that blame Facebook (or Cambridge Analytica) for the rise of Duterte and the decline of Philippine democracy. We disagree with these approaches, but also acknowledge some of the activist justifications behind them. The Philippines after all is in the middle of a violent drug war led by Duterte, where political critics and journalists have since been systemically attacked via formal intimidation via legal punishments as well as state-sponsored online harassment. In this topsy-turvy moment of radical war, we recognize how moral panics discourse can be used as a “weapon of the weak” (Scott 1985) where good-versus-evil narratives more easily mobilize disparate actors toward common goals.

In the last section of this essay, I dwell on the contributions and compromises of slow research when addressing the ethics and politics of retelling the perpetrator’s perspective in the radical war moment. Of course, this challenge is not new as anthropologists, sociologists, and communication scholars have long grappled with the need to balance empathy with critical judgment in retelling perspectives of gang members (Goffman 2014), and porn producers (Mayer 2008). Perhaps it is this polarized moment of radical war and a contemporary media moment of lurid curiosity for supervillains, serial killers, and assorted “bad actors” that this representational challenge is charged with intensely volatile new forces.

How do we hold “bad actors” accountable? Who is the bad actor anyway?

For me, slow research can help researchers work through within specific local contexts on how to engage with two central and ever-recurring challenges we face in the radical war moment: first, the challenge of representing perpetrators and second, the challenge of developing interventions which tackle the root causes behind democratic backsliding.

On the first challenge, I have found that slow research helped develop contextual responses in addressing the tension between, on the one hand, the responsibility of diving deep to understand the genesis of critical events or motivations of bad actors and, on the other hand, practicing caution or “strategic silence” such that we do not popularize dangerous ideas and encourage copycats.

Recent calls for “strategic silence” recognize the heightened vulnerabilities of today’s media ecosystem, as news outlets’ predilection to cover the viral trending topics leave them open to insidious attacks from savvy manipulators. Strategic silence refers to the editorial responsibility to quarantine extremist messages and avoid their circulation and amplification in public discourse (Donovan and boyd 2018). Joan Donovan and danah boyd specifically use the term when critiquing problematic news coverage of white supremacists (i.e., linking to white supremacist manifestos, or giving “too much” voice to white supremacist killers’ families and friends rather than, say, showing the real impact of their cruelty through recording voices of victims’ grieving families), but “strategic silence” can also apply as a caution to news reporting and fact-checking other kinds of media manipulators. News reports and fact checks potentially popularize or “oxidize” (Philipps 2018) dangerous messages via the widespread attention they invite, thereby playing into the very intention of bad actors seeking to “hack public attention” (Marwick and Lewis 2017). In recent years, a transmogrified version of this strategic silence concept has even been used to challenge deep dives into the genesis of populist publics, for instance, Vanity Fair’s media analysis page “The Hive” ran an article with the title, “The ‘Left Behind’ Trump Voter Has Nothing More to Tell Us” (Wolcott 2018). Of course, it was Arlie Hochschild’s (2016) subversive ethnography Strangers in Their Own Land that patiently retold the “deep story” that working-class Trump voters feel as if they have been left behind as other people (i.e., immigrants, members of the LGBTQ community, etc.) have cut ahead of them in line; Vanity Fair exhorted “it was time to drop a cloth over this parrot cage colleagues…” now that “reporters from some of the very outlets paying such nuanced attention to Typical Trump Voters—The New York Times, CNN, and so on—are reviled as enemies of the people” (Wolcott 2018).

For me, foundational values of care and patient attentiveness can still help researchers address in particular moments the pressure to respond during critical events while avoiding the trap of manipulative provocations. In our Philippines ethnography, we specifically channeled our intervention to broaden out the focus beyond fact-checking a select few provocative personalities dubbed by mainstream media as “queens” or “purveyors” of fake news to spotlighting industry and organizational practices that normalize and incentivize disinformative political marketing without any systems of check-and-balance. We argue that journalists who tend to overemphasize “fake news” as merely a Duterte phenomenon or a recent case of state-sponsored propaganda run the risk of enabling the entrenchment of amoral disinformation architects in the political system. This simply allows disinformation architects to slip undetected, engineer more innovations, and ally with new political patrons that offer them power and protection for the next election cycle. By focusing on structures rather than individuals, we wanted to identify gaps in transparency and accountability mechanisms in political marketing and influencer industry practice.

On the second challenge of developing interventions tackling root causes, I argue that principles of slow research very much helped guide the ways, I, as an ethnographer negotiated my relationship with my collaborators. In the same way ethnographers are guided by reflexivity in assessing modes of access and power between us and our “subjects,” we should be attuned to such dynamics in the multistakeholder collaborations we increasingly find ourselves in. It has become a commonplace for ethnographers now to work in teams alongside with data and computer scientists, political scientists, and legal experts particularly in the areas of digital politics and disinformation studies. I agree here with anthropologist Nick Seaver (2017) that ethnographers can sometimes be the most prejudicial of other fields and replicate in their analysis the bias that small ethnographic stories are “better” than big data without acknowledging similar reflexivity about representation, accuracy, and bias among “numbers people.”

For me, I had to negotiate and adjust my own positions and biases when I engaged in policy advocacy work drawing from our ethnographic research. Most specifically, I worked collaboratively with legal experts of a local election integrity group to lobby the election commission to adopt new transparency frameworks for campaign finance and social media marketing in the 2019 elections. This led to new regulation that required politicians to disclose paid consultancies and other digital transactions that fly under the radar from journalist reportage, such as digital advertising and influencer “collaborations” (see Chapter 5 of Ong et al. 2019). I had to learn the hard way that policy advocacy work meant a lot of compromises. While I was proud of the new regulation I advised on and helped pass, I left with the feeling that it was ultimately a performative exercise for government officials to demonstrate they did something even though there was little intention to enforce the regulation properly. It also meant that, following the principles of anonymity I mentioned earlier, the closed-door multistakeholder meetings I helped convene between policymakers, journalists, and representatives in the advertising and PR industry to brainstorm on policy meant that I always knew more than what was going on in the room. I could not act on my knowledge that certain actors in the room were actually lying and performing to each other, as that would violate the confidentiality that I had promised them when they revealed certain information to me in the context of my ethnographic research. These were small yet significant choices and compromises that ethnographers-turned-policy-advocates would have to think through and tarry for another time—I wonder how many of these I made were actually really grave mistakes.

For me, using insights derived from slow research and ethnography to inform policy nevertheless worked most effectively when offered as correctives to dominant approaches already out there. The value of patient attentiveness attunes ethnographers to gaps, vulnerabilities, taboos, and all that is “not said” in broader culture and therefore helps develop plausible alternatives.

This, of course, can be frustrating for allies we work with. A question we had to field throughout our research was “You say you’re all about accountability, but why can’t you name names of the people you spoke with?” We were a bit surprised to hear this critique the most from academics in history and political science, to whom we explained ethnographers’ commitment of care to informants as well as the importance of self-protection for risky research institutionalized by our university ethics protocols. Journalists respected our conditions of anonymity because they too know how to protect their sources, but were anxious when we emphasize the power that advertising and PR strategists yield in politics as news agencies also greatly depend on their corporate advertising money. Meanwhile, cybersecurity officers of tech platforms were eager to get tips about low-hanging fruits of the “most immediate threats” that could guide their platform bans as these workers are obviously unable to address root causes and change entire systems. For me, principles of slow research helped me negotiate the terms of each of these transactions, learn about hidden motives, and relax my own preconceptions about other fields of practice.

In this radical war moment of abundant and accelerated threat and microtargeted truths, we can only be each other’s imperfect allies. We need to welcome other people’s extra set of eyes that can help open up the issue, broaden our perspectives, and accept the challenging interruption to our dominant way of thinking. Slow research, as with slow food diet, invites us academics, ethnographers especially, and our allies to take time to appreciate and care for our subjects, colleagues, and collaborators. The pause we take is acknowledgment of privilege but also a reminder of the humble endeavor to make sense of new bewildering sensation.

References

  1. Adams, Vincanne. 2013. Markets of Sorrow, Labours of Faith: New Orleans in the Wake of Katrina. Durham: Duke University Press.

    Book  Google Scholar 

  2. Adams, Vincanne, Nancy Burke, and Ian Whitmarsh. 2014. Slow Research: Thoughts for a Movement in Global Health. Medical Anthropology 33(3): 179–197.

    Article  Google Scholar 

  3. Almario-Gonzalez, Chi. 2017. Unmasking the Trolls: Spin Masters Behind Fake Accounts, News Sites. ABS-CBN News, 7 January. http://news.abs-cbn.com/focus/01/20/17/unmasking-the-trolls-spin-masters-behind-fake-accounts-news-sites. Accessed 30 Jan 2020.

  4. Couldry, Nick. 2012. Media, Society, World: Social Theory and Digital Media Practice. Cambridge: Polity Press.

    Google Scholar 

  5. Davies, William. 2019. Nervous States: Democracy and the Decline of Reason. New York: Vintage.

    Google Scholar 

  6. Dayan, Daniel. 2007. On Morality, Distance and the Other Roger Silverstone’s Media and Morality. International Journal of Communication 1(2007): 113–122.

    Google Scholar 

  7. Donovan, Joan, and danah boyd. 2018. The Case for Quarantining Extremist Ideas. Guardian, 1 June. https://www.theguardian.com/commentisfree/2018/jun/01/extremist-ideas-media-coverage-kkk?fbclid=IwAR3yWmpptBHoRPVDZNy95rcs_faLXTscL7YGfL7aM05CUTZV-Zv8qgoML0Y. Accessed 30 Jan 2020.

  8. Fabian, Johannes. 2002. Time and the Other: How Anthropology Makes its Object. New York: Columbia University Press.

    Google Scholar 

  9. Ford, Matthew, and Andrew Hoskins. 2020. Radical War: A New Paradigm of War and Media. Journal of Digital War 1(1/2).

  10. Goffman, Alice. 2014. On the Run: Fugitive Life in an American City. Chicago: University of Chicago Press.

    Book  Google Scholar 

  11. Grindstaff, Laura. 2008. Self-serve Celebrity: The Production of Ordinariness and the Ordinariness of Production in Reality Television. In Production Studies, ed. Vicki Mayer, Miranda Banks, and John Caldwell, 71–86. London: Routledge.

    Google Scholar 

  12. Harbarth, Katie. 2018. Protecting Election Integrity on Facebook. In Presented at 360/OS, Berlin, Germany.

  13. Hochschild, Arlie. 2016. Strangers in Their Own Land. New York: The New Press.

    Google Scholar 

  14. Jolly, Jihii. 2014. How to Establish a Media Diet. Columbia Journalism Review, 20 August. https://archives.cjr.org/news_literacy/slow_news_news_diet.php. Accessed 30 Jan 2020.

  15. Malinowski, Bronislaw. 2015. Anthropology is the Science of the Sense of Humour: An Introduction to Julius Lips’ The Savage Hits Back, or the White Man Through Native Eyes. HAU: Journal of Ethnographic Theory 5(3): 301–303.

    Article  Google Scholar 

  16. Marwick, Alice, and Rebecca Lewis. 2017. Media Manipulation and Disinformation Online. Data and Society. https://datasociety.net/pubs/oh/DataAndSociety_MediaManipulationAndDisinformationOnline.pdf. Accessed 30 Jan 2020.

  17. Mayer, Vicki. 2008. “Guys Gone Wild”? Soft-Core Video Professionalism and New Realities in Television Production. Cinema Journal 47(20): 97–116.

    Article  Google Scholar 

  18. Merrin, William. 2019. Digital War: A Critical Introduction. New York: Routledge.

    Google Scholar 

  19. Ong, Jonathan, and Jason Cabañes. 2018. Architects of Networked Disinformation. New York: Data and Society Research Institute.

    Google Scholar 

  20. Ong, Jonathan, and Jason Cabañes. 2019. Politics and Profit in the Fake News Factory: Four Work Models of Political Trolling in the Philippines. Riga: NATO STRATCOM COE.

    Google Scholar 

  21. Ong, Jonathan, Jaime Manuel Flores, and Pamela Combinido. 2015. Obliged to be Grateful: How Local Communities Experienced Humanitarian Actors in the Haiyan Response. Makati: Plan International.

    Google Scholar 

  22. Ong, Jonathan, Ross Tapsell, and Nicole Curato. 2019. Tracking Digital Disinformation in the 2019 Philippine Midterm Election. Canberra: New Mandala.

    Google Scholar 

  23. Philipps, Whitney. 2018. The Oxygen of Amplification: Better Practices for Reporting on Extremists, Antagonists, and Manipulators. New York: Data & Society Research Institute.

    Google Scholar 

  24. Scott, James. 1985. Weapons of the Weak. New Haven: Yale University Press.

    Google Scholar 

  25. Seaver, Nick. 2017. Algorithms as Culture: Some Tactics for the Ethnography of Algorithmic Systems. Big Data & Society. https://doi.org/10.1177/2053951717738104.

    Article  Google Scholar 

  26. Solnit, Rebecca. 2019. Rebecca Solnit on the Power of Changing the Narrative and Writing Our Own Stories. CBC, 27 December. https://www.cbc.ca/radio/thesundayedition/the-sunday-edition-for-december-29-2019-1.5399598/rebecca-solnit-on-the-power-of-changing-the-narrative-and-writing-our-own-stories-1.5399614. Accessed 30 Jan 2020.

  27. Udupa, Sahana. 2019. India Needs a Fresh Strategy to Tackle Online Extreme Speech. Engage, 31 January. https://www.epw.in/engage/article/election-2019-india-needs-fresh-strategy-totackle-new-digital-tools. Accessed 30 Jan 2020.

  28. Wolcott, James. 2018. The “Left Behind” Trump Voter has Nothing More to Tell Us. Vanity Fair, 7 September. https://www.vanityfair.com/news/2018/09/the-left-behind-trump-voter-has-nothing-more-to-tell-us?fbclid=IwAR3tHkD25plMSsWPtr_jLRHDQIGY1Bylq7YGyUP3Vr_mwLNp0EzUavP1OEI. Accessed 30 Jan 2020.

Download references

Acknowledgements

Thoughtful and clarifying exchanges with Sahana Udupa, Nicole Curato, Shobha Avadhani, Vince Rafael, Lila Shahani, and Clare Amador helped me with this essay.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Jonathan Corpus Ong.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Ong, J.C. Limits and luxuries of slow research in radical war: how should we represent perpetrators?. Digi War 1, 111–116 (2020). https://doi.org/10.1057/s42984-020-00006-x

Download citation

Keywords

  • Ethnography
  • Slow research
  • Disinformation producers
  • Fake news
  • Perpetrator’s narrative
  • Research ethics