Skip to main content

What would a feminist open source investigation look like?


Use of publicly available information to offer radical retellings of violence has powerful democratising potential, both in terms of who contributes to open source investigations and whose stories they centre. At a time when trust in government, media institutions and non-government organisations as fact bearers has been eroded, emergent open source methods have become “an alternative set of truth practices” (Weizman in Open Verification, e-flux, 2019). Yet there are few accepted guidelines on what is legally, morally, or ethically permissible in such investigations. A growing question among practitioners using open source techniques in human rights investigations is not “Can we do this?” but “Should we be doing this?” Here, we set out why intersectional feminist thought should be considered when grappling with the radical possibilities and serious ethical challenges of open source investigations. To this end, we offer practical examples of how an investigator might better situate their findings, show their workings, design for ambiguity, practice equity in attribution, and find new ways to care for themselves and others.


What would an explicitly feminist open source investigation look like? This is not a thought experiment nor an exercise to identify what is feminist and what is not. Rather, it is an action plan by a group of human rights and tech workers seeking to conduct open source investigations differently. Our aim is to apply feminist thought to open source investigations so as to question and to reimagine what, in the last 5 years, have become dominant and default ways of working.

What follows is a response to the article ‘What would feminist data visualization look like?’ by US-based data feminist D’Ignazio (2017), and to the academic paper ‘Feminist Data Visualization’ co-authored by D’Ignazio with the digital humanities scholar, Lauren Klein (D’Ignazio and Klein 2016). Both texts look critically at the growing field of data visualisation and, in particular, at what was being designed, for whom and by whom. They offer a number of tangible ideas to help the reader imagine what a feminist data visualisation could look like. Our contribution is intended to offer the same for open source investigations.

First, what do we mean by ‘open source’? After all, its meanings are multiple. In the context of an investigation, open source can refer to both a category of information and a methodology. Open source information is information in the public domain, obtained with overt methods, as opposed to information that is classified, private or obtained via covert means. The authors of the forthcoming International Protocol on Open Source Investigations define this information source as “public information that is accessible by observation, request or purchase and that requires neither illegal means nor special status (such as law enforcement status) to acquire” (Human Rights Center at the University of California, Berkeley, forthcoming). As a methodology, open source is a grouping of techniques used to identify, collect and analyse this public information.

Open source investigations offer radical, democratising possibilities for human rights fact-finding (Heyns 2015, 6). The attention they give to social and local media sources can centre the experiences of groups whose voices are too often heavily mediated, marginalised or excluded in conventional reporting. For example, the non-governmental organisation (NGO) Syrian Archive used imagery posted on social media by “civilian witnesses” otherwise cut-off by conflict to investigate human rights violations in the ongoing Syrian civil war. The group’s work informed media coverage of the conflict, aided criminal investigations and preserved over three million user-generated images and videos at risk of erasure.Footnote 1

From data collection to analysis, a wide spectrum of participation has opened up the investigative space to digital volunteers, geographically dispersed activists and, in theory, anyone with an internet connection, desire, and time to contribute. In 2019, Amnesty International used microtasking to involve over 6000 volunteers in its open source investigation into abuse against women on the social media platform Twitter. The volunteers marked tweets as abusive or not, to generate a labelled dataset of problematic content—a subset of which Amnesty used to train an algorithm to analyse millions of tweets, thus automating the process and building a case of negligence against the tech platform (Amnesty International 2018). In the days after Amnesty’s findings were published, Twitter’s share price dropped by 12% (Todd 2018).

Indeed, it is the leveraging of so-called ‘user-generated content’ to ‘verify’ and ultimately, to pitch the realities of marginalised groups against those touted by powerful corporations, governments or militaries, which gives open source a revolutionary feel. In a ‘post-truth world’ where trust in institutions as fact bearers has been significantly eroded, open source methods present an “alternative set of truth practices” (Weizman 2019).

However, with these opportunities come serious ethical challenges. Within the open source investigator’s toolbox are methods of surveillance including applications developed for the online tracking and near-real-time monitoring of anyone of interest to an investigation without their knowing. As well as tools developed to uncover barely visible and thus arguably barely public information. Or, techniques designed to capitalise on the ‘mosaic effect’—that is the piecing together of data from different open sources to reveal new information, never directly made public. “One person’s open source investigation could be another person’s ‘doxxing’”,Footnote 2 note Rahmen and Ivens in the book Digital Witness (Rahman and Ivens 2020, 251).

Representation is also at stake. “Open source research has the potential to profoundly affect not just whose stories get told, but also who gets to tell those stories, and who will listen to them”, write Yvonne McDermott, Daragh Murray and Alexa Koenig of the OSR4Rights research group (McDermott et al. 2019). Representation issues range from an overreliance on whose voices are is already heard in a “cacophonous social media environment” to the types of violations reported and the prioritisation by NGOs of violations presumed to be suited to open source investigations (McDermott et al. 2019). In a conversation with the authors, the Deputy Director of the Engine Room, Zara Rahman, stresses the consequences of these dynamics: “Essentially, what is shown to the world by an investigation is shaped by the people with the most power, and these people are almost certainly not the people who are represented in the data, or who generated the data” (Z Rahman 2020, personal communication, 6 January). Crucially, for Rahmen representation issues in open source are rooted in power relations or more specifically the asymmetries that define them.

Establishing new ways of working

Despite these ethical dilemmas, open source investigations—perhaps due to their perceived newness or revolutionary promise—have been afforded a lot of space to push the boundaries of what’s possible, to make mistakes and to mature slowly. Open source tools, techniques and the communities formed around them have evolved rapidly over the last 5 years. From a lone sleuth working in their bedroom to an employee operating within a large team for a media outlet or human rights organisation, there are few formalised and accepted guidelines concerning what is legally or morally permissible in the space of an open source investigation. In certain circles, open source investigation is seen as the Wild West—a new, disembodied digital frontier where anything goes, especially in social media research.

Our impressions are not isolated—the need for ethical codes, protocols and other standards is already being discussed. For example, the Human Rights Center at the University of Berkeley, California is spearheading a collaborative effort to develop an International Protocol on Open Source Investigations alongside other human rights-based approaches introduced by many of the larger NGOs that are working to integrate open source practices with traditional research methods (Human Rights Center at the University of California, Berkeley, forthcoming; Koenig 2017; Sen 2019).

Although welcomed, current efforts largely focus on investigations working within human rights, that is, legal frameworks. In our view, if the radical, democratising potential of open source investigations is to centre the experiences of marginalised or underrepresented populations typically underserved or excluded from judicial forms of justice and accountability, it is vital that we give attention to investigations working outside legal systems in the pursuit of non-judicial forms of social justice.

In ‘From Human Rights to Feminist Ethics: Radical Empathy in the Archives’, Michelle Caswell and Marika Cifor define social justice as:

[The] ideal vision that every human being is of equal and incalculable value, entitled to shared standards of freedom, equality, and respect. These standards also apply to broader social aggregations such as communities and cultural groups. Violations of these standards must be acknowledged and confronted. It specifically draws attention to inequalities of power and how they manifest in institutional arrangements and systemic inequities that further the interests of some groups at the expense of others in the distribution of material goods, social benefits, rights, protections, and opportunities. Social justice is always a process and can never be fully achieved (Duff et al. via Caswell and Cifor 2016, 26).Footnote 3

If social justice beyond legal frameworks is an inherently an open-ended and “incalculable” process, we ask: what can feminist thought bring to open source investigations that a human rights-based approach might not?

Open source investigations are about power relations, so too is feminism

For decades feminist scholars and activists have worked to problematise unequal power relations and to find ways to resist and transform them. Simply put, feminism “is about power—who has it and who doesn’t” (D’Ignazio and Klein 2018).

By feminist thought, we mean specifically intersectional feminist thought. Intersectional feminism looks beyond gender to ask how race, class, sexuality, religion, ability and much more, determine our total experience of the world. The authors of this paper, for instance, are two white, native-English speaking women based in Western Europe, with higher education degrees. All of these factors and our positionality contribute to the reality that our voices are the ones being heard here, and with them our biases, understanding and outlooks (Alcoff 1988).

The term has existed since the late 1980s when it was used by the critical race scholar and lawyer, Kimberlé Crenshaw in a US court case. Crenshaw used the image of a traffic intersection to describe how her client, a Black woman, had experienced workplace discrimination both because she was Black and a woman (Crenshaw 1989). In other words, as a Black woman, Crenshaw’s client faced a kind of compounded oppression different from that faced by a white woman or a Black man.

For Crenshaw, intersectionality is “an experience, an approach, and a problem” (Crenshaw via Norlock 2019). Contemporary hollow uses of the term have been critiquedFootnote 4; however, we believe that regardless of what it is called—the Black feminist Patricia Hill Collins calls it the “matrix of domination”—as a concept, intersectionality is more relevant than ever (Collins 1990). If this is the case, what can open source investigators learn from a long history of intersectional feminist scholarship and activism?

One tool in the feminist’s toolbox is equity. “Equity is different than equality”, states the manifesto of ‘The Global Open Science Hardware (GOSH)’ movement. “Equity recognises that everyone does not start from the same position and so treating everyone the same may leave them in the same uneven positions they began in” (GOSH 2017). In other words, calling attention to how power is unevenly distributed and then treating everyone involved in an investigation in the same way will not bring about sustainable change. To achieve change in the long term, we must practice equity.

How can we illustrate this approach? In the words of the Newfoundland based feminist and anti-colonial science lab ‘Civic Laboratory for Environmental Action Research (CLEAR)’, the application of feminist thought to a professional space “isn’t about getting more women, people of colour, and Indigenous people into science—that would be inviting people into a space that is already stacked against them”. Instead, the idea is to change how the work is done “so that different world views, values, and ethics are the basis for knowledge production, which also happens to make the field a better place for women, people of colour, Indigenous peoples, queers, and people with disabilities” (CLEAR 2017).

The following provocations expand upon these ideas through practical questions and examples. Of the three section headers, the first two are borrowed, with permission, from the original article by D’Ignazio. The provocations are by no means exhaustive. Rather, they are an invitation to be built upon, debated and remixed.

Invent new ways to represent uncertainty, outsides, missing data, and flawed methods

Objectivity and presenting the ‘whole truth’ matter greatly in open source investigations that, understood as an alternative set of truth practices or epistemologies, must win trust through demonstrating technical competency and impartiality. Feminist objectivity “is about limited location and situated knowledge”, writes the feminist scientist Haraway (1988, 583). In short, all knowledge comes from somewhere and is incomplete in some way. Or more specifically, all investigations contain ambiguities, outsides and “missing data”Footnote 5 (Onuoha 2018). As such, “an argument for situated and embodied knowledges” is “an argument against various forms of unlocatable, and so irresponsible, knowledge claims. Irresponsible means unable to be called into account” (Haraway 1988, 583). The recommendations that follow, question the language of trust and verification that is inherently bound up with claims of objectivity.

Situate your findings

All research is situated within a community of practice that has its own “axiolog (morals, values, and ethics)” (CLEAR 2017). It helps to think of investigations as a string of decisions that enact these morals, values, and ethics. Seemingly innocuous, technical decisions such as “acts of tagging, indexing, aggregating and defining of data and data categories are inherently political”, write members of Syrian Archive (Deutch and Habal 2018, 46). These acts are political because certain values and interests are reproduced and amplified, while others are not (CLEAR 2017).

Is it possible to communicate that an investigation’s findings are not the ‘whole picture’, and for this to strengthen rather than weaken confidence in them? A large part of situating findings is documenting and publishing the decisions made along the way, along with the reasons they were made.

For example, a video report of an investigation into a United States drone strike in Mir Ali, Pakistan by the London-based group Forensic Architecture merges the presentation of the findings with documentation of how they were reached. The investigation used an all source (as opposed to open source) technique developed by the Forensic Architecture called “situated testimony” which employs “3D models of the scenes and environments in which traumatic events occurred, to aid in the process of interviewing” (Forensic Architecture, n.d.). From behind, the camera captures the investigators, the witness and a 3D-modeller, backlit by the computer graphic they are working on. A voiceover tells us that the witness “is hoping to communicate the realities of life under drones. And the experience of surviving a strike in which she also lost her brother in law” (Forensic Architecture 2013). We hear the witness’ dialogue with the investigators and see her hand-drawn sketch of the building compound. The video is documentation of a positive feedback loop within the investigative process—it seems that the more the witness models, the more she recalls. “Without the plan”, she says, referring to the 3D visualisation, “I could not have remembered”. The video report successfully situates Forensic Architecture’s findings by laying bare the technical (such as 3D modelling and satellite imagery analysis) and subjective (such as memory, interview and interpretation) processes that combine to generate the high resolution, 3D reconstruction of the drone strike that is shown at the end of the video. Without this insight, the accuracy and purpose of such a sophisticated visualisation could easily be misread (Figs. 1, 2).

Fig. 1

Digital reconstruction of a US drone strike which hit a home in Mir Ali, North Waziristan, Pakistan, killing five people. One of the surviving witnesses, pictured, worked with Forensic Architecture in Dusseldorf to build a digital model of her former home. Published on 16 April 2013. Image courtesy of Forensic Architecture © 2020 Forensic Architecture

Fig. 2

A rendered scene from within the digital reconstruction of the witness’ home. Published on 16 April 2013. Image courtesy of Forensic Architecture © 2020 Forensic Architecture

Show your workings

When we understand how facts are made, where their weaknesses lie, and what are the limits of what can be said, we can better construct and defend them (Weizman 2019).

An investigation’s methodology is the counterpart to its findings. A good methodology not only provides a history of actions taken, invites critique and makes the results potentially replicable—importantly, it also engages with the unknowns and problematises the research and its politics. In other words, it helps illuminate the where, why and how of knowledge production.

For instance, in a data-driven open source investigation the methodology could engage with: the provenance of the data; the decisions behind the data; the stakeholders represented; the data not included in the investigation and the reasons why; the rationale behind why the findings were presented in a particular way. In doing so, this methodology not only focuses on how certain conclusions were reached but also, vitally, on the fact that the data which supports them connects to real bodies, real systems and real power structures in the wider world.

Design for ambiguity

Open source investigations engage with real-world ‘muddy data’ and uncertainties. Working with unconventional sources, often at scale, means handling documentation that can vary hugely in medium, format, resolution and verifiability. Is it possible to design for different kinds of knowledge and the diverse ambiguities they present, throughout the investigative process?

For instance, it is essential to critically reflect on the legitimisation and the practices of verification—the technique of authenticating a source or information. Verification is an essential process in open source investigations but the language that surrounds it is normative, which is to say, something is either verified or not verified. What are the standards that determine if a source or a piece of documentation is verified? Who decides these standards and who implements them? Importantly, for whom—perhaps external to the investigation team—could these decisions have a negative impact on? Experimenting with gradiented verification or finding new ways to include materials that can neither be verified nor disproved by the investigator, is another way to work with, not against, ambiguities. Or, to “stay with the trouble!” as Donna Haraway might say (Haraway 2016, 1).

For example, the London-based civilian casualty monitor Airwars uses gradiented verification in its assessments of local reports of civilian casualties from US-led Coalition airstrikes in Iraq and Syria. The NGO’s verification system, which ranges from “Confirmed” (conceded by a belligerent) and “Fair” (multiple, credible sources) to “Weak” (single source) and “Discounted”, allows it to maintain a public record of all allegations. Tens of civilian deaths from incidents initially assessed by the NGO as Weak have been subsequently conceded by the military Coalition. Had Airwars excluded these incidents because they could not be fully verified, the civilian deaths might never have been investigated (Airwars 2019) (Fig. 3).

Fig. 3

Gradiented verification in Airwars’ online archive of civilian casualty allegations and related military claims. Screenshot dated January 2020, reproduced here with permission from Airwars

Such an approach recognises that a high occurrence of unverifiable sources or materials can be evidence of the conditions of reporting, methods of data collection or the position of the investigator (as in geographic, temporal, linguistic or cultural) vis-à-vis the subjects of the investigation. Or, that it could simply be indicative of the limits of the investigation (as in resources, time or expertise).

Outwardly, the language of verification matters greatly because of its effect on the integrity and persuasiveness of an investigation’s findings. As such, it should be treated with extreme care. For example, in the process of geolocation, a landmark similar to one depicted in a user-generated video is identified in satellite imagery. Should the landmark in the video be labelled “the same as” or “consistent with” the visually similar landmark in the satellite imagery? Put differently, how can ambiguity be communicated? Would the phrase “consistent with” make the findings less persuasive or, in fact, could it improve their integrity by increasing their transparency?

The visual language of verification is no less important. Indeed, the “coloured box culture” of open source verification (in which coloured boxes are drawn around landmarks to show how they recur in different images) is both effective and, arguably, overused. Is there a clearer or more nuanced way to communicate that the images presented do not “speak for themselves” and that they are interpretations which may require extra or expert knowledge to be read? (Kurgan 2013, 25).

Invent new ways to reference and address the material economy behind the data and investigation

There are deep disparities in power, risk and reward between the actors who populate open source investigations—from lead investigators, to junior researchers, to the people represented in the research and to those responsible for communicating the findings. In short, open source investigators risk reproducing the very power asymmetries they seek to unsettle (Rahman and Ivens 2020).

Some of these power differentials are visible. They may be evident in the choice of investigative topic, the makeup of the investigative teams, or the choice of digital tools and mechanisms for participation in investigations which too often exclude those unable to digitally contribute. Are the people who generated the data or documentation absent from its final representation? And, are they thus less likely to receive attention and funding?

The responsibility to recognise and address the disparity in power lies with more than the investigators and researchers. Funders themselves can incentivise poor practice by overvaluing what is new and innovative while overlooking the less glamorous aspects of investigative work such as the maintenance of tools and databases or the routine upkeep of servers and other hardware. At the same time, NGOs who engage in more traditional human rights work can be pressured by funders to have a digital component to their investigations in order to keep up with what is perceived to be ‘state of the art’. This last point is addressed in more detail in McDermott et al. (2019). They note how “this is deeply concerning, particularly for those organisations that do not have the capacity (in terms of time, money, expertise, and technological capacity) and inclination to engage with the difficult processes of collection and verification of open source evidence” (McDermott et al. 2019).

Practice equity in attribution

In not-so-subtle ways, attribution reproduces power relations both internal and external to the investigation team. Giving credit to those who have contributed to an investigation is an important part of referencing and acknowledging the labour that made it possible, particularly since credit is currency when it comes to future employment or funding opportunities (see CLEAR 2016). As D’Ignazio and Klein note, “making labor visible” has particular pertinence “in light of the fact that women and other underrepresented groups have been notoriously excluded from sharing in credit for scientific work” (D’Ignazio and Klein 2016, 3). The same could be said for the predominantly male fields of journalism and technology in which many open source investigators work (Smith 2019).Footnote 6

Crediting who worked on which part of an investigation, whose data made it possible and whose tools and methodologies were used is a start. However, practicing equity in attribution means questioning not only who contributed but what counts as a contribution. In other words, what is understood and valued as labour, throughout an investigation? Or, as the community technologist, Rigoberto Lara Guzmán, and the anthropologist, Sareeta Amrute, argue in ‘How to Cite Like a Badass Tech Feminist Scholar of Color’ practicing equity requires “unsettling” entrenched ideas of who during an investigation is a research subject and who is an “expert” (Guzmán and Amrute 2019).

For instance, who developed the tools, systems and workflows an investigation is reliant on?; who translated the research materials?; which local contacts were the investigation dependent on and who sent essential information?

Since 2016, a US-based group called the Information Maintainers, has worked to acknowledge and better resource the unglamorous task of caring for information, and importantly, caring for the carers (Acker et al. 2019). For the Information Maintainers, care could be understood repair, maintenance and attention. Perhaps comparably, for CLEAR, care is “a form of political and ethical practice that ‘holds things together’ (de la Bellacasa 2011, 90; Martin et al. 2015)” (Liboiron et al. 2017). In the context of an open source investigation, care work or holding things together encompasses a diverse range of tasks from tool building, data cleaning and file archiving to updating and fixing code as well as the maintenance of servers and other hardware perhaps long after the investigation findings have been published. Other tasks that count as care work include supporting colleagues and volunteers both informally and formally through trainings and reviews, organising meetings, sending email reminders, thanking people, listening to concerns voiced and taking them seriously and so on.

Open source investigations into humanitarian crises or conflicts often involve sustained work with graphic or otherwise distressing content. In this context, care work is heightened and its acknowledgment can feel particularly important because of the emotional labour involved (Fig. 4).

Fig. 4

White board with traces of the Civic Laboratory for Environmental Action Research’s approach to equity in author order. Photo by CLEAR, Creative Commons Attribution-ShareAlike 4.0 International License

In this vein, CLEAR published the paper ‘Equity in Author Order Protocol’ which lays out a number of standards transferable to open source investigations. The protocol covers principles such as agreeing by consensus-based decision making as a way to (re)distribute power, considering care work as a form of labour and, when all seems equal, considering the “different social markers associated with oppression and privilege for different groups of people”. In the context of open source investigations, it could be helpful to ask the following questions adapted from CLEAR’s guidelines:

  • Who needs the capital of attribution more? Consider the immediate and long-term impact of attribution.

  • Which unsung or underfunded groups do you want to promote or highlight?

  • Is this a unique opportunity for particular people to be recognised?

  • Who is being paid or unpaid for this work?

(CLEAR 2016)

Find new ways of looking after ourselves and others

A critical feminist ethics of care focuses on the moral challenge of listening attentively to all of those whose needs exist, intertwined, in a given time and place. While we cannot respond to all of the needs of everyone, the task is to judge with care by considering who will be harmed or isolated by our actions or policies in a given, particular context. (Robinson 2018, 7)

There has been a clear shift in the language around open source investigations away from the hippocratic oath copy-and-paste principle of “do no harm” towards a more realistic principle of harm reduction.Footnote 7 The shift, echoed in related fields such as digital security, acknowledges that the potentially harmful impacts of open source investigations are continually evolving in ways that are unknowable (The Engine Room 2018). As such, the obligation to mitigate against harm must also evolve. Such an approach is recognition that using novel investigatory techniques and tools makes possible harms harder to predict.

Care for yourself

In the words of African-American lesbian poet, librarian and activist Audre Lorde, “caring for myself is not self-indulgence, it is self-preservation, and that is an act of political warfare” (Lorde 2017, 130). Open source investigative techniques are often used in moments of violent conflict or crisis. Investigators will analyse events at a granular level: poring over the documentation—watching, reading, hearing the violence—indirectly experiencing them. This can lead to vicarious trauma, a response to the accumulation of exposure to the pain experienced by others. Both personal strategies and institutional shifts in approaches can have profound reductions in developing vicarious trauma. First Draft News and Eyewitness Media Hub have published a series of guides on reducing vicarious trauma when working with distressing material (see Dubberley et al. 2015). Importantly, by caring for yourself you are not only looking after yourself but also creating space for others to care for themselves.

Talking about care or psychosocial resilience in open source investigations can be met with resistance. Alexa Koenig, the Executive Director of the Human Rights Center at the University of Berkeley, California, for example, counters this in resiliency and professional trainings through including care within a holistic security framework. “I explain that security in open source activities is tripartite—physical, digital and psychosocial”, writes Alexa, “and that they are like overlapping Venn diagrams. When one is affected, the other two usually are as well” (A Koenig 2019, personal communication, 29 December).

Let empathy inform decisions

Radical empathy is… a learned process of direct and deep connection between the self and another that emphasises human commonality through “thinking and feeling into the minds of others” (Caswell and Cifor 2016, 30).

Those generating data may have very little control of how it is used in an investigation. In this section, we question the ethics of representation and of consent, or more often a lack of consent. Who decides how a video is used and shared? These decisions are rarely made by the person who is in the video or the person who made the video. Nevertheless, it is the person represented or the person carrying out the on-ground work whose safety is potentially affected, possibly without their awareness or ability to consent.

The questions below, adapted from Caswell and Cifor’s article, are intended as a guide to applying radical empathy to decision making in open source investigations. As Caswell and Cifor explain, it is not only the record creator but the subject of the record, the user of the record or the wider community that might have conflicting wishes. In order to address this, “in a feminist approach, each one of these parties is considered empathetically and in relation to each other and to dominant power structures before archival decisions are made” (Caswell and Cifor 2016, 34).

  • What are the desires and needs of the record creators? Would they want this material to be used in this way? Would they want this material to be preserved indefinitely?

  • When highlighting a record or data set: what are the consequences for those not being highlighted or represented? Who might be silenced by an investigation?

  • When publishing an investigation what are the personal consequences or affective experiences of those featured in the reporting? For example, in an investigation into destruction of houses, what will the impact be on those who might be finding out their house is destroyed through the published investigation? Building this into an investigation can be as simple as giving those affected the time and space to be heard, both in the research process and presentation of the findings. Indeed, at the core of procedural justice literature is the idea that most people just want to be heard, and fairly so, even when decisions are made contrary to their wishes (A Koenig 2019, personal communication, 29 December).

“Prioritise those proximate!”

“Prioritise those proximate!” urges D’Ignazio over a Skype call during the research for this article (C D’Ignazio 2019, personal communication, 3 September). How can the needs, well-being and safety of those most proximate to the violence being investigated be prioritised at every stage of an open source investigation?

  • Whose privacy is a concern?

Will the investigation dramatically increase or alter the visibility of people featured in the investigation or otherwise proximate to it and how? It is different to propel into scrutiny—even positively—a tweet from an account with under 50 followers, compared to one with over 1000 followers. It is reasonable to suspect that an account holder with over 1000 followers will have less privacy expectations than that of an account holder with 50 followers. If included in an investigation’s findings, how can the Tweets be handled differently? Another approach increasingly taken by news outlets is to embed tweets into online articles rather than saving and uploading the content, thus providing the authors of the tweet with the option to delete it or change the privacy settings.

  • Who can access the published report?

Publish in the language(s) of the sources, people and places featured in an investigation. This is critical for names that require transliteration between languages such as Arabic and English as one name can be transliterated in multiple ways, making it hard to trace. Resources are often scarce by the end of an investigation, so budget from the beginning for professional translation and the skill sets needed to produce materials in multiple languages. How accessible is the report, really? In 2019 half of the world’s population did not use the internet, and women, disabled and Indigenous people were overrepresented in this group. Consider alternative ways to circulate findings offline, such as posting physical copies to individuals or community affected by the events documented in an investigation or, if appropriate, organising in-person meetings or organising a local event. For example, when Forensic Architecture was invited to conduct an investigation into the police killing of Mark Duggan, a young black man whose death was a catalyst for the 2011 London riots, the group chose to present their findings at a community event held in Duggan’s local borough of Tottenham before publishing details online. Investigative protocols from other specialisms are transferable. The Argentine Forensic Anthropology Team (Equipo Argentino de Antropología Forense) has developed a notification protocol designed to prioritise the needs, well-being and safety of the family members of the disappeared or missing people, who the group have identified the remains of (see The Argentine Forensic Anthropology Team Notification Protocol)Footnote 8 (Fig. 5).

Fig. 5

“Where there is no justice, there is just us!” Stafford Scott, event organiser and campaigner. An invitation, tweeted by Forensic Architecture, to the community event for the investigation into the police killing of Mark Duggan. Screenshot dated 24 November 2019, reproduced here with permission from Forensic Architecture

  • Who might need different care in an investigative team?

Trauma is an intersectional issue. More considerate, thoughtful and sustainable ways of working might be necessary for those in the team who may have already experienced trauma themselves, generationally or have a personal connection to the subject of an investigation. For example, through the Digital Verification Corps set up by Amnesty in 2016, the NGO has worked with six universities and their students on open source investigations, often on the verification of graphic or distressing user-generated content. The diverse backgrounds of the student investigators’ have been critical to the project’s success. Mindful of the special circumstances and that the students manage other commitments and pressures alongside the work, Amnesty has paid particular attention to the psychosocial aspects of investigations and worked to train the groups and their coordinators to prevent and identify vicarious stress and trauma in themselves and each other (see Amnesty Citizen Evidence Lab).

In lieu of a conclusion

These provocations are a sample of how intersectional feminist thought and activism can aid open source investigators in reimagining ways of working. The article is an invitation to share experiences, reflections and ideas. As D’Ignazio and Klein note, feminism is one approach that intersects with and is informed by other social justice movements—including Queer, anti-racism, anti-ableism, Indigenous and anti-colonial—and their specific bodies of knowledge.


  1. 1.

    A 2018 investigation by Syrian Archive and Knack into the shipment from Europe to Syria of chemicals that can be used to produce chemical weapons resulted in the conviction of three Belgian companies that had violated EU sanctions law as well as an internal audit of the Belgian customs system. In 2019, Syrian Archive alongside TRIAL International and the Open Society Justice Initiative filed criminal complaints with prosecutors in Germany and Belgium to request that they open further investigations into the role of European companies in separate shipments of chemicals to Syria via Switzerland, which at the time had not been subject to EU sanctions. The requests were evidenced by open source information collected by the Syrian Archive. To date, the Syrian Archive has collected 3,578,591 digital records, of which 651,322 it has analysed and 8249 it has verified (J Deutch 2020, personal communication, 4 March; Havlik 2019).

  2. 2.

    Doxxing is searching for and publishing identifying information about someone, typically with malicious intent to threaten, silence or do real harm (Rahman and Ivens 2020).

  3. 3.

    For the original source of the definition see ‘Social justice impact of archives: A preliminary investigation’ by Wendy M Duff, Flinn, Karen Emily Suurtamm, David A Wallace (Duff et al. 2013).

  4. 4.

    For a critique of intersectional feminism see ‘Why I’m giving up on intersectional feminism’ by Gordon (2018).

  5. 5.

    For more on missing data see the work of the Nigerian-American artist and researcher, Mimi Onuoha. (Onuoha 2018).

  6. 6.

    In the UK in 2019, on average in journalism women earned 11.2% less than men and held 40% of jobs; in programming and software development female professionals earned 6.8% less than male professionals and held only 12% of jobs; in information technology female professionals earned 7.1% less than male professionals and held 18% of jobs. The same cannot be said for law, another field that open source investigators work in. In 2019, female legal professionals held over half of all jobs yet earned on average 16.8% less than their male peers (Smith 2019).

  7. 7.

    One of the leaders of the discussion around ‘harm reduction’ has been Alexa Koenig at the Human Rights Center at the University of Berkeley, California.

  8. 8.

    The Argentine Forensic Anthropology Team notification protocol includes “the compilation, data verification, and review of all case materials prior to scheduling a notification; the completion of an integrated, multidisciplinary identification report in the family’s language; risk assessment of the family receiving a notification, be it health related or threat from other persons; conducting the notification in person with the family; providing psychological and medical support; and explaining the repatriation process, among others”

    (Solís et al. 2016).


  1. Acker, A., H. Arnold, J. Castro, S. Galvan, P. Hswe, J. Meyerson, B. Nowviskie, M. Lassere, D. Olson, M.A. Parsons, A. Russell, L. Vinsel, and D.J. Wright. 2019. Information Maintenance as a Practice of Care: An Invitation to Reflect and Share. Zenodo. Accessed 8.28.19.

  2. Airwars. 2019. Civilian Casualties Archive. Airwars. Accessed 8.28.19.

  3. Alcoff, L. 1988. Cultural Feminism versus Post-structuralism: The Identity Crisis in Feminist Theory. Signs 13: 405–436.

    Article  Google Scholar 

  4. Amnesty International. 2018. Troll Patrol Findings. Amnesty DECODERS. Accessed 1.17.20.

  5. Caswell, M., and M. Cifor. 2016. From Human Rights to Feminist Ethics: Radical Empathy in the Archives. Archivaria 81: 23–43.

    Google Scholar 

  6. CLEAR. 2016. Equity in Author Order. Civic Laboratory for Environmental Action Research. Accessed 11.15.19.

  7. CLEAR. 2017. Feminist and Anti-colonial Science. Civic Laboratory for Environmental Action Research. Accessed 11.15.19.

  8. Collins, P.H. 1990. Black Feminist Thought: Knowledge, Consciousness, and the Politics of Empowerment. Boston: Unwin Hyman.

    Google Scholar 

  9. Crenshaw, K. 1989. Demarginalizing the Intersection of Race and Sex: A Black Feminist Critique of Antidiscrimination Doctrine, Feminist Theory and Antiracist Politics. Chicago: University of Chicago Legal Forum.

    Google Scholar 

  10. D’Ignazio, C., and L. Klein. 2016. Feminist Data Visualization. Presented at the Workshop on Visualization for the Digital Humanities (VIS4DH), IEEE, Baltimore.

  11. D’Ignazio, C. 2017. What Would Feminist Data Visualization Look Like?. Medium. Accessed 1.17.20.

  12. D’Ignazio, C., and L. Klein. 2018. Data Feminism. MIT Press Open. Accessed 2.28.20.

  13. de la Bellacasa, M.P. 2011. Matters of Care in Technoscience: Assembling Neglected Things. Social Studies of Science 41 (1): 85–106.

    Article  Google Scholar 

  14. Deutch, J., and H. Habal. 2018. The Syrian Archive: A Methodological Case Study of Open-Source Investigation of State Crime Using Video Evidence from Social Media Platforms. State Crime Journal 7: 46–76.

    Article  Google Scholar 

  15. Duff, W., A. Flinn, K.E. Suurtamm, and D.A. Wallace. 2013. Social justice impact of archives: A preliminary investigation. Archival Science 13: 317–348.

    Article  Google Scholar 

  16. Dubberley, S., E. Griffin, and H.M. Bal. 2015. Making Secondary Trauma a Primary Issue: A Study of Eyewitness Media and Vicarious Trauma on the Digital Frontline. Eyewitness Media Hub. Accessed 1.17.20.

  17. Forensic Architecture. 2013. Drone Strike In Mir Ali. Forensic Architecture. Accessed 1.17.20.

  18. Forensic Architecture. n.d. Methodology: Situated Testimony. Forensic Architecture. Accessed 1.17.20.

  19. Gordon, T.J. 2018. Why I’m Giving Up on Intersectional Feminism. Quartzy. Accessed 12.10.19.

  20. GOSH. 2017. GOSH Code of Conduct. Gathering for Open Science Hardware. Accessed 12.9.19.

  21. Guzmán, R.L., and S. Amrute. 2019. How to Cite Like a Badass Tech Feminist Scholar of Color. Medium. Accessed 1.17.20.

  22. Haraway, D. 1988. Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective. Feminist Studies 14: 575–599.

    Article  Google Scholar 

  23. Haraway, D. 2016. Staying with the Trouble: Making Kin in the Chthulucene. Durham: Duke University Press.

    Book  Google Scholar 

  24. Havlik, B. 2019. Reported 2014 Chemical Shipments to Syria Raise Questions over EU Sanctions. Open Society Justice Initiative. Accessed 1.17.20.

  25. Heyns, C. 2015. Report of the Special Rapporteur on extrajudicial, summary or arbitrary executions: Use of information and communications technologies to secure the right to life. UN Human Rights Council. Accessed 12.9.19.

  26. Human Rights Center at the University of California, Berkeley. forthcoming. International Protocol on Open Source Investigations.

  27. Koenig, A. 2017. Harnessing Social Media as Evidence of Grave International Crimes. Medium. Accessed 12.9.19.

  28. Kurgan, L. 2013. Close Up at a Distance: Mapping, Technology, and Politics. Cambridge: MIT Press.

    Book  Google Scholar 

  29. Liboiron, M., J. Ammendolia, K. Winsor, A. Zahara, H. Bradshaw, J. Melvin, C. Mather, N. Dawe, E. Wells, F. Liboiron, B. Fürst, C. Coyle, J. Saturno, M. Novacefski, S. Westcott, and L. Grandmother. 2017. Equity in Author Order: A Feminist Laboratory’s Approach. Catalyst 3: 1–17.

    Article  Google Scholar 

  30. Lorde, A. 2017. A Burst of Light: And Other Essays. Mineola: Dover Ixia Press.

    Google Scholar 

  31. Martin, A., N. Myers, and A. Viseu. 2015. The Politics of Care in Technoscience. Social Studies of Science 45 (5): 625–641.

    Article  Google Scholar 

  32. McDermott, Y., D. Murray, and A. Koenig. 2019. Digital Accountability Symposium: Whose Stories Get Told, and by Whom? Representativeness in Open Source Human Rights Investigations. Opinio Juris. Accessed 1.17.20.

  33. Norlock, K. 2019. Feminist Ethics. Stanford Encyclopedia of Philosophy. Accessed 12.9.19.

  34. Onuoha, M. 2018. Missing-datasets. GitHub. Accessed 10.28.19.

  35. Rahman, Z., and G. Ivens. 2020. Ethics in Open Source Investigations. In Digital Witness: Using Open Source Information for Human Rights Investigation, Documentation, and Accountability, ed. A. Koenig, S. Dubberley, and D. Murray. Oxford: Oxford University Press.

    Google Scholar 

  36. Robinson, F. 2018. A Feminist Practical Ethics of Care. In The Oxford Handbook of International Political Theory, ed. R. Eckersley. Oxford: Oxford University Press.

    Google Scholar 

  37. Sen, A.K. 2019. Wanted: A code of ethics for open source researchers. Atlantic Council. Accessed 12.9.19.

  38. Smith, R. 2019. Gender Pay Gap in the UK. Office for National Statistics. Accessed 2.28.20.

  39. Solís, C.E.O., M. Doretti, and K. Hernandez. 2016. A144 Identification Notifications and Their Applicability to Families of Missing Migrants.

  40. Syrian Archive. n.d. Syrian Archive. Syrian Archive.

  41. The Engine Room. 2018. Ties That Bind: Organisational Security for Civil Society. The Engine Room. Accessed 1.17.20.

  42. Todd, S., 2018. Twitter Stock Tumbles After Analyst Calls It ‘Harvey Weinstein of Social Media’. Variety. Accessed 1.17.20.

  43. Weizman, E. 2019. Open Verification. e-flux. Accessed 8.1.19.

Download references


The article is a direct response to an earlier article by Catherine D’Ignazio titled, “What would feminist data visualization look like?”, which became the book, “Data Feminism”, co-authored with Lauren F. Klein. Some of the structure of the original article by D’Ignazio has been retained. The article also draws on ideas discussed in the book chapter on ‘Ethics in Open Source Investigations’ by Zara Rahman and Gabriela Ivens for, Digital Witness: Using Open Source Information for Human Rights Investigation, Documentation, and Accountability. Special thanks to the generosity of our reviewers (in alphabetical order): Leenah Bassouni, Rebecca Echevarria, Alexa Koenig, Martyna Marciniak and Zara Rahman. Many thanks as well to Piper Haywood for copy editing and reviewing. The article was community reviewed by members of the nascent Feminist Open Source Investigations Group. Thanks to Matthew Battles, Jeff Deutch and Robert Trafford for their insights during the research process.

Author information



Corresponding author

Correspondence to Sophie Dyer.



A list of the digital guides, protocols and codes of conduct used in the research for this article, including those not given as examples in the main text.

  • Association for Progressive Communications

  • ‘Feminist Principles of The Internet’, 2016


  • The Argentine Forensic Anthropology Team (Equipo Argentino de Antropología Forense)

  • ‘Identification Notifications and Their Applicability to Families of Missing Migrants’, 2016


  • Amnesty International, Citizen Evidence Lab

  • ‘Well-being’, n.d.


  • The Human Rights Center at the University of Berkeley, California

  • ‘International Protocol on Open Source Investigations (Berkeley Protocol)’, forthcoming


  • Civic Laboratory for Environmental Action Research (also known as CLEAR)

  • ‘Equity in Author Order’, 2016


  • ‘Guidelines: Designing Equitable Scientific Tools’, 2017

  • ‘How to Run a Feminist Science Lab Meeting’, 2017


  • DART Centre for Journalism and Trauma

  • ‘Handling Traumatic Imagery: Developing a Standard Operating Procedure’, 2017


  • Data and Society

  • ‘How to Cite Like a Badass Tech Feminist Scholar of Color’, 2019


  • Design Justice Network

  • ‘Design Justice Network Principles’, n.d.


  • The Engine Room

  • ‘Ties That Bind: Organisational Security for Civil Society’, 2018


  • ‘Investigative Web Research’, 2017


  • ‘Technology Tools in Human Rights’, 2017


  • Eyewitness Media Hub

  • ‘Making Secondary Trauma a Primary Issue: A Study of Eyewitness Media and Vicarious Trauma on the Digital Frontline’, n.d.


  • First Draft

  • ‘Journalism and Vicarious Trauma: A Guide for Journalists, Editors and News Organisations’, 2017


  • Global Open Science Hardware

  • ‘GOSH Manifesto’, n.d.


  • ‘GOSH Code of Conduct, 2017’


  • International Organization for Migration

  • ‘Fatal Journeys Volume 3 Part 1: Improving Data on Missing Migrants’, 2017


  • OS4HR

  • ‘Digital Accountability Symposium: Whose Stories Get Told, and by Whom? Representativeness in Open Source Human Rights Investigations’, 2019


  • Privacy International

  • ‘Reclaiming Privacy: A Feminist Manifesto’, 2019

  • ‘From Oppression to Liberation: Reclaiming the Right to Privacy’, 2018

  • Public Lab

  • Public Lab Code of Conduct, 2016


  • Trans*H4CK

  • Trans*H4CK Code of Conduct, n.d.


Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Dyer, S., Ivens, G. What would a feminist open source investigation look like?. Digi War 1, 5–17 (2020).

Download citation


  • Feminism
  • Care
  • Human rights
  • Investigations
  • Open source