Skip to main content
Log in

The ethnographer and the algorithm: beyond the black box

  • Published:
Theory and Society Aims and scope Submit manuscript

Abstract

A common theme in social science studies of algorithms is that they are profoundly opaque and function as “black boxes.” Scholars have developed several methodological approaches in order to address algorithmic opacity. Here I argue that we can explicitly enroll algorithms in ethnographic research, which can shed light on unexpected aspects of algorithmic systems—including their opacity. I delineate three meso-level strategies for algorithmic ethnography. The first, algorithmic refraction, examines the reconfigurations that take place when computational software, people, and institutions interact. The second strategy, algorithmic comparison, relies on a similarity-and-difference approach to identify the instruments’ unique features. The third strategy, algorithmic triangulation, enrolls algorithms to help gather rich qualitative data. I conclude by discussing the implications of this toolkit for the study of algorithms and future of ethnographic fieldwork.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. While there are many variations on what people mean by ethnography, ethnographers across disciplines typically agree on several points. Epistemologically, many ethnographers rely on some version of “grounded theory” (Glaser and Strauss 1967, but see also Burawoy 1998, Timmermans and Tavory 2012 for different approaches), starting with a preliminary research question that evolves based on the data collected during fieldwork. Theoretically, ethnographic methods share multiple affinities with symbolic interactionism, which understands individual interactions as a key building block of social life (Mead 1967; Blumer 1969; Goffman 1959). In terms of methods, ethnography often involves participant observation, in which observers actively engage in the activities of the people they study.

References

  • Abebe, R., Barocas, S., Kleinberg, J., Levy, K., Raghavan, M., and Robinson, D.G. (2020). Roles for computing in social change. In Conference on Fairness, Accountability, and Transparency (FAT* ‘20), January 27–30, 2020, Barcelona, Spain. ACM, New York, NY, USA, 9 pages. https://arxiv.org/pdf/1912.04883.pdf.

  • Anderson, C. W. (2011). Between creative and quantified audiences: Web metrics and changing patterns of newswork in local US newsrooms. Journalism, 12(5), 550–566.

    Google Scholar 

  • Anderson, C. W., & Kreiss, D. (2013). Black boxes as capacities for and constraints on action: Electoral politics, journalism, and devices of representation. Qualitative Sociology, 36, 365–382.

  • Andrejevic, M. (2003). Reality TV: The work of being watched. New York: Rowman & Littlefield Publishers.

    Google Scholar 

  • Angwin J., Larson, J., Mattu, S., and Krichner, L. (2016). Machine bias. ProPublica, May 23, 2016. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.

  • Ananny, M., & Crawford, K. (2016). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), 973–989.

    Google Scholar 

  • Barley, S. R. (1986). Technology as an occasion for structuring: Evidence from observations of CT scanners and the social rrder of radiology departments. Administrative Science Quarterly, 31(1), 78–108.

    Google Scholar 

  • Barocas, S., Rosenblat, A., boyd, d., Gangdharan, S.P., and Yu, P. (2014). Data & civil rights: Technology primer. Data & Civil Rights Conference, October 2014. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2536579.

  • Barocas, S., & Selbs, A. D. (2016). Big data’s disparate impact. California Law Review, 104, 671–732.

    Google Scholar 

  • Baym, N. K. (2018). Playing to the crowd: Musicians, audiences, and the intimate work of connection. New York: New York University Press.

    Google Scholar 

  • Beaulieu, A. (2010). From co-location to co-presence: Shifts in the use of ethnography for the study of knowledge. Social Studies of Science, 40(3), 453–470.

    Google Scholar 

  • Bechky, B. A. (2003). Object lessons: Workplace artifacts as representations of occupational jurisdiction. American Journal of Sociology, 109(3), 720–752.

    Google Scholar 

  • Beer, D. (2018). The Data Gaze: Capitalism, Power, and Perception. London: Sage Publishing.

  • Bellanova, R. (2017). Digital, politics, and algorithms: Governing digital data through the lens of data protection. European Journal of Social Theory, 20(3), 329–347.

    Google Scholar 

  • Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim code. Cambridge: Polity.

    Google Scholar 

  • Bishop, S. (2019). Managing visibility on YouTube through algorithmic gossip. New Media & Society, Online First.

  • Blumer, H. (1969). Society as symbolic interaction. In H. Blumer (Ed.), Symbolic interaction. Berkeley: University of California Press.

    Google Scholar 

  • Boellstorff, T., Nardi, B., Pearce, C., & Taylor, T. L. (2012). Ethnography and virtual worlds: A handbook of method. Princeton: Princeton University Press.

    Google Scholar 

  • Bourdieu, P. (1999). Understanding. Pp. 607-626 in The Weight of the World: Social Suffering in Contemporary Society. Stanford: Stanford University press.

  • Brayne, S., and Christin, A. (2020). Technologies of crime prediction: The reception of algorithms in policing and criminal courts. Social Problems, Online First, 1–17.

  • Browne, S. (2015). Dark matters: On the surveillance of blackness. Durham, NC: Duke University.

    Google Scholar 

  • Bucher, T. (2016). The algorithmic imaginary: Exploring the ordinary affects of Facebook algorithms. Information, Communication & Society, 20(1), 30–44.

    Google Scholar 

  • Buolamwini, J., and Gebru, T. (2018). Gender shades: Intersectional accuracy disparities. In Commercial Gender Classification. Proceedings of Machine Learning Research 81, pp. 1-15. Conference on Fairness, Accountability, and Transparency. http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf.

  • Burawoy, M. (1998). The extended case method. Sociological Theory, 16(1), 4–33.

    Google Scholar 

  • Burgess, J., & Green, J. (2018). YouTube: Online video and participatory culture. Cambridge: Polity.

    Google Scholar 

  • Burrell, J. (2016). How the machine ‘thinks:’ Understanding opacity in machine learning algorithms. Big Data & Society, 3(1), 205395171562251.

    Google Scholar 

  • Burrell, J. (2009). The field site as a network: A strategy for locating ethnographic research. Field Methods, 21(2), 181–199.

    Google Scholar 

  • Callon, M. (1986). Some elements of a sociology of translation: Domestication of the scallops and the fishermen of St. Brieuc Bay. Pp. 196-223 in J. law (ed.) Power, Action, and Belief: A New Sociology of Knowledge? Abingdon: Routledge.

  • Cetina, K. (2016). What if the screens went black? The coming of software agents. Working Conference on Information Systems and Organizations (ISO), Dec 2016, Dublin, Ireland,. 3–16. https://hal.inria.fr/hal-01619192/document.

  • Charmaz, K. (2006). Constructing grounded theory: A practical guide through qualitative analysis. New York: Sage Publications.

    Google Scholar 

  • Chesterman, S. (2020). Through a glass, darkly: Artificial intelligence and the problem of opacity. Forthcoming, American Journal of Comparative Law. Retrieved from: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3575534.

  • Christin, A. (2020a). Metrics at work: Journalism and the contested meaning of algorithms. Princeton: Princeton University Press.

    Google Scholar 

  • Christin, A. (2020b). What data can do: A typology of mechanisms. International Journal of Communication, 14(2020), 1115–1134.

    Google Scholar 

  • Christin, A., and Lewis, R. (2020). The drama of metrics: Status and hierarchies among YouTube drama creators. Unpublished Manuscript.

  • Christin, A. (2018). Counting clicks: Quantification and variation in web journalism in the United States and France. American Journal of Sociology, 123(5), 1382–1415.

    Google Scholar 

  • Christin, A. (2017). Algorithms in practice: Comparing web journalism and criminal justice. Big Data & Society, 4(2), 1–14.

    Google Scholar 

  • Coleman, E.G. (2014). Hacker, Hoaxer, Whistleblower, Spy: The Many Faces of Anonymous. London and New York: Verso Books.

  • Corbett-Davis, S., Pierson, E., Feller, A., and Goel, S. (2016). A computer program used for bail and sentencing decisions was labeled biased against blacks: It’s actually not that clear. The Washington Post. October 17, 2016.

  • Crawford, K., & Schultz, J. (2014). Big data and due process: Toward a framework to redress predictive privacy harms. Boston College Law Review, 55(1).

  • Diakopoulos, N. (2013). Algorithmic accountability. Digital Journalism, 3(3), 398–415.

    Google Scholar 

  • Diakopoulos, N., and Friedler, S. (2016). How to hold algorithms accountable. MIT Technology Review, Nov. 17, 2016.

  • Duffy, B. E. (2017). (Not) getting paid to do what you love: Gender, social media, and aspirational work. New Haven: Yale University Press.

  • Duffy, B.E., and Hund, E. (2015). ‘Having it all’ on social media: Entrepreneurial femininity and self-branding among fashion bloggers. Social Media + Society, 1-15.

  • Duneier, M. (2011). How not to lie with ethnography. Sociological Methodology, 41, 1–11.

    Google Scholar 

  • Elish, M. C. (2019). Moral crumple zones: Cautionary tales in human-robot interaction. Engaging Science, Technology, and Society, 5(2019), 40–60.

    Google Scholar 

  • Elish, M. C., & Watkins, E. A. (2020). Repairing innovation: A study of integrating AI in clinical care. Unpublished Manuscript.

  • Espeland, W. N., & Stevens, M. L. (1998). Commensuration as a social process. Annual Review of Sociology, 24(1), 313–343.

    Google Scholar 

  • Eubanks, V. (2017). Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press.

  • Foucault, M. (1975). Discipline and punish: The birth of the prison. New York: Vintage Books.

    Google Scholar 

  • Fourcade, M., & Healy, K. (2017). Seeing like a market. Socio-Economic Review, 15(1), 9–29.

    Google Scholar 

  • Gillespie, T. (2016). #Trendingistrending: When algorithms become culture. In Algorithmic Cultures: Essays on Meaning, Performance and New Technologies, edited by R. Seyfert and J. Roberge. Abingdon: Routledge.

  • Glaser, B. G., & Strauss, A. L. (1967). The Discovery of Grounded Theory: Strategies for Qualitative Research. Chicago: Aldine Publishing Company.

    Google Scholar 

  • Goffman, E. (1959). The presentation of self in everyday life. New York: Doubleday.

    Google Scholar 

  • Gray, M., & Suri, S. (2019). Ghost work: How to stop Silicon Valley from building a new global underclass. New York: HMH.

  • Griesbach, K., Reich, A., Elliott-Negri, L., and Milkman, R. (2019). Algorithmic control in platform food delivery work. Socius 5.

  • Haggerty, K. D., & Ericson, R. V. (2003). The surveillant assemblage. British Journal of Sociology, 51(4), 605–622.

    Google Scholar 

  • Hannak, A., Soeller, G., Lazer, D., Mislove, A., and Wilson, C. (2014). Measuring price discrimination and steering on E-commerce web sites. IMC’14, November 5–7, 2014, Vancouver, BC, Canada. https://personalization.ccs.neu.edu/static/pdf/imc151-hannak.pdf.

  • Hannerz, U. (2003). Being there ... and there ... and there! Reflections on multi-site ethnography. Ethnography, 4, 201–216.

  • Hine, C. (2015). Ethnography for the internet. London: Bloomsbury.

    Google Scholar 

  • Hjort, L., Horst, H., Galloway, A., & Belle, G. (2017). The Routledge companion to digital ethnography. New York: Routledge.

    Google Scholar 

  • Introna, L. D. (2016). Algorithms, governance, and governmentality: On governing academic writing. Science, Technology, & Human Values, 41(1), 17–49.

    Google Scholar 

  • Irani, L. (2015). Difference and dependence among digital workers: The case of Amazon mechanical Turk. South Atlantic Quarterly, 114(1), 225–234.

    Google Scholar 

  • Kiviat, B. (2019). The moral limits of predictive practices: The case of credit-based insurance scores. American Sociological Review, 84(6), 1134–1158.

    Google Scholar 

  • Knorr Cetina, K. (1999). Epistemic cultures: How the sciences make knowledge. Cambridge: Harvard University Press.

    Google Scholar 

  • Knox, H., & Nafus, D. (Eds.). (2018). Ethnography for a data-saturated world. Manchester: Manchester University Press.

    Google Scholar 

  • Kolkman, D. (2020). The (in)credibility of algorithmic models to non-experts. Information, Communication & Society, Online First, 1–17.

  • Kotliar, D. (2020). Who gets to choose? On the socio-algorithmic construction of choice. Science, Technology, & Human Values. Online First.

  • Kunda, G. (2006). Engineering culture: Control and commitment in a high-tech corporation (Revised ed.). Philadelphia: Temple University Press.

  • Lamont, M., Beljean, S., and Clair, M. (2014). What is missing? Cultural Pathways to Inequality. Socio-Economic Review, 1–36.

  • Lange, A.-C., Lenglet, M., & Seyfert, R. (2018). On studying algorithms ethnographically: Making sense of objects of ignorance. Organization, 26(4), 598–617.

    Google Scholar 

  • Latour, B. (2010). The making of the law: An ethnography of the Conseil d’Etat. London: Polity.

    Google Scholar 

  • Latour, B. (1999a). Pandora's hope: Essays on the reality of science studies. Cambridge: Harvard University Press.

    Google Scholar 

  • Latour, B. (1999b). On recalling ANT. The Sociological Review, 47(1), 15–25.

    Google Scholar 

  • Latour, B. (1991). Technology is society made durable. Pp. 103-131 in J. law (Ed.) A Sociology of Monsters: Essays on Power, Technology and Domination, Abingdon, England: Routledge.

  • Latour, B. (1987). Science in action: How to follow scientists and engineers through society. Cambridge: Harvard University Press.

    Google Scholar 

  • Latour, B., & Woolgar, S. (1986). Laboratory life: The construction of scientific facts. Princeton: Princeton University Press.

    Google Scholar 

  • Leigh Star, S. (1999). The ethnography of infrastructure. American Behavioral Scientist, 43(3), 377–391.

    Google Scholar 

  • Lewis R. (2018). Alternative influence: Broadcasting the reactionary right on YouTube. White paper, September. Data & Society Research Institute. https://datasociety.net/wp-content/uploads/2018/09/DS_Alternative_Influence.pdf.

  • Lewis, S. C., & Westlund, O. (2015). Actors, actants, audiences, and activities in cross-media news work: A matrix and a research agenda. Digital Journalism, 3(1), 19–37.

  • Lichterman, P. (2015). Interpretive reflexivity in ethnography. Ethnography, 18(1), 35–45.

    Google Scholar 

  • Lyon, D. (2018). The culture of surveillance. Cambridge: Polity.

    Google Scholar 

  • McKenzie, D. (2019). How algorithms interact: Goffman’s ‘interaction order’ in automated trading. Theory, Culture & Society, 36(2), 39–59.

    Google Scholar 

  • Malcolm, J. (1989). The journalist and the murderer. New York: Vintage Books.

    Google Scholar 

  • Markham, A., & Baym, N. (2009). Internet inquiry: Conversations about method. Thousand Oaks: SAGE.

  • Marwick, A. (2013). Status Update: Celebrity, publicity, and branding in the social media age. New Haven: Yale University Press.

  • Mau, S. (2018). The metric society: On the quantification of the social. Cambridge: Polity.

    Google Scholar 

  • Mead, G.H. (1967 [1934]). Mind, self and society: From the standpoint of a social behaviorist. Chicago: University of Chicago Press.

  • Metz, C. (2015). Google is 2 billion lines of code—and it’s all in one place. Wired, September 16, 2015. Retrieved from: https://www.wired.com/2015/09/google-2-billion-lines-codeand-one-place/.

  • Mitchell, M., Wu, S., Zaldivar, A., Barnes, P., Vasserman, L., Hutchinson, B., Spitzer, E., Raji, I. D., and Gebru, T. (2019). Model cards for model reporting. ACM Proceedings of the Conference on Fairness, Accountability, and Transparency. 220-229. https://arxiv.org/abs/1810.03993.

  • Mols, B. (2017). In black box algorithms we trust (or do we?). Communications of the ACM. March 16, 2017. Retrieved from: https://cacm.acm.org/news/214618-in-black-box-algorithms-we-trust-or-do-we/fulltext.

  • Neff, G. (2012). Venture labor: Work and the burden of risk in innovative industries. Cambridge: MIT Press.

    Google Scholar 

  • Neff, G., and Stark, S. (2003). Permanently beta: Responsive organization in the internet era. Pp. 173-188 in P. Howard and S. Jones (Eds.), Society Online: The Internet in Context. Thousand oaks, CA: Sage.

  • Neyland, D. (2019). The everyday life of an algorithm. Cham: Palgrave McMillan, Springer Nature.

    Google Scholar 

  • Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: NYU Press.

  • O’Neil, C. (2016). Weapons of math destruction. New York: Crown.

    Google Scholar 

  • Orlikowski, W. (2007). Sociomaterial practices: Exploring technology at work. Organization Studies, 28(9), 1435–1448.

  • Orlikowski, W. (2000). Using technology and constituting structures: A practice lens for studying technology in organizations. Organization Science, 11(4), 404–428.

    Google Scholar 

  • Orr, J. (1996). Talking about Machines: An Ethnography of a Modern Job. Ithaca: Cornell University Press.

  • Petre, C. (2015). The traffic factories: Metrics at Chartbeat, Gawker Media, and The New York Times. Tow Center for Digital Journalism. Retrieved from: https://academiccommons.columbia.edu/doi/10.7916/D80293W1.

  • Pasquale, F. (2015). The black box society: The secret algorithms that control money and information. Cambridge: Harvard University Press.

    Google Scholar 

  • Powles, J., and Nissenbaum, H. (2018). The seductive diversion of ‘solving’ bias in artificial intelligence. Medium, Dec. 7, 2018. https://onezero.medium.com/the-seductive-diversion-of-solving-bias-in-artificial-intelligence-890df5e5ef53.

  • Robert, S. (2019). Behind the screen: Content moderation in the shadows of social media. New Haven: Yale University Press.

    Google Scholar 

  • Rosenblat, A. (2018). Uberland: How algorithms are rewriting the rules of work. Berkeley: University of California Press.

    Google Scholar 

  • Rosenblat, A., & Stark, L. (2016). Algorithmic labor and information asymmetries: A case study of Uber drivers. International Journal of Communication, 10, 3758–3784.

  • Sachs, S.E. (2019). The algorithm at work? Explanation and repair in the enactment of similarity in art data. Information, Communication & Society, 1–17.

  • Sandvig, C., Hamilton, K., Karahalios, K., and Langbort, C. (2014). Auditing algorithms: Research methods for detecting discrimination on internet platforms. Paper presented to Data and Discrimination: Converting Critical Concerns into Productive Inquiry, 64th Annual Meeting of the International Communication Association. May 22, 2014; Seattle, WA. Retrieved from: http://www-personal.umich.edu/~csandvig/research/Auditing%20Algorithms%20%2D%2D%20Sandvig%20%2D%2D%20ICA%202014%20Data%20and%20Discrimination%20Preconference.pdf.

  • Scholz, T. (2013). Digital labor: The internet as playground and factory. New York: Routledge.

    Google Scholar 

  • Seaver, N. (2018). Captivating algorithms: Recommender systems as traps. Journal of Material Culture, Online First, 1–16.

  • Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2), 205395171773810.

    Google Scholar 

  • Shestakofsky, B. (2017). Working algorithms: Software automation and the future of work. Work and Occupations, 44(4), 376–423.

    Google Scholar 

  • Siles, I., Segura, A., Solís, R., & Sancho, M. (2020). Folk theories of algorithmic recommendations on Spotify: Enacting data assemblages in the global south. Big Data & Society, 7(1), 1–15.

    Google Scholar 

  • Silverman, J. (2020). Spies, lies, and stonewalling: What it’s like to report on Facebook. Columbia Journalism Review, July 1st, 2020. https://www.cjr.org/special_report/reporting-on-facebook.php.

  • Snow, D. A. (1980). The disengagement process: A neglected problem in participant observation research. Qualitative Sociology, 3, 100–122.

    Google Scholar 

  • Stuart, F. (2020). Ballad of the bullet: Gangs, drill music, and the power of online infamy. Princeton: Princeton University Press.

    Google Scholar 

  • Suchman, L., Blomberg, J., Orr, J. E., & Trigg, R. (1999). Reconstructing technologies as social practice. American Behavioral Scientist, 43(3), 392–408.

    Google Scholar 

  • Sweeney, L. (2013). Discrimination in online ad delivery. ACM Queue, 11(3), 1–19.

    Google Scholar 

  • Terranova, T. (2000). Free labor: Producing culture for the digital economy. Social Text, 18(2), 33–58.

    Google Scholar 

  • Ticona, J., & Mateescu, A. (2018). Trusted strangers: Cultural entrepreneurship on domestic work platforms in the on-demand economy. New Media & Society, 20(11), 4384–4404.

    Google Scholar 

  • Timmermans, S., & Tavory, I. (2012). Theory construction in qualitative research: From grounded theory to abductive analysis. Sociological Theory, 30(3), 167–186.

  • Turner, F. (2009). Burning man at Google: A cultural infrastructure for new media production. New Media & Society, 11(1–2), 73–94.

    Google Scholar 

  • Turner, F. (2005). Actor-networking the news. Social Epistemology, 19(4), 321–324.

    Google Scholar 

  • Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York: Public Affairs.

  • Zukin, S., and Papadantonakis, M. (2017). Hackathons as co-optation ritual. In A.L. Kalleberg, S.P. Vallas (eds.) Precarious Work / Research in the Sociology of Work, 31, 157–181.

Download references

Acknowledgments

I would like to thank Sharon Zukin, John Torpey, Fred Turner, Melissa Valentine, Rebecca Hinds, and the participants of the Center on Digital Culture and Society Launch Symposium (University of Pennsylvania, Annenberg School of Communication), the Médialab (Sciences-Po)/Centre Internet et Société (CNRS) seminar, and the Communication Works in Progress (CWIP) workshop (Stanford University) for their feedback on previous versions of this article. This research was supported by the Chair “Major Social Changes” (Sorbonne Université—Institut d’Etudes Avancées de Paris).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Angèle Christin.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Christin, A. The ethnographer and the algorithm: beyond the black box. Theor Soc 49, 897–918 (2020). https://doi.org/10.1007/s11186-020-09411-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11186-020-09411-3

Keywords

Navigation