Abstract
Research on algorithms tends to focus on American companies and on the effects their algorithms have on Western users, while such algorithms are in fact developed in various geographical locations and used in highly diverse socio-cultural contexts. That is, the spatial trajectories through which algorithms operate and the distances and differences between the people who develop such algorithms and the users their algorithms affect remain overlooked. Moreover, while the power of big data algorithms has been recently compared to colonialism (Couldry and Mejias 2019), the move from the colonial gaze (Yegenoglu 1998) to the algorithmic gaze (Graham 2010) has yet to be fully discussed. This article aims to fill these gaps by exploring the attempts to algorithmically conceptualize “the Other” . Based on the case study of an Israeli user-profiling company and its attempts to sell its services to East Asian corporations, I show that the algorithmic gaze—algorithms’ ability to characterize, conceptualize, and affect users—stems from a complex combination between opposing-but-complimentary perspectives: that it is simultaneously a continuation of the colonial gaze and its complete opposite. The ways in which algorithms are being programmed to see the Other, the ways algorithmic categories are named to depict the Other, and the ways people who design such algorithms describe and understand the Other are all different but deeply interrelated factors in how algorithms “see.” I accordingly argue that the story of algorithms is an intercultural one, and that the power of algorithms perpetually flows back and forth—between East and West, South and North.
This is a preview of subscription content, access via your institution.
Notes
Beer recently described the “data gaze” as the ways data analytics companies see and talk about data (Beer 2018). The “algorithmic gaze” I discuss here constitutes the ways in which such companies design, construct, and tweak their algorithms to better “see,” conceptualize, and influence people.
As Ian Bogost suggested, algorithms’ power is not a material phenomenon so much as it is a devotional one—algorithms are idolized and are increasingly imagined as transcendental ideals (Bogost 2015). As Gillespie famously wrote: “That we are now turning to algorithms to identify what we need to know is as momentous as having relied on credentialed experts, the scientific method, common sense, or the word of God” (Gillespie 2014). Cooper similarly argued that like pastoral power, algorithmic governmentality is shepherding our attention and conduct via anticipatory tactics (Cooper 2020).
This research was approved by the ethics committee of the Faculty of Social Sciences, The Hebrew University of Jerusalem.
I used pseudonyms for all individuals and companies mentioned in this article.
It is important to note that this case study is not designed to be representative of Israelis’ views on culture or race, nor of the views data analytics companies in general have on these issues. Keeping in mind that the strengths of qualitative, critical research come from understanding the “how” and “why” of socio-cultural phenomena, not their frequency of incidence (Small 2008), this article offers a description of the socio-technical mechanisms behind the expansion of algorithmic powers, not an assessment of the ethics (or racism) of such companies in general.
The last decade has seen the creation of hundreds of Israeli data analytics companies (IVC 2020), and of thousands of such companies worldwide. The algorithmic products these companies produce are highly ubiquitous, as they operate behind the scenes of almost any online service—profiling users, personalizing contents, and nudging users towards different choices. Such companies often work with and rely upon much bigger global companies, and specifically, as shown below, such companies often rely on Facebook data, accessing it through Facebook’s API (See Kotliar 2020a).
All Hebrew excerpts were translated by the author.
Anthropologist Lila Abu-Lughod famously offered to reconsider the concept of culture, and has devised a strategy for “writing against culture”—a way of generating knowledge about people without over-generalizations and with a constant focus on the particular, the local, and the contextual (Abu-Lughod 2006). While there are many epistemological, methodological, and ethical differences between this view and the algorithmic view of people, they both seem to see Culture as a problematic category that should somehow be evaded. After all, while Extractive’s algorithms offer to move beyond categories like “Russian” or “Chinese,” Abu-Lughod criticizes the characterization of people as “the Nuer” or “the Balinese” (ibid. p. 475). Moreover, much like Abu-Lughod’s methodology, algorithms offer an allegedly more fine-grained, practice-based view of people that offers to replace cruder, top-down perspectives. As demonstrated below, the algorithmic gaze is simultaneously reminiscent of the one Abu-Lughod offers and its complete opposite.
The categorization Facebook offers largely depends on users’ self-disclosure and on the fact that Facebook offers multi-lingual versions of their platform. Thus, users can create a Facebook group and categorize it as related to sports, and then posts in that group will automatically be classified as sports-related. Hence, companies like Extractive can learn about users’ actions and characteristics without reading, translating, or even accessing their posts.
Deep Packet Inspection is a method of extracting, examining, and managing network data. It is often used to assess the functioning of a network, but it can also function as a powerful surveillance device, that can not only access the metadata of internet communication, but also the data itself (Fuchs 2013).
While Israel has built itself a reputation as a “Start-up Nation” (Senor and Singer 2009) and while Israeli companies have had some prominent successes over the last decades, Israel is still a peripheral actor in the world map of technological innovation. As a relatively young country, with only 8.7 million citizens, Israel is almost 480 times smaller than the United States and it is located at the heart of the turbulent Middle East. The small user-base of Hebrew speakers, its problematic geo-political location, and its distance from major technological and economic centers often limit companies’ ability to “scale” into new markets. Accordingly, Israeli startups often choose to get bought by and consolidated into larger international companies rather than continuing to grow independently and locally.
References
Abu-Lughod, L. (2006). Writing against culture. In E. Lewin (Ed.), Feminist anthropology, a reader (pp. 466–479). New York: Wiley.
Anderson, B. (1991). Imagined communities: Reflections on the origin and spread of nationalism. London: Verso Books.
Andrejevic, M. (2013). Infoglut: How too much information is changing the way we think and know. New York: Routledge.
Andrejevic, M. (2014). The big data divide. International Journal of Communication, 8(17), 1673–1689.
Andrejevic, M., Hearn, A., & Kennedy, H. (2015). Cultural studies of data mining: Introduction. European Journal of Cultural Studies, 18(4–5), 379–394. https://doi.org/10.1177/1367549415577395.
Angwin, J., Larson, J., Mattu, S., & Kirchner, L. (2016). Machine bias. ProPublica. https://www.propublica.org/article/machine-bias-risk-assessments-in-criminal-sentencing.
Aravamudan, S. (2011). Enlightenment orientalism: Resisting the rise of the novel. Chicago: University of Chicago Press.
Arvidsson, A. (2004). On the ‘pre-history of the panoptic sort’: Mobility in market research. Surveillance & Society, 1(4), 456–474.
Asad, T. (1994). Ethnographic representation. Statistics and modern power. Social Research, 61(1), 55–88.
Beer, D. (2009). Power through the algorithm? Participatory web cultures and the technological unconscious. New Media & Society, 11(6), 985–1002. https://doi.org/10.1177/1461444809336551.
Beer, D. (2018). The data gaze: Capitalism, power and perception. London: Sage.
Benjamin, R. (2003). Orientalist aesthetics: Art, colonialism, and French North Africa, 1880–1930. Berkeley: Univ of California Press.
Benjamin, R. (2019). Race after technology: Abolitionist tools for the new Jim code. New York: JohnWiley&Sons.
Berda, Y. (2013). Managing dangerous populations: Colonial legacies of security and surveillance. Sociological Forum, 28(3), 627–630. https://doi.org/10.1111/socf.12042.
Bivens, R., & Hoque, A. S. (2018). Programming sex, gender, and sexuality: Infrastructural failures in the ‘feminist ’ dating app bumble. Canadian Journal of Communication, 43, 441–459.
Bogost, I. (2015). The cathedral of computation. In The Atlantic. https://www.theatlantic.com/technology/archive/2015/01/the-cathedral-of-computation/384300/.
Bowker, G. C., & Star, S. L. (1999). Sorting things out: Classification and its consequences. Cambridge: MIT Press.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa.
Brubaker, R. (2004). Ethnicity without groups. Cambridge: Harvard University Press.
Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on facebook. New Media & Society, 14(7), 1164–1180. https://doi.org/10.1177/1461444812440159.
Bucher, T. (2018). IF…THEN: Algorithmic power and politics. Oxford: Oxford University Press.
Cheney-Lippold, J. (2011). A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture & Society, 28(6), 164–181. https://doi.org/10.1177/0263276411424420.
Cheney-Lippold, J. (2017). We are data: Algorithms and the making of our digital selves. New York: NYU Press.
Christin, A. (2018). Counting clicks: Quantification and variation in web journalism in the United States and France. American Journal of Sociology, 123(5), 1382–1415. https://doi.org/10.1086/696137.
Christin, A., Rosenblat, A., & Boyd, D. (2015). Courts and predictive algorithms. In Data & civil rights: A new era of policing and justice. Washington, DC: Data & Civil Rights.
Cohen, B. S. (1996). Colonialism and its forms of knowledge. Princeton: Princeton University Press.
Cohn, J. (2019). The burden of choice: Recommendations, subversion, and algorithmic culture. New Brunswick: Rutgers University Press.
Cooper, R. (2020). Pastoral power and algorithmic governmentality. Theory, Culture & Society, 37(1), 29–59. https://doi.org/10.1177/0263276419860576.
Couldry, N., & Hepp, A. (2018). The mediated construction of reality. Cambridge: Polity.
Couldry, N., & Mejias, U. A. (2019). The costs of connection: How data is colonizing human life and appropriating it for capitalism. Stanford: Stanford University Press.
Crawford, K. (2013). The hidden biases in big data. Harvard Business Review, 1.
Dencik, L., Hintz, A., & Carey, Z. (2018). Prediction, pre-emption and limits to dissent: Social media and big data uses for policing protests in the United Kingdom. New Media and Society, 20(4), 1433–1450. https://doi.org/10.1177/1461444817697722.
Eubanks, V. (2018). Automating inequality: How high-tech tools profile, police, and punish the poor. New York: St. Martin’s Press.
Fisher, E., & Mehozay, Y. (2019). How algorithms see their audience: Media Epistemes and the changing conception of the individual. Media, Culture & Society, 41(8), 1176–1191. https://doi.org/10.1177/0163443719831598.
Foucault, M. (1977). Discipline and punish: The birth of the prison. New York: Vintage Books.
Frenkel, M. (2014). Can the periphery write back? Periphery-to-centre knowledge flows in multinationals based in developing and emerging economies. In R. Westwood, G. Jack, F. R. Khan, & M. Frenkel (Eds.), Core-periphery relations and organization studies (pp. 33–53). New York: Palgrave Macmillan.
Fuchs, C. (2013). Societal and ideological impacts of deep packet inspection internet surveillance. Information Communication and Society, 16(8), 1328–1359. https://doi.org/10.1080/1369118X.2013.770544.
Gillespie, T. (2014). The relevance of algorithms. In G. Tarleton, P. J. Boczkowski, & K. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–193). Cambridge: MIT Press.
Gillespie, T. (2016). Algorithm. In B. Peters (Ed.), Digital keywords: A vocabulary of information society and culture (pp. 18–30). Princeton: Princeton University Press.
González, R. J. (2017). Hacking the citizenry? Anthropology Today, 33(3), 9–12.
Graham, S. D. (2005). Software-sorted geographies. Progress in Human Geography, (5), 562–580.
Graham, S. D. (2010). Interrupting the Algorithmic Gaze? Urban Warfare and US Military Technology. In K. J. Dodds (Ed.), Observant States. Geopolitics and Visual Culture. Rachel Hughes: MacDonald Fraser.
Graham, T., & Henman, P. (2019). Affording choice: How website designs create and constrain ‘choice’. Information, Communication & Society, 22(13), 2007–2023. https://doi.org/10.1080/1369118X.2018.1476570.
Hacking, I. (1995). Rewriting the soul. Multiple personality and the sciences of memory. Princeton: Princeton University Press.
Hagerty, A., & Rubinov, I. (2019). Global AI ethics: A review of the social impacts and ethical implications of artificial intelligence. http://arxiv.org/abs/1907.07892.
Haggerty, K. D., & Ericson, R. V. (2000). The surveillant assemblage. British Journal of Sociology, 51(4), 605–622. https://doi.org/10.1080/00071310020015280.
Hargittai, E. (2018). Potential biases in big data: Omitted voices on social media. Social Science Computer Review, 38(1), 10–24. https://doi.org/10.1177/0894439318788322.
Heikkinen, H. L. T., Huttunen, R., & Kakkori, L. (2000). And this story is true ...’ On the problem of narrative truth. In European Conference on Educational Research (pp. 1–15). Edinburgh.
Hirschman, C. (1986). The making of race in colonial Malaya : Political economy and racial ideology. Sociological Forum, 1(2), 330–361.
Ittmann, K., Cordell, D. D., & Maddox, G. H. (Eds.). (2010). The demographics of empire: The colonial order and the creation of knowledge. Athens: Ohio University Press.
IVC (2020). IVC Annual Israeli Tech Review. https://www.ivc-online.com/Portals/0/RC/Magazine%20&%20YB/IVC_ANNUAL_ISRAELI_TECH_REVIEW_FEB_2020/mobile/index.html. Accessed: 2 July 2020.
Just, N., & Latzer, M. (2017). Governance by algorithms: Reality construction by algorithmic selection on the internet. Media, Culture & Society, 39(2), 238–258. https://doi.org/10.1177/0163443716643157.
Karakayali, N., Kostem, B., & Galip, I. (2018). Recommendation systems as Technologies of the self: algorithmic control and the formation of music taste. Theory, Culture and Society, 35(2), 3–24. https://doi.org/10.1177/0263276417722391.
Kertzer, D. I., & Arel, D. (2002). Censuses, identity formation, and the struggle for political power. Cambridge: Cambridge University Press.
Kitchin, R. (2014). The data revolution: Big data, open data, data infrastructures and their consequences. London: Sage.
Kitchin, R. (2017). Thinking critically about and researching algorithms. Information, Communication & Society, 20(1), 14–29. https://doi.org/10.1080/1369118X.2016.1154087.
Kockelman, P. (2013). The anthropology of an equation. Sieves, spam filters, agentive algorithms, and ontologies of transformation. HAU: Journal of Ethnographic Theory, 3(3), 33–61. http://haujournal.org/index.php/hau/article/view/hau3.3.003.
Kohn, M., & Reddy, K. (2017). Colonialism. In The Stanford Encyclopedia of Philosophy. Metaphysics Research Lab, Stanford University. https://plato.stanford.edu/archives/fall2017/entries/colonialism/.
Kotliar, D. M. (2020a). Who gets to choose? on the socio-algorithmic construction of choice. Science, Technology, & Human Values. https://doi.org/10.1177/0162243920925147.
Kotliar, D. M. (2020b). The return of the social: Algorithmic identity in an age of symbolic. Demise. New Media & Society, 22(7), 1152–1167. https://doi.org/10.1177/1461444820912535.
Kukutai, T. H., & Broman, P. (2016). From colonial categories to local culture: Evolving state practices of ethnic enumeration in Oceania, 1965–2014. Ethnicities, 16(5), 689–711. https://doi.org/10.1177/1468796815603755.
Lake, R. W. (2017). Big data, urban governance, and the ontological politics of Hyperindividualism. Big Data & Society, 4(1), 66–70. https://doi.org/10.1136/jnnp.50.1.66.
Lash, S. (2007). Power after hegemony: Cultural studies in mutation? Theory. Culture and Society, 24(3), 55–78. https://doi.org/10.1177/0263276407075956.
Leese, M. (2014). The new profiling: Algorithms, black boxes, and the failure of anti-discriminatory safeguards in the European Union. Security Dialogue, 45(5), 494–511. https://doi.org/10.1177/0967010614544204.
Lyan, I., & Frenkel, M. (2020). Industrial espionage revisited: Host country-foreign MNC legal disputes and the postcolonial imagery. Organization. https://doi.org/10.1177/1350508420928517.
Lyon, D. (2003). Introduction. In Surveillance as social sorting (pp. 1–11). New York: Routledge.
Mann, M., & Daly, A. (2019). (Big) data and the North-in-South: Australia’s informational imperialism and digital colonialism. Television and New Media, 20(4), 379–395. https://doi.org/10.1177/1527476418806091.
Marchart, O. (1998). The east, the west and the rest: Central and Eastern Europe between techno-orientalism and the new electronic frontier. Convergence, 4(2), 56–75. https://doi.org/10.1177/135485659800400208.
Marcus, G. E. (1995). Ethnography in/of the world system: The emergence of multi-sited ethnography. Annual Review of Anthropology, 24(1), 95–117.
Milan, S., & Treré, E. (2019). Big data from the south(s): Beyond data universalism. Television and New Media, 20(4), 319–335. https://doi.org/10.1177/1527476419837739.
Mohan, S., & Punathambekar, A. (2018). Localizing youtube: language, cultural regions, and digital platforms. International Journal of Cultural Studies, 22(3), 317–333. https://doi.org/10.1177/1367877918794681.
Morley, D., & Robins, K. (1995). Spaces of identity: Global media, electronic landscapes, and cultural boundaries. London: Routledge.
Morris, J. W. (2015). Curation by code: Infomediaries and the data mining of taste. European Journal of Cultural Studies, 18(4–5), 446–463. https://doi.org/10.1177/1367549415577387.
Napoli, P. M. (2014). Automated media: An institutional theory perspective on algorithmic media production and consumption. Communication Theory, 24, 340–360. https://doi.org/10.1111/comt.12039.
Napoli, P. M., & McGannon, D. (2013). The Algorithm as Institution: Toward a Theoretical Framework for Automated Media Production and Consumption. In Media in Transition Conference (pp. 1–36).
Neff, G., Jordan, T., McVeigh-Schultz, J., & Gillespie, T. (2012). Affordances, technical agency, and the politics of Technologies of Cultural Production. Journal of Broadcasting and Electronic Media, 56(2), 299–313. https://doi.org/10.1080/08838151.2012.678520.
Neff, G., & Nagy, P. (2016). Automation, algorithms, and politics| talking to bots: Symbiotic Agency and the Case of Tay. International Journal of Communication, 10.
Neyland, D., & Möllers, N. (2017). Algorithmic IF … THEN rules and the conditions and consequences of power. Information Communication and Society, 20(1), 45–62. https://doi.org/10.1080/1369118X.2016.1156141.
Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: NYU Press.
Noble, S. U., & Roberts, S. (2019). Technological elites, the meritocracy, and post-racialmyths in Silicon Valley.” https://escholarship.org/content/qt7z3629nh/qt7z3629nh.pdf.
O’Neal, C. (2016). Weapons of math destruction: How big data increases inequality and threatens democracy. New York: Broadway Books.
Prasad, A. (2009). Postcolonial theory and organizational analysis. Postcolonial theory and organizational analysis. New York: Palgrave Macmillan.
Prey, R. (2018). Nothing personal: Algorithmic individuation on music streaming platforms. Media, Culture & Society, 40(7), 1086–1100. https://doi.org/10.1177/0163443717745147.
Ribak, R. (2019). Translating privacy: Developer cultures in the global world of practice. Information, Communication & Society, 22(6), 838–853. https://doi.org/10.1080/1369118X.2019.1577475.
Ricaurte, P. (2019). Data epistemologies, the Coloniality of power, and resistance. Television and New Media, 20(4), 350–365. https://doi.org/10.1177/1527476419831640.
Rifkin, J. (2001). The age of access: The new culture of Hypercapitalism. New York: Penguin.
Rogers, R. (2009). Post-demographic machines. Walled Garden, 38, 29–39.
Roh, D. S., Huang, B., & Niu, G. A. (2015). Techno orientalism: Imagining Asia in speculative fiction, history, and media. New Brunswick: Rutgers University Press.
Rouvroy, A. (2013). The end(s) of critique : Data-Behaviourism vs. due-process. In M. Hildebrandt & K. DeVries (Eds.), Privacy, due process and the computational turn: The philosophy of law meets the philosophy of technology (pp. 143–169). New York: Routledge.
Said, E. W. (1995). Orientalism: Western conceptions of the orient. New York: Penguin Books.
Sax, W. S. (1998). The hall of mirrors: Orientalism, anthropology, and the other. American Anthropologist, 100(2), 292–301.
Seaver, N. (2015). The Nice thing about context is that everyone has it. Media, Culture and Society, 37(7), 1101–1109. https://doi.org/10.1177/0163443715594102.
Seaver, N. (2017). Algorithms as culture: Some tactics for the ethnography of algorithmic systems. Big Data & Society, 4(2). https://doi.org/10.1177/2053951717738104.
Seaver, N. (2018). What should an anthropology of algorithms do? Cultural Anthropology, 33(3), 375–385. https://doi.org/10.14506/ca33.3.04.
Senor, D., & Singer, S. (2009). Start-up nation: The story of Israel’s economic miracle. NewYork: Grand Central Publishing.
Seth, S. (2009). Putting knowledge in its place: Science, colonialism, and the postcolonial. Postcolonial Studies, 12(4), 373–388. https://doi.org/10.1080/13688790903350633.
Shapin, S. (1998). Placing the view from nowhere: Historical and sociological problems in the location of science. Transactions of the Institute of British Geographers, 23(1), 5–12. https://doi.org/10.1111/j.0020-2754.1998.00005.x.
Small, M. L. (2008). Lost in translation: How not to make qualitative research more scientific. In Report from Workshop on Interdisciplinary Standards for Systematic Qualitative Research (pp. 1–8). Washington, DC.
Sweeney, L. (2013). Discrimination in online ad delivery. Communications of the ACM, 56(5), 44–54. https://doi.org/10.1145/2447976.2447990.
Takhteyev, Y. (2012). Coding places: Software practice in a south American City. Cambridge: MIT Press.
Thatcher, J., O’Sullivan, D., & Mahmoudi, D. (2016). Data colonialism through accumulation by dispossession: New metaphors for daily data. Environment and Planning D: Society and Space, 34(6), 990–1006. https://doi.org/10.1177/0263775816633195.
Tufekci, Z. (2014). Engineering the public: Big data, surveillance and computational politics. First Monday, 19(7). https://doi.org/10.5210/fm.v19i7.4901.
Turow, J., & Couldry, N. (2018). Media as data extraction: Towards a new map of a transformed communications field. Journal of Communication, 68(2), 415–423. https://doi.org/10.1093/joc/jqx011.
Uvin. (2002). On counting, categorising, and violence in Burundi and Rwanda. In D. I. Kertzer & D. Arel (Eds.), Census and identity: The politics of race, ethnicity, and language in National Censuses (pp. 148–175). Cambridge: Cambridge University Press.
Vaidhyanathan, S. (2018). Antisocial media: how facebook disconnects us and undermines democracy. Oxford, UK: Oxford University Press.
van Dijk, N. (2009). Property, privacy and personhood in a world of ambient intelligence. Ethics and Information Technology, 12(1), 57–69. https://doi.org/10.1007/s10676-009-9211-0.
Vyncke, P. (2002). Lifestyle segmentation: From attitudes, interests and opinions, to values, aesthetic styles, life visions and media preferences. European Journal of Communication, 17(4), 445–463.
Wilf, E. (2013). Toward an anthropology of computer-mediated, algorithmic forms of sociality. Current Anthropology, 54(6), 716–739. https://doi.org/10.1086/673321.
Willson, M. (2014). The politics of social filtering. Convergence: The International Journal of Research into New Media Technologies, 20(2), 218–232. https://doi.org/10.1177/1354856513479761.
Yegenoglu, M. (1998). Colonial fantasies: Toward a feminist reading of orientalism. Cambridge: Cambridge University Press.
Yeung, K. (2017). ‘Hypernudge’: Big data as a mode of regulation by design. Information, Communication & Society, 20(1), 118–136. https://doi.org/10.1080/1369118X.2016.1186713.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for a human future at the new frontier of power. New York: Public Affairs.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher’s note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
About this article
Cite this article
Kotliar, D.M. Data orientalism: on the algorithmic construction of the non-Western other. Theor Soc 49, 919–939 (2020). https://doi.org/10.1007/s11186-020-09404-2
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11186-020-09404-2