Advertisement

We Have Been Assimilated: Some Principles for Thinking About Algorithmic Systems

  • Paul N. EdwardsEmail author
Conference paper
Part of the IFIP Advances in Information and Communication Technology book series (IFIPAICT, volume 543)

Abstract

This text is an opinion piece motivated by an invited keynote address at the 2018 IFIP 8.2 working conference, ‘Living with Monsters?’ (San Francisco, CA, 11 December 2018.) It outlines some principles for understanding algorithmic systems and considers their implications for the increasingly algorithm-driven infrastructures we currently inhabit. It advances four principles exhibited by algorithmic systems: (i) radical complexity, (ii) opacity, (iii) radical otherness, and (iv) infrastructuration or Borgian assimilation. These principles may help to guide a more critical appreciation of the emergent world marked by hybrid agency, accelerating feedback loops, and ever-expanding infrastructures to which we have been all too willingly assimilated.

Keywords

Algorithmic systems Complexity Opacity Otherness Infrastructure 

References

  1. 1.
    Barocas, S., Selbst, A.D.: Big data’s disparate impact. Calif. Law Rev. 104, 671–732 (2016). http://www.californialawreview.org/wp-content/uploads/2016/06/2Barocas-Selbst.pdf
  2. 2.
    Bessi, A., Ferrara, E.: Social bots distort the 2016 US presidential election online discussion. First Mon. 21 (2016). http://uncommonculture.org/ojs/index.php/fm/article/view/7090/5653
  3. 3.
    Boyd, D., Marwick, A.E., Levey, K.: The Networked Nature of Algorithmic Discrimination. Open Technology Institute (2014). https://www.danah.org/papers/2014/DataDiscrimination.pdf
  4. 4.
    Burrell, J.: How the machine ‘thinks’: understanding opacity in machine learning algorithms. Big Data Soc. 3, 1–12 (2016).  https://doi.org/10.1177/2053951715622512CrossRefGoogle Scholar
  5. 5.
    Dietvorst, B.J., Simmons, J.P., Massey, C.: Algorithm aversion: people erroneously avoid algorithms after seeing them err. J. Exp. Psychol.: Gener. 144, 114–126 (2015).  https://doi.org/10.1037/xge0000033CrossRefGoogle Scholar
  6. 6.
    Domingos, P.: A few useful things to know about machine learning. Commun. ACM 55, 78–87 (2012).  https://doi.org/10.1145/2347736CrossRefGoogle Scholar
  7. 7.
    Edwards, P.N.: A Vast Machine: Computer Models, Climate Data, and the Politics of Global Warming. MIT Press, Cambridge (2010)Google Scholar
  8. 8.
    Edwards, P.N., Jackson, S.J., Bowker, G.C., Knobel, C.P.: Understanding Infrastructure: Dynamics, Tensions, and Design. Deep Blue, Ann Arbor (2007). http://hdl.handle.net/2027.42/49353
  9. 9.
    Eslami, M., et al.: First I “like” it, then I hide it. In: The 2016 CHI Conference, pp. 2371–2382 (2016).  https://doi.org/10.1145/2858036
  10. 10.
    Galloway, A.R.: Gaming: Essays on Algorithmic Culture. University of Minnesota Press, Minneapolis (2006)Google Scholar
  11. 11.
    Galloway, A.R.: The cybernetic hypothesis. Differences 25, 107–131 (2014).  https://doi.org/10.1215/10407391-2420021CrossRefGoogle Scholar
  12. 12.
    Giddens, A.: Agency, institution, and time-space analysis. In: Knorr-Cetina, K., Cicourel, A.V. (eds.) Advances in Social Theory and Methodology: Toward an Integration of Micro- and Macro-sociologies. Routledge & Kegan Paul, Boston (1981)Google Scholar
  13. 13.
    Giddens, A.: The Constitution of Society. Spectrum Educational Enterprises, Washington, D.C. (1984)Google Scholar
  14. 14.
    Guynn, J.: Google starts flagging offensive content in search results. USA Today (2017). https://www.usatoday.com/story/tech/news/2017/03/16/google-flags-offensive-content-search-results/99235548/
  15. 15.
    Hallinan, B., Striphas, T.: Recommended for you: the Netflix Prize and the production of algorithmic culture. New Med. Soc. 18, 117–137 (2015).  https://doi.org/10.1177/1461444814538646CrossRefGoogle Scholar
  16. 16.
    Hardt, M.: How big data is unfair: understanding unintended sources of unfairness in data driven decision making. Medium (2014). https://medium.com/@mrtz/how-big-data-is-unfair-9aa544d739de
  17. 17.
    Kahneman, D.: Maps of bounded rationality: psychology for behavioral economics. Am. Econ. Rev. 93, 1449–1475 (2003). http://www.jstor.org/stable/3132137CrossRefGoogle Scholar
  18. 18.
    Kahneman, D.: Thinking, Fast and Slow. Macmillan, New York (2011)Google Scholar
  19. 19.
    Plantin, J.-C., Lagoze, C., Edwards, P.N., Sandvig, C.: Infrastructure studies meet platform studies in the age of Google and Facebook. New Med. Soc. 10, 1–18 (2016).  https://doi.org/10.1177/1461444816661553CrossRefGoogle Scholar
  20. 20.
    Rader, E., Gray, R.: Understanding user beliefs about algorithmic curation in the Facebook news feed. In: The 33rd Annual ACM Conference, pp. 173–182 (2015).  https://doi.org/10.1145/2702123
  21. 21.
    Royal Society: Machine Learning: The Power and Promise of Computers that Learn by Example. Royal Society, London (2017)Google Scholar
  22. 22.
    Schüll, N.D.: Addiction by Design: Machine Gambling in Las Vegas. Princeton University Press, Princeton (2012)Google Scholar
  23. 23.
    Seaver, N.: What should an anthropology of algorithms do? Cult. Anthropol. 33, 375–385 (2018). https://culanth.org/articles/966-what-should-an-anthropology-of-algorithms-do
  24. 24.
    Striphas, T.: Algorithmic culture. Eur. J. Cult. Stud. 18, 395–412 (2015).  https://doi.org/10.1177/1367549415577392CrossRefGoogle Scholar
  25. 25.
    Vosoughi, S., Roy, D., Aral, S.: The spread of true and false news online. Science 359, 1146–1151 (2018).  https://doi.org/10.1126/science.aap9559CrossRefGoogle Scholar
  26. 26.
    Wallach, H.: Big data, machine learning, and the social sciences. Medium (2014). http://medium.com/@hannawallach/big-data-machine-learning-and-the-social-sciences-927a8e20460d

Copyright information

© IFIP International Federation for Information Processing 2018

Authors and Affiliations

  1. 1.Center for International Security and Cooperation, Stanford UniversityStanfordUSA
  2. 2.School of Information, University of MichiganAnn ArborUSA

Personalised recommendations