Abstract
Our lives are increasingly mediated, regulated and produced by algorithmically-driven software; often invisible to the people whose lives it affects. Online, much of the content that we consume is delivered to us through algorithmic recommender systems (“recommenders”). Although the techniques of such recommenders and the specific algorithms that underlie them differ, they share one basic assumption: that individuals are “users” whose preferences can be predicted through past actions and behaviors. While based on a set of assumptions that may be largely unconscious and even uncontroversial, we draw upon Andrew Feenberg’s work to demonstrate that recommenders embody a “formal bias” that has social implications. We argue that this bias stems from the “technical code” of recommenders – which we identify as a form of behaviorism. Studying the assumptions and worldviews that recommenders put forth tells us something about how human beings are understood in a time where algorithmic systems are ubiquitous. Behaviorism, we argue, forms the episteme that grounds the development of recommenders. What we refer to as the “behavioral code” of recommenders promotes an impoverished view of what it means to be human. Leaving this technical code unchallenged prevents us from exploring alternative, perhaps more inclusive and expansive, pathways for understanding individuals and their desires. Furthermore, by problematizing formations that have successfully rooted themselves in technical codes, this chapter extends Feenberg’s critical theory of technology into a domain that is both ubiquitous and undertheorized.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
As such, behaviorists deny the Cartesian mind-body dualism.
- 2.
Skinner preferred to avoid mental concepts, but the underlying idea of (analytical or logical) behaviorism is that a mental state or condition is the idea of a behavioral disposition or family of behavioral tendencies (Graham, 2019). This means that a behaviorist can in principle continue to use mental concepts, but they would refer to a certain behavioral disposition rather than inner states.
- 3.
Even though developers and behaviorists work with different motivations – developers work according to a commercial incentive while behaviorists are motivated by a certain ideal of “real” science – they eventually both aim for the prediction and control of human behavior. These goals have proven to be greatly compatible; the founder of behaviorism, John B. Watson, joined an advertising agency after he left academia and became highly successful in that field (Baars, 1986; Waldrop, 2001). In addition, Skinner’s analysis has been called the psychological equivalent of wage-labor capitalism (Baars, 1986), as the prediction and control of human behavior in order to increase productivity has been a central focus of managerial practices; from “scientific management” to “nudge management” more recently (Ebert & Freibichler, 2017).
- 4.
Apart from the praxis critique, there are roughly three main reasons for the rejection of behaviorism within philosophy (Graham, 2019). First of all, many people were, and still are, sceptical about behaviorism’s commitment to the thesis that behavior can be understood without referring to mental processes. A second reason for the dismissal of behaviorism is the existence of “qualia” (e.g., Place, 2000): behaviorism cannot account for the qualitatively distinctive experience underlying overt behavior. Yet another critique came from Noam Chomsky (1967 [1959]). According to Chomsky, behaviorism cannot account for the fact that language does not seem to be learned through explicit teaching. He pointed out that linguistic performance outstripped individual reinforcement histories.
- 5.
Notice here that behaviorism is a presupposed framework rather than a scientific theory, meaning that it cannot be falsified by any experimental results (Baars, 1986).
- 6.
While the purpose of this chapter is not to explore solutions, there are several interesting proposals and projects underway. For example, academics and developers have called for and experimented with more user-centric recommenders that allow users some degree of control over how they are profiled. One example of user-centric design is gobo.social, a social media news aggregator designed by the MIT Media Lab. This tool offers sliders that users control in order to filter information: The user can explore a range of political perspectives on a continuum from left to right, or “the extent of seriousness, rudeness, gender, and other parameters” (Reviglio & Agosti, 2020, p. 6). In another example, Harambam et al. (2018) provide an interesting proposal to grant users greater “voice” in our algorithmically-driven media ecosystem. The authors propose the creation of algorithmic recommender personae to “allow people instead to demand from [recommenders] to behave in ways that align with their own specific... interests at each single moment” (ibid. p. 4). It is also possible to involve users in the earliest stages of the design and development of recommender algorithms. The benefits of participatory design are not only in creating more user-friendly technologies, but also in making “explicit the critical, and inevitable, presence of values in the system design process” (Suchman, 1993, p. viii). As Feenberg convincingly argues in Questioning Technology, by widening opportunities to intervene, user participation in design serves to limit “the operational autonomy of technical personnel” (Feenberg, 1999, p. 135) who are socialized into the technical codes of the profession (ibid, p. 142).
References
Adomavicius, G., Mobasher, B., Ricci, F., & Tuzhilin, A. (2011). Context-aware recommender systems. AI Magazine, 32(3), 67–80. https://doi.org/10.1609/aimag.v32i3.2364
Adomavicius, G., & Tuzhilin, A. (2005). Toward the next generation of recommender systems: A survey of the state-of-the-art and possible extensions. IEEE Transactions on Knowledge and Data Engineering, 17(6), 734–749.
Aggarwal, C. C. (2016). Recommender systems (Vol. 1). Springer International Publishing.
Baars, B. J. (1986). The cognitive revolution in psychology. New York: Guilford Press.
Beer, D. (2009). Power through the algorithm? Participatory web cultures and the technological unconscious. New Media and Society, 11(6), 985–1002.
Beer, D. (2013). Popular culture and new media: The politics of circulation. Springer.
Bobadilla, J., Ortega, F., Hernando, A., & Gutiérrez, A. (2013). Recommender systems survey. Knowledge-Based Systems, 46, 109–132.
Burke, R. (2007). Hybrid web recommender systems. In The adaptive web (pp. 377–408). Springer.
Cheney-Lippold, J. (2011). A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture and Society, 28(6), 164–181.
Chomsky, N. (1967 [1959]). Review of B. F. Skinner’s verbal behavior. In L. A. Jakobovits & M. S. Miron (Eds.), Readings in the psychology of language (pp. 142–143). Prentice-Hall.
Chung, M. C., & Hyland, M. (2012). Behaviourism, and the disappearance and reappearance of organism (Person) variables. In M. C. Chung & M. Hyland (Eds.), History and philosophy of psychology (pp. 144–169). Wiley-Blackwell.
Drott, E. (2018). Why the next song matters: Streaming, recommendation, scarcity. Twentieth-Century Music, 15(3), 325–357.
Ebert, P., & Freibichler, W. (2017). Nudge management: Applying behavioural science to increase knowledge worker productivity. Journal of Organization Design, 6(1), 1–6.
Ekstrand, M. D., & Willemsen, M. C. (2016, September). Behaviorism is not enough: Better recommendations through listening to users. In Proceedings of the 10th ACM conference on recommender systems (pp. 221–224).
Falk, K. (2019). Practical recommender systems. Manning Publications.
Feenberg, A. (1992). Subversive rationalization: Technology, power, and democracy. Inquiry, 35(3–4), 301–322.
Feenberg, A. (1999). Questioning technology. Routledge.
Feenberg, A. (2008). Critical theory of technology: An overview. In G. J. Leckie & J. E. Buschman (Eds.), Information technology in librarianship: New critical approaches (pp. 31–46). Libraries Unlimited.
Feenberg, A. (2017). Critical theory of technology and STS. Thesis Eleven, 138(1), 3–12.
Fisher, E., & Mehozay, Y. (2019). How algorithms see their audience: Media epistemes and the changing conception of the individual. Media, Culture and Society, 41(8), 1176–1191.
Gillespie, T. (2014). The relevance of algorithms. In T. Gillespie, P. J. Boczkowski, & K. A. Foot (Eds.), Media technologies: Essays on communication, materiality, and society (pp. 167–194). The MIT Press.
Goldberg, D., Nichols, D., Oki, B. M., & Terry, D. (1992). Using collaborative filtering to weave an information tapestry. Communications of the ACM, 35(12), 61–70.
Gomez-Uribe, C. A., & Hunt, N. (2015). The Netflix recommender system: Algorithms, business value, and innovation. ACM Transactions on Management Information Systems (TMIS), 6(4), 1–19.
Graham, G. (2019, Spring). Behaviorism. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy. https://plato.stanford.edu/archives/fall2019/entries/behaviorism/
Habermas, J. (1970). Towards a rational society. Beacon Press.
Hallinan, B., & Striphas, T. (2016). Recommended for you: The Netflix Prize and the production of algorithmic culture. New Media and Society, 18(1), 117–137.
Harambam, J., Helberger, N., & van Hoboken, J. (2018). Democratizing algorithmic news recommenders: How to materialize voice in a technologically saturated media ecosystem. Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences, 376(2133), 20180088.
Jenkins, H. W., Jr. (2010, August 14). Google and the search for the future. Retrieved from https://www.wsj.com/articles/SB10001424052748704901104575423294099527212
Kant, T. (2020). Making it personal: Algorithmic personalization, identity, and everyday life. Oxford University Press.
Kirkpatrick, G. (2020). Technical politics. In G. Kirkpatrick (Ed.), Technical politics: Andrew Feenberg’s critical theory of technology (pp. 70–95). Manchester University Press.
Linden, G., Smith, B., & York, J. (2003). Amazon.com recommendations: Item-to-item collaborative filtering. IEEE Internet Computing, 7(1), 76–80.
Lops, P., De Gemmis, M., & Semeraro, G. (2011). Content-based recommender systems: State of the art and trends. In Recommender systems handbook (pp. 73–105). Springer US.
Lu, Y., Dong, R., & Smyth, B. (2018, September). Why I like it: multi-task learning for recommendation and explanation. In Proceedings of the 12th ACM Conference on Recommender Systems (pp. 4–12).
Lynch, J. (2018, July 2018). Netflix thrives by programming to ‘taste communities,’ not demographics. Retrieved 1 Nov 2020, from AdWeek: https://www.adweek.com/tv-video/netflix-thrives-by-programming-to-taste-communities-not-demographics/
Miller, G. A. (2003). The cognitive revolution: A historical perspective. Trends in Cognitive Sciences, 7(3), 141–144.
Mishler, E. G. (1976). Skinnerism: Materialism minus the dialectic. Journal for the Theory of Social Behaviour 6(1), 21–47.
Moore, J. (1999). The basic principles of behaviorism. In B. Thyer (Ed.), The philosophical legacy of behaviorism (pp. 41–68). Springer.
Morris, J. W. (2015). Curation by code: Infomediaries and the data mining of taste. European Journal of Cultural Studies, 18(4–5), 446–463.
Mullaney T (2015) Everything is a recommendation. MIT Technology Review, 23 March. Available at: https://www.technologyreview.com/s/535936/everything-is-a-recommendation/
Pagano, R., Cremonesi, P., Larson, M., Hidasi, B., Tikk, D., Karatzoglou, A., & Quadrana, M. (2016, September). The contextual turn: From context-aware to context-driven recommender systems. In Proceedings of the 10th ACM conference on recommender systems (pp. 249–252).
Pazzani, M. J., & Billsus, D. (2007). Content-based recommendation systems. In The adaptive web (pp. 325–341). Springer Berlin Heidelberg.
Perik, E., De Ruyter, B., Markopoulos, P., & Eggen, B. (2004). The sensitivities of user profile information in music recommender systems. In Proceedings of private, security, trust (pp. 137–141).
Place, U. T. (2000). The causal potency of qualia: Its nature and its source. Brain and Mind, 1(2), 183–192.
Prey, R. (2018). Nothing personal: Algorithmic individuation on music streaming platforms. Media, Culture and Society, 40(7), 1086–1100.
RecSys. (n.d.). 15th ACM Conference on Recommender Systems, from https://recsys.acm.org/recsys21/
Reisberg, D. (2016). The science of mind. In D. Reisberg (Ed.), Cognition: Exploring the science of mind (6th ed., pp. 2–27). W. W. Norton & Company.
Resnick, P., Iacovou, N., Suchak, M., Bergstrom, P., & Riedl, J. (1994, October). GroupLens: An open architecture for collaborative filtering of netnews. In Proceedings of the 1994 ACM conference on Computer supported cooperative work (pp. 175–186).
Resnick, P., & Varian, H. R. (1997). Recommender systems. Communications of the ACM, 40(3), 56–58.
Reviglio, U., & Agosti, C. (2020). Thinking outside the black-box: The case for “algorithmic sovereignty” in social media. Social Media + Society, 6(2), 2056305120915613.
Ricci, F., Rokach, L., & Shapira, B. (2011). Introduction to recommender systems handbook. In Recommender systems handbook (pp. 1–35). Springer.
Rieder, B. (2020). Engines of order: A mechanology of algorithmic techniques. Amsterdam University Press.
Riedl, J., & Konstan, J. (2002). Word of mouse: The marketing power of collaborative filtering. Warner Books.
Rogers, R. (2009). Post-demographic machines. Walled Garden, 38(2009), 29–39.
Salter, J., & Antonopoulos, N. (2006). CinemaScreen recommender agent: Combining collaborative and content-based filtering. IEEE Intelligent Systems, 21(1), 35–41.
Seaver, N. (2012). Algorithmic recommendations and synaptic functions. Limn, 1(2). from https://escholarship.org/uc/item/7g48p7pb
Seaver, N. (2019). Captivating algorithms: Recommender systems as traps. Journal of Material Culture, 24(4), 421–436.
Seaver, N. (2021). Seeing like an infrastructure: Avidity and difference in algorithmic recommendation. Cultural Studies, 35(4–5), 771–791.
Skinner, B. F. (1953). Science and human behavior. Macmillan.
Skinner, B. F. (1974). About behaviorism. Knopf.
Suchman, L. (1993). Foreword. In D. Schuler & A. Namioka (Eds.), Participatory design: Principles and practices. CRC/Lawrence Erlbaum Associates. vii–x.
Tkalčič, M., Burnik, U., & Košir, A. (2010). Using affective parameters in a content-based recommender system for images. User Modeling and User-Adapted Interaction, 20(4), 279–311.
Waldrop, M. M. (2001). The dream machine: J.C.R. Licklider and the revolution that made computing personal. Viking.
Wan, M., & McAuley, J. (2018, September). Item recommendation on monotonic behavior chains. In Proceedings of the 12th ACM conference on recommender systems (pp. 86–94).
Watson, J. B. (1913). Psychology as the behaviorist views it. Psychological Review, 20(2), 158.
Yu, A. (2019). How netflix uses ai, data science, and machine learning — from a product perspective from https://becominghuman.ai/how-netflix-uses-ai-and-machine-learning-a087614630fe
Yoo, K. H., & Gretzel, U. (2011). Creating more credible and persuasive recommender systems: The influence of source characteristics on recommender system evaluations. In Recommender systems handbook (pp. 455–477). Springer.
Zuboff, S. (2019). The age of surveillance capitalism: The fight for the future at the new frontier of power. Profile Books.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
de Jong, M., Prey, R. (2022). The Behavioral Code: Recommender Systems and the Technical Code of Behaviorism. In: Cressman, D. (eds) The Necessity of Critique. Philosophy of Engineering and Technology, vol 41. Springer, Cham. https://doi.org/10.1007/978-3-031-07877-4_8
Download citation
DOI: https://doi.org/10.1007/978-3-031-07877-4_8
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-07876-7
Online ISBN: 978-3-031-07877-4
eBook Packages: Literature, Cultural and Media StudiesLiterature, Cultural and Media Studies (R0)