Abstract
This chapter discusses the new modes of interaction with consumers through personalisation. Marketers use algorithmic systems to generate, test and distribute commercial content, products, and website layouts for optimising business measures. While possibly contributing to a more pleasant customer experience and more straightforward choice, these developments fundamentally change business-to-consumer relations in that by adopting a radical behaviourist approach, they subject consumers’ decision-making to an opaque, subtle, and potentially exploitative modulation of digital choice architecture.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
For example, Giddens (1984), p. 183, argues that surveillance should be neutrally defined as “the coding of information relevant to the administration of subject populations, plus their direct supervision by officials and administrators of all sorts.”
- 2.
Google (2020), https://policies.google.com/.
- 3.
Instragram (2020), https://help.instagram.com.
- 4.
Microsoft (2020), https://privacy.microsoft.com/en-GB/.
- 5.
European Commission (2018). For example, more than two-thirds (71%) of respondents in the consumer survey reported that in their experience, nearly all or most websites use online targeted advertising.
- 6.
O’Brien (2020).
- 7.
Strycharz et al. (2019).
- 8.
Arora et al. (2008). The one-to-one idea was introduced back in 1993 by Don Pepper and Martha Rogers after observing the growing role of databases in marketing. They showed that marketing was shifting from focusing on the market to focusing on consumers and heading into an era where mass marketing would no longer be an effective way to compete. In this setting, they predicted that “business will be able to communicate directly with consumers, individually, rather than shouting them, in groups.” See, in general, Peppers and Rogers (1993).
- 9.
The inherent contradiction between personalisation and privacy is frequently referred to as the “privacy-personalisation paradox” (Aguirre et al. 2016): while personalisation requires extensive surveillance and may heighten privacy concerns among consumers, it also enhances consumer engagement with the firm.
- 10.
For a dive into how machine learning can be used to personalise all these aspects of a transaction, see Wirth and Sweet (2019). The book was written by the CEO and director of content marketing at Evergage. The company was acquired in 2020 by Salesforce, today a leading player in marketing products powered by AI. The book contains a valuable account of personalisation practices and the logic of data-driven business.
- 11.
Evergage Inc. (2019) reports survey results showing that marketers are using personalisation in email, homepages, landing pages, interior pages, online ads, product-detail pages, search results, pricing, and blog posts.
- 12.
By leveraging AI technologies and machine learning correctly, marketers “can now engage with every customer on a 1:1 basis, in-the-moment, with relevant offers across channels, at scale” (Galkin 2018).
- 13.
“Customer journey” is an expression popularly used by marketers to characterise and map the interaction between consumers and a brand. The basic idea is that marketing should no longer look at consumers as potential customers for products, but at individuals with whom to establish meaningful social relationships that foster interaction with them at many moments of the customer experience. For an overview of the concept, see Lemon and Verhoef (2016).
- 14.
Statista (2019).
- 15.
IAB Europe (2016), European Online Advertising surpasses TV to record annual spend of EUR 36.4 bn, https://iabeurope.eu/all-news/press-release-european-online-advertising-surpasses-tv-to-record-annual-spend-of-e36-2bn/.
- 16.
For an overview and historical evolution on the AdTech sector, see McStay (2016).
- 17.
For an informative overview of all such actors, see Sartor et al. (2021).
- 18.
White and Samuel (2019).
- 19.
On programmatic advertising and current development in the field, see Lee and Cho (2020).
- 20.
Chen et al. (2019).
- 21.
There is indeed often a misconception between targeted and personalised advertising. While targeted advertising tailors a campaign to a specific audience, personalized advertising further adapts that campaign to a group that is singled out within that audience. See Babet (2020).
- 22.
Ghose et al. (2013).
- 23.
van Noort et al. (2020).
- 24.
Google (2015), EA Sports Madden GIFERATOR https://www.thinkwithgoogle.com/consumer-insights/consumer-trends/ea-sports-madden-giferator/.
- 25.
Adobe, Adobe Advances Programmatic Advertising with New Dynamic Creative Technology, 2015, available at https://s23.q4cdn.com/979560357/files/doc_events/2015/04/1/042915AdobeAdvancesProgrammaticAdvertising.pdf.
- 26.
Bakpayev et al. (2020).
- 27.
Another cutting-edge development in programmatic creatives is automatic copywriting. As explained in Altstiel et al. (2018), automatic copywriting tools input consumers’ data ranging from simply entering the name of customers or specific information related to their personality. By means of natural-language-generation techniques, the tool outputs different possible keywords, design templates, and pictures to be used in the advertisements. For example, Lacy (2018) shows how Alibaba offers such a service to its business partners, who can use an AI copywriter that puts out 20,000 lines of copy per second.
- 28.
Ricci et al. (2011).
- 29.
In particular, Tuzhilin (2009), pp. 3–4.
- 30.
- 31.
In practice, pull and push methods are often used in combination. For example, the Amazon recommendation system also pushes recommended items when a user searches for products, i.e., pull methods. Similarly, YouTube’s search results match user queries and display recommended items based on inferred preferences.
- 32.
Tuzhilin (2009).
- 33.
Jannach and Jugovac (2019).
- 34.
The Server Side (2017).
- 35.
The annual RecSys conference, dedicated to state-of-the-art research and applications in recommender systems, routinely hosts workshops on applying deep-learning methods to recommender systems. For an updated review of recommender-system applications in e-commerce based on deep learning, see Zhang et al. (2019).
- 36.
Haruna et al. (2017).
- 37.
CXL (2019).
- 38.
Greenberg and Witten (1985).
- 39.
Findlater and Gajos (2009).
- 40.
Wirth and Sweet (2019), pp. 94–117.
- 41.
Wirth and Sweet (2019), p. 94.
- 42.
Criteo (2017).
- 43.
LiftIgniter (2019).
- 44.
- 45.
Sunstein (2014).
- 46.
Goldman (2006) precisely makes this point, describing the benefits of digital market research that can uncover and then fulfil “latent preferences” that the consumers would otherwise be incapable of articulating themselves.
- 47.
Strycharz et al. (2019).
- 48.
For example, Wollan et al. (2017) show that personalisation is based on meaningfulness and relevance. Zealley et al. (2018) emphasise immediacy, since to do relevant marketing means to “serve a customer’s most relevant needs in the moment.” Similarly, Zoratti and Gallagher (2012) suggest that personalised marketing is relevant when a business reaches customers with the right message or offer in the right channel at the right time.
- 49.
Merchant (2019).
- 50.
Sweet (2018).
- 51.
Lytics (2018).
- 52.
Kantar (2014).
- 53.
Boudet et al. (2018).
- 54.
Bleier et al. (2018).
- 55.
LaptrinhX (2020).
- 56.
From the YouTube blog: “If viewers are watching more YouTube, it signals to us that they’re happier with the content they’ve found.” See YouTube, “Why We Focus on Watch Time,” 2012, https://youtube-creators.googleblog.com/2012/08/youtube-now-why-we-focus-on-watch-time.html.
- 57.
Lury and Day (2019) speak of “zooming of the self”.
- 58.
Lupton (2016), p. 338. The essay draws on the concept of “algorithmic assemblage” and expands the concept by arguing that thanks to algorithms, data subjects co-evolve with others in their cohort: “The vitality of digital data has significant implications for people’s data practices. People are confronted with attempting to gain some purchase on information about themselves which is not only continually generated but is also used by other actors and agencies in ways of which they may not be fully aware”.
- 59.
A lucid attempt to lay bare the biased nature of personalisation can be found in Greene and Shmueli (2019), p. 4. Speaking of the person as a feature vector, they deny that machine-learning personalisation can be reconciled with the humanistic view of the person. Rather, “personalized scores are generated by combining information about your observed behavior with certain assumptions about you and your goals, beliefs, and preferences to predict a very constrained set of future actions”.
- 60.
Kohavi and Thomke (2017) describe how Microsoft and several other leading companies—including Amazon, Booking.com, Facebook, and Google—each conduct more than 10,000 online controlled experiments annually, with many tests involving millions of users.
- 61.
See, e.g., Vertical Leap (2018).
- 62.
A popular approach to algorithmic marketing is contextual multi-arm bandit testing based on reinforced learning.
- 63.
Kohavi and Thomke (2017): “some executives mistakenly believe that causality isn’t important. In their minds all they need to do is establish correlation, and causality can be inferred. Wrong!”
- 64.
Siroker and Koomen (2013, p. 92) argues that “A/B testing neutralises the ideological and replaces it with the empirical. Instead of, ‘I feel we should do X because that’s what I feel,’ a culture of A/B testing encourages both curiosity and humility, where people say, ‘I hypothesise that X is better than what we have right now. I don’t have the data today, but let’s run an experiment and test it.’ After that process is complete, A/B testing then offers the much stronger claim: ‘I know we should do X because of the evidence that it’s the right thing to do.’”
- 65.
As Lowrie (2017) argues, algorithms can only be evaluated in their functioning as components of extended computational assemblages; on their own, they are inert. Consequently, the epistemological coding proper to this evaluation does not turn on truth and falsehood but rather on the efficiency.
- 66.
Susser (2019). The understanding of technology as a mediating interface between people and the world is founded on the post-phenomenological approach to technology embraced by Don Ihdle and Peter-Paul Verbeek, on which the technology would bear meaning only in context. In particular, the theory of technological mediation analyses the influence that technology has on human behaviour and relations. For an introduction, see Verbeek (2015).
- 67.
The experiment is explained in Luguri and Strahilevitz (2021).
- 68.
- 69.
Lessig (2000), p. 84.
- 70.
Thaler and Sunstein (2009).
- 71.
Kahneman (2011). As is known, Kahneman’s great contribution to behavioural science (and economics) lies in a general theory of cognition that divides the human mind into a dual system consisting of two types of cognitive activity that Kahneman calls System 1 and System 2: System 1 makes judgments quickly, in ways that are experienced as effortless and mostly automatic. In contrast, System 2 is slower, experienced as more effortful, and quite limited as to how much incoming information it can process at any given time. System 2 handles the most complex and mentally taxing operations, but System 1 serves as the most pervasive default mode for making decisions. Behavioural economists largely focus on how elements of System 1 processes lead to decision-making outcomes that diverge from the optimal outcomes predicted by a rational choice perspective. Thus, a choice architecture can be created to leverage System 2, then generally leading to rational and self-reflexive decision, but can also exploit System 1, providing a context for impulsive, biased, and incorrect decisions.
- 72.
Thaler and Sunstein (2009), pp. 81–102.
- 73.
Thaler and Sunstein (2009), p. 428.
- 74.
As has been argued in behavioural economics, the proper name for these practices is not “nudge” but “sludge” (Thaler 2018).
- 75.
Nadler and McGuigan (2018) show that marketers often represent themselves as “choice architects.” For example, Ogilvy.com, a unit specialising in behavioural economics for marketing, describes itself as a team of “choice architects” who “apply principles from cognitive psychology, social psychology and behavioural science to create measurable behaviour change in the real world.” This unit has a global presence with ten offices worldwide and works with marquee firms such as American Express, Nestlé, British Airways, and Starbucks. Another advertising giant, Foote, Cone & Belding (FCB), has established an Institute of Decision-Making that partners with businesses to deliver insights about clients by working from behavioural psychology.
- 76.
Pasquale (2006).
- 77.
Cheney-Lippold (2011).
- 78.
Yeung (2017).
- 79.
Yeung (2017), p. 121.
- 80.
See Darmody and Zwick (2020), describing hyper-personalisation as “un-marketing,” a vision where marketing cannot fail to be “on target,” which at one time was what arguably empowered consumers in the business-to-consumer game.
- 81.
See, e.g., Banker and Khetani (2019). Through several behavioural experiments, the authors show that recommendation systems make users over-dependent on algorithmic recommendations and put them at greater risk of choosing inferior products and services, with a consequent loss in welfare. The authors show that this is especially the case with highly specialised medical and financial products. The authors draw the attention of policymakers to the fact that “without full information on product attributes and consumers’ true utility functions, both watchdogs and consumers themselves cannot accurately assess whether adopting the recommendations they are presented with may lead to a welfare loss.”
- 82.
This possibility has indeed been proven in studies on targeted advertising, such as Lienemann et al. (2019).
- 83.
- 84.
André et al. (2018) endorses the view that while algorithmic systems may relieve choice overload, their “psychological reductionism” may deprive consumers of the ability to exercise self-reflection and caution in making attentive judgments between autonomous and hedonistic choices. See also Köcher and Holzmüller (2017).
- 85.
Machines have only a limited ability to build on explicit or discursive feedback from humans and must instead rely on proxies. This is a significant limit of AI that Cristianini (2021) argues to be responsible for a major cultural transformation in which “unobservable quantities” are replaced with “cheaper proxies.”
- 86.
Cohen (2019), p. 77.
- 87.
It could even be argued that from an ad that is displayed but not clicked on, the system will infer that we are not interested in that product even if we simply did not notice the ad, and so regardless of whether the choice was deliberate or was not a choice at all. From an algorithmic perspective, then, even an action not taken is data which can be stored and analysed.
- 88.
- 89.
De Vries (2010).
- 90.
On the problem of compulsive buying in personalised advertising, see Mikolajczak-Degrauwe and Brengman (2014).
- 91.
Willis (2020).
- 92.
- 93.
See Grafanaki (2017), who explains that such self-reinforcing feedback loops no longer affect only the sphere of consumption but any activity in areas of life (news, media, healthcare) that are mediated by personalisation technologies.
- 94.
- 95.
On the concept of filter bubble see, more extensively, Pariser (2011), passim.
- 96.
For a different view, see Cass R. Sunstein (2018, pp. 54–55), who, assuming that algorithms actually reflect users’ choices, interprets algorithmic personalisation as “consumer sovereignty in action.” This is a type of sovereignty that to a certain degree admits and welcomes exposure to fixed and repetitive consumption choices, and Sunstein distinguishes it from political sovereignty. The latter rests on the fundamental values of ideological pluralism and free access to informed debate, which presupposes a certain diversity of viewpoints. It follows that algorithmic personalisation may be justified when used to influence consumers’ choices, but not when used to shape the political identity of citizens.
- 97.
- 98.
See, e.g., Zwebner and Schrift (2020), who, on a sample of two hundred individuals, find that most consumers generally show aversion to being observed by online platforms in the process of forming preferences before making a purchase.
- 99.
Schneider et al. (2018).
- 100.
- 101.
Richard H. Thaler and Tucker (2013). On “personalised nudges,” see specifically Mills (2020), who defines them as “the process of adding, adapting or augmenting a nudge using personal data to reduce the behavioural externalities produced by the nudge, while retaining the intervention’s status as a nudge.”
- 102.
Schneider et al. (2018) provide a framework for designing personalised nudges based on repeating cycles that consist in defining a goal, understanding the user, designing the nudge, and testing its effectiveness with experimentation techniques such as A/B (or split) testing.
- 103.
Experiments on nudges is what the company Nudgify provides to its business clients. Their tool A/B-tests different nudges and sees how they perform on different consumers groups. Likewise, Adobe offers AI-powered products making it possible to A/B-test the whole experience of nudging consumers. The AI platform “helps ensure that you zero in on the experience that nudges customers to buy, read, download, or take whatever other action you want them to take to meet business goals.”
- 104.
Kaptein (2015).
- 105.
According to Cialdini (2007), persuasive communication is based on six cognitive strategies built on social norms that make it possible to modify individuals’ attitudes. These attitudes include authority (people are inclined to follow suggestions originating from authority), consensus (individuals are more prone to follow the behaviour that others have had), commitment (people strive to maintain consistency in their behaviours if they have previously committed to some course of action), scarcity (the scarcity or presumed scarcity of commodities and opportunities tends to increase the probability that people will seek or pursue that opportunity), liking (people are more inclined to be influenced by people who express liking for them or with whom they can better identify), and reciprocity (individuals are more likely to have certain behaviours if they are in a situation of reciprocity with those who try to persuade them, as when they owe a favour).
- 106.
Kaptein (2015).
- 107.
Zoovü, Persuasion Profiling – How You Sell Matters More Than What You Sell, 2018, https://zoovu.com/blog/persuasion-profiling/.
- 108.
Johnathan Keane, Crobox raises $1.25 million for its persuasion-as-a-service technology, TechEU, 2016, https://tech.eu/brief/crobox-1-25-million/.
References
Adomavicius G, Bockstedt JC et al (2013) Do recommender systems manipulate consumer preferences? A study of anchoring effects. Inf Syst Res 24(4):956–975
Adomavicius G, Bockstedt J et al (2019) The hidden side effects of recommendation systems. MIT Sloan Manag Rev 60(2):13–15
Aguirre E et al (2016) The personalization-privacy paradox: implications for new media. J Consum Mark 33(2):98–110
Altstiel T, Grow J, Jennings M (2018) Advertising creative: strategy, copy, and design. SAGE Publications
Anderson C (2006) The long tail: why the future of business is selling less of more. Hachette Books
André Q et al (2018) Consumer choice and autonomy in the age of artificial intelligence and big data. Cust Needs Solut 5(1-2):28–37
Arora N et al (2008) Putting one-to-one marketing to work: personalization, customization, and choice. Mark Lett 19(3-4):305–321
Babet A (2020) Utilization of personalization in marketing automation and email marketing. MA thesis School of Business and Management, Kauppatieteet https://lutpub.lut.fi/handle/10024/161404
Bakpayev M et al (2020) Programmatic creative: AI can think but it cannot feel. Australas Mark J AMJ
Banker S, Khetani S (2019) Algorithm overdependence: how the use of algorithmic recommendation systems can increase risks to consumer well-being. J Public Policy Mark 38(4):500–515
Bleier A, De Keyser A, Verleye K (2018) Customer engagement through personalization and customization. In: Palmatier RW, Kumar V, Harmeling CM (eds) Customer engagement marketing. Springer, pp 75–94
Boudet J et al (2018) No customer left behind: how to drive growth by putting personalization at the center of your marketing. url: https://www.mckinsey.com/business-functions/marketing-andsales/our-insights/no-customer-left-behind#
Bruckner DW (2011) Second-order preferences and instrumental rationality. Acta Analytica 26(4):367–385
Brynjolfsson E, Hu Y, Smith MD (2003) Consumer surplus in the digital economy: estimating the value of increased product variety at online booksellers. Manag Sci 49(11):1580–1596
Chen G et al (2019) Understanding programmatic creative: the role of AI. J Advert 48(4):347–355
Cheney-Lippold J (2011) A new algorithmic identity: soft biopolitics and the modulation of control. Theory Cult Soc 28(6):164–181
Cialdini RB (2007) Influence: the psychology of persuasion. Collins, New York
Cohen JE (2019) Between truth and power: the legal constructions of informational capitalism. Oxford University Press
Cristianini N (2021) Shortcuts to artificial intelligence. In: Pelillo M, Scantamburlo T (eds) Machines we trust machines we trust – perspectives on dependable AI. MIT Press
Criteo (2017) This start-up uses machine learning to build websites. https://www.criteo.com/blog/bookmark-machine-learning-website/ (visited on 12/29/2021)
CXL (Sept. 2019) Why Content Personalization Is Not Web Personalization (and What to Do About It). Post by Shanelle Mullin. https://cxl.com/blog/web-personalization/ (visited on 12/10/2021)
Darmody A, Zwick D (2020) Manipulate to empower: hyper-relevance and the contradictions of marketing in the age of surveillance capitalism. Big Data Soc 7(1):1–12
De Vries K (2010) Identity, profiling algorithms and a world of ambient intelligence. Ethics Inf Technol 12(1):71–85
Evergage Inc. (2019) 2019 Trends in Personalization. url: https://www.evergage.com/wp-content/uploads/2019/04/2019_Trends_in_Personalization_Report.pdf (visited on 12/10/2021)
Findlater L, Gajos KZ (2009) Design space and evaluation challenges of adaptive graphical user interfaces. AI Mag 30(4):68–68
Frankfurt HG (1971) Freedom of the will and the concept of a person. J Philos 68(1):5–20
Gal MS, Elkin-Koren N (2017) Algorithmic consumers. Harv J Law Technol 30(2):309–352
Galkin A (2018) Retail switch: from generalization to hyper-personalization. Forbes. https://www.forbes.com/sites/forbestechcouncil/2018/06/25/retail-switch-from-generalization-to-hyperpersonalization/?sh=3f18a2106bc0
Ghose A, Goldfarb A, Han SP (2013) How is the mobile Internet different? Search costs and local activities. Inf Syst Res 24(3):613–631
Giddens A (1984) The constitution of society: outline of the theory of structuration. University of California Press
Goldman E (2006) Data mining and attention consumption. In: Strandburg KJ, Raicu DS (eds) Privacy and technologies of identity: a cross-disciplinary conversation. Springer, pp 225–237
Grafanaki S (2017) Drowning in big data: abundance of choice, scarcity of attention and the personalization trap, a case for regulation. Richmond J Law Technol 24:1
Gray CM et al (2018) The dark (patterns) side of UX design. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, pp 1–14
Greenberg S, Witten IH (1985) Adaptive personalized interfaces – a question of viability. Behav Inform Technol 4(1):31–45
Greene T, Shmueli G (2019) How personal is machine learning personalization?. arXiv preprint (arXiv:1912.07938)
Greene T, Shmueli G (2020) Beyond our behavior: the GDPR and humanistic personalization. arXivpreprint (arXiv:2008.13404)
Grimmelmann J (2004) Regulation by software. Yale Law J 114:1719–1758
Haruna K et al (2017) Context-aware recommender system: a review of recent developmental process and future research direction. Appl Sci 7(12):1211
Helberger N et al (2021) EU consumer protection 2.0. Structural asymmetries in digital consumer markets. BEUC, Joint report from EUCP2.0 project
Jannach D, Jugovac M (2019) Measuring the business value of recommender systems. ACM Transact Manage Inf Syst (TMIS) 10(4):1–23
Kahneman D (2011) Thinking, fast and slow. Macmillan
Kantar (2014) 6 Common Digital Advertising Objectives. https://www.kantarmedia.com/us/thinking-and-resources/blog/6-common-digital-advertising-objectives
Kaptein M (2015) Persuasion profiling: how the internet knows what makes you tick. Business Contact
Köcher S, Holzmüller HH (2017) New Hidden Persuaders: an investigation of anchoring effects of recommender systems on consumer choice. In: Stieler M (ed) Creating marketing magic and innovative future marketing trends. Springer, pp 51–52
Kohavi R, Thomke S (Sept. 2017) The surprising power of online experiments. Harv Bus Rev. https://hbr.org/2017/09/the-surprising-power-of-online-experiments
Lacy L (2018) Alibaba Says its AI copywriting tool passed the turing test. https://www.adweek.com/commerce/alibaba-says-its-ai-copywriting-tool-passed-the-turing-test/(visited on 12/29/2021)
LaptrinhX (Aug. 2020) 34 Marketing Metrics to Include in Every Marketing Report. https://laptrinhx.com/34-marketing-metrics-to-include-in-every-marketing-report-2764961370/ (visited on 12/29/2021)
Lee H, Cho C-H (2020) Digital advertising: present and future prospects. Int J Advert 39(3):332–341
Lemon KN, Verhoef PC (2016) Understanding customer experience throughout the customer journey. J Mark 80(6):69–96
Lessig L (2000) Code is law. Harvard Magazine. https://harvardmagazine.com/2000/01/code-islaw-html
Lienemann BA et al (2019) Tobacco advertisement liking, vulnerability factors, and tobacco use among young adults. Nicotine Tob Res 21(3):300–308
LiftIgniter (2019) Products Recommendations for your Ecommerce Website. https://www.liftigniter.com/ecommerce (visited on 12/29/2021)
Logg JM, Minson JA, Moore DA (2019) Algorithm appreciation: people prefer algorithmic to human judgment. Organ Behav Hum Decis Process 151:90–103
Lowrie I (2017) Algorithmic rationality: epistemology and efficiency in the data sciences. Big Data Soc 4(1)
Luguri J, Strahilevitz LJ (2021) Shining a light on dark patterns. J Legal Anal 13(1):43–109
Lupton D (2016) Personal data practices in the age of lively data. In: Daniels J, Gregory K, McMillan Cottom T (eds) Digital sociologies. Bristol University Press, pp 339–354
Lury C, Day S (2019) Algorithmic personalization as a mode of individuation. Theory Cult Soc 36(2):17–37
Lytics (2018) The ultimate guide to personalized marketing. https://www.lytics.com/blog/personalization-at-scale-11-marketing-to-the-millions/ (visited on 08/11/2021)
Mathur A et al (2019) Dark patterns at scale: findings from a crawl of 11K shopping websites. Proc ACM Human-Comput Interact 3(CSCW), pp 1–32
McStay A (2016) Digital advertising. Macmillan International Higher Education
Merchant M (May 2019) Why consumers prefer personalization. Post by Tom Zawacki. https://multichannelmerchant.com/blog/why-consumers-prefer-personalization/ (visited on 08/11/2021)
Mikolajczak-Degrauwe K, Brengman M (2014) The influence of advertising on compulsive buying – the role of persuasion knowledge. J Behav Addict 3(1):65–73
Mills S (2020) Personalized nudging. Behav Public Policy:1–10
Nadler A, McGuigan L (2018) An impulse to exploit: the behavioral turn in data-driven marketing. Crit Stud Media Commun 35(2):151–165
Nguyen TT et al (2014) Exploring the filter bubble: the effect of using recommender systems on content diversity. In: Proceedings of the 23rd international conference on World wide web, pp 677–686
Novemsky N et al (2007) Preference fluency in choice. J Mark Res 44(3):347–356
O’Brien M (2020) Customers Demand Personalization — However They Define It. https://www.sailthru.com/marketing-blog/personalization-emarketer-webinar/ (visited on 12/29/2021)
Pariser E (2011) The filter bubble: what the Internet is hiding from you. Penguin Press
Park Y-J, Tuzhilin A (2008) The long tail of recommender systems and how to leverage it. In: Proceedings of the 2008 ACM conference on Recommender systems, pp 11–18
Pasquale F (2006) Rankings, reductionism, and responsibility. Clev St Law Rev 54:115
Peppers D, Rogers M (1993) The one-to-one future: Building relationships one customer at a time. Currency Doubleday
Ricci F, Rokach L, Shapira B (2011) Introduction to recommender systems handbook. In: Ricci F et al (eds) Recommender systems handbook. Springer, pp 1–35
Sartor G, Lagioia F, Galli F (2021) Regulating targeted and behavioural advertising in digital services. Tech. rep. PE 694.680 Policy Department for Citizens’ Rights and Constitutional Affairs
Schneider C, Weinmann M, Vom Brocke J (2018) Digital nudging: guiding online user choices through interface design. Commun ACM 61(7):67–73
Schwartz PM (2000) Internet privacy and the state. Conn Law Rev 32:815–860
Siroker D, Koomen P (2013) A/B testing: the most powerful way to turn clicks into customers. John Wiley & Sons
Statista (2019) Online advertising spending in Europe from 2006-2019. https://www.statista.com/statistics/307005/europe-online-ad-spend/ (visited on 12/08/2021)
Strycharz J et al (2019) Contrasting perspectives–practitioner’s viewpoint on personalised marketing communication. Eur J Mark 53(4):635–660
Sunstein CR (2014) Choosing not to choose. Duke Law J 64(1):1–52
Sunstein CR (2018) # Republic. Princeton University Press
Susser D (2019) Invisible Influence: Artificial Intelligence and the Ethics of Adaptive Choice Architectures. In: AIES ’19: Proceedings of the 2019 AAAI/ACM Conference on AI, Ethics, and Society, pp 403–408
Sweet K (2018) The Road to Successful Personalization. https://www.business2community.com/infographics/road-successful-personalization-infographic-02041891 (visited on 12/20/2021)
Thaler RH (2018) Nudge, not sludge. Science 261(6401):431
Thaler RH, Sunstein CR (2009) Nudge: improving decisions about health, wealth, and happiness. Penguin Books
Thaler RH, Tucker W (2013) Smarter information, smarter consumers. Harv Bus Rev 91(1):44–54
The Server Side (Aug. 2017) How Pandora built a better recommendation engine. Post by George Lawton url: https://www.theserverside.com/feature/How-Pandora-built-a-better-recommendation-engine
Thomas R, Uminsky D (2020) The Problem with Metrics is a Fundamental Problem for AI. arXiv preprint (arXiv:2002.08512)
Tuzhilin A (2009) Personalization: the state of the art and future directions. In: Adomavicius G, Gupta A (eds) Handbooks in information systems, business computing. Emerald, pp 3–43
van Noort G et al (2020) Introducing a model of automated brand-generated content in an era of computational advertising. J Advert 49(4):411–427
Verbeek P-P (2015) Beyond interaction: a short introduction to mediation theory. Interactions 22(3):26–31
Vertical Leap (2018) What can machine learning do for me right now in marketing?. https://www.vertical-leap.uk/blog/what-can-machine-learning-do-for-me-in-marketing/
White GR, Samuel A (2019) Programmatic Advertising: Forewarning and avoiding hype-cycle failure. Technol Forecast Soc Chang 144:157–168
Willis LE (2020) Deception by Design. Harv J Law Technol 34:116–149
Wirth K, Sweet K (2019) One-to-One personalization in the age of machine learning, 2nd edn. Evergage, Inc
Wollan R et al (2017) Put your trust in hyper-relevance. https://www.accenture.com/_acnmedia/pdf-69/accenture-global_dd_gcpr-hyper-relevance.pdf
Yeung K (2017) ‘Hypernudge’: big data as a mode of regulation by design. Inf Commun Soc 20(1):118–136
Zarsky TZ (2003) “Mine Your Own Business!”: making the case for the implications of the data mining of personal information in the forum of public opinion. Yale J Law Technol 5(1):1–56
Zealley J, Wollan R, Bellin J (Mar. 2018) Marketers need to stop focusing on loyalty and start thinking about relevance. Harvard Business Review. https://hbr.org/2018/03/marketers-need-to-stopfocusing-on-loyalty-and-start-thinking-about-relevance
Zhang S et al (2019) Deep learning based recommender system: a survey and new perspectives. ACM Comput Surv (CSUR) 52(1):1–38
Zittrain J (2009) Law and technology. The end of the generative internet. Commun ACM 52(1):18–20
Zoratti S, Gallagher L (2012) Precision marketing: maximizing revenue through relevance. Kogan Page Publishers
Zwebner Y, Schrift RY (2020) On my own: the aversion to being observed during the preference construction stage. J Consum Res 47(4):475–499
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Galli, F. (2022). Predictive Personalisation. In: Algorithmic Marketing and EU Law on Unfair Commercial Practices. Law, Governance and Technology Series, vol 50. Springer, Cham. https://doi.org/10.1007/978-3-031-13603-0_4
Download citation
DOI: https://doi.org/10.1007/978-3-031-13603-0_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-13602-3
Online ISBN: 978-3-031-13603-0
eBook Packages: Law and CriminologyLaw and Criminology (R0)