This chapter offers an account of how alcohol marketers have used social media platforms over the past fifteen years, and argues that understanding the engineering, operation and consequences of platforms’ data-driven, participatory and opaque advertising model is fundamental to addressing larger questions of platform regulation in the public interest. It suggests that through the case of alcohol marketing we can understand and assess many of the novel regulatory challenges posed by the advertising model of digital platform companies. Thus, in this chapter we appraise some of the existing alcohol industry and platform approaches to self-regulation and suggest some principles for regulating marketing that is data-driven, participatory and opaque, and connect these to larger debates about the future regulation of platforms. The critical assessment of the novel ways in which platform marketing integrates participatory forms of audience engagement with the prospecting, segmentation and targeting of consumers is crucial for developing an accountable regulatory regime that allows for effective governance of the commercial activities of marketers and brands on platforms.
- Alcohol marketing
- Social media
- Digital platforms
- Algorithmic culture
Digital platforms like Facebook, Google, Instagram, YouTube, Snapchat and TikTok on the US-centric internet have transformed advertising and marketing. They super-charge consumer participation, the collection and use of the data created by this interaction, and the capacity of business to respond to consumers in real-time. As the Australian Competition and Consumer Commission’s Digital Platforms Inquiry (2019) demonstrated, questions about digital platforms’ market power and their impact on public life are inseparable from questions about the nature and regulation of their advertising models.
In this chapter we argue that over the past decade these platforms have not only become among the largest advertising companies in the world; they have also transformed what advertising is and how it impacts on consumers, publics and societies. Search and social media platforms in particular are advertiser-funded media engineering projects. By this we mean that where mass media businesses typically invested revenues generated from advertising into the production of content (like news and entertainment), digital platforms invest their revenues into the transformation of the medium itself: the design of interfaces and data-processing techniques to capture and channel attention and action. This means that regulation needs to respond to a form of marketing where the innovation takes place at the level of the medium, rather than the content of advertisements and their placement.
The data-driven, participatory and opaque nature of advertising on digital platforms is fundamental to questions about how we conceive of its governance or regulation:
The data-driven or algorithmic nature of the advertising model of platforms requires us to contend with the collection and analysis of data used to customise and target advertisements and to optimise engagement with users.
The participatory qualities of advertising on platforms mean we need to pay attention to how ordinary people and professional intermediaries like influencers are called on to incorporate advertisements into the content they produce and share.
The opaque nature of advertising on platforms means that the activities of advertisers are increasingly only visible to the consumer being targeted, and they often take an ephemeral form. Even if promotional communication is visible to those targeted, the data-processing operations that produce those texts and optimise engagement with them are only visible to the platform and advertiser. This means advertisements are not published or archived in ways that enable broader public consideration. As a consequence, advertising is opaque in the sense that the public does not know how data is being used to target specific groups of consumers.
Many of the larger regulatory and public interest questions we are currently grappling with when thinking about the regulation of digital platforms depend on us understanding how the engineering of this new form of advertising is fundamental to the ongoing development of platforms.
In this chapter we argue that platforms are best characterised as the creators of an algorithmic brand culture (Carah and Angus 2018). The algorithms that now play a central role in governing our culture (Striphas 2015) are developed to serve the interests of advertisers and brands (Carah and Angus 2018). To understand the influence of platforms requires us to regard them as more than just the creators of a new form of targeted advertising. Rather, they have engineered a new set of relationships between our everyday communicative culture, its rhythms of consumer behaviour and expression, and the data-processing power of digital media. In the following, we will document the emergence of this algorithmic brand culture in several phases, as a means for articulating how a particular set of regulatory questions and challenges have emerged as platforms have ‘undone’ the well-established commercial settlements of mass media and advertising.
Platforms’ dominance of audience and advertising markets makes them a central actor in governing the relationships between advertisers, creative agencies, content producers, influencers and creators, and consumers. In the case of alcohol marketing, platforms and marketers have created their own uneven governance frameworks that build on their pre-existing ‘quasi regulatory’ models and guidelines. Duguay et al. (2018) note that the regulation of social media platforms typically follows a ‘patchwork governance’ approach which combines formal policies (for instance, content moderation policies and regulations) with the selective use of technological and automated governance mechanisms (for instance, hashtag filtering, algorithmic content selection and distribution). As we will argue in detail below, in the case of alcohol marketing on digital platforms, these governance frameworks tend to focus on the symbolic content of advertisements but (conveniently) fail to address how the participatory, algorithmic and opaque qualities of the advertising model contribute to potentially harmful representations of drinking culture and—ultimately—alcohol consumption.
Throughout the chapter we draw on examples of how alcohol marketers have used digital media platforms over the past fifteen years. We do this for two reasons. Firstly, alcohol marketers have been innovative users of digital media. And, secondly, they paradigmatically demonstrate the regulatory challenges associated with marketing in the platform-era because they are promoting a commodity that the public has an interest in regulating. Alcohol marketers were early adopters of digital media platforms. Before these platforms had formal advertising markets brands developed a range of creative strategies for generating engagement on platforms like MySpace, Facebook and Instagram. These included staging and sponsoring events, partnering with influencers and creators, and making their own native content (Carah and Shaul 2016). We suggest that through the case of alcohol marketing we can understand and assess many of the regulatory challenges posed by the advertising model of digital media platforms.
We argue that two critical developments lay the groundwork for particular regulatory challenges. Firstly, in the era before platforms had developed mature, formal paid advertising models, advertisers sought to generate organic reach by making themselves part of the participatory culture of platforms. Brands presented themselves as open-ended cultural resources that consumers could incorporate into the everyday stories they created about their identities and practices. This led to questions about the responsibility of advertisers for the moderation of content on their profiles and pages and created uncertainty about how the disclosure of commercial intent ought to work within social media’s participatory cultures. As platforms created algorithmically-curated feeds of content advertisers were incentivised to ‘game’ the system by producing content that platforms would rate favourably. This marked the beginning of more mature settlements with platforms about how advertisers could operate as native publishers. Platforms became tuned toward recommending content that captured user attention and sustained user engagement, and rewarded advertisers who did the same.
Secondly, as platforms shifted advertisers away from ‘organic’ and toward ‘paid’ reach, the advertising model also transformed from ‘targeting ads’ to offering a more integrated data-driven ‘tuning’ or ‘optimisation’ of relationships between advertisers and consumers. Advertising is now central to platforms’ efforts to generate curated feeds of content that sustain engagement and maximise revenue, and, as the data-driven architecture of optimising relationships between ads and audiences is increasingly refined, it has also become more opaque as advertisers’ ‘public’ pages and profiles are replaced by highly targeted advertising and branded content. This generates fundamental questions about what advertising is, what uses of data are harmful, and what kind of public accountability is necessary to make advertising open to public scrutiny. It also, we argue, turns our attention to how the advertising model lays the foundation for the engagement-based, metrics-oriented, algorithmic flow of information that generates many other fundamental regulatory questions that concern us. There is no way to address critical, platform-related issues like bias, discrimination, disinformation and so on without some kind of fundamental regulation of the advertising model.
Regulating Marketing on Digital Platforms
In recent years, sustained and critical attention has been given to the ‘platformization of the Internet’ (Helmond et al. 2019; vanDijk 2020; Poell et al. 2019). This work has drawn attention to the dominant role that platforms play as social, technical, institutional and infrastructural actors. Alongside this research, other streams of inquiry have investigated the ideological and ‘governing power’ of algorithms in relation to surveillance, transparency and consumer agency (Cheney-Lippold 2011; Beer 2017; Ananny and Crawford 2016; Ziewitz 2016). Furthermore, productive research has been conducted into the increasingly algorithmic nature of culture, drawing attention to the significant role algorithms and platforms play in the context of cultural production, content creation, and visibility (Striphas, 2015; Nieborg and Poell 2018).
Comparatively little attention, though, has been paid to the specific case of digital advertising and to the fundamental role advertising plays in the process of platformisation and its associated forms of participatory expression, datafication and algorithmic culture. Critical communication and media studies scholars have typically approached advertising on digital platforms from theoretical and conceptual perspectives focusing on three interrelated key issues: Privacy, data-driven consumer manipulation, and social discrimination. For example, McStay (2011) draws attention to the recursive loop underlying online behavioural advertising, in which information gained about individual consumer behaviour automatically feeds back into the design and development of subsequent promotional appeals. Because of this, he writes, ‘ideological examination of texts and audience positioning is far less important than awareness of delivery systems and the power, privacy and profiling relations that exist beneath hybridised behavioural advertising-machines’ (McStay 2011: 320).
Extending these concerns about behavioural targeting, Yeung (2017) introduced the notion of the ‘hypernudge’—a term describing the ways the deliberate and individualised algorithmic configuration of choice architectures can exploit systematic cognitive weaknesses in human decision-making (see also Nadler and McGuigan 2018). And Turow (2006, 2012) has advanced the argument that the increasing personalisation of promotional communication carries the risk of social discrimination as it sorts consumers according to their commercial profitability. This customisation arguably also shapes consumers’ self-identities and the social imaginary as a whole, thereby institutionalising an increasingly data-driven and discriminatory marketplace and associated media culture (Turow et al. 2015). Despite these concerns about the harmful and predatory character of digital advertising, media and communication research on regulation of digital platforms has mostly focussed on issues like speech, moderation, news and political campaigning. This is curious because each of these issues are shaped by how platforms’ algorithmic systems of classification, curation and recommendation are fundamentally designed to serve the strategic imperatives of an advertiser-funded business model.
In contrast, there is a longstanding concern among public health researchers and organisations about the digital marketing of unhealthy, addictive or harmful commodities like alcohol, gambling, tobacco and junk food. For the most part, this research has been tracking the shift of advertising spend to digital channels and attempting to assess its effects. In some ways, public health researchers and organisations have paid more sustained attention to the advertising model and questions of harm and regulation than media and communication researchers have because they were tracking unhealthy marketing in mass media and following its shift into digital channels.
Broadly, public health research has approached the marketing of unhealthy and addictive commodities on digital media through ‘exposure-centric’ and ‘engagement-centric’ frameworks (Carah and Brodmerkel 2021). The exposure-centric view continues the systematic research of the effects of exposing consumers to advertisements. This research operationalises advertising as the creation and distribution of discrete texts through mass media channels. The engagement-centric view attempts to conceptualise and describe how marketers capitalise on the participatory culture of social media platforms by involving consumers in creating, circulating and engaging with brand and promotional content (Goodwin et al. 2016; Lyons et al. 2016; Niland et al. 2017; Atkinson et al. 2017). Elsewhere we have argued that ‘an exposure-centric view aligns with the first wave of digital advertising that transported familiar forms of display advertising into online channels’ and an ‘engagement-centric view reflects the second wave of digital marketing’ instigated by the participatory culture of social media (Carah and Brodmerkel 2021). However, neither of these perspectives ‘adequately reflect how the advertising models of digital platforms have matured from early display advertising, to organic participatory engagement, to a third wave characterised by paid data-driven engagement that aims to optimise consumers’ perceptions and actions’ (Carah and Brodmerkel 2021: 20).
Digital platforms now operate as multi-sided market infrastructures that orchestrate and optimise relationships between consumers, businesses and platforms (Nieborg and Helmond 2019). Platforms combine the sale of advertising with a range of marketing services including data analytics, retail and distribution plug-ins, and tools for managing partnership with consumers and cultural intermediaries like influencers. In the following section we endeavour to give an account of how platforms was managed this transition in their own commercial interests, using alcohol marketing as a case. By doing so we can illustrate how regulatory questions and challenges have ‘accumulated’ over the past decade. And, we can also demonstrate the important role that media and communication researchers must play in formulating a platform-centric understanding of how digital advertising works, as a precursor to understanding its potential for harm and the possibility of effective regulation.
The Development of Digital Platforms’ Advertising Model: The Case of Alcohol Marketing
Alcohol marketers were early adopters of digital media, experimenting with platforms like MySpace, Facebook and Instagram before there was a formal paid advertising model. As far back as 2012, Facebook and Diageo, one of the largest alcohol companies in the world distributing global brands such as Smirnoff, Johnny Walker and Guinness, announced a ‘collaboration’ to ‘maximise consumer participation at scale in our campaigns, particularly in emerging markets’ (Carah et al. 2014). Announcements like these were the first indication that alcohol marketing on platforms was much more elaborate than buying targeted advertising or fostering user-generated content. Instead, alcohol marketers and platforms were forming deep, ‘consultancy’-type arrangements where they shared personnel, expertise and—crucially—data.
These arrangements were not just beneficial for the marketers. Perhaps more importantly, they helped the platforms refine their advertising models. Tools like ‘custom audiences’ or ‘lookalike audiences’ could only be developed by collaborating with major global marketers who shared large consumer datasets with Facebook. In the case of ‘custom audiences’ Facebook developed data-matching tools that would take the database of customer information that marketers held and use it to ‘find’ those customers within the Facebook platform. ‘Lookalike audiences’ are more sophisticated as they take an existing audience—provided by a marketer or built within Facebook—and then use it as a ‘query’ to generate a larger set of consumers with similar characteristics on the platform. Highly prospective tools like lookalike audiences illustrate that advertisers are not so much buying advertising space or slots in a feed of content as much as they are buying access to an infrastructure that can iteratively optimise relationships with targeted consumers.
These features are arguably uniquely harmful for addictive commodities like alcohol where platforms’ algorithmic architecture could easily ‘learn’ unintended proxies for excessive or harmful consumption—classifying ‘dependent drinkers’ as ‘high value’ consumers due to patterns they share that may not be directly related to their expressed preferences for alcohol. These patterns could be derived from locative information that places them proximate to licensed venues more often than other consumers. They could be keywords used in their private messenger chats that indicate a preference for drinking, or drinking-related pastimes.
Throughout the past decade alcohol marketers have remained at the forefront of innovation about how to use digital media to track consumers and engage with them in specific times and places. In recent years, but especially since the onset of the Covid-19 pandemic, they have rapidly expanded their use of digital platforms to integrate advertising with ‘one click’ purchase and rapid home delivery services (Carah and Brodmerkel 2021; Mojica-Perez et al. 2019). In doing so, they are closing the gap between the promotion and distribution steps in their marketing strategies.
In the following, we describe and critically reflect on the transition from an organic to a paid advertising model. Where initially digital platforms’ separated ‘paid’ display advertising from ‘organic’ or ‘earned’ engagement, over time these elements have become integrated in a paid native model. We argue that analysing the shift from an organic to a paid advertising model is crucial to understanding how the advertising model of digital platforms has matured. They also help us appreciate how our approach to studying and regulating digital advertising needs to focus on the ongoing development of platform infrastructure, rather than the specific activities of advertisers at any given moment in time. The lesson of the past decade is that the advertising model of digital platforms is in a continuous and generative state of transformation. This makes it fundamentally different to the advertising architecture of mass media. Strategies to reduce the harms caused by marketing, especially marketing of commodities where the public has an interest in the protection of vulnerable consumers—like alcohol, gambling, tobacco, unhealthy food, financial services, insurance, real estate, employment, and so on—need to aim at the infrastructure of platforms rather than the content of ads or the current tactics used by marketers on platforms.
The Organic Period
The ‘organic’ moment is characterised by brands creating profiles and posting content on social media platforms either as a stand-alone promotional activity or in addition to purchasing display advertising. In the case of platforms like MySpace and Facebook advertisers were not convinced of the value of paid display advertising, where their ads were placed as interstitial pop-ups or alongside user profiles, walls or news feeds. In the case of Instagram there was no paid advertising. Advertisers sought instead to engage directly with the participatory culture of social media. They created their own accounts, pages and profiles to post content and engage with consumers. They partnered with influencers, celebrities and other cultural producers like musicians, photographers and fashion models to post content on their behalf. They encouraged consumers to post content that referenced or incorporated the brand on their own profiles.
In this period, brands invented a native advertising model before one formally existed. In the case of alcohol marketing, brands, retailers and venues acculturated themselves to the attention economy of social media platforms by trying to figure out what consumers’ everyday drinking cultures looked like, and then attempting to make themselves part of them. For example, in Australia, we saw brands making content that anticipated the Friday afternoon ‘knock off’. The Bundy Bear, the mascot of the Australian rum brand Bundaberg would post images on a Friday afternoon about how he could hardly wait to have his first rum of the weekend (Carah et al. 2014). The posts would be timed to hit the feeds of ‘fans’ of Bundaberg Rum on Friday afternoon when they too were getting ready to have a ‘knock off’ drink. They liked, shared and commented on the post as a way of communicating with their peers about their own drinking culture. As they did so, they circulated branded content within their own social networks online. In a sense, users were helping brands refine and target their messages by selectively engaging with it, adding their own commentary, and pushing it deeper into their peer networks. In time, this engagement with profiles, pages and content created by brands also served to generate data that indicated ‘affinity’ between brands and consumers, and enabled platforms to assign them ‘preferences’ that could be used to recommend content and target advertising.
A crucial feature of organic reach, for alcohol marketers, is that consumers can say things that the brand itself cannot say. Consumers can link brands directly to celebrations of excessive consumption, something brands cannot do under their own self-regulatory codes. For instance, when Jim Beam posted an image of a tumbler of bourbon with the caption ‘soup of the day’, fans responded with comments like ‘I’m going on the Jim Beam soup diet’ and other statements about how often, or how much, bourbon they could drink (Carah et al. 2014). Brands did not routinely moderate comments on these posts, and even disputed that they had a responsibility for what consumers said on their own pages (Brodmerkel and Carah 2013).
The organic moment is significant because it marked the beginning of brands engaging deliberately with the participatory culture of social media platforms during a formative period. Users extensively incorporated brands and other symbolic resources from commercial culture into their own vernacular practices. Advertisers approached the moderation of user-generated content, and the disclosure of commercial intent, in uneven ways. Although organic strategies have become a less important part of advertisers’ use of social media platforms as their advertising models have matured, it is in this period that norms about moderation and disclosure were established, often in ways that suited the interests of advertisers.
One sign advertisers’ use of social media was maturing during this period was the growth of more clearly defined ‘social media manager’ roles who not only undertook basic moderation of pages, but who took responsibility for the front-line management of engagement and reach via social channels. This became especially important as social media platforms sought to create more formalised relationships with advertisers. The key issue for platforms was that advertisers’ organic practices were bypassing their paid advertising models, meaning that while advertisers might be investing resources in creating content and managing participation online, they weren’t paying the platforms for using their channels. The organic moment waned because platforms no longer wanted advertisers to capture attention without paying.
The ‘organic’ moment evolved into the ‘affinity’ moment as platforms transitioned toward algorithmically-curated feeds of content. Feeds like Facebook’s news feed and Instagram’s home feed were chronological when they were first launched. Users saw every post from friends, profiles or pages they followed in the order they were posted. The feeds were central to capturing and harnessing attention that could be sold to advertisers. The feeds needed to be ‘tuned’ to prioritise content that kept consumers engaged with the platforms. The basic commercial proposition is clear: ‘personalise’ feeds to maximise engagement with the platform and find the optimal level of paid content.
Multiple forces shape the way feeds are tuned in practice (Van Dijk and Poell 2013; Carmi 2020). In some cases, platforms will prioritise forms of content that drive user engagement with particular platform features, or they will de-prioritise content for political reasons. For instance, Facebook has prioritised images and videos, and it has both prioritised and deprioritised news during different periods over the past decade (Herrman and Maheshwari 2016; Sloane 2019). Early versions of Facebook’s news feed were tuned to prioritise timing, type of content, and affinity. Timing was a measure of how recently an item has been posted or interacted with, content meant that some types of posts (like images) were given priority of others (like text), and affinity was a catch-all term for a range of data being utilised for developing models of ‘attraction’ between users on the platform. The emphasis on tuning feeds for ‘affinity’ meant that brands could post all they like, but their content would only be recommended and shown to consumers if the platform algorithms discerned a high degree of affinity with the brand. In addition to accumulating followers of their pages or profiles, brands thus had to create content that generated affinity with consumers.
Advertisers looked for ways to tap into the high degree of affinity users had with their peers. This included strategies like creating real world engagement with consumers by building themed ‘activations’ at cultural and sporting events and other sites of lifestyle consumption like malls and nightlife precincts. Brands would encourage consumers to take photos of themselves socialising in themed spaces or with branded paraphernalia (glasses, hats, props and sets, etc.). As they did so they would incorporate brands and alcohol consumption into the story they were telling about themselves in their peer networks. They were also registering data that enabled platforms to make more accurate judgments about affinity between users, cultural interests and brands. During this period, we also see the emerging importance of workers like nightlife photographers and promoters (Carah 2014; Carah and Dobson 2016). They would create images, video and posts and circulate it in their own peer networks to stimulate engagement with brands and businesses. They were acutely aware of how to ‘game’ the affinity preference of recommendation algorithms (Carha and Dobson 2016; Cotter 2019).
The Native Period
During the organic period advertisers learned to embed themselves into the participatory culture of social media and then exploit the algorithmic architecture of platforms. The strategic challenge for platforms was then how to integrate their paid display advertising model into the participatory brand culture that had evolved on the platform. They needed to create a market where advertisers were compelled to pay for engagement with consumers, not just targeted display advertising. The risk for platforms was that advertisers would invest resources into the creation of branded content or partnerships with influencers to generate ‘earned’ rather than ‘paid’ reach on their platforms. In response, the platforms developed a native advertising model that integrates the participatory brand culture with their data-driven targeted advertising infrastructure. They now offer advertisers a proliferating range of ad formats including ephemeral video stories, augmented reality filters and sponsored posts. These formats flow ‘natively’ through the feeds of platforms, in that they look like any other kind of content and are not always easily distinguishable as ads. Alongside this content, platforms provide a better integration of advertising and retail, with the introduction of shopping features like the ‘buy’ button on Instagram.
Platforms also now offer more formalised relationships with partners like influencers, enabling collaboration within platforms and the sharing and integration of data across platform, advertisers and their partners. This shift to a native advertising model dramatically reduces accountability. Promotional communication becomes less easy to distinguish from organic or editorial content and is more often only visible to consumers who have been specifically targeted. Formats like influencer partnerships are often only disclosed in oblique ways (Wojdynski et al. 2018). Furthermore, as the algorithmic tuning of audience categories and advertising content grows in importance, it at the same time also becomes more opaque. Formats like Instagram’s ephemeral video stories are a crucial example here. Advertisers post stories from their own accounts, partner with influencers to post stories, and create and publish ads as stories. Stories are only visible to the consumers who have been targeted. There is no public archive of the content created nor how it is targeted. Stories also incorporate an expanding array of interactive buttons and features. For instance, in a story created by the beer brand Guinness the consumer is taken to an interactive map where they can find the nearest pub serving pints. Or, in a story created by the wine delivery service Vinomofo, the consumer can buy wine by swiping up on the story and being taken to an online store. On platforms like Snapchat, TikTok and Instagram we have seen the development of ‘sponsored filters’, a form of augmented reality advertising. Users are not targeted with ads, but rather they are targeted with ad-making tasks (Hawker and Carah 2020). The filter is an invitation to take a ‘tool’ provided by the brand and make an image or video that is shared with peers. The content made is only seen by peers. There is no moment where the advertisements or other activities of marketers are published or archived and therefore available for scrutiny.
These participatory and targeted forms of advertising are unfolding within platforms where our participation generates vast databases of information about our everyday life. That information is not only used to target us in discreet ways, for instance by enabling advertisers to select people of a particular gender, age, in a designated location, or with declared preferences for particular genres, pastimes and products. More fundamentally, this data is used to train algorithmic models that progressively learn to classify and make predictions about consumers and their social lives. These predictions might not correspond with definitive criteria set by advertisers; rather what advertisers ‘buy’ from platforms is access to an audience building service. For instance, algorithmic processing of images users post on social media can identify brand logos and products and other patterns of consumption. This enables users to be assembled into audiences based on the content they post and interact with.
We ought to prepare too for a looming wave of algorithmic advertising that involves not only the targeting of ads, but the automated and dynamic generation of ads themselves. Already, tools like Facebook’s ‘dynamic creative’ learn not just to target ads but to assemble different versions of ads by repeatedly testing different combinations of images, text, buttons and calls to action with different consumers. In time, machine vision models will be able to fabricate images that appeal to different consumers in real time.
The ‘object’ of advertising on digital media is not an ‘ad’ that is ‘targeted’ using ‘data’. If we think like this we make the mistake of only looking to regulate the content of the advertisements, their placement, and the information used to target them. More fundamentally, we should conceptualise the ‘object’ as the dynamic process of optimising relationships between advertisers and consumers. Advertisers don’t so much buy ad space as rent access to a machinery that refines and tunes their audiences and their ability to act on their audiences. The longer an advertiser spends “in market” tuning these categories and creative connections, the more optimized their engagement becomes (Carah and Brodmerkel 2021). This is evident in the shift away from measuring exposure (how many people saw the ad) to measuring engagement (what people did). This indicates that the value proposition of a digital platform isn’t its ability to ‘place’ the ad in front of the right person at the right time, in so much as it is to gradually tune and tweak its capacity to nudge, move, engage a consumer. What matters is stimulating the click, the purchase, the recommendation. This means we need to be thinking about regulating this process of ‘operating’ on the consumer.
Regulating Alcohol Advertising on Digital Platforms
As the early combination of targeted display advertising alongside participatory organic engagement with brands and promotion morphed into data-driven, pay-to-play, full service marketing technologies, public accountability has become a major issue. This concerns not just our ability to see what advertisers are doing—what content they are posting, for instance—but, more fundamentally, to understand how the system operates. Its power is no longer located only in the symbolic persuasiveness of advertisements, but in platforms’ data-driven models and their capacity to target consumers. Over time, advertising on platforms has become the ‘dynamic process of training predictive models that assemble audiences, configure ads, and optimize the relationships between them’ (Carah and Brodmerkel 2021).
In the organic moment, advertisers began to operate beyond the publication of what we would consider to be traditional ‘advertisements’. An emerging regulatory threshold issue was (and still remains) advertisers’ responsibility for the broader participatory culture they animate around their brands. Norms need to be set around their responsibility not only to moderate content in the various channels they now operate in, but also for the forms of expression they encourage from consumers. Furthermore, we begin to see that defining the actions of advertisers around the ‘advertisement’ breaks down, as the ads are integrated into the open-ended participatory culture of social media. The disclosure and visibility of advertising becomes a critical issue. This involves both how advertising is distinguished from other forms of content and cultural expression and how open advertising is to public scrutiny. In the early ‘organic’ years content was relatively visible in the sense that members of the public could follow brand pages, and to some degree even scrape posts and build archives of advertiser content (using either platform APIs or web-scraping). But, as platforms became more organised around algorithmically-curated feeds and ephemeral content it is no longer possible to observe and archive advertisements, even in an ad-hoc way. This is a threshold issue when we consider that nearly all forms of advertising regulation are based on the principle of advertising being open to public scrutiny.
In response to demands for more accountability, platforms have developed some limited ‘transparency tools’. For example, on the user-side Facebook provide dashboards that enable users to see what ‘ad preferences’ they have been assigned. These feature generic information about how Facebook understands them. And, they offer some limited forms of control such as the capacity to remove preferences (although they cannot stop Facebook from generating new preferences). Users are also able to stop Facebook from targeting them with ads for specific commodities like alcohol and gambling. Facebook also created the so-called Ad Library in response to the fallout about ‘dark’ ads during the 2016 Presidential Election in the United States. The tool offers a rudimentary portrait of what advertisers are doing on the platform. The library enables members of the public to search for any public page on the platform and see what ads they are currently running. The term library is disingenuous in the sense that it does not archive the content, it only shows what's running live on the platform. It also provides very limited information about the ads. It will indicate if there are multiple versions, but it will not show all the versions or reveal if they are being automatically generated. It will also not provide any details about the reach of ads or the data being used to target them. While the library gives some indication of what campaigns are currently running on the platform, it does not enable a systematic archiving and analysis and it does not allow the public to come to understand and assess how audiences are assembled and targeted. This is particularly an issue of concern with addictive commodities like alcohol, where a highly tuned model would most likely disproportionately prey on vulnerable, high volume, consumers.
Historic approaches to advertising regulation have almost exclusively focused on specifying what advertisers can say and where they can say it. Platforms have tended to follow this logic by focussing on the provision of moderation and gating tools, and transparency tools like the Ad Library which build-in the assumption that public accountability means being able to see the content of ads. This strategy works to undermine public scrutiny and accountability for a number of reasons: Firstly, platform tools for reporting content reduce scrutiny to individual preference rather than community standards. Secondly, it mis-specifies (whether deliberately or not is open to debate) how advertising on platforms works.
Public accountability post-platformization would mean coming to terms with how platforms use data to tune the relationship between ads and consumers, and build a model of advertising organised around these operations. A platform-centric view of advertising on digital media needs to attend to consumer participation in creating and circulating advertisements, the collection of data, and the training of models that enable marketers to optimize the relationships between ads and consumer actions. In other words: It is not just the symbolic message or persuasive attempt of an individual ad that has the potential to cause harm and requires regulatory scrutiny, but the data-driven optimization of consumer attention, engagement, and behaviour.
Thus, a platform-centric view on digital marketing suggests that three key steps are imperative for establishing effective regulatory interventions: First, we need to agree that advertising ought to be open to public scrutiny because it affects the quality of public life and plays a determining role in shaping the infrastructure we now use to circulate information and create a shared sense of reality. Second, we need to reckon with what it means for advertising to be public in the platform era. It does not mean being able to see the ads, it means opening up the data-driven operations that characterise marketing on digital platforms. Third, we need to shift our focus from emphasising privacy and individual choice to a stronger focus on potential harms and the public interest. Ultimately, we need to see our concerns about moderation, speech, bias and discrimination as entwined with platforms that are fundamentally advertising companies. There is no way to deal with larger questions about how platforms affect our public culture that doesn’t involve a reform of the advertising model.
ABAC (2018). ABAC Adjudication Panel Determination No. 110/18. Available online: www.abac.org.au/wp-content/uploads/2018/10/110-18-Determination-Furphy-Beer-18-October-2018.pdf
ACCC (2019). Digital Platforms Inquiry – Final Report (June 2019). Australian Competition and Consumer Commission, Canberra. Available online: https://www.accc.gov.au/publications/digital-platforms-inquiry-final-report
Ananny, M., & Crawford, K. (2016). Seeing without knowing: Limitations of the transparency ideal and its application to algorithmic accountability. New Media & Society, 20(3), pp. 973–989.
Atkinson, A. M., Ross-Houle, K. M., Begley, E., & Sumnall, H. (2017). An exploration of alcohol advertising on social networking sites: An analysis of content, interactions and young people’s perspectives. Addiction Research & Theory, 25(2), 91–102.
Beer, D. (2017). The social power of algorithms. Information, Communication & Society, 20(1), pp. 1–13.
Brodmerkel, S., & Carah, N. (2013). Alcohol brands on Facebook: The challenges of regulating brands on social media. Journal of Public Affairs, 13(3), pp. 272–281.
Carah, N. (2014). Watching nightlife: Affective labor, social media, and surveillance. Television & New Media, 15(3), pp. 250–265.
Carah, N., & Shaul, M. (2016). Brands and Instagram: Point, tap, swipe, glance. Mobile Media & Communication, 4(1), pp. 69–84.
Carah, N., & Dobson, A. (2016). Algorithmic hotness: Young women’s “promotion” and “reconnaissance” work via social media body images. Social Media+ Society, 2(4), pp. 1–10.
Carah, N., & Angus, D. (2018). Algorithmic brand culture: Participatory labour, machine learning and branding on social media. Media, Culture & Society, 40(2), pp. 178–194.
Carah, N., & Brodmerkel, S. (2021). Alcohol marketing in the era of digital media platforms. Journal of Studies on Alcohol and Drugs, 82(1), pp. 18–27.
Carah, N., Brodmerkel, S., & Hernandez, L. (2014). Brands and sociality: Alcohol branding, drinking culture and Facebook. Convergence, 20(3), pp. 259–275.
Carmi, E. (2020). Rhythmedia: A study of facebook immune system. Theory, Culture & Society, 37(5), pp. 119–138.
Cheney-Lippold, J. (2011). A new algorithmic identity: Soft biopolitics and the modulation of control. Theory, Culture & Society, 28(6), pp. 164–181.
Cotter, K. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influence on Instagram. New Media & Society, 21(4), pp. 895–913.
Draper, N., & Turow, J. (2019). The corporate cultivation of digital resignation. New Media & Society, 21(8), pp. 1824–1839.
Duguay, S., Burgess, J., & Suzor, N. (2018). Queer women’s experience of patchwork platform governance on Tinder, Instagram, and Vine. Convergence, 26(2), pp. 237–252.
Goodwin, I., Griffin, C., Lyons, A., McCreanor, T., & Moewaka Barnes, H. (2016). Precarious popularity: Facebook drinking photos, the attention economy, and the regime of the branded self. Social Media + Society, 2(1), 2056305116628889.
Hawker, K., & Carah, N. (2020). Snapchat’s augmented reality brand culture: Sponsored filters and lenses as digital piecework. Continuum, 35(1), pp. 12–29.
Helmond, A., Nieborg, D., & van der Vlist, F. (2019). Facebook‘s evolution: Development of a platform-as-infrastructure. Internet Histories, 3(2), pp. 123–146.
Herrman, J., & Maheshwari, S. (2016). Facebook Apologizes for Overstating Video Metrics. Available online: https://www.nytimes.com/2016/09/24/business/media/facebook-apologizes-for-overstating-video-metrics.html
Lyons, A. C., Goodwin, I., Griffin, C., McCreanor, T., & Moewaka Barnes, H. (2016). Facebook and the fun of drinking photos: Reproducing gendered regimes of power. Social Media+ Society, 2(4), 2056305116672888.
McStay, A. (2011). Profiling Phorm: An autopoietic approach to the audience-as-commodity. Surveillance & Society, 8(3), pp. 310–322.
Mojica-Perez, Y., Callinan, S., & Livingston, M. (2019). Alcohol home delivery services: An investigation of use and risk. Canberra: Centre for Alcohol Research and Education. Available online: https://fare.org.au/wp-content/uploads/Alcohol-home-delivery-services.pdf
Nadler, A., & McGuigan, L. (2018). An impulse to exploit: The behavioral turn in data-driven marketing. Critical Studies in Media Communication, 35(2), pp. 151–165.
Nieborg, D., & Poell, T. (2018). The platformization of cultural production: Theorizing the contingent cultural commodity. New Media & Society, 20(11), pp. 4275–4292.
Nieborg, D. B., & Helmond, A. (2019). The political economy of Facebook’s platformization in the mobile ecosystem: Facebook Messenger as a platform instance. Media, Culture & Society, 41(2), pp. 196–218.
Niland, P., McCreanor, T., Lyons, A. C., & Griffin, C. (2017). Alcohol marketing on social media: Young adults engage with alcohol marketing on facebook. Addiction Research & Theory, 25(4), pp. 273–284.
Poell, T., Nieborg, D. & van Dijk, J. (2019). Platformisation. Internet Policy Review, 8(4), https://doi.org/10.14763/2019.4.1425
Sloane, G. (2019). Facebook agrees to pay advertisers $40 million over inflated video stats. Available online: https://adage.com/article/digital/facebook-agrees-pay-advertisers-40-million-over-inflated-video-stats/2205101
Striphas, T. (2015). Algorithmic culture. European Journal of Cultural Studies, 18(4–5), pp. 395–412.
Turow, J. (2006). Niche Envy: Marketing discrimination in the digital age. Cambridge: MIT Press.
Turow, J. (2012). The daily you: How the new advertising industry is defining your identity and your worth. New Haven: Yale University Press.
Turow, J., McGuigan, L., & Maris, E. (2015). Making data-mining a natural part of life: Physical retailing, customer surveillance and the 21st century social imaginary. European Journal of Cultural Studies, 18(4–5), pp. 464–478.
Van Dijck, J. & Poell, T. (2013). Understanding social media logic. Media and Communication, 1(1), pp. 2-14.
Van Dijk, J. (2020). Seeing the forest for the trees: Visualizing platformization and its governance. New Media & Society, https://doi.org/10.1177/1461444820940293
Wojdynski, B., Evans, N., & Hoy, M. (2018). Measuring sponsorship transparency in the age of native advertising. The Journal of Consumer Affairs, 52(1), pp. 115–137.
Yeung, K. (2017). ‘Hypernudge’: Big data as a mode of regulation by design. Information, Communication & Society, 20(1), pp. 118–136.
. Youn, S., & Seunghyun, K. (2019). Newsfeed native advertising on Facebook: Young millenials’ knowledge, pet peeves, reactance and ad avoidance. International Journal of Advertising, 38(5), pp. 651–683.
Ziewitz, M. (2016). Governing algorithms: Myth, mess, and methods. Science, Technology & Human Values, 41(1), pp. 3–16.
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
© 2022 The Author(s)
About this chapter
Cite this chapter
Carah, N., Brodmerkel, S. (2022). Regulating Platforms’ Algorithmic Brand Culture: The Instructive Case of Alcohol Marketers on Social Media. In: Flew, T., Martin, F.R. (eds) Digital Platform Regulation. Palgrave Global Media Policy and Business. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-95220-4_6
Publisher Name: Palgrave Macmillan, Cham
Print ISBN: 978-3-030-95219-8
Online ISBN: 978-3-030-95220-4
eBook Packages: Literature, Cultural and Media StudiesLiterature, Cultural and Media Studies (R0)