The rules pertaining to unfair contract terms in the European Union (EU), as outlined in Directive 1993/13/EEC on unfair contract terms (henceforth, the UCTD),Footnote 1 have remained to date one of the most significant sets of rules governing consumer contracts worldwide. Yet, despite their importance, no attempts have been made to modernize the rules or adapt them to the digital age. Having been adopted three decades ago, the rules no longer adequately reflect the ongoing digitalization trends permeating the market, economy, and society at large.

Particularly over the last decade, there has been a noticeable surge in social media platforms with increasingly large user bases (Gardiner, 2022) which has sparked concerns over the monetization of data and whether existing consumer law, such as the UCTD, provides adequate protection for the users of such platforms (Durovic & Lech, 2021). These concerns are especially salient due to the significant global reach of most major technology companies (Gardiner, 2022), many of which operate in jurisdictions of the European Union like the United States of America or the People’s Republic of China, and the increasing importance of online platforms in the aftermath of COVID-19 (Gardiner, 2022).

One particular concern arises from the fact that the UCTD, having been drafted for sales contracts and service contracts that were fully performed shortly after their conclusion (Recital 6), may be unable to extend its protection to contracts concluded online. Specifically, online contracts are often of a long-term nature and are subject to much more extensive changes than the UCTD may have originally taken into account. Whilst this concern may not be pertinent for all contract types—for instance mortgage contracts where the changes predominately pertain to the primary performance of the consumer (e.g., interest rates and exchange rates) which can (in theory, in the abstract) easily be foreseen at the time contract formation— it potentially presents issues for a significant portion of online contracts, such as, for the purposes of current discussion, the standard term contracts of social media platforms.

In light of the foregoing, this paper aims to evaluate, through the lens of consumer law, the efficacy of the UCTD in safeguarding consumer data and addressing emerging digital vulnerabilities within the evolving technological marketplace. The paper will endeavor to examine the adaptability of the UCTD to the digital age by drawing on two recent consumer law cases involving Meta and TikTok. These cases have been specifically chosen to provide a valuable framework for the analysis of the UCTD’s relevance in the digital realm, specifically due to the emphasis that each case gave to the notion of consumer vulnerability.

Importantly, the UCTD’s regulatory framework converges significantly with various other provisions within consumer and competition law, notably in the domain of data privacy and the 2016 Regulations on the protection of natural persons with regard to the processing of personal data (GDPR).Footnote 2 However, this paper ‘s examination will exclusively focus on the unfair contract terms regime as contained in the UCTD.

The paper will begin with a succinct overview of the UCTD’s contemporary relevance to the regulation of the relationship between online platforms and their users. Particular attention will be given to the influence of the newly introduced Digital Services Act 2022 (DSA), Footnote 3 in order to contextualize the current relevance of the UCTD within the evolving regulatory landscape. Subsequently, and with reference to the aforementioned cases against Meta and TikTok, the paper will highlight several gaps in the UCTD’s protective framework which impede its potential to be effective regulatory tool in the digital age. These gaps include deficiencies in the Directive’s substantive protections, penalties, and enforcement mechanisms.

Finally, the paper will explore the potential solutions to the deficiencies identified. Specifically, it will assess the viability of incorporating data-driven personalization of unfair terms controls into the existing regulatory framework of default rules, or potentially replacing the current framework entirely. Ultimately, the paper proposes a dual-pronged approach to reform. Firstly, a refinement of the procedural disclosure of terms is recommended to facilitate more informed user decision-making. Secondly, existing substantive default protections should be strengthened, rather than abolished, in a way that continues to strike a balance between encouraging consumer confidence and providing sufficient legal certainty to traders. Support for this approach will be derived from an examination of the regulatory strategies employed by authorities both within and outside the EU, pertaining to the control of technology companies’ handling of users’ data and the terms of use policies.

Scope of Application of the UCTD

The UCTD’s regulatory scope is unique by virtue of the fact that it concerns itself with shaping the content of consumers’ contractual obligations in their interactions with online platforms. Unlike the GDPR, which, in practical terms, is the most widely applied legislative framework currently governing the relationship between online platforms and their users, the UCTD extends its regulatory ambit to cover not only businesses’ collection of consumer data, but most contractual terms arising privately between certain parties, i.e., a consumer—a person acting for non-business purposes—and a business (Article 2(b) and (c)). Furthermore, as exemplified by TikTok’s Diamond policy, as examined later in this paper, the UCTD applies to traditional contracts wherein a user provides monetary remuneration to an online platform for defined goods or services. Thus, the UCTD’s targeted focus on business-to-consumer contracts addresses aspects of users’ relationship with online platforms that would otherwise be under-addressed were we to solely rely on the GDPR.

The UCTD possesses equal advantages over the DSA in terms of regulating consumer–trader interactions in the digital age. Whilst both pieces of legislation focus on preserving individuals’ rights as consumers, the UCTD places special emphasis on addressing business-to-consumer contracts. By contrast, the primary focus of the DSA is the general duty of businesses, specifically online platforms, to curb the spread of online harms like misinformation (Schmid, 2023; Stolton, 2020). Thus, the DSA and the UCTD have divergent regulatory focuses. Whereas the DSA focuses on harm prevention, the UCTD focuses on facilitating consumption whilst striking a balance between legal certainty and freedom of contract. These different regulatory focuses manifest in the divergent natures of the obligations imposed on businesses within legislative framework, and the consequences for their infringement. The DSA requires online platforms to actively regulate user-to-user relationships by taking steps to limit the spread of misinformation posted by users. Additionally, it envisages active monitoring of compliance through a European Board for Digital Services (Buri & van Hoboken, 2021),Footnote 4 and any non-compliance will be met with penalties like fines and potential criminal sanctions like imprisonment for those in managerial positions (Department for Digital, Culture, Media, & Sport, 2022). In contrast, the UCTD, unlike later consumer directives such as Directive 2005/29/EC on unfair commercial practices,Footnote 5 contains no requirement for member states to enact regulatory penalties of any kind.Footnote 6 It equally contains far fewer criminal penalties, as a result of the view that this would be disproportionate to the nature of contractual obligations. The UTCD applies to voluntary obligations between specific parties, rather than obligations which arise as a matter of course. This deliberate scope facilitates a light-touch approach that remains better suited to addressing private contractual obligations between online platforms and individual users.

The UCTD’s role in the regulation of the digital consumer market warrants greater scrutiny, however, in light of the recent legal actions brought against major global platforms, in particular Meta and its subsidiary WhatsApp, and TikTok, pertaining to terms in their user agreements concerning the handling of users’ personal data. Given that these companies are leaders in the industry, Facebook being the world’s biggest social media company with almost three billion users in 2023 (McMorrow & Yang, 2021),Footnote 7 and TikTok being the most downloaded app globally in 2022 (Nakafuji, 2021), their privacy policies possess the potential to have a large impact on social media users. As will be seen from the actions brought against these companies, there seems to be a common problem in their user policies. Specifically, the terms of service often permit users to surrender vast amounts of data not only to the platforms themselves but also to loosely defined categories of third-party affiliates, often without adequately stating the scope or effects of such data sharing. Conversely, the policies often limit or even exclude the company’s liability to users. Consequently, the cases brought against Meta and TikTok offer the opportunity to critically assess the adequacy of the current regulatory response to such issues, as well as the broader realm of unfair terms within digital platform contracts.

The Example of Meta

Facebook and other FAANG companiesFootnote 8 have faced significant external pressures regarding their processing of personal data on a global level, in particular regarding the potential for users’ data to be transferred to third-party apps without their consent (BBC News, 2019).

The EU’s response to these concerns has been to undertake a series of initiatives through the European Commission (‘Commission’). Such initiatives have been successful in some instances, for example one particular initiative resulted in the meaning of ‘price’ in the UCTD being broadened to cover so-called ‘pay-by-data’ situations, as with Facebook Login, where users agree to permit the collection of their personal data for access to a service for which they do not pay monetary consideration (Dix, 2017, p. 3; Helberger et al., 2017, p. 1429).

Equally, following an exchange with the Commission, Facebook has made adjustments to their Terms and Services to clarify that their business model relies on targeted advertising and that, by engaging with Facebook’s services, users consent to share their data and to be exposed to advertising (European Commission, 2019c). Facebook has also made amendments to terms in its T&Cs that were considered to be potentially unfair. These amendments include altering the limitation of liability to acknowledge the potential liability for negligence, specifically regarding any mishandling of third-party data (UCTD, Annex 3, paras 1(a) and (b)), as well as limiting the unilateral ability to change contractual terms solely to instances where the changes meet a standard of reasonableness and account for users’ interests (UCTD, Annex 3, para 1(j)).

Notwithstanding these changes, as in the US, the Commission’s enforcement actions against Facebook have predominantly centered around the antitrust aspect of its business-to-business dealings within the online advertising market, rather than addressing the potential impacts of its contractual terms under consumer law (European Commission, 2021b).Footnote 9 Where users’ privacy is concerned, many EU-level enforcement actions since 2019 have focused on Facebook’s handling of data within the framework of the GDPR (BBC News, 2020).Footnote 10

At the national level, enforcement actions have drawn on both the general requirement of unfairness in Article 3(1), as well as its elaboration, which necessitates that all terms in consumer contracts be drafted in “plain and intelligible” language, with any ambiguous terms being construed in favour of the consumer (Article 5). Recent legal challenges have targeted the privacy terms of Facebook’s wholly owned instant-messaging platform WhatsApp, which Facebook acquired in 2014. The most recent complaint, filed by the BEUC (the European Consumer Organisation) and eight national member agencies, alleges that the changes made to WhatsApp’s privacy policies in 2021 lack clarity and fail to adequately inform consumers about their impacts (BEUC, 2021d; BEUC, 2021c). Although WhatsApp denies that the changes modify their existing privacy policies (BEUC, 2021c), the policies now state that users who communicate with businesses on WhatsApp will have their data shared with both the business and Facebook, without mentioning how such sharing will occur, and that the data may be used for advertising (Data Protection Commissioner v Facebook Ireland, 2020).

Preceding the latest challenge, triggered by WhatsApp’s changes to its privacy policies, Facebook faced both non-UCTD actions over its handling of user data, and actions under the UCTD for failing to give users sufficiently clear information about how their data would be handled. Challenges under national laws implementing the UCTD have specifically targeted the ambiguous language used by Facebook in documents setting out its contractual relationship with users. The Paris Court of First Instance, in UFC Que Choisir v Facebook (2019), ruled that language used in Facebook’s user policies, such as “notably,” “for example,” “among others,” “such as,” “may,” or “can,” did not give users enough certainty on what Facebook could do with their data (UFC Que Choisir, 2019). The lack of clarity within user policies pertaining to Facebook’s method of data collection and usage has also formed the basis of a successful non-UCTD challenge by Italy’s Altroconsumo, which alleged a misleading commercial practice (Euroconsumers, 2020).

Additionally, two recent inquiries by the Irish Data Protection Commission (DPC) into Meta regarding its Facebook and Instagram services partly concerned the transparency of its terms of service (DPC, 2023). The 2018 complaints, stemming from separate complainants, raised similar concerns relating to the latest changes in Meta’s terms of service in light of the GDPR. Specifically, Meta sought to shift from relying on user consent as the basis on which it relies under the GDPR to process personal data (Article 6(1)(a)), to relying on the ground of contractual necessity (Article 6(1)(b)). The DPC’s final decisions on the complaints were announced on 31 December 2022, and upheld the binding decision of the European Data Protection Board (EDPB). Importantly, it was held that Meta breached its transparency obligations under the GDPR, falling foul of Articles 12, 13(1)(c), and 5(1)(a) of the regulations. The information pertaining to the legal basis on which data processing was conducted was not outlined with sufficient clarity, meaning users were not equipped with sufficient understanding as to how, and on what grounds, their data would be used.

Furthermore, Facebook has incurred a €110 million fine under competition law for misleading regulators as to its ability to match user data between WhatsApp and Facebook (BEUC, 2021c; European Commission, 2017b). This penalty followed two rulings against Facebook by national data protection bodies on its processing of WhatsApp users’ data. In a Press Release in May 2021, the Hamburg Commissioner for Data Protection and Freedom of Information prohibited Facebook from processing WhatsApp users’ data for its purposes, highlighting, among other reasons, that the provisions were unclear in their application, contradictory, and did not clearly state the effects on WhatsApp users. The Italian Data Protection Agency likewise stated, in January 2021, that the policy updates were insufficiently clear, as they did not enable users to comprehend the nature of the policy updates (BEUC, 2021c, p. 16). Although these national rulings were not made within the framework of the UCTD, they highlight a widespread concern that the alterations made to Facebook and WhatsApp’s privacy policies still do not meet the requirements of ‘plain and intelligible language’ prescribed under the UCTD. As a result, there remains a possibility that the new policies may also constitute unfair terms.

The Example of TikTok

TikTok, formally owned by Chinese company Bytedance, has raised significant regulatory concerns in some of the largest digital consumer markets regarding its privacy policies. These concerns have been increasingly tied to issues of national security, particularly regarding the risk of the government of the People’s Republic of China (PRC) having access to user data. Such concerns stem from the PRC government officials’ influence over the company and its PRC-based parent Bytedance, with a PRC diplomat having crafted TikTok’s content moderation policy (Morrow, 2020). In response to this perceived threat of mass data collection, several governments have opted to cut off user access to TikTok, rather than mandating a change in privacy policies.

The focus on national security in discussions surrounding TikTok arguably obscures the substantive concerns pertaining to TikTok’s handling of consumer data and the appropriate response to these concerns. One illustration of this can be observed in the PRC government’s regulatory approach to TikTok. Douyin, developed by Bytedance as an exclusive, separate, but identical app for the PRC market (McMorrow & Yang, 2021),Footnote 11 was named among 105 apps identified by the PRC’s Central Cyberspace Affairs Commission (CAC) as violating data privacy laws (Office of the Central Cyberspace Affairs Commission, 2021b; Soo, 2021). Echoing enforcement actions taken against Facebook by non-PRC regulators, Douyin was given 15 days to update its policies, with the CAC specifying that Douyin was ‘violating the principle of necessity, (and) collecting personal data for reasons unrelated to services provided’ (Office of the Central Cyberspace Affairs Commission, 2021b). This enforcement action coincided with the implementation of new regulations for social media platforms’ mobile applications, which mandated that fresh user consent be obtained for every change in apps’ data policies (Office of the Central Cyberspace Affairs Commission, 2021a). This requirement for clear terms is equally reflected in the PRC’s Personal Data Protection Law, which stipulates that processing of data must be transparently and openly conducted, and requiring that the objectives, methods, and limits of the personal data collection be explicitly stated (Junzejun Law Offices, 2021). A parallel may therefore be drawn between the PRC’s requirement for transparency in users’ data agreements and the similar requirement of the UCTD, indicating a common concern for ensuring that users are adequately and reasonably informed about how their personal data will be used.

In contrast to the national security approaches taken outside the EU, the EU’s response to TikTok’s control over user data has focused on the specific terms contained in its policies. These actions primarily focus on two aspects of TikTok’s policies: the lack of clarity in the presentation of the terms, and the disproportionate control given to TikTok to decide the aspects of its relationship with users. Reflecting the broad nature of privacy concerns across online platforms, some of these enforcement actions against TikTok rely on interpretations of the general unfairness clause and the principle of intelligibility established in prior actions against WhatsApp and Facebook. The focus on intelligibility extends beyond the presentation of terms to encompass the necessity of ensuring that terms are worded in simple language that is comprehensible to particular groups of users with lower comprehension skills, such as minors.

In its action against TikTok, the Italian Data Protection Commissioner highlighted that its terms neglected to consider the circumstances of minors, suggesting that TikTok should have created a dedicated section for minors set out in simple language which specifically highlights the particular risks posed to them (Garante Per La Protezione Dei Dati Personali, 2021). Similarly, the limited range of European languages in which TikTok displays its policies has also been cited as a potential barrier for non-native English speakers, which could hinder their understanding of the terms’ implications (BEUC, 2021a). The importance of linguistic accessibility was emphasized in a 2016 ruling against WhatsApp by Berlin’s Superior Court, which held that “an extensive, complex set of rules with many clauses in a foreign language” would not constitute an intelligible set of terms (Verbraucherzentrale Bundesverband, 2018). Despite TikTok’s EU-wide availability, its privacy policy is only available in 18 languages, potentially violating the requirement of intelligibility in member states where the policy is unavailable in the national language. Additionally, minors in those regions are less likely to comprehend the terms written in a foreign language (BEUC, 2021a). This potential linguistic inaccessibility of TikTok’s terms highlights the fact that effective user control over privacy settings hinges upon the ability of users to read and reasonably understand the terms on which they contract.

In addition to their lack of intelligibility, TikTok’s terms of service have raised concerns regarding their potential unfairness, primarily due to the disproportionate degree of control that TikTok reserves over users’ data (BEUC, 2021b). Indeed, several of their terms have been identified as falling within the UCTD’s indicative list of potentially unfair terms and have been singled out by both the European Commission and national consumer authorities as particular areas of concern in social media companies’ policies generally (European Commission, 2017a).

Under TikTok’s terms, users give ‘an unconditional, irrevocable, non-exclusive, royalty-free, fully transferable (including sub-licensable), perpetual worldwide licence’ to TikTok, its licensees and affiliates, and other users, over the content made by them (BEUC, 2021a, p. 12; TikTok, 2023a). This very language had been held unfair in UFC Que Choisir v Facebook (2019) due to its creation of a significant imbalance to consumers’ detriment without any corresponding benefit conferred by Facebook (UFC Que Choisir, 2019). The wide-ranging discretion given to TikTok by their terms extends even to its paid Diamonds service, under which users may use monetary consideration to purchase gifts and send them to other users’ accounts, which may be converted into Diamonds that can in turn be exchanged for US dollars. TikTok’s terms stipulate that both the conversion rate of gifts to Diamonds, and Diamonds to dollars, is determinable solely by TikTok. TikTok also reserves the right to cancel the exchange scheme entirely, citing a series of ambiguously worded justifications such as “technical reasons” (UFC Que Choisir, 2019). These terms effectively give TikTok a unilateral discretion to modify the terms of the Diamond service, and to even terminate the Diamond service entirely based upon reasons that would be difficult for users to verify (UCTD, Annex A, paras 1(b) and (q)), despite the European Commission’s explicit warning to social media companies regarding the vast imbalance in rights and obligations that such terms create (European Commission, 2017a).

Owing to the lack of intelligibility of its terms, and their creation of substantively imbalanced obligations, the European Consumer Network and national consumer agencies have launched a formal complaint against TikTok for potential violations of the UCTD (BEUC, 2021b).

The series of enforcement actions brought against Facebook, by authorities both within and outside the EU, regarding its pervasively unclear clauses pertaining to the effects of its data-sharing activities indicates that this lack of clarity may not be an oversight on its part. Additionally, the recurrence of these issues over the last decade in actions over both Facebook and WhatsApp’s privacy policies suggests that the current measures in place to deter non-compliance are inadequate.

Furthermore, the actions against Facebook and TikTok shed light on a common and recurring problem: Privacy policy terms are often vague and do not adequately inform users about extent and manner of the data that will be shared. Such terms effectively grant online platforms a wide discretion in handling users’ data, leaving consumers with limited control over their own information.

Shortcomings of the UCTD

Having examined several potential regulatory issues concerning the privacy policies of social media companies, this section will now delve into the UCTD’s potential limitations in addressing these issues, with a specific focus on three key areas: transparency and accessibility of terms, the core terms exemption, and penalties.

As evidence by the number of actions involving Article 5 of the UCTD, most recently, the BEUC’s action against TikTok, the persisting lack of transparency in the presentation of company’s T&Cs remains a significant issue. Even apart from the practical difficulties of cost and time, the ability of consumers to enforce their rights through judicial means is currently hindered by the complex language in which terms are presented and the considerable length of most terms of service. This issue of presentation is recurrent and pervasive. At 11,698 words (Faye, 2020 as reported by Kleinman, 2020), the current length of TikTok’s T&Cs exceeds even the 11,195 words of Facebook’s T&Cs in 2015, in turn, more than twice the length of the search engine Google’s T&Cs at 4099 words and e-commerce site Amazon UK’s T&Cs at 5212 words (Elshout et al., 2016, p. 6).

Evidently, longer T&Cs do not necessarily enhance clarity and may instead deter consumers from reading T&Cs altogether. In the context of data privacy, established behavioural research suggests that the prevalent “notice and choice” model of data privacy (Busch, 2019, p. 310), under which users are given large volumes of information on how their data might be used, not all of which may be relevant to them, and asked to select between various types of privacy settings could lead to “information overload” (Busch, 2019, p. 314). This informational overload can lead to decision fatigue, wherein overwhelmed users are prone to making less well-considered decisions that may, as a result, not align with their best interests (Busch, 2019; Jacoby et al., 1974, p. 1). This phenomenon arguably becomes even more pronounced in the context of exclusively mobile-based applications like TikTok, where users are arguably less likely to read lengthy T&Cs presented on small mobile phone screens, which require a lot of scrolling, than when they are presented on laptop screens (Alghamdi et al., 2014; Mehmood et al., 2021). As a result, the combination of lengthy T&Cs and the limitations of mobile device interfaces poses a significant obstacle to user comprehension of privacy policies and the terms pertaining to data handling.

While some firms may deliberately attempt to prevent consumers from reading or understanding terms, such a motive is not the sole, or even the most likely, reason for the length of terms and conditions. Several factors contribute to the extensive nature of T&Cs, beyond deliberate confusion. One alternative motivation could be the desire to minimize uncertainty and void potential litigation by including a wide range of scenarios in their T&Cs. Additionally, legal requirements like the “four-corners rule,” which is typically found in most common law systems in the US where Facebook and other FAANG companies are based, require that a contract must contain all terms in a single document. By adhering to such rules, the length of a company’s T&Cs is likely to increase.Footnote 12 Furthermore, companies may update their T&Cs to comply with the rulings of past legal challenges, implementing more precise legal wording and providing more information. For example, following action by the European Commission Facebook added words to its T&Cs to specify that users’ data would be used in advertising (European Commission, 2019c). Ironically, another, often prevalent, reason for overly lengthy T&Cs is the attempt by companies to satisfy their transparency obligations and comprehensively notify users of their rights (Elshout et al., 2016, p. 6). Traders might feel compelled to include more information in their T&Cs pertaining to the legal effects and consequences of particular provisions, thereby inadvertently exacerbating the problem of lengthy T&Cs. Nevertheless, regardless of the reasons behind lengthy T&Cs, it is clear that they reflect a failure to fulfil the UCTD’s transparency and intelligibility stipulations (UCTD, Article 5).

Unlike substantive protections, addressing the problem of clarity in presentation of terms may not require an increase in protections or notification requirements, but rather a reformulation of existing requirements. As has been argued, merely increasing the number of substantive protections which must be brought to the attention of consumers would not increase the protection for consumers but rather decrease the ability of consumers to respond rationally and take individual action (Korobkin, 2003, p. 1229). Long T&Cs exacerbate the behavioural weaknesses of consumers which underpin most careless contractual decision-making, in particular, the attitude of complacency and the tendency of consumers to underestimate the potential relevance of T&Cs to themselves (Hillman & Rachlinski, 2002, pp. 453-454). Thus, addressing the clarity issue will likely require a more nuanced approach than simply increasing the volume of information provided to consumers.

Indicative of this complacency is the startingly low level of awareness among consumers regarding their rights. According to a survey conducted by the Commission, in 2020 only 27% of consumers across the EU reported knowing their rights very well, while 70% relied on public authorities to protect their rights—and despite the potentially onerous consequences of some T&Cs, 80% of consumers trusted retailers and service providers to respect their rights (European Commission, 2021a). These findings suggest that, irrespective of the length or content of specific businesses’ T&Cs, consumers are generally disinclined to educate themselves about their rights, let alone to pursue enforcement actions.

Equally, individuals typically employ heuristics (i.e., shortcuts) when evaluating the potential for a certain term or consequence to apply to their situation (Ramsay, 2012). As a result, they often underestimate the likelihood of a term being pertinent to them or their transaction (Hillman & Rachlinski, 2002). It is therefore difficult to assert that lengthy T&Cs are the sole cause of the prevalent practice of “signing-without-reading.” Nonetheless, it is clear that lengthy T&Cs at the very least do nothing to mitigate the problem.

In addition, the lack of availability of T&Cs in the national language of users—as exemplified by TikTok’s T&Cs—is likely to act as a further deterrent to users in reading terms. Such issues make it evident that reforms to the presentation of T&Cs are necessary.

An equally pressing concern, in addition to the length of T&Cs, is the use of limitation clauses. In respect of liability limitation provisions, Facebook and TikTok have often attempted to limit their obligations to users in respect of monetary liability for negligence. Far from being theoretical, the potential financial ramifications of these online platforms’ negligence in handling user data can be significant, given their extensive user bases. Notably, in 2021, a data leak occurred in which over 530 million Facebook users globally had their personal data posted online (Murphy, 2021). Despite, and likely with the knowledge of, this risk both Facebook and TikTok have attempted to place extremely low limits on their financial liability to users in the event of negligence.

Previously, prior to alteration after an exchange with the European Commission, Facebook’s terms contain a term that limited aggregate liability to “the greater of one hundred dollars ($100) or the amount you have paid us in the past 12 months,” subject to “the fullest extent permitted by applicable law” (Loos & Luzack, 2016, p. 81). These limits closely resemble the wording of TikTok’s current liability limitation clause, which restricts aggregate liability to “the higher of (i) the amount paid by you to TikTok in the twelve-month period immediately preceding your claim against TikTok; or €100.00,” subject to applicable law (TikTok, 2023a, s.10). Since these services operate on a pay-by-data model for users, who may not have made any monetary payments at all to these services, the reference to monetary payments in a twelve-month period is both arbitrary and arguably disingenuous. This clause may lead consumers to believe that claim amounts are capped at €100.00, an extremely low amount in comparison to court costs and potential time spent on bringing a claim. This could have the effect of deterring users from initiating claims at all.

In practice, the inclusion of a low limit on negligence liability is unlikely to be legally enforceable and may amount to an unfair term. Given the large proportion of Facebook and TikTok’s revenues generated by advertising, which in turn heavily relies on the processing of user data, the low cap on negligence liability may represent a significant financial imbalance to the detriment of users under Article 3(1). It is worth noting that the low limit itself aligns with a term specified in the indicative list, as it purports to ‘inappropriately…limit the legal rights of the consumer…[for] inadequate performance by the supplier’ (UCTD, Annex A, Para 1(c)). However, the indicative list in itself provides no clear indication as to whether particular terms are unfair (European Commission, 2019a) and, even though Facebook has now deleted the reference to a $100 cap or sums paid by the user, the fact that TikTok now uses a virtually identical clause in its limitation of liability illustrates the ineffectiveness of the indicative list as a deterrent to firms’ use of such terms.

The BEUC’s action against WhatsApp (BEUC, 2021d) highlights that the lack of clarity in T&Cs extends beyond the presentation of individual clauses. Users also face difficulties in tracking changes to different versions of clauses, as with WhatsApp’s failure to indicate clearly where the changes to its privacy policy may be found, or Meta’s failure to clearly indicate the change in grounding for its data processing under the GDPR. The UCTD’s provisions, formulated before the creation of the internet (Gardiner, 2022), appear more suited to one-off consumer transactions based on a single set of T&Cs than to ongoing business-to-consumer relationships in which T&Cs may change several times.

For example, the indicative list marks open-ended clauses which permit a trader to unilaterally modify contractual obligations as potentially unfair (Annex A, paras 1(j) and (k)). In long-term, continuous contracts these kinds of terms are likely to be prevalent, but due to the difficulty users face in tracking modifications to terms and conditions over time, such clauses will often escape sufficient scrutiny from users. This lack of transparency associated with long-term contracts, such as social media privacy policies, exacerabtes the inequity of a unilateral modification clause but is not accounted for by the UCTD. The BEUC’s actions against Meta illustrate this shortcoming. As is well known, Article 5 UCTD requires that written terms be drafted in “plain, intelligible language,” which clearly extends to a positive set of written terms, but arguably overlooks lack of transparency arising from a failure to draw attention to a change in terms. This omission seems particularly pressing due to the persistence with which complaints are being brought against WhatsApp and Facebook on the grounds that changes to T&Cs are made without adequately being brought to users’ attention.

Moreover, several terms in Facebook and TikTok’s T&Cs permit sweeping unilateral modification of services provided to users. Whilst the indicative list captures terms which enable suppliers to unilaterally alter contractual terms without a contractually specified valid reason (UCTD, Annex A, para 1(j)), it exempts such terms if a supplier is required to give the consumer reasonable notice and also to state that the consumer is free to dissolve the contract (UCTD, Annex A, para 2(b)).

Similar other contractual provisions, TikTok’s current unilateral modification clause closely resembles the clause previously present in Facebook’s T&Cs before the recent amendments were implemented. Facebook’s T&Cs now include an explicit statement that it may actively notify users of changes via email, whereas their earlier terms were vague and merely stated that it “may provide notice on the Site Governance Page” (as reported in Loos & Luzack, 2016, p. 70). The previous formulation arguably placed a considerable burden on consumers to regularly check and remain aware of any updates, prompting the need to shift to a formulation which places the burden onto the trader. However, TikTok has not made this same shift. TikTok’s current T&Cs state that unspecified “commercially reasonable efforts” will be used to notify users of changes and that users must still “ook at the terms regularly to check for such changes” (TikTok, 2023a, s.10). Considering the gravity of some changes that might be made to T&Cs, such as TikTok removing the option for users to opt out of personalized advertising (Sanchez, 2017), this failure to actively notify users by explicitly stated means arguably amounts to an outright failure to notify at all.

In addition to this ambiguity regarding methods of notification, TikTok’s existing terms, much like Facebook’s previous terms, do not provide for a notice period before any changes take effect. While Facebook altered its T&Cs to stipulate a 30-day notice period for any changes, TikTok’s T&Cs state that the effective date of any changes will be listed on their webpage and, importantly, that continued access to TikTok after the effective date ‘constitutes acceptance of the new terms’ (TikTok, 2023a). In effect, this term allows TikTok to arbitrarily update its terms and bind users to any changes immediately. The fact that TikTok has been able to retain such a provision arguably suggests that the current ambiguity of “reasonable notice” in the UCTD’s indicative list is so vague as to be potentially meaningless. As a result, its effectiveness in regulating large social media contracts is arguably compromised.

Even where the indicative list includes specific requirements, like the requirement to notify users of their option to terminate their contract, such requirements are not always reflected in T&Cs. Former versions of Facebook’s T&Cs, for example, merely stated that “continued use…constitutes acceptance” of the new T&Cs (as reported in Loos and Luzack, 2016, p. 70), while TikTok’s current T&Cs state that users who disagree with any changes ‘must stop accessing or using the Services’ without specifying that this should be done by deleting their accounts, rather than merely remaining inactive, resulting in users inadvertantly preserving their contractual relationship with TikTok (TikTok, 2023a). Facebook has since updated its T&Cs to explicitly mention the option to terminate one’s account (Facebook, 2023), but TikTok retains this omission at the time of writing (Loos & Luzack, 2016, p. 71). This arguably shows that, despite the stipulations of the indicative list, the failure to notify users of their options is a recurring problem.

The requirement to inform users of a right to terminate accounts, unlike the requirement of reasonable notice, is explicit and straightforward, thus a recurring omission to respect this requirement might indicate that the indicative list’s lack of effectiveness may not be solely attributable to its wording. It is plausible that the continued ambiguity concerning the indicative list’s legal effect across the EU, as under Irish law which governs TikTok’s T&Cs (Moncrieff, 2020), may facilitate a favourable environment for companies to ignore the indicative list rather than make efforts to craft T&Cs in conformity with it.

One of the most glaring shortcomings of the UCTD pertains to the blanket exemption of core terms, including price, from undergoing the fairness review as provided for in Article 4(2). This exemption was advocated for by Brandner and Ulmer (1991, p. 649) on the grounds that price is determined by the collective and anonymized actions of market actors meaning that it is likely to be fairly set. However, in the new digital era, this argument may no longer be sufficient to justify the continued existence of the core exemption, particularly considering the emergence of targeted, personalized pricing practices. Additionally, although the exemption primarily concerns consumer transactions with e-commerce retailers, the classification of social media outlets as pay-by-data services (Wu et al., 2020)—notably TikTok’s Virtual Coins Policy governing its paid Diamonds service (Balita-Centeno, 2021)—would also raise questions of reviewability.

The current unreviewability of price under the UCTD in the majority of EU member statesFootnote 13 arguably offers far inferior levels of protection than that offered, for example, by Shenzhen province in the PRC. In Shenzhen, the practice of personalized pricing via algorithms is banned and is punishable by a fine of up to 50 million yuan (USD$ 7.73 million) (Jiayi, 2021).

In addition, the exemption of price from review appears at odds with some of the provisions on the indicative list. Given that TikTok’s Virtual Coins policy gives it a right to “manage, regulate, control, modify and/or eliminate such Coins” with “no liability to users” (TikTok, 2022), the continued unreviewability of core terms introduces uncertainty regarding the judicial assessment of such a clause as a potentially unfair unilateral modification clause that inappropriately limits consumers’ legal rights (UCTD, Annex A, paras 1(b) and (j)Footnote 14).

The Way Forward

This paper proposes, as a way to combat the UCTD’s current deficiencies, the use of personalized terms in lieu of the current system of default terms in social media platforms’ user agreements. This proposed approach is dual-pronged, consisting of (1) a refinement of procedural notification requirements and (2) a replacement of substantive prohibitions of particular unfair terms.

Under the procedural prong, particular attention should be given to refining the intelligibility requirement outlined in Article 5 of the UCTD in order to improve the way terms on data privacy and data collection are disclosed to users. The dominant ‘notice and choice’ model of disclosure (Busch, 2019) is clearly ineffective in facilitating informed user decisions about data sharing. thus, a shift in approach is arguably needed. It is submitted that this shift should involve adapting notice requirements to the characteristics of users through the use of personalized data collection.

Busch (2019, p. 318) proposes that businesses, particularly those which already collect user data for purposes like personalized advertising, should be obligated to use such data to customize their data privacy disclosure terms. Instead of showing all users the same T&Cs, businesses could implement a two-tier disclosure system, wherein information relevant to a specific class of users is displayed, accompanied by links leading to more comprehensive disclosure materials. For instance, a software company could inform users that particular products are compatible with the software of the device on which specific users are browsing the company’s sales website, instead of providing generic information about compatibility with various operating systems (Busch, 2019, p. 320). Similarly, drawing upon the research conducted by Porat and Strahilevitz (2014), which found that particular personality types may be more vulnerable to poor financial choices, Busch (2019, p. 323) proposes the use of data collected from users’ social media accounts to formulate ‘just-in-time’ warnings to be displayed right before users make particular purchasing choices. By tailoring disclosure to optimize the relevance of the information shown to particular users, those users are more likely to be able to meaningfully base their decisions on the information presented to them.

Busch’s (2019) proposals offer an especially promising solution, particularly since they directly address the problem of decision fatigue associated with the traditional “notice and choice” model. However, it nonetheless contains two problems which must be addressed.

Firstly, any detailed personalization of terms would entail the collection of large amounts of user data, which would raise the question of how to obtain users’ consent to the collection of their data in the first place (Poludniak-Gierz, 2019; GDPR Article 6(a)). Although Busch (2019) suggests that user choice could be preserved by making the reception of personalized disclosure opt-in rather than a default, the nature of this option would itself require businesses to comprehensively notify users of the type and methods of data collection (Poludniak-Gierz, 2019, p. 176), thus potentially risking the very information overload that personalization seeks to avoid. It is worth noting that this “chicken-and-egg problem” may not pose as formidable of an issue for the specific companies under examination, like Meta and TikTok, since these companies already collect user data for personalized advertising and could therefore be legally required to use that data for personalising notifications. However, the second problem persists, since monitoring compliance with personalized disclosures would be much harder for enforcement agencies than monitoring compliance with standard terms (Busch, 2019, p. 329).

Despite the difficulties with the wholesale personalization of disclosure requirements, Busch’s proposals still offer promising opportunities to guide reform. One potential application lies in utilising these proposals to identify and fine-tune disclosure, specifically to classes of vulnerable users, such as minors. The BEUC’s actions against TikTok have highlighted the concern that minors may possess more limited abilities than adults to comprehend standard T&Cs (BEUC, 2021a; BEUC, 2021b). Given that Facebook and TikTok already collect data pertaining to users’ age in the sign-up process (TikTok, 2023b), this data could be used to fulfil the requirements of Article 5 UCTD by simplifying notifications for minors in a manner that aligns with plain and intelligible language.

In addition to the personalization of procedural notification requirements, a more radical proposal would be the replacement of default substantive protections with personalized, data-based terms. Porat and Strahilevitz (2014, p. 1441) propose that with the aggregation of consumer data enabling businesses to “use this data to tailor different default rules for their contracts,” consumers would receive more contractual certainty from terms tailored to their personal characteristics.

For example, increased consumer protections have often been associated with increased financial costs, as firms pass on compliance expenses to consumers, resulting in higher prices (Dunkelberg, 2017). Under the framework of default rights, for example, the right of withdrawal, consumers with higher impulsivity levels are more inclined to exercise this right after an impulsive purchase, and the cost associated with their heightened likelihood of utilizing this right is borne by other consumers who pay the same price for products but are less likely to exercise the right (Porat & Strahilevitz, 2014, p. 1453). Porat and Strahilevitz (2014, p. 1454) argue that were businesses allowed to aggregate data about consumers’ personalities and past behaviour, and so offer them a price that reflects their likelihood to exercise the right in question, we could avoid the injustice prevailing in the current system. The personalization of protections would eliminate the inefficiency of cross-subsidies, by matching personal characteristics to protections, and facilitating lower prices for consumers who would not have to bear the cost of rights they are unlikely to utilize.

This proposal of aligning protections and prices with consumers' personal characteristics is, in essence, an insurance model, and is wholly unknown in the context of the UCTD. The UCTD’s non-binding preamble specifically excludes insurance protections and premium calculations from the unfairness assessment.Footnote 15 This exclusion suggests a potential mismatch between the assumptions that underpin the new behavioural economic view of consumer protection and those of personalized protection, in which consumers are assumed to be willing and able to bargain in the knowledge of their self-interest (Porat & Strahilevitz, 2014). If applied to unfair terms, this proposal would essentially return consumers to a situation of wholesale freedom of contract, exacerbating the fundamental problem of unequal bargaining power between users and social media platforms (Kessler, 1943; Ramsay, 2012) by stripping away existing protections.

As a result of the superior bargaining positions of multi-national businesses like Facebook and TikTok, individual consumers are often at a disadvantage when bargaining for their preferred terms. The European Commission’s latest consumer survey suggests thatFootnote 16 were default unfair-terms controls completely abolished, the capacity for users to bargain and advocate for their interests would be highly diminished owing to their inherent behavioural deficiencies. In particular, the informational asymmetry between users and social media platforms puts consumers at a disadvantage in pre-contractual negotiations as consumers will possess less knowledge regarding what clauses might be potentially relevant to them. Indeed, even if users possessed sufficient information to make such a determination, they may fall prey to complacency, or fail to place sufficient importance on these clauses (Hillman & Rachlinski, 2002; Korobkin, 2003). Consumers might instead decide to simply trust the goodwill of businesses.

In the Commission’s survey, it was found that only 27% of consumers reported knowing their rights well and that over 80% simply trusted businesses to respect their rights (European Commission, 2021a, p. 3). By contrast, the persistent lack of clarity across Facebook and WhatsApp’s provisions on data handling suggests that some firms may not be inclined to openly negotiate with users or ensure respect for their rights, but rather to limit users' ability to scrutinize and object to the terms they are offered. As a result, it is clear that most consumers do not regard knowledge of rights as potentially relevant to future events in their contractual relationships with businesses, including disputes, and that, in the absence of statutory protections, consumers will be unlikely to find out about what protections are relevant to their needs and to negotiate terms in their best interests.

Even if consumers were able to negotiate favourable personalized terms that align with their preferences, they may not be able to enforce them in a cost-and-time-effective manner, particularly where litigation is the primary means of obtaining redress. The UCTD addresses this potential enforcement limitation by granting national consumer agencies, such as the BEUC, the authority to initiate legal actions (Article 7(2)), without having to first demonstrate that they have suffered injury from the use of unfair terms. This exception to the normal rules of standing in civil litigation across many member states marks this enforcement mechanism as particularly significant. The importance of this mechanism of enforcement is further reflected in the number of EU and national-level actions brought by agencies, as well as the finding that 70% of consumers trusted national consumer agencies to stand up for their rights (European Commission, 2021a, p. 3). Were exemption from default unfair terms controls to become normative for social media T&Cs, purely because such terms are personalized through data aggregation, the enforcement of contractual obligations owed to consumers would be likely to suffer.

Rather than advocating for the replacing present unfair terms controls with complete freedom-of-contract in personalized terms, this paper submits that the UCTD, which even in its present form creates minimum guarantees of access to justice (e.g., UCTD, Annex A, para 1(q)), should be reformed for greater effectiveness. The current protections should be extended to users of social media platforms on an equal footing with consumers of offline goods and services. Without overtly restricting the ability of social media platforms to innovate, the UCTD’s guidelines should enable users to know what they are paying for, how much data or money they have to pay, what their key rights are, and how to exercise them.

In order to improve transparency, it is proposed to impose a maximum word limit of 5,000 words on T&Cs. As demonstrated before, the ability of consumers to read and understand T&Cs is undermined lengthy T&Cs, which has been identified as a recurring problem. Notable, TikTok’s current terms, as of 2021, run to virtually the same length as Facebook’s T&Cs from 2016. A cap of 5,000 words is a feasible option, as other technology companies like Twitter (4,445 words) and Google (4,099 words) have already managed to craft T&Cs that are at least 10% below this limit (Elshout et al., 2016, p. 6). This proposal for a reduced word limit would comply with the Commission’s 2016 recommendation that T&C lengths be reduced to improve readability, based on a finding that more than twice the proportion of users read in full a reduced and simplified set of T&Cs compared to the standard long-form T&Cs (Elshout et al., 2016, p. 6).Footnote 17 Whilst a reduced word limit cannot guarantee that users will read all the T&Cs, it can help mitigate the problem of consumers making impulsive and uninformed contractual decisions.

To address firms’ concerns regarding the comprehensible notification of users rights, and the avoidance legal uncertainty in litigation, a standard template or Annexe of rights for consumers could be created for inclusion or incorporation into T&Cs. This template could be designed in a concise, “core-rights” format with no more than 3 pages and 500 words that includes graphics and is similar in design to the Commission’s Key Consumer Data factsheet and could be required to be displayed prominently by all companies (European Commission, 2021a, p. 3). Such a document would provide consumers with legal certainty concerning notice requirements, whilst also improving accessibility and reducing any disincentives to read the T&Cs. At the same time, it would provide businesses with the legal certainty that they require, and reduce the costs involved in drafting and changing T&Cs to comply with notification requirements.

While improvements in the presentation of terms may be necessary to improve transparency, their effectiveness could be further enhanced by combining them with Busch’s (2019) proposal for simplified terms targeted at vulnerable groups. In particular, companies should be required to display their T&Cs in the national language of every EU member state and display a simplified list of T&Cs for users with lower comprehension skills, such as minors. The BEUC’s (2021a, p. 11) action against TikTok demonstrates that in the vast majority of EU countries where English is not a first language, minors are less likely than adult users to have achieved sufficient fluency in English to understand legal terminology, even when it is presented in simplified English. Providing T&Cs in the national language of every member state would make T&Cs more accessible not only for minors but also for users with lower levels of literacy.

In addition, a two-part disclosure could be implemented for minors, wherein they are shown a simplified version of T&Cs, like TikTok’s current Privacy Policy for Younger Users (TikTok, 2023c), with a link provided to an extended version of T&Cs. At present, TikTok’s Privacy Policy for Younger Users is listed in a sidebar as one of the options which users may view, and to which users must actively navigate. To alleviate decision fatigue, users should be shown, by default, the version of T&Cs appropriate to their national language and age, as detected from details like their mobile phone’s country code obtained during the ‘sign-up’ process, instead of having to actively choose a particular language or age version of T&Cs as is currently the case.

Furthermore, it is submitted that the indicative list should be converted to a blacklist of terms deemed unfair in all circumstances. The present indicative list fails to provide certainty to users on the expected legal effect of potentially unfair terms. Even if software were developed to assist users in scanning T&Cs for unfair terms, as has already been accomplished (Micklitz et al., 2017Footnote 18), users would be unlikely to seek a costly and time-consuming judicial determination of the status of terms on the indicative list. Equally, businesses are incentivized to ignore the indicative list, as the worst potential consequence would be the unenforceability of the accused term (UCTD, Article 6(1)). In contrast, a business which employs a blacklisted term would commit an unfair commercial practice and be exposed to criminal liability (Jana Pereničová and Vladislav Perenič v SOS financ spol. s r. o, 2012). By replacing the current reactive work of consumer agencies with a blacklist, businesses may be incentivized to work preventatively, in cooperation with national consumer agencies, to avoid unfair terms, thereby reducing the likelihood of unfair terms being employed in the first place.

A suitable starting point for a comprehensive blacklist would involve incorporating the seven terms that have been identified as unfair by the consensus of national consumer protection authorities. In particular, the list should include terms which absolve traders of liability or allow businesses to unilaterally modify users’ contracts (Directorate-General for Justice and Consumers, 2016). To ensure the effectiveness of the UCTD, paragraph 1(q) of the blacklist should be amended to expressly blacklist clauses that designate the law of a non-EU jurisdiction as the governing law of T&Cs. This would grant users the certainty that their transactions will enjoy the protections of the UCTD.

However, special care must be taken to ensure that the blacklist remains compatible with other legal obligations of online networks. Terms which give online platforms unilateral discretion to remove content or terminate user accounts without notice—unlike unilateral modification of terms—should not be blacklisted. These provisions enable online platforms to comply with the DSA’s obligations by promptly acting to remove illegal content posted by users. Moreover, the unilateral modification of standard terms and conditions is further protected by Article 19 of the Directive 2019/770/EC on the supply of digital content, which establishes requirements for the modification of digital content.

The contents of the blacklist in its entirety should be deduced from the current indicative list, after thorough negotiation and agreement among member states. However, it is suggested that two clauses from the Annex be retained. The controls regarding unilateral changes to services, particularly the requirement of reasonable notice, should be reinforced. In addition to requiring T&Cs explicitly inform users of their right to terminate their accounts, similar to Facebook’s current terms, a mandatory 14-day notice period should be imposed for any changes to take effect. As with the aforementioned word limit, this proposed 14-day notice period has been proved feasible by current practice and would be shorter than Facebook’s current notice period of 30 days (Facebook, 2023). Given that less than a third of consumers are aware of their rights, a 14-day notice period may be easier for them to recall due to its similarity to other periods, like the right of withdrawal in distance contracts provided for in Article 19 of Directive 2011/83/EU on Consumer Rights (Consumer Rights Directive).Footnote 19 It is crucial that consumers gain greater awareness of their right to reject changes by terminating their accounts, as the effectiveness of such a right, like the right of withdrawal, is time-sensitive. As such, raising consumer awareness of their rights would empower them to make more effective choices on whether to respond to changes in T&Cs by deleting their accounts within the designated time frame.

To ensure effective transparency and notification, businesses should be required to highlight the specific clauses in their T&Cs that have been changed. This requirement, distinct from the requirement of intelligibility in written terms, finds parallels in the implementing legislation of certain member states. For instance, the UK’s Consumer Rights Act, 2015 (CRA) stipulates that core terms concerning price or subject matter are reviewable where they fail to meet standards of prominence and are not “brought to the consumer’s attention in such a way that an average consumer would be aware of the term” (Section 64(4)). Facebook has faced recurring criticism that it fails to clarify or even attempts to obscure the nature and effect of changes to its terms, for instance by suppressing without announcement clauses which detail the effects that particular user choices concerning their privacy settings may influence data transfers. Adding a general requirement to Article 5 for terms, including changes of terms, to be prominent, would establish a duty to present changes of terms clearly and would provide a clear legal basis for challenges concerning the presentation or alteration of terms. Achieving prominence could be accomplished by using special visual cues indicating the clauses that have been added or altered, for instance by putting them in bold writing, and including the previous version of the altered clauses viewable by mouse-over. When coupled with a 14-day notice period, this action may serve to improve the ability of users to scrutinize the wording and implications of any changes to data privacy clauses.

Corresponding to controls on social media platforms’ ability to vary the obligations imposed on users through changes to services, it is essential to prohibit platforms from including any reference to unreasonable limitations of liability for negligence in their T&Cs. This includes any blanket exclusions of liability for the actions of third parties. It is worth noting that unreasonable limits of liability were unenforceable under the pre-UCTD unfair terms legislation of certain member states, for instance the UK’s Unfair Contract Terms Act 1977, section 2(2), which provided that businesses which profited from commercial activities should rightly bear the cost of remedying any losses arising from negligent performance of those activities. The leakage of over 500 million Facebook users’ data (Holmes, 2021), highlights the potential consequences that collection and processing of data for personalized advertising, a significant revenue source for platforms like Facebook and TikTok, can have for users. TikTok’s current clause, which specifically excludes liability for negligence in respect of eight types of loss (TikTok, 2023a), appears to impose unreasonable limits on its liability. This is despite the European Commission having flagged Facebook’s limitation clauses over similar concerns (European Commission, 2019b). The amendments made by Facebook to it T&Cs following the enforcement action, which explicitly acknowledge the possibility of liability for negligence and the acts of third parties, should be incorporated as mandatory standards for limitation clauses in the UCTD.

In addition to the suggested modifications, the UCTD could mitigate abuses of bargaining power in this area by clearly establishing the reviewability of terms which stipulate the manner in which price and subject matter are decided, in particular the reviewability of personalized and black-box pricing. This could be done without directly removing the exemption for core terms. The current exemption, when combined with the indicative list’s suggestion that terms permitting the unilateral modification of price may be unfair, creates uncertainty regarding the reviewability of black-box clauses where no definite initial price is presented to users. To enhance certainty on such matters, a clarification could be inserted into Article 5, beyond its requirement of ‘plain intelligible language’ for the exemption to apply. This clarification would involve explicitly stating that price terms which do not set out a clear price or rate of payment for particular services are not plain and intelligible. This clarification would complement Annex A, which clarifies that the indicative list’s term on unilateral price variation clauses does not include terms which explicitly describe the method of price variation (UCTD, Annex A, para 2(d)). Equally, the clarification would establish, in no uncertain terms, the reviewability of T&Cs governing paid content in social media platforms’ services. It would expose terms like those which permit TikTok to unilaterally determine the exchange rate of dollars to diamonds to review under the general fairness clause. If agreed upon by the member states, this clarification could potentially pave the way for black-box pricing clauses to be added to the current indicative list or even a blacklist of unfair terms, thus offering further protection to consumers.

Overall, this paper submits that rather than replacing the current system of default unfair terms controls, we should reinforce them in the ways outlined above. The proposed reforms would not introduce legal uncertainty for social media platforms or create unnecessary challenges in complying. On the contrary, strengthening the preventative aspect of these measures is necessary for users to make informed choices about the extent of their obligations to social media platforms. All measures proposed herein are in some form reflections of existing commercial practice by non-social media online platforms, and are even reflected in Facebook’s revised T&Cs. Therefore, implementing these changes would arguably be neither impractical nor unexpected, for TikTok or for any other platform which may come to dominate the social media market. These changes would serve to strike a balance between commercial certainty and conferral of adequate protection on users.

Given the current limitations of the UCTD in effectively addressing the recurrent use of potentially unfair terms, these proposed measures arguably form the minimum foundation required to effectively tackle this issue.

Conclusion

Three decades after its adoption, the UCTD remains a significant component of the European Union’s distinctly nuanced regulatory approach toward social media companies. So far, the EU has avoided imposing heavy-handed, binary measures like wholesale bans, instead opting for engagement with companies like Meta and TikTok on the basis of existing consumer rules like the UCTD. In this regard, the UCTD’s provisions pertaining to the role of consumer agencies in consumer protection has been invaluable and will likely continue to form the basis of the UCTD’s distinctive contribution to unfair contract terms regulation in the context of the handling of user data.

However, this form of regulatory control has its limitations, and it is questionable how effective it is in addressing consumer vulnerability, especially when faced with the inequality of bargaining power between major online platforms such as TikTok or Meta and their users. The current regulatory framework relies primarily on reactive measures, meaning that the UCTD currently relies on constant firefighting by consumer agencies for the enforcement of its provisions. As a result, to effectively protect consumer interests and prevent the use of unfair terms, clear and uniform standards must be introduced to support the work of consumer agencies, as is done for other consumer legislative frameworks. The UCTD arguably does not provide this support.

The UCTD’s current ineffectiveness in preventing the use of unfair terms in the digital sphere is arguably attributable to both its procedural and substantive provisions. These shortcomings are clearly illustrated by the recent actions of the BEUC against Meta and TikTok and emphasize the pressing issue of consumers’ digital vulnerability.

The BEUC’s actions against Meta and WhatsApp’s concerning their data-handling clauses indicate that the UCTD’ focus on static sets of terms is inadequate. The Directive needs to address not only static T&Cs but also terms which change across an ongoing user relationship, as is common in social media platforms T&Cs. Equally, the recurring complaints that platforms like Meta and TikTok do not provide sufficient detail concerning the scope and effecting of their data handling suggests that the UCTD is failing to adequately protect consumers even with regard to positive written terms does not go far enough.

The solution proposed in this article to combat the ineffectiveness of the current UCTD involved the replacement of default terms in social media user agreements with personalized terms based on data collection. The reform to the UCTD would be dual-pronged, involving the refinement of procedural notification requirements, and the addition of substantive prohibitions of certain unfair terms. This approach arguably strikes the appropriate balance between the protection of consumers and the assurance of certainty for social media companies.

Various improvements on the procedural requirements of the UCTD were explored in this article. One key suggestion was for the modification of Article 5 to explicitly require terms to be made prominent, and to extend its application to encompass cases where terms are not adequately clarified or unilaterally altered. In this regard, the UCTD’s application to social media platforms cannot be viewed in isolation but must be recognized as part of the larger corpus of European Consumer legislation that has developed over the last decade. Particularly, recognition should be given to the different protective needs of certain user groups, such as minors, the elderly, the illiterate, or those with different mother tongues.

The paper further proposed a number of substantive modifications to the UCTD to increase its effectiveness in regulating social media platform user agreements, which are crucial for addressing the issue of consumer vulnerability in the digital realm. One such proposal was for the introduction of a blacklist of terms that will be deemed automatically unfair. Implementation of effective, uniform standards is eminently achievable and would arguably be less invasive and involve less of a root-and-branch overhaul of the UCTD than if we were to fine-tune the Directive by incorporating various existing commercial practices.

Regrettably, the UCTD was a key piece of consumer legislation that Directive 2019/2161/EU Enforcement and Modernisation Directive (the Omnibus Directive) did not substantively update, despite earlier concerns that the effectiveness of consumer protection was hindered by a lack of awareness about, and insufficient enforcement of, consumer protection rules. Updating the UCTD to establish clear and definitive standards would represent an important initial step toward raising awareness of unfair terms regulations and enhancing the preventive enforcement of these rules.