In March 2020, while her parents were at their essential services jobs during the lockdown, a 12-year-old Canadian girl was alone at home with a smartphone. As many other young users, she spent her time on TikTok, scrolling through the endless feed of short videos. One day, she came across a video of someone using special effects and stickers in their videos, which they had purchased with “TikTok coins,” a service in principle only available for + 18 users. Captivated by the different features offered by the platform, she decided to buy some coins of her own, and before she knew it, she had spent over $12,000 on in-app purchases for TikTok coins, which she also used to buy followers and likes from well-known users, increasing her popularity on the platform.Footnote 1 A similar story took place in the UK, where an 11-year-old girl spent about £3,500 in less than a month on TikTok coins to buy private video calls and singing duets with an influencer.Footnote 2 More recently, an Irish mother warned other parents following a discovery of the number of views and downloads of videos including children and the ease with which a minor’s account on TikTok can be accessed through a simple follow by an anonymous user.Footnote 3 These examples point to a pressing issue of how to protect vulnerable platforms’ users such as children. However, in the EU, the conditions for valid consent are not harmonized across the EU, as the issue is left to the Member States.

Recent examinations of the platform’s practices identified several infringements regarding the protection of minors by the platform. In a report prepared by the European Consumer Organisation (BEUC), it was found that TikTok’s actions violated the rights of European consumers by engaging in deceptive and unfair business practices (BEUC, 2021d). BEUC identified several violations of the Unfair Contract Terms Directive (UCTD) due to the lack of clear information. TikTok’s “Terms of Service” and “Virtual Item Policy” contain ambiguous information regarding the availability and sharing of content on the platform. Moreover, it was found that these contractual frameworks are not suitable for its userbase, largely consisting of children and teenagers (ibid.). In light of this, BEUC issued an external alert under the Consumer Protection Cooperation Regulation (CPC Regulation) with a view to coordinate an EU-wide action against these practices. The article considers BEUC’s external alert on TikTok to assess the effectiveness of such CPC mechanism in the enforcement of consumer law. Given the implications over children’s privacy, their capacity to give consent, or their exposure to inappropriate content on social media, the article also considers the enforcement of the specific safeguards for minors provided in General Data Protection Regulation (GDPR) and the Audiovisual Media Services Directive (AVMSD) as some of the platform’s practices also violate the rules contained therein.

This article suggests that, paradoxically, the number of applicable rules and enforcement entities has led to less effective enforcement. The abundant sector-specific rules and corresponding enforcement authorities create complexity, especially when trying to coordinate the activities of multiple national enforcement authorities across borders with significant procedural differences among them. Accordingly, the article describes how minors, as vulnerable consumers, are (un)protected under the current EU regulatory framework and argues that “too many cooks spoil the broth”—i.e., that the plethora of rules, organizations, and procedures involved in enforcement leads to the ineffectiveness of EU law, without further considering that the different interests involved and inherent risks of regulatory capture are also part of problem.

The article is structured around the identified factors that, it is argued, contribute to the underenforcement of EU law in the digital single market: first, there are different applicable rules and the lack of hamornization concerning who is considered a minor under EU law; second, the mechanisms under the EU CPC Regulation are used in a suboptimal manner; third, overlapping scopes of application and overlapping enforcement competences make it difficult to determine the competent authorities to act when issues emerge; and fourth, insufficient action from consumers to act following online harms and online wrongdoings. The result is a digital single market where minors, as vulnerable consumers, remain exposed and unprotected, accentuating their digital vulnerability. The article offers a set of observations to take into consideration ahead of the application of the Digital Services Act (DSA).

TikTok and the EU Vulnerable Consumer (Law)

TikTok is a fast-growing social media app (in China known as Douyin) released in 2016 and developed by the Chinese technology company, ByteDance Ltd. Despite regulatory and other challenges, the app rapidly grew in popularity. The international release of the app took place in September 2017, and by July 2018, it already had rapidly expanded its users’ base with 680 million monthly active users, reaching 2 billion downloads in 2020 (Fannin, 2019). By 2022, TikTok had 1.2 billion monthly active users,Footnote 4 reached more than 150 markets,Footnote 5 and was available in 75 languages.

Despite its services are not available for children under the age of 13 (TikTok’s Community Guidelines), the platform has nonetheless attracted a significant young user base. Over 60% of users are under 30 (Ofcom, 2021), and 43% of TikTok’s global audience falls within the 18–24 age demographic.Footnote 6

The app relies on high level of engagement based on the appealing qualities of its video feed (“For You”), powered by a sophisticated recommendation algorithm (Goldsmiths, 2021; Siles González & Meléndez Moran, 2021). The algorithm uninterruptedly shows content to users based on weighted factors such as users’ likes, shares, interactions, and previous searches.Footnote 7 The entire system, consisting of economic and reputation incentives, is run by such algorithm and concealed emotional nudges, designed to increase viewing time (Zhao, 2021). Recent studies show that the average amount of time that TikTok users spend on the platform is approximately 50 min each day.Footnote 8 For children, the average daily usage time on TikTok is about 75 min,Footnote 9 and 16% of North American teenagers (ages 13 to 17) acknowledge to use TikTok “almost constantly” (Vogels et al., 2022).

The popular platform, which was initially known primarily as a space for memes, lip-syncing, dancing, and entertainment, has evolved to become a versatile tool for expressing a wide range of ideas, including politics, lifestyle, and personal advice.Footnote 10 However, with more users sharing sensitive and private information on the platform than they otherwise do, concerns are growing regarding the potential impact on mental health from targeted content consumption.Footnote 11 The video-centric and virality nature of TikTok’s services, together with a powerful system of incentives, pushes creators to provide unconventional or extreme forms of content to increase the chances of virality (Chu et al., 2022). As such, it is not surprising that TikTok is considered to contain all the necessary ingredients to be an important vector for election and disinformation (Alonso-López et al., 2021), as well as harmful and even fatal trends resulting from the so-called TikTok challenges (Elkhazeen et al., 2023).Footnote 12 This has called for a more intense regulatory scrutiny.

The increasing popularity of TikTok has also sparked geopolitical concerns. TikTok has displaced the user-base from Silicon Valley-born platforms, disturbing US hegemony (Gray, 2021). Moreover, the company, which comes from a country with a history of disregarding fundamental rights and that entered the market during a time when major tech companies were facing accusations of wrongdoing, soon encountered significant opposition. Due to different issues related to content and Internet censorship, TikTok has been, for example, intermittently prohibited in Indonesia over concerns of serving to spread pornography and is currently banned in Bangladesh and Pakistan for immoral content and for being harmful to children (Weimann & Masri, 2020). In 2020, TikTok was banned in India due to national sovereignty concerns.Footnote 13 The platform was accused of curating politically sensitive content and sympathising with the Chinese government.Footnote 14 The video-sharing platform has been under intense scrutiny in different jurisdictions around the world due to concerns on national security, privacy violations, and breaches of consumer protection laws. TikTok agreed to pay $5.7 million in the USA to settle the Federal Trade Commission’s allegations of violations of children’ privacy because of illegally collecting personal information without parental consent as required by the US Children’s Online Privacy Protection Act (COPPA).Footnote 15 With regard to privacy, one of the major concerns is about personal data transfers since, as a Chinese company subject to Chinese Cybersecurity law, it can be required to submit data upon request to the national government (US Congresional Research Service, 2020).

In the EU, TikTok’s practices in the EU have raised concerns regarding the protection of personal data and compliance with consumer protection legislation. The platform’s business model allows for significant personal data collection from minors based on unclear terms and misleading practices that contravene the protections afforded by the GDPR and that pose challenges to the protection of vulnerable consumers under European consumer law.

Digital Vulnerability in EU Law

The literature and regulatory discussions have already focused on vulnerability in the data-driven economy (Helberger et al., 2017; Livingstone & O’Neill, 2014; Macenaite & Kosta, 2017). In a context of increased digitalization, EU rules are gradually putting more emphasis on protecting children in the digital environment given their vulnerability to make choices regarding what they do and share online. For example, the DSA underlines the protection of minors as vulnerable recipients of online platforms services (see Recitals 50, 46, 62, 71, 81, 83, 89, 95, 102, and 104), and includes provisions to monitor and to act against the presence of content that is harmful for minors (trusted flaggers), prohibition of targeted advertising, obligations to provide information on advertising systems, and, in the case of very large online platforms, participation in self- or co-regulatory agreements for mitigating the systemic risks such as those arising from disinformation or manipulative and abusive activities or any adverse effects on minors.

For those platforms primarily directed at minors, Article 14 DSA requires services providers to draft terms and conditions in a manner that can be understood by them. Article 28 DSA establishes that, where the services are accessibly by minors, online platform providers shall put in place “appropriate and proportionate measures to ensure a high level of privacy, safety, and security of minors, on their services,” and Article 35 specifically requires very large online platforms the adoption of targeted measures protect the rights of children, including age verification and parental control tools, as well as tools designed to help minors report abuse or obtain support. Still, the DSA does not define what is the age-limit to be considered a minor.

Pending the full application of the newly passed rules, the GDPR and EU consumer law already provide specific rights and contain available mechanisms to protect the vulnerability minors and children, but both leave open who the minors are and where childhood ends. In its Article 8, the GDPR establishes specific conditions applicable to child's consent in relation to information society services. Where the processing of personal data of children is based on consent, such processing shall be lawful where the child is at least 16 years old (Article 8(1) GDPR). In this regard, issues related to minors’ consent for lawfully processing data of children under 16 sparked criticism about the implementation and interpretation challenges of such age limit (Donovan, 2020; Macenaite & Kosta, 2017), showing that the GDPR but also EU law in general do not provide guidance on who the minors are.

Under EU consumer law, specific measures shall be taken to protect vulnerable consumers who may be more susceptible to harm due to factors such as “psychological infirmity, age or credulity” (see, e.g., Recital 34 Consumer Rights Directive, Recital 18 and Art 5 (3) Unfair commercial practices directive). Both mention children but none of these provisions defines them. A social media service provider can also be acting as a trader and therefore it must comply with EU consumer law and market regulation (CPC Network, 2016). This means that social media platforms must be careful to not engage in unfair practices that could harm vulnerable consumers. In determining whether a commercial practice is unfair, EU law considers the impact on a vulnerable consumer, even if the same practice might be considered fair when judged against the “average consumer” or a group of average consumers (EDPB, 2022). In this regard, in addition to the obligations arising from the GDPR or even the e-Commerce Directive, additional safeguards for vulnerable consumers provided in the Unfair Contract Terms Directive (UCTD), Unfair Commercial Practices Directive (UCPD), or the Consumer Rights Directive (CRD) apply to online platform services offered to minors.

With regard to the presence of minors on the platform, the Audiovisual Media Services Directive (AVMSD) would also be relevant as TikTok is considered a video-sharing platform. The revised AVMSD requires Member States to introduce appropriate measures to prevent and moderate content that affects children’s physical and mental development (Article 28b AVMSD).

Terms of Service and Consumer Law

Social media users are exposed to (hidden) advertising that manipulate consumers’ preferences, often disguised as non-sponsored recommendations (Goanta & Ranchordás, 2020). Given that TikTok’s user base is mainly composed of vulnerable consumers (minors), and following several complaints from national consumer organizations, BEUC carried out a study on TikTok commercial and advertising practices (BEUC, 2021d). The analysis revealed several violations of EU consumer law since it found that TikTok’s “Terms of Service” (ToS) contain unclear, ambiguous terms and create an uneven power dynamic between the platform and its users (ibid.).

TikTok Virtual Items and Coins

Engagement with the app and usage time are encouraged through TikTok’s Virtual Items Policy. TikTok “Coins,” which can be purchased in the platform (only + 18), allow the possibility of “activating” gifts to show appreciation and interact with live-stream content creators. Based on the number of gifts received as well as the duration and frequency of the videos uploaded, TikTok calculates the popularity of creators to award “Diamonds,” which content creators can monetize under the Reward Programme. The system is designed to incentivise the creation of “high-quality, engaging content on the Platform.”Footnote 16 Moreover, the TikTok Creator Fund rewards content creators by the number and authenticity of views and the level of engagement on the content. The criteria require to be older than 18 years old, have at least 10,000 followers, an engagement of at least 100,000 video views in the last 30 days, and have an account that fits with the TikTok Community Guidelines and Terms of Service.Footnote 17

Under Directive 2005/29/EC (“Unfair Commercial Practices Directive” or “UCPD”), children and teenagers are considered “vulnerable consumers” who may be harmed by unfair commercial practices. The European Commission’s Guidelines on the UCPD also recognize teenagers as a group of vulnerable consumers. In particular, Article 5(3) UCPD establishes that where a commercial practice is targeted at a specific group of consumers, the effects of the practice should be evaluated from the perspective of the average members of that group. Moreover, Point 28 in Annex I UCPD explicitly bans the direct exhortation to children in commercial communications. This is also emphasized in the UCPD Guidance, which considers that children and young people are the likely target audience.

Also, the UCTD considers unfair any non-individually negotiated contractual term where it causes a significant imbalance between the parties to the detriment of the consumer (Article 3(1)). Although the UCTD does not establish specific protections for vulnerable groups such as underage, unfair terms are more problematic where the aggravated party is a minor. Notwithstanding this, neither the UCPD nor the UCTD defines what is to be considered a minor.

Article 5 of the UCTD requires contract terms to be always drafted in plain, intelligible, and unambiguous language. Yet, it has been found that TikTok’s Virtual Items Policy contains complex terms and uses legal and commercial language, making it difficult for consumers, especially the app’s young audience, to understand (BEUC, 2021d). Moreover, although available in every EU country, the Virtual Items Policy is not available in all EU Member States’ languages (ibid).

It was also found that TikTok’s Virtual Items Policy infringes Article 6 CRD and Article 7(4)(c) UCPD since this Policy does not provide clear pre-contractual information about the actual value, which displayed in the form of “TikTok Coins” rather than actual currency, of the “virtual gifts” that can be sent to content providers. Similarly, TikTok’s Virtual Items Policy breached the CRD by not providing clear and sufficient information about the right of withdrawal or its absence (ibid). Following different complaints from consumer organizations, the latest TikTok Coin Policy, updated in June 2022, now includes a section detailing withdrawal rights.

User-Generated Content

Terms regarding the ownership of user-generated content are also problematic. For example, by accepting TikTok’s ToS, content creators irrevocably agree to a royalty-free worldwide license that allows that user-generated content can be used, distributed, and reproduced by TikTok, its partner, or affiliated parties without remuneration.

figure a

By creating such an imbalance, TikTok’s contractual framework does not comply with Article 3(1) UCTD. Despite the term appears clear with regard to the use of user-generated content by the platform, it is quite unlikely that users will delete or stop using the app upon knowledge about the grant of licenses (Polito et al., 2022).

Misleading Practices

TikTok infringes Article 5(3) UCPD since its contractual framework is ambiguous and provides insufficient information about data collection policies that concern a vulnerable group, children, which is the public to which the platform is largely targeted.

Younger users do not understand that most of the content that they watch on the platform is commercial advertising. For example, according to a study by the Catalan Audiovisual Council, 93% of TikTok’s content contained hidden advertisements (Consell de l’Audiovisual de Catalunya, 2020). Promoting certain products or services without easily identifiable toggles (e.g., #Ad) signalling branded content involves a breach of Article 7 UCPD due to misleading omissions since consumers, usually minors, are not sufficiently informed of TikTok’s business model and how it builds on users’ data (Luzak & Goanta, 2022).

Infringements of EU Data Protection

The ambiguity, lack of transparency, and unfairness found in TikTok’s ToS and business practices do not only involve breaches of EU consumer law provisions but also have important ramifications concerning the fundamental right to personal data protection.

TikTok seems to fall short of fulfilling the information requirements relating to the processing and sharing of users’ personal data. Most importantly and given that a considerable amount of TikTok’s users are minors, including children under 13, there are special considerations that the platform should take when processing personal data of vulnerable data subjects as stated in the GDPR (Recital 38) as well as specific conditions applicable to child’s consent in relation to information society services (Article 8).

Based on a detailed examination of the platform’s privacy policies, including a chronological analysis of their continuous updates, Ausloos and Verdoodt (2021) found several violations of users’ privacy. By not stating the purpose(s) and legal basis for data collection and data processing, TikTok’s practices do not respect data protection principles and prevent an adequate assessment of GDPR compliance and a proper enforcement of the rights of vulnerable data subjects.

Infringement of Children Data Protection Rights

TikTok lacks mechanisms to protect children’s data and that it fails to consider their needs in its privacy policy and service design. For example, TikTok’s services and privacy policy do not have in place mechanisms and information regarding the special protection for children for data protection purposes, including in the context of profiling, and a children-centre approach in the design of the service, disregarding Recital 38 GDPR (Ausloos & Verdoodt, 2021).

Despite its Community Guidelines, TikTok allows children under 13 to create accounts since neither age verification nor parental consent requirements are in place, making it easy for children to bypass the age threshold by entering a false birth date (Pasquale et al., 2022). This issue has also been noted by the Italian Data Protection Authority.Footnote 18

The technical design of the platform amplifies the effects of a lack of specific safeguards for children. A non-differentiated treatment between minors and adults in the data processing activities of the platform contravenes the GDPR and reinforces the vulnerability of children with regard to the exposure to advertising and recommender systems based on profiling.

Recently, in addition to concerns about children-watching accounts, research has found a hyper-sexualization of children in the platform (Soriano-Ayala et al., 2022). The current business model allows that potential abuses can easily scale since suggestions by the algorithm exponentially increases the visibility of minors’ content and accounts.

figure b

This policy and technical setting can result in an unintended exposure of minors’ content. While users’ ages 13 to 15 accounts are private by default, and for teens ages 16 to 17, the default setting is set to public.

figure c

The issue of consent by minors regarding their privacy and personal data remains elusive. Pursuant to Article 8 of the GDPR, the minimum age to consent for lawful processing of personal data in relation to information society services is 16 years old. For services offered directly to children under 16, consent shall be given or authorized by the holder of parental responsibility (Article 8(1)). Under a flexibility provision, Member States can also establish lower thresholds provided that the minimum age is 13 (ibid). However, the minimum age for lawfully giving consent is not yet fully harmonized across the EU. Therefore, Member States establish different age thresholds, ranging, for example, from 13 years in Spain or Sweden to 16 in Germany (European Union Agency for Fundamental Rights, 2018).

Disregard of GDPR’s Data Processing Principles

TikTok does not sufficiently inform about the connection between the purposes of data collection and the specific legal basis used for such processing activities. On these grounds, the lawfulness of the processing remains uncertain (Article 5(1) let a GDPR). Furthermore, in their report, Ausloos and Verdoodt also question the validity of combined legal grounds for a lawful processing and the compatibility of this combination’ with the principle of data minimization (Article 5(1) let c GDPR).

figure d

The code that enables browsing within TikTok’s content (i.e., in-app browser) allows TikTok to monitor input and taps into the keyboard, which is known as keylogging.Footnote 19 In sum, the duration and the amount of data collected casts doubts on the compliance of the processing activities with purpose and storage limitation (Article 5(1) let b and e respectively GDPR). Moreover, regarding the legal basis for data processing, TikTok’s reliance on consent for the purposes of marketing communications does not fulfil the requirements as provided in GDPR (Article 4(11) GDPR) or the ePrivacy Directive by virtue of which consent needs to be specific and informed for a lawful processing (Article 6(1) let b GDPR). Instead, commercial profiling is embedded as part of different legal basis and scattered around different processing purposes in TikTok’s Privacy policy. For example:

figure e

The European Data Protection Board (EDPB) had already noted that contractual necessity “is not a suitable legal ground for building a profile of the user’s tastes and lifestyle choices based on his clickstream on a website and the items purchased (…)” (Article 29 Working Party, 2014). Furthermore, BEUC has denounced the presence of certain identifiers by the platform and by third parties that track users’ activity for profiling and advertising upon the installation of the mobile app. Technically, the installation of TikTok’s app in a mobile device enables third-party identifiers even if the user is not logged in or has de-activated ad-personalization (Android) and even if the user resets and activates Limited Ad Tracking (iOS) (BEUC, 2021b). The ambiguity about the reach of TikTok’s data processing is also observable with regard to non-users.Footnote 20

Protection of Minors in the AVMSD

The revised AVMSD includes an obligation for Member States to enact legislation that prevents and protects minors from inappropriate content. Article 28b AVMSD aims to protect minors from harmful user-generated videos and audiovisual commercial communications that may impact their physical, mental, or moral development.

The rules apply to TikTok due to its role as a “video-sharing platform” under the AVMSD (Article 1). TikTok does not take adequate measures against hidden advertising in the platform as well as to restrict children and teenagers’ exposure to such commercial as well as to inappropriate content. For example, BEUC denounced that TikTok’s algorithm regularly exposes minors to potentially illegal or degrading content and criticized that the company fails to take sufficient action to prevent children and teenagers from being exposed to inappropriate content (BEUC, 2021d).

In sum, the potential (in)compatibility of some TikTok’s practices with EU law is too relevant to be ignored. The platform’s technical and self-regulatory framework seems to fall short of EU privacy and consumer protection standards. Consequently, different data protection and consumer authorities are investigating these issues. Leaving aside the specific infringements, the focus is hereon on what (did not) happen once the violations were identified.

There’s Something About the CPC Regulation: BEUC v TikTok

Under EU law, national authorities are responsible for the enforcement of EU consumer law, regardless of whether the incriminated practices come under the scope of the UCTD, the UCPD or the CRD. As a result of the increasing cross-border nature of consumer transactions in the EU, the 2006 Consumer Protection Cooperation Regulation (Regulation (EC) No 2006/2004) had set up a network for a better coordination of the activity by competent authorities in the Member States designated for the enforcement of consumer law, the Consumer Protection Cooperation Network (“CPC Network”). The purpose of establishing a CPC Network was to allow national authorities from all countries in the European Economic Area to jointly cooperate against consumer law violations when the trader and the consumer are established in different countries. In 2017, Regulation (EU) 2017/2394 on Consumer Protection Cooperation Regulation granted national authorities with wider powers to detect consumer law violations and to rapidly act against offenders.

Under the existing framework the European Commission can “alert” the CPC Network and coordinate EU-wide enforcement action to address practices which harm a large majority of EU consumers with a view to ensure the consistent application of consumer law within the internal market (Article 26(2) CPC Regulation). Upon suspicion of consumer law infringements, competent national authorities can, and shall, notify the Commission, other competent national authorities, and single liaison offices without undue delay about the potential breach(es) (Article 26(1) CPC Regulation). Member States can also confer designated bodies, European Consumer Centres, consumer organizations and associations, including trader associations, that have the necessary expertise, the power to issue an “external alert” to the relevant competent authorities and the Commission of suspected infringements (Article 27 CPC Regulation).

As one of the designated bodies, BEUC and 18 of BEUC’s member organizations issued an external alert informing of the problems encountered with TikTok’s practices in February 2021 (BEUC, 2021c). Letters were also sent to the EDPB (BEUC, 2021b) and, given the consideration of TikTok as a “video-sharing platform” under the AVMSD, to the European Regulators Group for Audiovisual Media Services (ERGA) (BEUC, 2021a). The purpose of this initiative was to encourage the European Commission and national authorities to investigate TikTok’s potentially unfair and misleading practices, which not only harm consumers, but also minors at risk as a vulnerable group. Accordingly, BEUC asked the European Commission and national authorities to develop a “common position” based on their investigation and assessment of TikTok’s practices to ensure compliance with EU consumer law.

In response to the external alert, in May 2021, the European Commission and the CPC Network launched a “formal dialogue” with TikTok to review the company’s commercial practices and policies. While announcing the launch of such dialogue, Commissioner Reynders emphasized the risks for vulnerable consumers posed by targeting minors and disguised advertisement in a context of accelerated digitalization.Footnote 21

During months following the submission of the external alert, there were different exchanges with the different authorities. The resulting actions by the alerted authorities were seen as insufficient (BEUC, 2022a).

Attention should be paid in this case not only to the fact that authorities did not take enforcement measures as provided in the CPC Regulation to bring about the cessation or prohibition of the widespread infringement, including the imposition of penalties where appropriate (Article 21) but specially to the procedure that followed the external alert. More specifically, what deserves attention was not the insufficient response by the competent authorities but the extra legality of the steps following the call for investigation.

Under CPC, competent authorities may invite the trader to propose commitments to cease the infringement (Article 20). However, that very same provision establishes that such invitation is to be made on the basis of a “common position” (Article 20(1)). “Where appropriate,” a common position is agreed upon by a group of competent authorities who are concerned with a coordinated action (Article 19(3)). Accordingly, whereas the outcome of a coordinated action does not have to be set out in a common position, it can be interpreted that opening the possibility of offering commitments by the trader shall be based on a common position. In the case at stake, there was no common position agreed upon the competent authorities. A “dialogue” between the European Commission and TikTok was initiated presumably under the possibility in Article 20(1) in fine for the trader to propose commitments on their own initiative. However, in a press release, the European Commission specifies that, following BEUC’s external alert, “the Commission, together with the CPC, and led by the Irish and Swedish consumer authorities, launched a dialogue with TikTok.”Footnote 22 Therefore, the issue of whether a common position is required to initiate negotiations with an allegedly infringing trader requires further clarification.

Given the available information, TikTok proposed a set of commitments for major updates in TikTok’s policies (dated June 2022) in which it included, inter alia, the creation of bigger label for signalling paid advertisements and the application of TikTok’s Branded Content Policy to EEA users, amendments to policies for more understandable information on personalized advertising, further transparency regarding the functioning of Virtual items, and information about consumer rights concerning the purchase and use of Coins and Gifts.Footnote 23

In the same month, the European Commission simply announced the closure of the investigation considering that the “series of concerns have now been addressed and TikTok committed to change its practices” (ibid). There is no other formal document or decision than the press release to indicate the outcome and finalization of the investigation initiated by BEUC. The press release does not include a timeframe to cease infringements either (cf. Article 20(1) CPC Regulation). In BEUC’s view, the agreed commitments are not fully compatible with the 5 key principles of fair advertising to children endorsed by consumer and data protection authorities,Footnote 24 leaving important issues unsolved (BEUC, 2022b).

In sum, the procedural irregularities in the enforcement of consumer law in the TikTok case is just one example of a worrisome trend also observed in other external alerts under the CPC Regulation concerning WhatsApp and airlines practices during the pandemic (BEUC, 2022a). In the case at stake, it has been argued that the procedures involved informal dialogues and the designated entities were not regularly informed about status of their alerts (ibid). The following section explores how the functioning of the current CPC mechanisms and other procedural and institutional deficiencies hinder the application of the rules protecting minors online.

Too Many Cooks: Overlapping Competences and Rules

The number of updates to TikTok’s community and privacy guidelines makes it more difficult to identify and assess violations, thereby hindering the effective enforcement of previous wrongdoings (Ausloos & Verdoodt, 2021). As described above, the incorporation of consumer and data protection standards has been done as part of unclear, complex, and long contractual and privacy policies in a manner that allow the collection and retention of massive amounts of data without users’ knowledge about the purposes of data collection (Roth, 2021).

While the aim of this paper is not to discuss the fairness of the processing, which has been extensively done elsewhere (e.g., Clifford & Ausloos, 2018; Helberger et al., 2017; Malgieri, 2020), here we take the view that the legality of the processing of personal data is contingent on the UCTD’s unfairness test. An interpretation of GDPR’s legal basis for lawful processing of personal data links the UCTD’s requirement of plain, intelligible language to the principle of fairness enshrined in Article 5(1) let (a) GDPR (EDPB, 2019). This means that where the processing of personal data is based on what is considered an unfair term under the UCTD, it “will generally not be consistent with the requirement under Article 5(1)(a) GDPR that processing is lawful and fair” (ibid.).

This interpretation triggers two important considerations regarding the enforcement of the specific safeguards for minors provided by EU consumer, data protection, and audiovisual media services law. First, a siloed approach in the application of the rights of the consumer/data subject paradoxically results in an insufficient response against violations. Second, GDPR’s one-stop shop mechanism incentivises regulatory arbitrage and exacerbates the problem of lack of enforcement.

Horizontality v Verticality in the Digital Age

As part of the New Consumer Agenda, the European Commission is currently testing the fitness of European consumer law to ensure equal fairness online and offline (European Commission, 2022). The resulting rules from this assessment are expected to shed light on whether horizontal consumer law can address the challenges posed by the digital environment as well as ensuring coherence with forthcoming legislation such as the DSA or the AI Act.

Indeed, consumer law is an important piece in the protection of digital services’ users since it can contribute to counterbalance the intrinsic power imbalance between online platforms and users (Helberger et al., 2017). However, in practice, the existence of different authorities, consumer protection authorities on the one hand and data protection authorities on the other, raises certain challenges.

The reconstruction of the (in)action following BEUC’s external alert on TikTok provides a highly meaningful account of overlapping scopes of application of EU laws and competing enforcement authorities (Cantero Gamito & Micklitz, 2023). Both, the EU consumer acquis and the GDPR, provide different enforcement options as well as different interpretations of what constitutes an enforcement violation. TikTok is a good example of this—the very same business practices can fall under consumer protection law, data protection law, e-commerce law, and audiovisual law, all at the same time. Naturally, the more applicable rules there are, the more the potential frictions. Yet, even within the realm of consumer law, different national enforcement authorities may choose to use either UCTD or UCPD against the same business practices. This choice is often influenced by the authority’s prior experiences and familiarity with the respective laws (Cafaggi & Micklitz, 2009).

Regarding the contractual fairness to legally process personal data (of minors), data protection authorities claim that are not competent to decide on the validity of contracts (EDPB 2022, at para. 65 and 108). And, while consumer organizations can file, ex officio, a collective action to bring proceedings against practices that affects the rights of data subjects in cases of infringement of rules on consumer protection or unfair commercial practices, consumer protection authorities are not competent to monitor compliance or to impose fines for GDPR’s violations (e.g., Case C-319/20, Meta Platforms, 2022). Accordingly, the allocation of competence to supervisory authorities under GDPR (Article 51ff GDPR) casts doubts about the ability of national authorities other than the designated national supervisory authority to monitor compliance with the GDPR requirements. For example, in Germany, the German Federal Cartel Office (FCO) initiated proceedings against Meta, which resulted in the prohibition of processing data as provided for in Facebook’s ToS and the imposition of measures to stop Meta from doing so. The FCO found abuse of the company’s dominant position for obtaining consent for data processing. The compatibility of FCO’s competence to examine GDPR’s compliance is currently under examination (Case C‑252/21, Meta v Bundeskartellamt, n.d). Pending the judgment, AG Rantos suggested that a competition authority, within the framework of its powers under the competition rules, may examine, as an incidental question, the compliance of the practices investigated with the rules of that regulation, while considering any decision or investigation of the competent supervisory authority on the basis of said regulation, informing and, where appropriate, consulting that authority.Footnote 25 At the moment, the competence of non-DPAs to enforce GDPR remains limited and non-lead supervisory authorities cannot circumvent the lead DPA by filing a judicial action before the courts unless very specific circumstances apply (C‑645/19—Facebook Ireland and others v Gegevensbeschermingsautoriteit, 2021). Therefore, we will have to wait to the final judgment to see further developments, especially given that the Irish Data Protection Commission (“DPC”), under the GDPR’s jurisdictional rules (one-stop shop), is the lead supervisory authority to oversee the cross-border data processing activities of most of the large data processing companies, traditionally based in Ireland.

One-Stop Shop and Principle of Origin

The deficiency of the institutional design of the GDPR has posed limits to its effectiveness (Gentile & Lynskey, 2022). But most irritations come from the perceived lack of action by the Irish Data Protection Commission (DPC), the lead authority responsible of monitoring data protection rules’ compliance and enforcement under the GDPR's ‘one-stop-shop’ mechanism.Footnote 26 This triggered an institutional conflict between the DPC and other national authorities (e.g., public criticism by the German authorities) and a motion for resolution in 2021 by the European Parliament (LIBE Committee) calling the European Commission to initiate an infringement procedure against Ireland over lack of GDPR enforcement (European Parliament, 2020).Footnote 27

TikTok’s decision to move its headquarters to Ireland was a game changer in terms of enforcement jurisdiction. Before the company had a European headquarters, any DPA was responsible for enforcing data protection laws on the platform. However, once the company moved its operations to Ireland, the enforcement responsibilities shifted, and the allocation of competences under the GDPR (Article 55) came into effect. For example, when the Dutch DPA imposed a fine on TikTok in 2021 for violating children’s privacy, the case was transferred to the Irish DPA.Footnote 28 Moreover, in early 2020, the Italian DPA requested the European Data Protection Board the establishment a “TikTok Taskforce”Footnote 29 to identify and coordinate investigations into the platform's practices across the EU.However, by mid-2021, the EDPB taskforce’s work seemed to have terminated, coinciding with the Irish DPA taking over responsibility for enforcing the GDPR on TikTok following the company’s move to Dublin. Moreover, based on an exchange between the EDPB and TikTok in February 2021, it appears that the investigations conducted by the taskforce were supposed to be confidential.Footnote 30

In the EDPB’s response to a BEUC’s letter informing about TikTok’s intention to change the legal basis from consent to legitimate interest for processing personal data, the EDPB announced that the Irish DPC “engaged” in a conversation with TikTok and, subsequently, the attempted modification of the legal basis was withdrawn.Footnote 31 The content and extent of such engagement remain unknown. For example, it is unclear whether the latest updates to TikTok’s Privacy Policy are part of the compromise reached with the Irish DPC. As part of the latest introduced amendments, TikTok now informs about the followingFootnote 32:

figure f

This acknowledgment also raised concerns about the transfer of personal data from European users to employees located in China. To date, despite the passing of the Chinese Personal Information Protection Law (PIPL) in 2021, there is no adequacy agreement between the EU and China to enable data transfers. There is equally no information about the ongoing investigations by the DPC on the GDPR’s compliance of TikTok’s transfer of personal data to China. The resulting ambiguity suggests that the disclosure may be part of the compromise between the platform and the Irish authority.

figure g

All things considered, it seems that the Irish DPC is moving towards a lenient approach where mere transparency is considered an adequate response to the potential exploitation of vulnerable users, minors, setting aside the fact that data privacy rules around the world differ considerably. It has been shown that the information disclosure paradigm not only does not prevent abuses (Helberger et al., 2021), but it also legitimates platforms’ practices and self-regulatory powers (Maroni, 2023).

The information paradigm approach shall be put in context with the validity of consent under data protection legislation. The CJEU, and in line with the definition of consent provided in the GDPR (Article 4), has interpreted the formerly applicable Directive 95/46 on personal data protection by requiring that the processing is lawful provided that the data subject has given his or her consent “unambiguously” (C-673/17 -Planet49, 2019 at para. 54). This, together with the requirement for consent to be “freely given,” raises the question as to whether it is reasonable to expect that children, as vulnerable users, are freely giving consent to processing activities for advertising purposes when accepting the terms and conditions to access platform’s content, not only registering an account. Informed consent also involves knowledge about the long-term consequences after consent has been given and the possibilities for data management and control mechanisms at hand (Helberger et al., 2021). In fact, more recently, it has been interpreted that in the event of dominant platforms, users lack choice and therefore there are objections to use of contractual necessity as a legal basis for processing because freedom of consent can be impaired in situations of dominant position in the domestic market for online social networks.Footnote 33 Yet, despite this trend towards the information paradigm, we shall not underestimate the intrinsic value of the GDPR and its institutional apparatus against data protection violations.

The next issue concerns the country of origin principle in the revised Audiovisual Media Services Directive (AVMSD). By allocating competence to the authorities where the service provider is headquartered, the result is a regulatory approach more focused on setting common standards than creating an apparatus for mutual enforcement (Cavaliere, 2021). Therefore, the new Irish media regulator, Coimisiún na Meán, will be responsible also for overseeing compliance with the AVMSD, which includes provisions for video-sharing platforms such as TikTok. Article 28b AVMSD requires Member States to ensure that minors are protected from harmful user-generated videos and audiovisual commercial communications that may impact their physical, mental, or moral development. However, in addition to the late implementation of the Directive in the country, the previous media regulator, the Irish Broadcasting Authority (IBA), did not consider itself to be competent to enforce the AVMSD rules on video-sharing platforms pending the implementation (BEUC’s second letter to DG Connect). As a result, the situation leads to the question of whether national authorities can take action in the absence of transposition, and whether the Directive has any effects prior to its implementation, including a potential state liability under the Francovich (1991) doctrine.

Insufficiency of Private Enforcement

Generally, platforms’ users often meet difficulties to obtain redress (Kosian et al., 2022). The reasons why users fail to seek redress are manifold, including lack of knowledge about the available options and/or distrust on in-house mechanisms, and overall reluctance to engage in dispute resolution at all (ibid.). Arguably, these difficulties cannot be overcome by way of the regulatory mechanisms under the brand-new Digital Services Act that puts in place out-of-court mechanisms to increase access to justice (see Articles 20 and 21 DSA). Where the aggravated party is a vulnerable consumer, how does a right to out-of-court settlement can help a child? The CJEU has already acknowledged that vulnerable consumers can be reluctant to use available remedies even where the contract terms used against them are clearly unfair; e.g., Pohotovost' (2010) Milena Tomášová (2016).

Ex-post private enforcement does not seem to be sufficiently effective against platforms’ wrongdoings. A more proactive approach towards the enforcement of EU law would be a preferred option to adequately apply EU law. However, despite the substantial procedural and institutional set up under the EU rules, important problems remain, making the online space a territory where violations occur, and law enforcement is scarce.

As a matter of fact, setting aside the GDPR, contemporary rules on online platforms do not grant rights. For example, the DSA is largely about procedures, not rights or remedies (Busch & Mak, 2021). Moreover, the DSA is about regulatory harmonization (uses Article 114 TFEU as a legal basis) and, one could argue, little more than that. Besides the right to information (Article 32 DSA) and the right to an out of court settlement (Article 21), the DSA does not provide individual rights that platforms’ users can rely on. Even the prohibition of profiling and dark patterns (Article 28 and 27 respectively) relies on GDPR. The solution is, and remains, consumer law, but the increasing institutionalization (DPAs, national authorities under DSA, etc.) should not result in the weakening of consumer law enforcement.

Recommendations for Better Protecting Vulnerable Consumers in the Digital Single Market

This section ponders different regulatory alternatives that may contribute to enhance the protection of minors in the digital single market. The proposals and recommendations focus on the particular lessons which can be learnt following BEUC’s external alert on TikTok’s practices and what role can and should be attributed to consumer organizations. Not only the expectations about the CPC mechanisms were unfulfilled (BEUC, 2022a), other existing mechanisms against consumer and data protection law infringements also seem to be underperforming.

Consolidation the Consumer Acquis as a Safety Net for Vulnerable Consumers

The EU consumer acquis, particularly the UCTD and the UCPD, serves as a crucial safety net for the protection of vulnerable consumers, including minors. These directives have a broad scope and cover various types of business practices, regardless of whether they pertain to data privacy or audiovisual media services. However, considering the recent developments in the digital single market, such as the GDPR, the AVSDM, and the DSA, it is crucial to ensure that the safety net provided by the UCTD and UCPD is not undermined. One way to achieve this is by clearly stating in any new legislation or regulations that potential rules do not affect the applicability of the safety net provided by the UCTD and UCPD. This can be done by using, for example, language such as “without prejudice” to ensure that the safety net remains intact. For example, this would be useful when issues arise with regard to the applicable law in case overlapping scopes, e.g., children as users of the platform (“recipients of the service” in DSA’s terminology) but also as consumers when they buy TikTok coins, including the application of financial services legislation such as the Directive 2002/65/EC on distance marketing of financial services.

The UCPD is designed to have a broad scope of application, covering a wide range of business strategies, including standard contract terms. In practice, this means that standard terms are considered a specific form of business strategy that must comply with the legal requirements outlined in the UCTD. The primary enforcement entities for consumer law are national consumer agencies and/or consumer organizations, and consumer law does not have a country of origin principle. Therefore, in practice, national enforcement authorities may impose sanctions on the same business strategy of the same company based on either the UCPD or the UCTD.

Based on BEUC’s external alert on TikTok, the enforcement of UCTD, UCPD, and other EU rules conforming the consumer acquis can benefit from valuable improvements to the enforcement procedural rules contained in the CPC Regulation.

Improvements to the CPC Regulation

Considering TikTok’s experience, revisions to the CPC framework are necessary to abolish informal arrangements and to strengthen the role and function of commitments, as well as to adapt procedures for a faster response. A few specific improvements have been recently proposed (BEUC, 2022c).

The CPC Regulation does not provide for a formal procedure on the handling of alerts of widespread infringements. This also applies to the functioning of the “commitments” proposed by a trader to cease infringements (Cantero Gamito & Micklitz, 2023). In this regard, there are significant concerns regarding whether the Member States and the European Commission faithfully adhere to the procedural rules outlined in the CPC Regulation when addressing such alerts. For example, the CPC regulation allows for Member States to assign specific tasks to designated bodies and to give competent authorities the power to consult with consumer organizations, trader associations, designated bodies, or other relevant parties regarding the effectiveness of commitments made by a trader to stop an infringement under this regulation (Recital 7). However, Member States are not required to involve designated bodies or provide for consultations with these groups “regarding the effectiveness of the proposed commitments to cease the infringement covered by this Regulation” (ibid.).

Besides the above, the experience with the external alert on TikTok shows that there are reasons to conclude that the current CPC framework operates below the level of “common positions” and “commitments.” The CPC Regulation outlines a specific procedure for addressing external alerts, including the roles and responsibilities of various parties and the appropriate course of action with a view to promote cooperation, voluntariness, and confidentiality. In this regard, it is worth making a parallel to the Regulation 1/2003 on competition law enforcement, which formalized the commitment procedure and remedies to eliminate informal arrangements (Schweitzer, 2008). Regulation 1/2003 established binding commitments and established the basis for monitoring commitment decisions, either by the European Commission or through consulting firms acting on its behalf (Article 9). In competition law, agreements outside of Regulation 1/2003 may be permissible prior to the initiation of official proceedings or in response to a complaint from a competitor. Consumer law does not operate like this. In the case of TikTok, it appears that violations of consumer law were resolved through private negotiations and insufficient commitments that did not address the underlying issues that prompted the investigation, and with no specific remedies for affected consumers. The Member States and the European Commission should have followed the rules outlined in the CPC Regulation, which would have included the development of a common position under the guidance of the competent supervisory authority, ensuring compliance through commitments, and determining whether the common position and company commitments should be made public. These steps were not taken. Here it is suggested that similarly to Regulation 1/2003 binding commitment decisions should be introduced. Currently, the CPC Regulation grants consumer organizations the right to launch an external alert, in order to set an investigation procedure into motion. However, Article 27 explicitly states that the competent authorities are under no obligation to engage into an investigation or to hear the consumer organization in a potential investigation. TikTok has convincingly demonstrated the shortcomings of the current mechanism. Accordingly, Article 27 ought to be revised. The competent authorities must be under an obligation to make a decision on whether a formal investigation will be opened or not. This decision should be naturally subject to judicial review.

Counterbalancing Jurisdictional Rules

The TikTok case is also explained by the deficiencies which result from the country of origin principle in the AVMSD and the one-stop shop mechanism under the GDPR. As explained above, it is not secret that there are important disagreements between the Irish Data Protection Commission and the EDPB. Recently, the Irish DPC announced that it plans to take legal action against and would bring an action for annulment before the CJEU “in order to seek the setting aside of the EDPB’s directions” because it believes that the EDPB has exceeded its authority and does not have a general supervisory role.Footnote 34 This would be an unparalleled move by the Irish body, as the activities of national authorities are to be based on sincere cooperation (Gentile & Lynskey, 2022).

Seen through the lenses of consumer organizations, the most effective possible solution to counterbalance eventual detrimental effects of the existing jurisdictional rules could be to empower consumer organizations to launch a complaint and to oblige the competent authority to take action within a specific timeframe. In case the competent authority delays the procedure, the concerned designated entities would be authorized to lawfully request the competent agency to act against possible sanctions.

An inspiration of this institutional setting could be found in the still in force ePrivacy Directive (Directive 2002/58/EC), which provides that Member States shall ensure that the competent national authority and, where relevant, other national bodies have the power to order the cessation of the infringements (Article 15a(2)). Moreover, it also enables national regulatory authorities to adopt measures to ensure effective cross-border cooperation in the enforcement of the national laws and to create harmonized conditions for the provision of services involving cross-border data flows (Article 15a(4)).

Institutional Embeddedness

One of the major problems in the enforcement of EU law against unlawful practices is the overlapping competences between competent national authorities and European-level bodies. There is no mechanism in place which allows to coordinate the responsibilities which are divided between consumer agencies/consumer organizations, data protection authorities, and the authorities competent for the enforcement of audiovisual media services.

Following the entry into force of the DSA, important institutional changes for the enforcement the regulatory architecture of the digital single market are to be expected. In this light, the EDPB is calling for institutional embeddedness with the new regulatory instruments such as the DSA, DMA, Data Act, and AI Act (EDPB, 2022). This reinforces the “too many cooks” narrative. There are many institutions involved and many rules to be enforced.

The integration of various legal fields within a singular governing authority is an uncharted territory not only within the European Union, but also among its Member States. While some Member States have sought to combine consumer law and competition law, none has yet arguably considered the merger of consumer law and data protection law, or audiovisual media services. This lack of precedent presents a challenge in finding an appropriate solution at the European level. However, a commonality among the enforcement structures of the aforementioned legal fields is the presence of a lead supervisory authority, as determined through the headquarters of the entity in question, or through joint or unilateral decisions made by the EU or Member States. One potential solution to this issue is the establishment of a horizontal structure, in which the lead authorities of the various legal fields are grouped together to seek a common solution and prevent deviation. This would be a first step towards closer harmonization among the competent agencies (Micklitz, 2022). A second step would involve the implementation of a procedure beginning with the exchange of information and mutual consultation. Finally, a rule should be established allowing for agreement on the substance of potential infringements, the specific laws that have been violated, and a binding agreement on potential solutions.

Conclusions: Neither in Theory Nor in Practice

Following the recent adoption of the DSA in October 2022 and mounting pressures of potential multi-million fines for violations of the GDPR, platform operators, including TikTok, are updating their contractual frameworks in anticipation of a conceivable “Brussels effect” in Big Tech (Bradford, 2020; Keller, 2022). Yet, despite the growing international reputation as to its capacity to curb the power of very large online platforms and protect users’ fundamental rights online, an empirical observation of EU law’s institutional and procedural apparatus may show that expectations are not (yet) fully met.

It was attributed to Napoleon the quote that reads “[l]aws which are consistent in theory often prove chaotic in practice” (Bertaut, 1916). The case at stake illustrates in a nutshell that despite the different applicable rules (consumer acquis, GDPR, AVMSD, and more recently the DSA, among others) the digital single market still lacks a system that clarifies the distribution of enforcement responsibilities given the multiplicity of laws and actors involved. TikTok is paradigmatic in that it demonstrates plainly how the identical business strategy falls in the scope of application of many EU laws, thereby automatically creating delimitation problems at the substantive level and competing responsibilities at the enforcement level.

There is no doubt that the efforts by the European Commission to provide a comprehensive framework for regulating platforms are indeed welcome and they already signal a step in the right direction towards further coherence and effective regulatory action. Things are changing and business as usual is no longer an option in the EU.Footnote 35 However, the legislator ought to consider the limitations of a siloed approach where legislative and procedural gaps are arguably used by the companies whose business models are in fact built around abusing those loopholes.

This paper has claimed that too many cooks spoil the broth and showed that there are legislative shortcomings regarding the institutional and procedural design through which violating practices remain unenforced. The paper has also suggested some recommendations to improve the protection of minors in the digital single market under currently applicable rules. The broader perspective, considering both the overlapping scope of applications and overlapping enforcement competences, could be relevant in light of the recently enacted DSA, which also establishes a set of procedures and actors, such as the Digital Services Coordinator and designates competent national authorities to monitor legal compliance. With the new regulatory framework for online platforms, there is an opportunity to clarify and coordinate enforcement actions and mechanisms for more accountable platforms’ practices.