1 Introduction

This OpinionFootnote 1 describes and summarises the results of the interdisciplinary research carried out by the authors during the course of a three-year project on intermediaries’ practices regarding copyright content moderation. This research includes the mapping of the EU legal framework and intermediaries’ practices regarding copyright content moderation, the evaluation and measuring of the impact of moderation practices and technologies on access and diversity, and a set of policy recommendations. For a detailed explanation of our methods and the analysis, readers are encouraged to read our reports and associated publications on these topics.Footnote 2 This Opinion provides a condensed and updated description of our findings and policy recommendations. Following this introduction, Section 2 summarises the main conclusions and findings from our mapping analysis involving the content moderation of copyright-protected content on online platforms in the EU. This mapping analysis provides a basis for our subsequent normative and evaluative research. Section 3 then summarises the main conclusions and findings from our evaluation analysis. On that basis, Section 4 outlines our policy recommendations for EU and national policymakers.

2 Mapping of Copyright Content Moderation Rules and Practices

The main research question of our mapping analysis refers to how to map the impact on access to culture in the Digital Single Market (DSM) of content moderation of copyright-protected content on online platforms. To answer this question we carry out a legal analysis of copyright content moderation rules at the EU and national levels (2.1), as well as an empirical research of private regulation by platforms (2.2).Footnote 3

2.1 Copyright Content Moderation Rules at the EU and National Levels

The EU level analysis starts with an exposition of the baseline regime set down in EU copyright law before the approval of Art. 17 of the Copyright in the Digital Single Market Directive (CDSMD)Footnote 4, which we call the pre-existing acquis. EU law has been subject to a high level of harmonisation stemming from many directives on copyright and related rights, the interpretation of which is determined by the case law of the Court of Justice of the European Union (hereinafter "CJEU" or the "Court"). In particular, the legal status of copyright content moderation by online platforms under this regime is mostly set by the Court’s interpretation of Arts. 3 and 8(3) InfoSoc DirectiveFootnote 5 on direct liability for communication to the public and injunctions against intermediaries, and Arts. 14 and 15 e-Commerce DirectiveFootnote 6 on the hosting liability exemption and the prohibition on general monitoring obligations.Footnote 7 We explain this case law and its implications for platform liability and content moderation obligations up to the Court’s Grand Chamber judgment in YouTube and Cyando,Footnote 8 and how those developments contributed to the proposal and approval of Art. 17 CDSMD.

Setting aside the political nature of legislative processes, from a systematic and historical perspective, Art. 17 CDSMD and subsequently the Digital Services Act (DSA)Footnote 9 can be seen as the result of efforts in EU law and its interpretation by the Court for the last 20 years to adapt to technological developments and the changing role and impact of platforms on society. The result has been a push towards “enhanced” responsibility for platforms, characterised by additional liability and obligations regarding the content they host and services they provide, as well as an increased role of fundamental rights – especially of users – in the legal framework.

The heart of this part of our analysis is the complex legal regime of Art. 17 CDSMD, which we examine in light of existing scholarship, the Commission’s Guidance on that provision,Footnote 10 the Advocate General’s (AG) Opinion, and the Court’s Grand Chamber judgment in Case C-401/19. Our analysis sets out in detail the different components of this hybrid regime, including:

  • the creation of the new legal category of “online content-sharing service providers” (OCSSPs), a sub-type of hosting service providers under the e-Commerce Directive, and “online platforms” under the DSA;

  • the imposition of direct liability on OCSSPs for content they host and provide access to;

  • the merged authorisation regime for acts of OCSSPs and their uploading users, provided the users’ acts do not generate significant revenue;

  • the lex specialis nature of Art. 17 CDSMD in relation to Art. 3 InfoSoc Directive and Art. 14 e-Commerce Directive, which is endorsed explicitly by the Commission’s Guidance and the AG’s Opinion in C-401/19, and in our view implicitly by the Court in the same judgment;

  • the relationship between the prohibition on general monitoring obligations in Art. 15 e-Commerce Directive and Art. 17(8) CDSMD, where we argue that the latter may be understood as being of a merely declaratory nature;

  • the complex liability exemption mechanism comprised of “best efforts” obligations on OCSSPs (to obtain an authorisation and to impose preventive and reactive measures) in Art. 17(4); and

  • the substantive and procedural safeguards in the form of exceptions or limitations or “user rights” and in-/out-of-platform complaint and redress mechanisms in Art. 17(7) and (9).

Our analysis addresses multiple points of uncertainty in this complex regime, some of which will no doubt be subject to litigation at the national level and likely the CJEU. The following aspects are worth highlighting, however, as they also reflect possible points of improvement of this regime from the perspective of copyright content moderation.

First, whether an online platform is subject to the pre-existing regime (as updated by the DSA) or the new regime in Art. 17 CDSMD will depend on its qualification as an OCSSP. Our research shows that there is significant legal uncertainty on this point, despite the Commission’s Guidance. To be sure, certain large-scale platforms, especially with video-sharing features (e.g. YouTube, Meta/Facebook, Instagram), clearly qualify as OCSSPs. Others will also clearly be excluded from the scope of Art. 17 because they are covered by the definitional carve-outs in Art. 2(6) CDSMD. Still, a significant grey area remains, which affects both larger platforms and (especially) medium-sized and small platforms. The main reason is that the definition includes a number of open-ended concepts (“main purpose”, “large amount”, “profit-making purpose”) that ultimately require a case-by-case assessment of which providers qualify as an OCSSP. Such assessment would partly take place in the context of the respective national Member State, which may lead to additional uncertainty. Furthermore, even if it can be established that a platform falls within the scope of the legal definition, it might remain unclear to what extent it does.

Second, a crucial part of our analysis on platform liability and copyright content moderation obligations refers to what we call the normative hierarchy of Art. 17 CDSMD, considering the Commission’s Guidance and drawing from the arguments in the AG’s Opinion and CJEU judgment in C-401/19.

The first important implication of the judgment is that the Court recognises that Art. 17(7) CDSMD includes an obligation of result. As such, Member States must ensure that these exceptions are respected despite the preventive measures in paragraph (4), qualified as “best efforts” obligations. This point, already recognised by the AG and in the Commission’s Guidance, is reinforced by the Court’s recognition that the mandatory exceptions, coupled with the safeguards in paragraph (9), are “user rights”, not just mere defences.Footnote 11

The second and related main implication of the judgment is that the Court rejects the possibility of interpretations of Art. 17 that rely solely on ex post complaint and redress mechanisms as a means to ensure the application of user rights. The judgment clarifies that Member States’ laws must first and foremost limit the possibility of deployment of ex ante filtering measures; assuming this occurs, the additional application of ex post safeguards is an adequate means to address remaining over-blocking issues. This conclusion should be welcomed, especially in light of existing evidence that complaint and redress mechanisms are seldom employed by users.

The third main implication of the judgment relates to the scope of permissible ex ante filtering by platforms. On this point, the Guidance states that automated filtering and blocking measures are “in principle” only admissible for “manifestly infringing” and “earmarked” content. However, the Court states unequivocally that only filtering/blocking systems that can distinguish lawful from unlawful content without the need for its “independent assessment” by OCSSPs are admissible. Only then will these measures not lead to the imposition of a prohibited general monitoring obligation under Art. 17(8) CDSMD. Furthermore, these filters must be able to ensure the exercise of user rights to upload content that consists of quotation, criticism, review, caricature, parody, or pastiche.

On this point, the judgment endorses by reference the AG’s Opinion, which states inter alia that filters “must not have the objective or the effect of preventing such legitimate uses”, and that providers must “consider the collateral effect of the filtering measures they implement”, as well as “take into account, ex ante, respect for users’ rights”.Footnote 12 In our view, considering the Court’s statements in light of the previous case law and current market and technological reality, the logical conclusion is that only content that is “obviously” or “manifestly” infringing – or “equivalent” content – may be subject to ex ante filtering measures. Beyond those cases, for instance as regards purely “earmarked content”, the deployment of ex ante content filtering tools appears to be inconsistent with the judgment’s requirements.

It also remains to be seen whether this reasoning applies more broadly to other types of illegal content beyond copyright infringement. If it does, it might help to shape the scope of prohibited general monitoring obligations versus permissible “specific” monitoring, with relevance for future discussions on the DSA. In drawing these lines, caution should be taken in the application of the “equivalent” standard in Glawischnig-Piesczek,Footnote 13 which likely requires a stricter interpretation for the filtering of audio-visual content in OCSSPs than textual defamatory posts on a social network.

Finally, we provide a brief analysis of the interplay between Art. 17 CDSMD and the potentially applicable provisions of the DSA to OCSSPs. On this topic, we refer readers to our parallel research, which offers an in-depth analysis.Footnote 14 The upshot of our analysis is that the DSA will apply to OCSSPs insofar as it contains rules that regulate matters not covered by Art. 17 CDSMD, as well as specific rules on matters where Art. 17 leaves a margin of discretion to Member States. We consider that such rules apply even where Art. 17 CDSMD contains specific (but less precise) regulation on the matter. Although there is legal uncertainty in this regard, such rules include both provisions in the DSA’s liability framework and in its due diligence obligations (e.g. as regards the substance of notices, complaint and redress mechanisms, trusted flaggers, protection against misuse, risk assessment and mitigation, and data access and transparency).

In light of the above, one conclusion from our analysis is the emergence of a bifurcated or multilevel legal framework for online platforms engaging in copyright content moderation. On the one hand, OCSSPs are subject to the regime of Art. 17 CDSMD as regards liability and content moderation. On the other hand, non-OCSSPs are subject to the pre-existing regime under the InfoSoc and e-Commerce Directives (and now the DSA), as interpreted by the CJEU (e.g. in YouTube and Cyando). Although the regimes have similarities – and can be approximated through the Court’s interpretative activity – they are structurally different. This divergence may lead to further fragmentation on top of the fragmentation that is to be expected by the national implementations of the complex mechanisms in Art. 17 CDSMD. To this we must add the application of the horizontal rules on content moderation liability and due diligence obligations arising from the DSA. In sum, the multi-level and multi-layered EU legal landscape on copyright content moderation that emerges from our mapping analysis is extremely complex.

Relatedly, as anticipated above, certain copyright content moderation issues of relevance remain unregulated in the copyright acquis, namely rules on measures (i) affecting the visibility and monetisation of content, and (ii) addressing a user’s ability to provide information, e.g. relating to the termination or suspension of their account. Although both categories are relevant, the issue of monetisation is, in our perspective, the most glaring regulatory gap, since “monetisation” actions play a central and financial consequential role in platforms’ content moderation practices. This topic should therefore be subject to further research and policy action in the near future.Footnote 15

Still, as regards regulatory gaps, it is important to underscore the complexity of the legal determinations and judgments required to assess human and algorithmic copyright content moderation practices. This strongly suggests a need for better transparency and access to data from platforms. In these regards, both the pre-existing regime prior to the DSA and Art. 17 CDSMD offer very little. As such, this is an area where serious consideration must be given to the potential application to OCSSPs and other copyright platforms of the DSA’s transparency provisions, as well as to national solutions that impose on OCSSPs and non-OCSSPs transparency and data access obligations. As regards the DSA, the data access and scrutiny obligations vis-à-vis researchers are of particular importance. As regards national law solutions, in our view, the German transposition law provides an interesting blueprint in Sec. 19(3) of the Act on the Copyright Liability of Online Content Sharing Service Providers (UrhDaG) in relation to rights to information.

Our legal analysis also includes a comparative study of ten select national laws based on desk research and two expert questionnaires carried out before and after the due date of the CDSMD coming into force. Because of delays in national implementations during the period of our research and the detailed nature of the study, we refer readers to our report on this topic.Footnote 16 It is, however, worthwhile noting the diversity of national implementations (between “gold-plating” and quasi verbatim transpositions), the inconsistency of certain national implementations with the CJEU judgment in C-401/19, and the likelihood that a number of preliminary references on different aspects of Art. 17 CDSMD will find their way to the CJEU in the short to medium term.

2.2 Private Regulation by Platforms: Empirical Research

The empirical component of our mapping analysis studied the copyright content moderation structures adopted by 15 social media platforms over time, with a focus on their terms and conditions (T&Cs) and automated systems. These platforms are grouped into: (i) mainstream – Facebook, YouTube, Twitter, Instagram and Sound Cloud; (ii) alternative – Diaspora, DTube, Mastodon, Pixelfed and Audius; and (iii) specialised – Twitch, Vimeo, FanFiction, Dribble and Pornhub.

Our analysis suggests that two dual processes explain the development of these structures. The first is complexification/opacification. Our empirical work indicates that virtually all 15 platforms’ T&Cs have become more intricate in various ways and to different extents. Over time, more (kinds of) rules have been introduced or made public, and these rules have been communicated in increasingly more diverse sets of documents. These documents have changed and have been tweaked several times, producing sometimes a plethora of versions, often located in a dense web of URLs. We therefore conclude that the way platforms organise, articulate and present their T&Cs matters greatly. For one, under increasing public and policy pressure, platforms have felt the need to express and explain their practices and rules of operation, and they have done so with complex and greatly varying documentation. For observers, although this provides more information about platforms, it nevertheless makes understanding the trajectory of platforms and their T&Cs extremely challenging. For example, with YouTube, a major actor when it comes to copyright, our database of their highly fragmented T&Cs has not resulted as robust enough to allow for a precise longitudinal examination of their rules. In that way, the very organisation and presentations of T&Cs should be understood as one element of platforms’ governance of content.

Substantially, we demonstrated that complexification can be radically distinct, depending on which platforms one considers. Very large ones, such as Meta/Facebook, experienced an almost continuous and drastic transformation; smaller ones, such as Diaspora, have barely changed. Yet, when a change did occur, it made those sets of rules more difficult to comprehend. Whilst our analysis did not take a longitudinal perspective on automated copyright content moderation systems, their emergence and eventual transformation into a central governance tool for various platforms is, in itself, an important element of a broader complexification process. These systems work at a scale that is hard to comprehend, through computational operations that are technically intricate, and under largely unjustified and seemingly arbitrary protocols on, e.g. how to appeal decisions. In other words, they are remarkably opaque, as with many of the T&Cs we studied. Our analysis pointed out that while in some cases some complexification might be impossible to avoid, opacification is by no means necessary or justifiable. From this perspective, then, the imposition by law of rules on platforms’ internal content moderation procedures and their transparency (e.g. in the DSA) is sensible. It will be critical to ensure that these reporting obligations are rolled out in robust and detailed ways, so that they are instrumental to the clarification and understanding of such procedures and related decision-making.

The second process is platformisation/concentration. By categorising rules into what we have termed “normative types”, we argued that various platforms in our sample altered their rules so as to give themselves more power over copyright content moderation, usually by increasing the number of their obligations and rights, which were, in turn, largely aligned with their own interests, logics and technologies. We suggested that this could be interpreted as a particular example of the broader phenomenon of “platformisation”.Footnote 17 Nonetheless, our analysis argued that this transformation was by no means unidirectional. Platformisation enhances not only platforms’ power but also their responsibilities over content moderation. It was curious to note, therefore, that while emboldening their normative legitimacy to control copyright, platforms did not necessarily alter their discursive focus on user-oriented rules. As with complexification, platformisation has been experienced differently by different platforms and deepened by the rise of automated copyright content moderation systems, which may severely impair ordinary users’ ability to participate in and challenge removal decisions. That platformisation centralises power in the hands of platforms might be a truism – but our research also suggests that this process might end up giving more power to large rightsholders, to the detriment of essentially smaller rightsholders and (users-)creators, as well as other users.Footnote 18 Nowhere this was clearer than in our study of Meta/Facebook’s “Rights Manager”, which does not appear to be accessible for small creators, a non-algorithmic bottleneck that has been rarely studied from an empirical perspective.Footnote 19 We have opened up our massive dataset on platform policies ranging back to their founding years in the mid-2000s,Footnote 20 in order to facilitate much needed further research with this data on copyright issues and beyond.Footnote 21

3 Evaluation and Measuring: The Impact of Moderation Practices and Technologies on Access and Diversity

Building on our mapping work, the evaluative part of the analysis centres on a normative examination of the existing public and private legal frameworks with regard to intermediaries and cultural diversity, and on the actual impact on intermediaries’ content moderation on diversity. The evaluation analysis pursues two main objectives. First, it aims to explain and evaluate the existing legal frameworks (both public and private, existing and proposed) that shape the role of intermediaries in organising the circulation of culture and creative works in Europe, including in copyright content moderation. Second, it aims to explain, critically examine and evaluate the existing practices and technologies that intermediaries deploy to organise the circulation of culture and creative works in Europe, including in copyright content moderation. Each objective corresponds to two main components of our analysis: evaluating multi-level legal frameworks (3.1) and measuring the impact of moderation practices and technologies on access and diversity (3.2).

3.1 Evaluating Multi-Level Legal Frameworks

The first main component evaluative analysis involves a normative assessment of how legal rules and contractual terms on the moderation of copyright content on large-scale user-generated content (UGC) platforms, such as OCSSPs, affect digital access to culture and the creation of cultural value. We assess how such rules and terms shape the design of moderation by UGC platforms, the activities of creators and users, and the role of fundamental rights and freedoms in shaping these rules and terms. We also evaluate how the state-enacted rules in the DSM shape the emergence of private models for content moderation, examining how the production of law is shaped by the intrinsic characteristics and needs of the actors within the conditions of the legal framework. Our research shows that the existing legal framework has increasingly focused on how it shapes the role of intermediaries in organising the circulation of culture and creative works in Europe, including copyright content moderation.Footnote 22

3.1.1 Emerging Multi-Level Legal Framework and Its Complexity

One initial insight from our research relates to the increasing complexity of the landscape of interacting rules in this field. For instance, the relevant substantive copyright rules are contained in national copyright legislation, partly based on harmonising instruments such as the InfoSoc Directive. The relevant rules regarding intermediary or platform regulation are contained in Art. 17 CDSMD (and its national implementations), the e-Commerce Directive’s framework for intermediary liability exemptions in Arts. 12–15, replaced and amended by the DSA (see Fig. 1).

Fig. 1
figure 1

Relationship of substantive copyright rules and intermediary framework

In order to understand the regulatory (i.e. the law) and self-regulatory environment surrounding the moderation of online content, it is necessary to recall that Art. 14 e-Commerce Directive set forth the horizontal basic rules for an intermediary’s mandated response to illegal content, including copyright-infringing works. These rules have now been replaced by the corresponding provision in the DSA.Footnote 23

Notably, the e-Commerce Directive refrained from further specifying the notice-and-action regime. In this void (or more positively: freedom of operation) industry practices have emerged. These, in turn, now appear to be at least partly codified in Arts. 17 CDSMD with regard to OCSSPs, and in the DSA with regard to other online platforms (or non-copyright services of the same platforms) that fall outside the scope of Art. 17 CDSMD (see Fig. 2).Footnote 24

Fig. 2
figure 2

Relationship between rules on intermediaries and industry practices

One issue related to the regulatory framework concerns its complexity and potential overlaps and interplay. This is specifically relevant in the context of online platforms and copyright, where both Art. 17 CDSMD and the DSA specify and adjust platforms’ room of operation for content moderation and which we have previously explored.Footnote 25 Further complexity is added with the aforementioned diversity in national implementations of Art. 17 CDSMD.Footnote 26

Besides this overlap, there are other notable areas where rules interact. Since content moderation also often involves the processing of personal data, for example, future research should look into the interplay between the General Data Protection Regulation (GDPR)Footnote 27 and the sector-specific CDSMD framework as well as the horizontal rules in the DSA. Since content moderation is regularly performed or supported by algorithmic means, a potential intersection with the Artificial Intelligence Act (AIA),Footnote 28 which was proposed on 21 April 2021, is of interest.Footnote 29 The proposed AIA introduces “rules regulating the placing on the market and putting into service of certain AI systems”Footnote 30 and focusses on the regulation of providers as well as users of such AI systems. In the context of copyright content moderation, the AIA is of interest given the broad and generic definition of AI systems in Art. 3(1) AIA proposal: “software that is developed with one or more of the techniques and approaches listed in Annex I and can, for a given set of human-defined objectives, generate outputs such as content, predictions, recommendations, or decisions influencing the environments they interact with”.Footnote 31 In our view, content moderation technology likely falls within the scope of this definition.

Furthermore, the scope the proposed Regulation focuses on risks, inter alia, to the protection of fundamental rights of the natural persons concerned.Footnote 32 Copyright content moderation might come with risks for, inter alia, freedom of expression or the arts. The AIA proposal differentiates between different types of risk: (i) AI systems that come with unacceptable risks are prohibited; (ii) AI systems with high-risk are permitted but subject to specific obligations; (iii) AI systems with limited risk are subject to certain transparency obligations. None of these risks in the AIA proposal seems to encompass copyright content moderation at this stage. However, the Parliament’s compromise version of the AIA (from May, 2023) sets forth specific rules for generative AI models applicable to copyright, both as regards transparency and disclosure obligations of copyright materials in the training data, and safeguards for the moderation of illegal content generated by AI systems, including copyright infringing content.Footnote 33 If these changes or similar ones are adopted in the final version of the Regulation, then the AIA will become yet another significant legal instrument for the regulation of copyright (AI generated) content moderation in EU law.

3.1.2 Assessment

Building on these insights, we first expand on the assessment of the regulatory environment and revisit the starting point for access to culture and the creation of cultural value. In doing so, we introduce a concept of “Rough Justice”, which acknowledges the difficulties and differences vis-à-vis a full “fair trial” setup and proposes conceptualisation in the context of procedural rules, substantive rules and competences.Footnote 34

A second starting point for the legal evaluation is provided through analysing and evaluating the framework for the quality of automated copyright content moderation as put forward in the CDSMD and the DSA in light of erroneous decisions.Footnote 35 Our analysis shows that with regard to access to culture and cultural diversity, decision quality should be emphasised as a separate factor from ex post mitigation mechanisms. In our view, the principal question that arises is what error rate is acceptable under the legislative framework. After examining different explicit and implicit references to content moderation error rates in the DSA and Art. 17 CDSMD – including interpretations in the Commission’s Guidance, the AG’s Opinion and CJEU judgment in Case C-401/19 – we conclude that the issue of error rates in all these mentioned examples can only consist of a contextual analysis. A first factor should relate to the volume of content moderation decisions taken. The goal cannot only be to have a low percentage of errors but rather a low number of actual “wrong” content moderation decisions. A second factor should relate to the “harm” caused by a wrong decision and whether such harm can be mitigated ex post. It is therefore suggested that decision quality should be a decisive factor that is to be seen as a separate from ex post mitigation mechanisms.

A third perspective relates to the realisation that copyright content moderation increasingly requires an understanding of contextual use and whether the potential risk of “bias carry-over” from datasets to content moderation is sufficiently addressed in the current framework.Footnote 36 Our analysis takes the first steps in exploring the possible links between conditional data access regimes and content moderation performed through data-intensive technologies such as fingerprinting and, within the realm of AI in general, machine learning algorithms in particular. More specifically, we look at whether current EU copyright rules may have the effect of favouring the propagation of bias present in input data to the algorithmic tools employed for content moderation and what kind of measures could be adopted to mitigate this effect. Algorithmic content moderation is a powerful tool that may contribute to a fairer use of copyright material online. However, it may also embed most of the bias, errors and inaccuracies that characterise the information on which it has been trained. Therefore, if the users’ rights contained in Art. 17(7) CDSMD are to be given effective protection, simply indicating the expected results but omitting how these should be reached may not be sufficient. The problem of over-blocking is not simply a technical or technological issue. It is also a cultural, social and economic issue. Perhaps more than anything, it is a power dynamic issue. Recognising parody, criticisms and review as “user rights”, as the CJEU does in case C-401/19, may be a first step towards the strengthening of users’ prerogatives. But the road to reach a situation of power symmetry between users, platforms and right holders seems a long one. Ensuring that bias and errors concealed in technological opacity do not circumvent such recognition and render Art. 17(7) ineffective in practice would be a logical second step.

3.2 Measuring the Impact of Moderation Practices and Technologies on Access and Diversity

The second main component of our analysis is an attempt to measure the impact of copyright content moderation on access and diversity. During the course of our project and the implementation period of the CDSMD it became clear that online platforms play a crucial role in contemporary societies, whilst AI technologies are increasingly presented as solutions to core societal problems. Under increasing public and political pressure, social media platforms have expanded their efforts to moderate the content they host. To do so, they have invested both in growing numbers of human moderatorsFootnote 37 and in algorithmic moderation.

The empirical component of our research attempts to gauge the impact of increasing content moderation practices, policies and technologies, including for copyright, and of the CSDMD on access to culture and diversity. From the legal perspective, as pointed out above, Art. 17 CDSMD poses serious concerns as regards the freedom of expression implications of preventive filtering and over-blocking. This concern is amplified by the lack of transparency surrounding private platforms’ algorithmic moderation systems. This raises the stakes for understanding better how platforms and copyright content moderation impact diversity and access to culture in the DSM.

This empirical part of the project tackles three dimensions of this problem with three sub-studies.

In the first sub-study, we investigate the historical evolution and current situation of transparency reporting with a focus on copyright content moderation.Footnote 38 We further examine the convergence and divergence in the content moderation practices of social media platforms along with transparency habits in a broader sense, also by elaborating on substantial numbers of content moderation data. Our analysis highlights that transparency reporting has a number of important limitations that potentially jeopardise the perceived accountability of platforms and positive effects of the reporting on their legitimacy in the eyes of external stakeholders. We conclude that there is still significant room for improvement regarding platform transparency practices, as well as for their moderation practices. Better quality and potentially a standardisation of transparency practices by platforms would be crucial for a better understanding and assessment of their copyright content moderation and, as a result, for evidenced-based policymaking in this area.

In the second sub-study, we analysed content-level data of platforms with regard to changes and factors of cultural diversity on social media and streaming platforms.Footnote 39 This empirical study investigates the changes and influences in access and cultural diversity on social media and streaming platforms, specifically YouTube, in the timeframe 2019 to 2022, focusing on the period between the approval of the CDSMD and the end of 2022, when most national implementation laws in Member States had just been passed or are still in the final stages of discussion. This study reaches three main conclusions:

  • First, we found a high share of blocked and deleted content in our sample, higher than that found in previous research. Due to restricted access to data, it is difficult to pin down and isolate the exact reasons for content deletion and take-down. As such, caution is advised when interpreting these results.

  • Second, we have found a general decrease of diversity with regard to available content. Within the four countries studied (Estonia, France, Germany and Ireland), three countries display a noticeable decrease in the diversity index, with Ireland representing a contrary development with a slight increase. The country differences do not correlate, though, with national differences in copyright law and specifically with the variation in substance and timing of the national implementation of the CDSMD. This makes it hard to assess and isolate the actual impact of copyright content moderation and the implementation of the CDSMD on content diversity.Footnote 40 Future and continuing research is needed to assess these questions when the legislative implementations become effective and visible at full scale.

  • Third, and most important from our perspective, we have been confronted with the limitations of research in this space due to lack of data access. In the current landscape, it is close to impossible to systematically study the questions posed in this project. What is the impact of copyright regulation and content moderation on content diversity? In fact, this research is not only highly limited, but also dependent on internal decisions of platforms on giving access to (different types of) data. This is a common refrain also for our legal research. Hence, there is urgent need for more robust rules on data access for researchers.

Finally, in the third sub-study, we have explored creators’ understanding and experiences of copyright moderation in relation to their creative work and the labour of media production on social media platforms.Footnote 41 The main takeaway from our study is that users of social media platforms that do creative work are influenced by algorithmic content moderation. Perhaps our most important finding, which extends understanding on how algorithmic content moderation influences creative work on platforms, is that creators engage in self-censorship. That is to say, creators avoid posting certain content or adjust it in advance in order to cater to the perceived functioning of platforms algorithmic content moderation. For many artists, anticipation of platform “punishments” (i.e. restrictive moderation actions) directly influenced the cultural products that they produced. In addition, because the regulative dimension of algorithmic copyright moderation is opaque for creators, they engage in “algorithmic gossip”Footnote 42 and use user folk theoriesFootnote 43 to try and guess which practices are accepted and which are not. There are important policy implications from this research, such as that more transparency in platform governance is needed, both from policymakers and platforms, so that the automated content moderation does not add to the uncertainty and insecurity of the creators’ media production work on social media platforms.

Overall, the results of our empirical research indicate a strong impact of copyright regulation and content moderation on diversity, and potentially an impact that leads to a decrease in diversity of content. Yet, the research has also shown that these interpretations cannot be fully verified based on the limited data that is available to researchers and the public. A common theme we highlight and return to in our recommendations is the need for further research on issues of diversity and access on social media platforms, given its high relevance for European societies, as well as its complex nature, specifically in the context of contemporary fragmented media landscapes. Mandatory data access clauses such as those included in the German Network Enforcement Act (NetzDG), the German CDSMD implementationFootnote 44 as well as in the DSA pave an important avenue forward in this regard. Yet, it remains to be seen how robust and effective these clauses are, since they demand the highest levels of data security and infrastructure facilities on the side of researchers and their institutions. Finding practical and fair solutions as well as best practices for data access that are not only accessible to researchers at elite and perfectly equipped institutions is a key challenge for policy and research in the next decade. Consequently, we conclude with a strong call for robust mandatory data access clauses in future regulations.

4 Recommendations for Future Policy Actions

In the following, we summarise the key recommendations for future policy actions based on our research.


Definition of OCSSPs

  • Considering the potential for legal uncertainty and fragmentation of the digital single market as regards copyright content moderation, we recommend that the Commission reviews its Guidance on Art. 17 CDSMD (COM/2021/288 final) in order to provide clearer guidelines on the definition of OCSSPs, especially for small and medium-sized online platforms and coordinate its application across Member States.


User rights – recognition

  • National legislators should review their national transpositions of Art. 17 CDSMD to fully recognise the nature of the exceptions and limitations in paragraph (7) as “user rights” in accordance with CJEU jurisprudence, rather than mere defences.


User rights – operationalisation

  • We further recommend that the Commission reviews its Guidance in order to provide guidelines from the perspective of EU law as to the concrete implications of a “user rights” implementation of paragraph (7) in national laws. This should include, to the extent possible, concrete guidance on what type of actions users and their representatives (e.g. consumer organisations) may take against OCSSPs to protect their rights.


Complaint and redress safeguards – complementary nature

  • National legislators should review their national transpositions of Art. 17 CDSMD to ensure that ex post complaint and redress mechanisms under paragraph (9) are not the only means to ensure the application of users’ rights, but rather a complementary means, in line with the Court’s judgment in case C-401/19.

  • We further recommend that the Commission’s Guidance is updated to fully reflect the Court’s approach in case C-401/19, as regards the complementary role of complaint and redress mechanisms under paragraph (9).


Permissible preventive filtering

  • The Commission should review its Guidance to clearly align it with the Court’s judgment in C-401/19, namely by clarifying that: (1) OCSSPs can only deploy ex ante filtering/blocking measures if their content moderation systems can distinguish lawful from unlawful content without the need for its “independent assessment” by the providers; (2) such measures can only be deployed for a clearly defined category of “manifestly infringing” and strictly defined category of “equivalent” content; and (3) such measures cannot be deployed for other categories of content, such as (non-manifestly infringing) “earmarked content”. Member States should further adjust their national implementations of Art. 17 CDSMD to reflect these principles.

  • In implementing these principles, the Commission and Member States could take into consideration the approach proposed by the AG’s Opinion on how to limit the application of filters to manifestly infringing or “equivalent” content, including the consequence that all other uploads should benefit from a “presumption of lawfulness” and be subject to the ex ante and ex post safeguards embedded in Art. 17, notably judicial review. In particular, the AG emphasised the main aim of the legislature to avoid over-blocking by securing a low rate of “false positives”. Considering the requirements of the judgment, in order to determine acceptable error rates for content filtering tools, this approach implies that the concept of “manifestly infringing” content should only be applied to uploaded content that is identical or nearly identical to the information provided by the rightsholder that meets the requirements of Art. 17(4)(b) and (c) CDSMD.


Relationship of Art. 17 CDSMD and the DSA – clarification

  • The Commission should review its Guidance to clarify which provisions in the DSA’s liability framework and due diligence obligations Chapters apply to OCSSPs despite the lex specialis of Art. 17 CDSMD, within the limits of the Commission’s competence as outlined in Art. 17(10) CDSMD.


Relationship of Art. 17 CDSMD and the DSA – terms and conditions and fundamental rights

  • The Commission should clarify in its Guidance that the obligations of Art. 14 DSA apply to OCSSPs, in particular the obligation in paragraph (4) to apply and enforce content moderation restrictions with due regard to the fundamental rights of the recipients of the service, such as freedom of expression. The authorities and courts of the Member States should equally interpret their national law in a manner consistent with the application of Art. 14 DSA to OCSSPs.


Monetisation and restrictive content moderation actions

  • At the EU level, EU institutions and in particular the Commission should explore to what extent the copyright acquis already contains rules addressing content moderation actions relating to monetisation and related restrictive content moderation actions (e.g. shadow banning and downranking) of copyright-protected content on online platforms (e.g. in Arts. 18 to 23 CDSMD), and to what extent policy action is needed in this area. Further research is needed specifically on the imbalanced nature of the contractual relationship of online platforms and uploading users, as well as in the transparency and fairness of their remuneration.


Recommender systems and copyright content moderation

  • Although our research has focussed on issues of content moderation, we note the related but separate issue of content recommendation.Footnote 45 Whereas the actual phenomena are somewhat related, they relate however to a different set of issues and perspectives. We note that more research is needed in the field of copyright content recommendation as well as the role of copyright in content recommendation with a view to access and diversity. We therefore recommend that the EU institutions (e.g. the Commission through its Joint Research Centre) takes steps to carry out such research.


Transparency and robust data access for researchers

  • At the EU level, EU institutions and in particular the Commission should explore the application of the DSA’s provisions on transparency and access to data to OCSSPs and non-OCSSPs hosting copyright protected content (see Art. 40 DSA on data access and scrutinyFootnote 46), as well as study and, if adequate, propose EU level action that imposes transparency and access to data obligations on online platforms regarding their copyright content moderation activities. Inspiration could be drawn by the design and implementation of the German national transposition law under Sec. 19(3) Act on the Copyright Liability of Online Content Sharing Service Providers (UrhDaG) as regards rights to information. In that context, special care should be taken to: (1) ensure mandatory rules for data access for researchers; (2) carefully define the scope of beneficiary researchers, research institutions and research activities so as not to be overly restrictive; (3) design a regime that avoids the potential negative effects of requiring researchers to reimburse the platforms’ costs related to complying with such requests; and (4) fund and support academic initiatives to build up collaborations and institutional capacity to develop and coordinate the necessary expertise and infrastructure to process this data, including database creation and secure processes for data access. To the extent possible, the Commission should advance recommendations in this direction in its revised version of the Guidance on Art. 17 CDSMD.


Trade secret protection and transparency of content moderation systems

  • In order to make transparency meaningful, in our view, proper account must be taken concerning trade secrets protection, which likely extends to different aspects of human and algorithmic copyright content moderation by platforms.Footnote 47 Consequently, achieving meaningful transparency in this area will likely require legislative intervention that exempts platforms algorithmic moderation systems from trade secrets protection, at least for purposes of data access and scrutiny by researchers and policymakers. EU institutions and in particular the Commission should explore the limitations of the current legal framework in this respect and propose the required legislative intervention to ensure this access.


Relationship Art. 17 CDSMD and AI Act Proposal

  • We recommend that the Commission studies the legal interplay between legislation on AI and platform regulation, in particular the issue of whether and to what extent algorithmic content moderation systems might be covered by the AIA proposal. Any such study should consider the future scenario and potential impact of algorithmic content moderation systems that rely on machine learning which will be deployed to assess contextual uses covered by user rights under Art. 17(7) CDSMD, and how this might affect the permissibility of preventive filtering measures.


Human competences in copyright content moderation

  • Our research indicates that competences of human moderators directly impact the quality of the content moderation system. This much is recognised in the DSA, CDSMD, expert recommendations and the codes we reviewed, which require human review at minimum in the appeal process, partly as a means to mitigate the risks of automated content moderation. From our viewpoint, a certain level of human involvement should also be required to reduce biases and errors and ensure accuracy in the first stage of automated moderation. One way to achieve this would be to mandate or incentivise random accuracy tests by human intervention at this stage. We therefore recommend that the Commission explore the best practices and mechanisms to mandate or incentivise such random accuracy test for OCSSPs.