1 The Copyright/DSA Interface

On 15 December 2020, the European Commission submitted a proposal for a Regulation on a Single Market for Digital Services (Digital Services Act, DSA) and amending Directive 2000/31/EC.Footnote 1 In November 2021, the Council of the European Union reached agreement on an amended version of this proposal,Footnote 2 and on 20 December 2021 the European Parliament’s Committee on the Internal Market and Consumer Protection (IMCO) released a draft for an EP legislative resolution.Footnote 3 The legislative project “seeks to ensure the best conditions for the provision of innovative digital services in the internal market, to contribute to online safety and the protection of fundamental rights, and to set a robust and durable governance structure for the effective supervision of providers of intermediary services”.Footnote 4 To achieve these aims, the DSA sets out numerous due diligence obligations for intermediaries concerning any type of illegal information, including copyright-infringing content.Footnote 5

Empirically, copyright law accounts for most content removal from online platforms, by an order of magnitude. While currently there are few reporting obligations that would allow a consolidated picture to emerge, voluntary reporting (such as Google’s transparency reportFootnote 6) and specific obligations under some national laws (such as Germany’s Network Enforcement ActFootnote 7) give a clear indication of scale. Copyright removals are in the billions, privacy removals (under EU law) are in the millions, government-initiated removals and specific removals under the German Network Enforcement Act list of criminal offences are in the tens of thousands (per annum).Footnote 8

Thus, copyright enforcement online is a major issue in the context of the DSA, and the DSA will be of utmost importance for the future of online copyright in the EU. Against this background, the European Copyright Society (ECS) takes this opportunity to share its view on the relationship between the copyright acquis and the DSA (Sect. 2) and on further selected aspects of the DSA from a copyright perspective (Sect. 3).

2 The Relationship Between the Copyright Acquis and the DSA

2.1 In General

A preliminary point on the relationship between the DSA and the EU copyright acquis concerns the potential scope of overlapping rules. In the context of this Opinion, we are only concerned with DSA rules that govern copyright-infringing content as a type of “illegal content”Footnote 9 transmitted, hosted or allowed to be searched and found by an “intermediary service” covered by the DSA, i.e. mere conduit, caching, hosting, online platform and online search engine services.Footnote 10

Current EU copyright law covers this regulative space already. In particular, it provides for a multi-level approach to online platforms. If a platform qualifies as an “online content sharing service provider” (OCSSP) according to Art. 2(6) of the Copyright in the Digital Single Market Directive (CDSMD)Footnote 11 and supporting recitals, it is subject to the lex specialis of Art. 17 CDSMD, which sets out a special regime in relation to the right of communication to the public in Art. 3 InfoSoc DirectiveFootnote 12 and the liability exemption for hosting services in Art. 14 E-Commerce Directive.Footnote 13 Conversely, if a platform does not qualify as an OCSSP, it is subject to the pre-existing regime as interpreted by the Court of Justice of the European Union (CJEU), most recently in YouTube and Cyando,Footnote 14 and to the rules in the InfoSoc and E-Commerce Directives (the latter to be replaced by the DSA, in particular Art. 5 as regards the liability exemption for hosting service providers).

According to Art. 1(5)(c) DSA, the future regulation will be “without prejudice to the rules laid down by […] Union law on copyright and related rights”. Recital 11 of the Commission proposal adds that to the extent the copyright acquis establishes “specific rules and procedures”, those “should remain unaffected”.Footnote 15 Recital 9 provides additional guidance by setting out the general principle on the topic of the relationship of the DSA with sector-specific legislation. According to this, the DSA “should complement, yet not affect the application of rules resulting from other acts of Union law regulating certain aspects of the provision of intermediary services”. This has two implications, spelled out in the same recital. On the one hand, the DSA “leaves those other acts, which are to be considered lex specialis in relation to the generally applicable framework set out in this Regulation, unaffected”. On the other hand, the rules of the DSA shall “apply in respect of issues that are not or not fully addressed by those other acts as well as issues on which those other acts leave Member States the possibility of adopting certain measures at national level”.Footnote 16 From our perspective, the changes proposed by the Council to Recitals 9 to 11 do not affect these conclusions.Footnote 17 In particular, Recital 11 is amended to the effect that the DSA “is without prejudice to the specific rules and procedures governing liability of providers of intermediary services set in” the InfoSoc and CDSM Directives. The recital aims to clarify the applicability of the sector-specific liability regimes (including procedures such as notice-and-action) to OCSSP and non-OCSSP copyright hosting platforms. But this language is not materially different from that of the Commission proposal. In other words, it does not set aside the complementary application of the DSA’s rules and procedures to those platforms. The point is illustrated below with the rules on notices and transparency.

From the joint reading of these provisions it emerges, first, that the bifurcated or multi-level rules on online platforms hosting and disseminating copyright protected content – under the InfoSoc/E-Commerce Directives and Art. 17 CDSMD – are lex specialis to the DSA. Second, that such lex specialis does not preclude the application of the DSA in certain cases to copyright content-sharing platforms, whether or not they qualify as OCSSPs. Since most of the rules applying to non-OCSSPs are not specified in the legislative instruments of the copyright acquis (namely Arts. 3 and 8(3) InfoSoc Directive), the most challenging questions concern the application of the DSA to OCSSPs, which are subject to more detailed rules in Art. 17 CDSMD.

In light of the above, there are two categories of rules in the DSA that will apply to OCSSPs. First, the straightforward case of DSA rules regulating matters not addressed in Art. 17 CDSMD. Second, the less clear case of specific DSA rules on issues that Art. 17 CDSMD touches upon but in relation to which it is not as detailed as the DSA and leaves Members States with a margin of discretion. In view of this lack of clarity, we welcome the IMCO/EP proposal to introduce a new obligation of the Commission to publish guidelines with regard to the relationship between the DSA and sector-specific legal acts such as the CDSMD.Footnote 18

2.2 “Sufficiently Substantiated Notices” as an Example

One example that helps illustrate that point concerns the regime on notices giving rise to liability. The best-efforts obligations in Art. 17(4)(b) and (c) CDSMD are conditional upon the provision of information by the rightholder: “relevant and necessary information” under lit (b); and “a sufficiently substantiated notice” under lit (c). Article 17(9) CDSMD adds that if rightholders who request to have access to their specific works or other subject-matter disabled or removed, “they shall duly justify the reasons for their requests”. The Directive’s recitals do not add much in this respect. In other words, Art. 17 CDSMD essentially advances concepts for what information is needed, that it must originate from the rightholder or his/her representative, and a requirement that a notice is “sufficiently substantiated”. Recognising this gap, the Commission Guidance advances concrete recommendations on the content of such notice, namely that it follows the horizontal procedures in the 2018 Recommendation on Measures to Effectively Tackle Illegal Content Online.Footnote 19

The problems arising from the lack of harmonisation of a notice-and-action regime, including procedural rules on notices and counter-notices as regards illegal content, are detailed throughout the Commission’s DSA Impact AssessmentFootnote 20 and ultimately influenced several key provisions in the DSA, including on notice-and-action (Art. 14), statement of reasons (Art. 15), trusted flaggers (Art. 19), and measures and protection against misuse (Art. 20). These rules add a level of specificity not found in the lex specialis of Art. 17 CDSMD, for instance as to the minimum content of a notice in Art. 14(2) DSA. In light of the relationship between the CDSMD and DSA, the harmonising aims of the DSA in general and as regards notice-and-action procedures in particular, the legal nature of the DSA as a regulation, all lead to the conclusion that these more specific rules should apply to OCSSPs, in addition to the more general provisions in Art. 17 CDSMD. The fact that the Commission’s Guidance attempts to fill the legislative gap on notice-and-action by reference to the generic 2018 Illegal Content RecommendationFootnote 21 appears to confirm this understanding.

2.3 Transparency

The DSA introduces a number of ambitious transparency provisions. They are structured as a baseline reporting obligation (Art. 13) for all providers of intermediary services “to make publicly available in a specific section in their online interface, at least once a year, clear and easily comprehensible reports on content moderation” (Council text). Articles 29–33 then specify additional requirements for very large online platforms (VLOPs) and, according to the Council text, also for very large online search engines (VLOSEs, Art. 33a). With respect to online advertising, Art. 30 requires VLOPs and VLOSEs to publish a registry detailing advertising sold on their service. Article 31 regulates access and scrutiny of data, including for “vetted researchers … for the sole purpose of conducting research that contributes to the detection, identification and understanding of systemic risks in the Union” (Art. 31(2)).Footnote 22 Articles 32 and 33 deal with the role of compliance officers and additional transparency obligations with respect to resources dedicated to content moderation and reporting on risk assessment and mitigation.Footnote 23

It is our view that these new transparency obligations (apart from those relating to online advertising) apply with respect to copyright content moderation, even to platforms that qualify as OCSSPs under the CDSMD (and thus fall under the lex specialis of Art. 17). While the CDSMD regime requires under Art. 17(8) that OCSSPs provide rightholders with “(a) adequate information on the functioning of tools that they apply to ensure the unavailability of unauthorised content under Article 17(4) and; (b) information, where licensing agreements are concluded between service providers and rightholders, on the use of their content covered by the agreements”,Footnote 24 requirements in the other direction (safeguards for legitimate uses of content) are vague or ambiguous, i.e. “not fully addressed”, and thus within the scope of the DSA (supra, 2.2).

The Commission Guidance on Art. 17 CDSMD admits as much, devoting an entire section to the lack of certainty about application and enforceability of safeguards for legitimate uses of content. While Art. 17(7) and (9) provide “that any action undertaken together by service providers and rightholders does not lead to the unavailability of content”, there is no mechanism specified in Art. 17 CDSMD on how to achieve this. The Commission goes on to recommend reporting requirements at the national level: “Regular reports on the content blocked as a result of the application of automated tools in fulfilment of requests by rightholders would allow the Member States to monitor whether Article 17 has been properly applied, in particular Article 17(8) and (9). This would also notably allow users’ representatives to monitor and contest the application of parameters defined by service providers in cooperation with rightholders to detect systematic abuses.”Footnote 25 Earlier in the Guidance, the Commission articulates this weakly as a “could” aspiration. “In order to enhance legal certainty, the Member States could encourage both online content-sharing service providers and rightholders to provide information to users on the content covered by rightholders’ authorisations, leaving it to all those concerned to decide how to best to make it known that an authorisation is in place. Such transparency could contribute to avoid the risk of blocking of legitimate uses.”Footnote 26

The German implementation of Art. 17 CDSMD (which in some ways anticipated if not shaped the Commission Guidance) does indeed structure a provision on rights to information (Act on the Copyright Liability of Online Content Sharing Service Providers [Urheberrechts-Diensteanbieter-Gesetz – UrhDaG], Sec. 19 “Rights to information”) that implements Art. 17(8) CDSMD closely, but also adds a clause (Sec. 19(3)) that opens service providers’ content moderation processes and decisions to scrutiny by researchers, clearly with the aim to address the imbalance in transparency with respect to safeguards for the legitimate use of content.Footnote 27 Arguably, Art. 31 DSA will have just this effect, regardless of the routes to Art. 17 CDSMD implementation taken by Member States.

Transparency obligations are generally an important regulatory tool in cases of information asymmetries, and well understood in this context (for example, with respect to financial markets). As part of a due diligence and due process regime for online intermediaries, they acquire a new dimension. Powerful private firms are becoming proxies for the exercise of state and regulatory power via new duties and codes. Transparency safeguards against automated or discretionary moderation decisions need to be examined closely.Footnote 28

3 Critical Assessment of Selected DSA Provisions from a Copyright Perspective

This section provides a critical assessment of selected DSA provisions from the perspective of EU copyright law.

3.1 Search Engines

The first issue concerns the legal status of providers of online search engines. Search engines play a prominent role in copyright enforcement online. The harder it is to find infringing sources, the less frequently these will be accessed. Google, by far the most-used search engine in the EU, has for many years delisted URLs upon notices of alleged copyright infringement. This procedure is based upon the U.S. Digital Millennium Copyright Act (DMCA) and implemented on a worldwide scale. As of November 2021, Google reports to have received delisting requests under this scheme for more than 5.4 billion URLs in total.Footnote 29 More than 95% of these notices led to a removal of websites.Footnote 30

Less clearly established is the legal status of search engines under current EU copyright law. According to the case law of the CJEU, the posting of a hyperlink for profit to protected content that is freely available on another website without the consent of the copyright holder, constitutes a “communication to the public” under Art. 3(1) InfoSoc Directive, unless the underlying presumption of a full knowledge of the infringing nature of the source on the part of the person posting the hyperlink can be rebutted.Footnote 31 On the basis of this doctrine, the German Federal Supreme Court (Bundesgerichtshof) declined to hold the provider of a picture search engine directly liable for a link to an unauthorised source.Footnote 32 The court found, however, that the search engine provider is subject to the rules on “interferer liability” (Störerhaftung) with the consequence that, upon a notice, copyright-infringing URLs have to be delisted from the search results.Footnote 33

In spite of the significance of search engines for a safe, predictable and trusted online environment, these services are not explicitly addressed in the Commission’s DSA proposal. The IMCO/EP draft resolution also only proposes to add a new recital according to which “a search engine could act solely as a ‘caching’ service as to information included in the results of an inquiry”, whereas “[e]lements displayed alongside those results, such as online advertisements, would however still qualify as a hosting service”.Footnote 34 The Council finally proposes to codify a definition of search engines and to subject their providers to the liability rules for “caching” services.Footnote 35 Consequently, Arts. 8 and 9 regarding orders issued by competent authorities to act against illegal content and to provide information plus the basic due diligence obligations under Chapter III Sec. 1 (Arts. 10–13) regarding electronic points of contact, legal representatives, terms and conditions and transparency reporting would apply to search engine providers. Whereas the due diligence obligations of providers of hosting services, online platforms and online marketplaces (Arts. 14–24c) would not be applicable, the Council wants to extend the rules on VLOPs to VLOSEs (Art. 33a).Footnote 36

All these approaches towards search engines appear problematic. On the one hand, the aims of the DSA can hardly be achieved if search engines are not specifically regulated. On the other hand, the approach adopted by the Council also fails to establish an adequate regulatory framework for current search engine practice, in particular the massive-scale notice and delisting procedures and the handling of complaints by recipients whose content is delisted or demoted in ranking. Neither Art. 14 nor Arts. 17–20 and 23 of the Council proposal, which address these practices in the cases of hosting services and online platforms, would be applicable to search engines. Absent such medium-level due diligence obligations, the implications of the already vague high-level systemic risk provisions (Arts. 26 and 27) for search engines are completely unclear. According to these rules, VLOSEs will have to mitigate inter alia the systemic risk of the dissemination of illegal content through their services, also by “adapting” their content moderation processes.Footnote 37 These content moderation processes are, however, largely unregulated by the DSA in the first place. In order to address this problem, the DSA would have to be complemented with tailor-made, medium-level due diligence obligations for search engines.

3.2 The Concept of “Public”

Another issue relating to the scope of application of DSA rules concerns the concept of the “public”, as defined in Art. 2(i) DSA. This definition is relevant in particular for the notion of “online platform” and thus for the scope of application of the due diligence obligations in Chapter III Sec. 3 DSA. An “online platform” is a provider of a hosting service which, at the request of a recipient of the service, stores and “disseminates to the public” information.Footnote 38 “Dissemination to the public” is defined as “making information available … to a potentially unlimited number of third parties”.Footnote 39

The criterion of a “potentially unlimited number” of third parties has been criticised for excluding services like telegram groups from the scope of the platform-related norms of the DSA simply because such channels, in spite of the fact that they involve individual and societal risks addressed by the DSA,Footnote 40 have a fixed maximum of 200,000 recipients.Footnote 41

To avoid such flaws and ensure consistency with the sector-specific rules of EU copyright law, Art. 2(h) DSA could incorporate the more flexible and functional concept of “public” as developed in the case law of the CJEU concerning the right of communication to the public under Art. 3 InfoSoc Directive. The number of recipients would then not need to be “potentially unlimited” but “indeterminate” and “(fairly) large”.Footnote 42

3.3 Preventive Measures

Another highly contentious issue in the context of both the DSA and copyright law pertains to the legality of preventive measures, in particular automated content moderation activities.Footnote 43

3.3.1 No General Monitoring or Active Fact-Finding Obligations

In this regard, it is to be noted that Art. 7 DSA confirms the prohibition of general monitoring or active fact-finding obligations.Footnote 44 The language is essentially the same as Art. 15(1) of the E-Commerce Directive. Hence the prohibition on general monitoring and active fact-seeking obligations continuously applies to hosting and other intermediary service providers. This prohibition extends to obligations imposed both through any provisions of national or EU law within DSA’s scope. Therefore, it will apply to orders of authorities, injunctions issued by courts, or risk mitigation measures imposed by the DSA. On the other hand, Recital 28 DSA confirms that obligations imposed on providers to monitor in specific cases are not against the ban of Art. 7 DSA.Footnote 45 The DSA does not specify the line between general and specific monitoring, leaving this task largely to the CJEU, as was the case up until now.

3.3.2 Preventive Measures and Art. 17 CDSMD

The legality of automated, preventive content moderation tools also lies at the heart of the debate about Art. 17 CDSMD. That provision is a complex mixture of ex ante preventive obligation and ex post notice-and-take-down measures, while still operating under the prohibition on general monitoring obligations. Given that the CDSMD is generally lex specialis to the DSA, it is plausible that the scope of permissible specific monitoring in the context of Art. 17 – regarding copyright protected works and subject-matter available on OCSSPs – is broader as under Art. 7 DSA – regarding other types of illegal content.

In his opinion in Case C-401/19,Footnote 46 AG Saugmandsgaard Øe built upon the reasoning of the CJEU’s judgment in Glawischnig-Piesczek, which however did not concern copyright but rather defamation. The AG argued that the only acceptable ex ante blocking of content uploaded by users of a platform is that of information already established as being illegal (by a court) or that is manifestly illegal (“obvious from the outset”).Footnote 47 According to this view, the filtering of identical or equivalent content ought to considered lawful. Such conclusion broadens the scope of accepted special filtering measures, but does not yet equalise them with a general monitoring obligation. Whether this reading would apply only to measures imposed within the scope of Art. 17 CDSMD or more broadly to other types of illegal content within the scope of Art. 7 DSA remains open to discussion.

Importantly, the intuition behind these arguments is that the CJEU in Glawischnig-Piesczek linked the scope of “identical or equivalent” information to the lack of need to carry out any additional assessment by humans. This suggests that the Court is drawing a distinction between types of infringements that can be automated without significant error rates and those that cannot be. Article 7 DSA does not resolve this and leaves the issue to the courts, and ultimately the CJEU.

3.3.3 Risk Mitigation Under Art. 27 DSA

Preventive measures including the use of automated content moderation tools are further key to the special set of due diligence obligations the DSA sets out for very large intermediary service providers, namely VLOPs and – according to the Council position – VLOSEs. Under Art. 26 DSA, VLOPs and VLOSEs shall “identify, analyse and assess” any systemic risks stemming from the functioning and use of their services in the Union. This includes looking at risks posed by “the dissemination of illegal content”. From the copyright perspective, this means that video, picture or other content sharing services and search engines would be under an obligation to assess copyright infringements as part of their risk management obligations.

Such risk assessment will not consider individual instances of copyright infringement but systemic misuses of services. Because providers have to equally consider systemic risks to freedom of expression of affected users, the provision incentivises platforms to mitigate both risks of under-blocking and over-blocking within the same assessment. Once providers identify the systemic risks, they must “put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks” (Art. 27(1) DSA). Such measures, as it is clear from the list of examples provided in the DSA, must be compliant with the prohibition on general monitoring obligations. This includes adjustments of terms and conditions, recommender systems, improving internal processes, strengthening alternative dispute resolution (ADR) systems, improving awareness of users, or cooperation with other providers.

The provision thus gives the European Commission a tool to consider copyright infringement risks and the ability to minimise the negative effects of creativity-stifling interventions. It is worth noting that this provision can have important spill-over effects on private enforcement measures, such as injunctions against intermediaries under Art. 8(3) of the InfoSoc Directive. In the IMCO/EP draft resolution, it is added that the risk mitigation measures must also be compatible with the prohibition on general monitoring obligations.Footnote 48 The existing case law on specific monitoring therefore remains highly relevant.

3.3.4 Automated Blocking and Safeguards to Fundamental Rights

The obligation to implement content recognition technologies in order to prevent the dissemination and findability of copyright-infringing and other illegal content finally raises important questions from the perspective of fundamental rights. Indeed, the main reason the Republic of Poland requested the annulment of Art. 17(4)(b) and (c) CDSMD was the concern that OCSSPs will rely on automated filtering mechanisms to evade liability for their end-users’ unauthorised activities, which could, however, disproportionally affect end-users’ freedom of expression. In his Opinion, AG Saugmandsgaard Øe agreed with the applicant that OCSSPs might rely on automated mechanisms to comply with Art. 17 CDSMD and that this would interfere in a particularly severe way with end-users’ freedom of expression.Footnote 49 The AG nevertheless concluded that freedom of expression can be limited to a certain degree, and that Art. 17 CDSMD contains numerous mechanisms to balance such limitation.Footnote 50

The current version of the DSA sets out an even more elaborate system concerning the use of automated means by various intermediary service providers. Such means are referred to explicitly in the definition of “content moderation” in Art. 2(p) and in the context of notice-and-action mechanisms of providers of hosting services, including providers of online platforms (Arts. 14–15), internal complaint-handling systems of providers of online platforms (Art. 17), transparency reporting obligations of providers of online platforms (Art. 23), and risk assessments by VLOPs and VLOSEs (Art. 26). The DSA remains silent, however, as regards the precise requirements and limits of any such acts.Footnote 51 It is therefore plausible that restrictions on the use of automated content moderation tools would be measured in light of the approach adopted by the AG and eventually the CJEU in the annulment action against Art. 17 CDSMD.

3.3.5 Transparency Obligations Concerning Automated Measures

Finally, the DSA focuses much more than Art. 17 CDSMD on the transparency and communication of information related to the use of automated means by the relevant service providers. Providers of hosting services and online platforms face almost identical obligations related to the use of automated means,Footnote 52 with one notable exception. Online platform service providers are obliged to specify the precise purposes and the indicators of the accuracy of the automated means used for the purposes of content moderation.Footnote 53 In contrast, “mere” hosting service providers are exempted from sharing detailed empirical data on their practices related to automated blocking.

This is also where the proper interpretation of “online platforms” will come into play.Footnote 54 The Council version of the DSA defines online platforms as hosting service providers “which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation”.Footnote 55 These terms will almost certainly be tested by national courts, and ultimately the CJEU.

3.4 Implementation and Enforcement Provisions

Chapters IV and V of the DSA on the implementation and enforcement of due diligence obligations also raise concerns from a copyright perspective.

3.4.1 Symmetric Remedies for All Actors Involved

Firstly, the implementation and enforcement rules of the DSA should address all interests involved in a balanced, symmetrical way. Actors Involved

The difficulty with this basic claim is that digital content disputes engage the rights and interests of various actors: (a) persons who post content online (content providers), (b) readers who consult the content, and (c) victims who are affected by the content posted (in our context: copyright holders).

In our understanding, the concept of “the recipient of the service” as defined in Art. 2(b) DSA includes not only providers of content on hosting services but also their readers. Thus, the DSA creates enforcement tools for victims, such as copyright holders, but also countermeasures for speakers, who can equally be copyright holders, and their readers. Our understanding is as follows:

  • Content providers are protected as “recipients of the service”, whether individually (Arts. 15, 17, 18) or collectively (Art. 68, sometimes even under Art. 72 if they do not act in their trade), and, if they qualify, as “copyright holders” under applicable copyright legislation;

  • Readers are protected as “recipients of the service” collectively (Art. 68) and often through consumer associations if their interests are harmed (Art. 72 DSA and Art 2(1) of the Representative Action DirectiveFootnote 56);

  • Victims are protected through the articulation of all of the DSA’s mechanisms addressing illegal content (Arts. 14, 19, etc.) and any respective legislation, such as copyright provisions that set out remedies against copyright infringements.

Remedies available for these actors for violations of DSA obligations should be symmetrical and avoid creating strong rights for one (set of) group(s) at the expense of other groups. The DSA proposal does, however, not establish such a level playing field for all interests involved. Whereas copyright holders can act individually (Arts. 14, 17, 18), via trusted flaggers (Art. 19) and representative entities (Art. 68), passive users (readers) and in particular content providers are more limited in their ability to implement and enforce intermediaries’ DSA obligations.

Firstly, they do not always have individual claims which they could advance. Readers (users) can, according to Art. 72 DSA in connection with the Representative Action Directive, only act through qualified consumer organisations or public bodies that may bring a representative action seeking injunctive relief or redress in cases of copyright over-blocking in violation of the DSA.Footnote 57 This type of collective enforcement is applicable, however, only in cases of DSA violations that “harm or may harm the collective interests of consumers”.Footnote 58 This requirement is not met if content providers act for commercial purposes yet in line with copyright, for example, in the case that a professional musician uploads a parody or remix of a song. For these actors, Art. 72 DSA will therefore not provide enforcement support. Trusted Content Creators

In order to ensure effective representation of uploaders, Art. 68 DSA should firstly be complemented by a provision similar to Art. 20(1) and (2) Representative Action Directive, which would oblige Member States to support eligible representative entities with public funding, including structural support, or access to legal aid. In addition, the priority status available for trusted copyright holders under Art. 19 DSA should also be available in a defensive way for certain content providers (Trusted Content Creators, TCCs) who create and distribute content on regulated platforms. The content they upload should be recognised and potentially carefully privileged in the DSA’s notice-and-action system. This could benefit creators of various types of content, ranging from journalistic, scientific to artistic expressions, that all tend to be protected by copyright law and might be exposed to risks of damaging instantaneous removal of legitimate content. The idea mirrors that of trusted flaggers: based on their track record of high-quality notifications, trusted flaggers enjoy higher trust and priority. Similarly, TCCs who, based on their track-record of non-infringing content earning everyone’s trust, can potentially enjoy a higher level of protection against time-sensitive errors of the notice-and-action process.

To this end, the DSA could encourage content providers to establish TCC associations. TCCs would be continuously evaluated as to how well they police the content of their members. If they do a good job, they become trusted for as long as they maintain such a track record. TCC content should be then privileged in the notice-and-action process. For instance, they should not be subject to takedowns without prior discussion, or the accounts of their members should not be terminated as easily. Such benefits should lead to the following benefits:

  • Encourage the creation of associations of TCCs who jointly represent some quality standards, even in the context of professions that normally do not have such structures (e.g. YouTube Creators);

  • Drive interest in membership, as members of TCCs have a special status of quality/trust and thus a stronger position vis-à-vis platforms;

  • Incentivise internal quality control within TCCs; if one member accumulates mistakes, other members will act to remove such member, as its consistent mistakes can cost everyone the privileged status;

  • Bad actors can set up their own associations, but they will be judged on their track record and not the formal status (e.g. being an accredited journalist); as a consequence, if they fail to maintain the aggregate quality among the members, they all lose the status of TCCs.

Such policies strengthen the position of TCCs ex ante and reduce over-blocking prior to take-down/de-platforming. The benefit is that since TCCs are collective entities, their members have to keep each other in check, but they also bargain and deal with the platforms collectively. This strengthens their position and shifts the focus from individual mistakes to aggregate characteristics of quality that such TCCs represent. As with trusted flaggers, it is left to the society at large to develop its own institutions that can be relied upon to articulate trust, without concentrating such power in platforms. To keep some public oversight, the DSA could extend the accreditation mechanism used for trusted flaggers to TCCs and adjust it for these purposes. Out-of-Court Dispute Settlements

Article 18 DSA sets out a framework for extra-judicial dispute settlement. Modelled partly on the Uniform Dispute Resolution Policy (UDRP),Footnote 59 it can contribute towards effective resolution of digital content disputes. However, the provision as currently drafted relies on the initiative of the affected content providers to take action. In practice, content providers might be disincentivised by costs or other factors. The DSA should therefore embrace solutions that make the use of such dispute settlement more likely in cases where content creators want to make sure that lawful content becomes available. The IMCO/EP draft resolution seems to be going in the right direction as it emphasises accessibility and ease of access, including by persons with disabilities.Footnote 60

To improve the provision further, Art. 18 DSA could better develop the interconnection with Art. 68 DSA, by giving consumer associations a right of action on behalf of content providers. Online platforms could be obliged to offer a menu of dispute settlement bodies once the content provider exhausts its options in the internal dispute system,Footnote 61 and allow them to automatically refer its case to one of the organisations of its choice. Such an organisation could then examine on its own initiative whether it wishes to pursue the case and sponsor the fees or provide necessary legal advice.

3.4.2 Access Restrictions According to Art. 41(3)(b) DSA

A second, practically important enforcement issue concerns the question whether copyright infringements justify a temporary restriction of access to a service according to Art. 41(3)(b) DSA.

The wholesale blocking of access to a website or online interface of an intermediary service, including VLOPs and VLOSEs (cf. Art. 65(1)), is an “extreme measure” that has to be justified on its own, separately and distinctly from measures against the illegal content accessible there.Footnote 62 Article 41(3)(b) therefore rightly establishes high hurdles for a judicial blocking order. In particular, the DSA infringement has to entail “a serious criminal offence involving a threat to the life or safety of persons” that “cannot be avoided through the exercise of other powers available under Union or national law”.

Copyright infringements, even on a commercial scale, do not meet these requirements. As such, and in contrast to trademark or patent infringing goods such as fake medicines, they never involve a threat to the life or safety of persons. Moreover, a “zero risk” approach regarding copyright infringements cannot be seen as justified in a democratic society.Footnote 63 Finally, the effective, proportionate and dissuasive enforcement of copyright, including through criminal laws, and through blocking orders against access providers have provided sufficient tools to curb online piracy.Footnote 64

Recital 82 in fine nevertheless mentions “services … used by a third party to infringe an intellectual property right” as an example in which a DSA blocking order might be viable. This statement creates legal uncertainty in that it also includes copyright infringements and thus indirectly contradicts the material requirements of Art. 41(3)(b). It should therefore be deleted or limited to an example like fake medicines.

3.4.3 Private Enforcement of DSA Obligations

The DSA as proposed by the Commission and agreed upon by the Council is furthermore silent on the general question whether Chapter IV regulates the enforcement of intermediaries’ obligations conclusively or whether a failure of an intermediary to comply with the DSA can also trigger private claims on other legal grounds including general tort and unfair competition law.Footnote 65

A copyright perspective reveals that this is not merely a theoretical problem. For example, will an intermediary be liable to compensate copyright holders for the damage arising from a non-existent or insufficient notice-and-action mechanism? Do uploaders have such a claim if a platform fails to reverse an unjustified removal decision without undue delay according to Art. 17(3) DSA?

In the interest of legal certainty,Footnote 66 it is submitted that the issue of non-DSA enforcement measures is specifically addressed. Delegating enforcement powers to Member StatesFootnote 67 or otherwise providing that non-DSA measures remain applicableFootnote 68 would contribute to the practical effectiveness of DSA obligations. Enforcement would not lie exclusively in the hands of the Commission and national Digital Services Coordinators and thus eventually depend upon the public resources allocated for that purpose. There are, however, also arguments in favour of ruling out any enforcement beyond the DSA. In particular, unspecified non-DSA remedies based on national laws might disturb the delicate overall balance between reducing the dissemination of illegal content and safeguarding freedom of expression online, which the DSA establishes on a procedural meta-level on top of copyright and other substantive laws. In the area of copyright law, where a comprehensive EU enforcement acquis remains applicable, such a restrictive approach would arguably not create relevant enforcement lacuna.

3.4.4 Digital Services Coordinators

We have argued that copyright law in the DSA context needs to be understood as part of an EU-wide regulatory regime. Specifically, we have shown that the copyright acquis will come under DSA scrutiny in important respects, with respect to the formalities involved in copyright takedown and search removals, with respect to transparency obligations by intermediaries, and with respect to the assessment of automated content moderation activities (which must include creator, user and fundamental rights perspectives as the core of “systemic risk” mitigation).

This leaves one critical question: Who is the regulator? If copyright law accounts for most content moderation decisions within the future scope of the DSA, who has the authority to oversee and enforce the regime?

The DSA envisages a tiered system where generally competences sit with Member States who under Art. 38 (Council version) “shall designate one or more competent authorities as responsible for the supervision and enforcement of this Regulation (‘competent authorities’)”. For VLOPs and, according to the Council text, also for VLOSEs, the Commission will exercise powers to initiate proceedings (Art. 51), make findings of non-compliance (Art. 58) and issue fines (Art. 59). In our view, the designation of national authorities with DSA regulatory competences with respect to copyright law will prove highly sensitive for the effective implementation of the new regime. National authorities with DSA competences need to possess expertise in copyright law, freedom of expression and a comprehensive understanding of creator and user contexts.