This section provides a critical assessment of selected DSA provisions from the perspective of EU copyright law.
Search Engines
The first issue concerns the legal status of providers of online search engines. Search engines play a prominent role in copyright enforcement online. The harder it is to find infringing sources, the less frequently these will be accessed. Google, by far the most-used search engine in the EU, has for many years delisted URLs upon notices of alleged copyright infringement. This procedure is based upon the U.S. Digital Millennium Copyright Act (DMCA) and implemented on a worldwide scale. As of November 2021, Google reports to have received delisting requests under this scheme for more than 5.4 billion URLs in total.Footnote 29 More than 95% of these notices led to a removal of websites.Footnote 30
Less clearly established is the legal status of search engines under current EU copyright law. According to the case law of the CJEU, the posting of a hyperlink for profit to protected content that is freely available on another website without the consent of the copyright holder, constitutes a “communication to the public” under Art. 3(1) InfoSoc Directive, unless the underlying presumption of a full knowledge of the infringing nature of the source on the part of the person posting the hyperlink can be rebutted.Footnote 31 On the basis of this doctrine, the German Federal Supreme Court (Bundesgerichtshof) declined to hold the provider of a picture search engine directly liable for a link to an unauthorised source.Footnote 32 The court found, however, that the search engine provider is subject to the rules on “interferer liability” (Störerhaftung) with the consequence that, upon a notice, copyright-infringing URLs have to be delisted from the search results.Footnote 33
In spite of the significance of search engines for a safe, predictable and trusted online environment, these services are not explicitly addressed in the Commission’s DSA proposal. The IMCO/EP draft resolution also only proposes to add a new recital according to which “a search engine could act solely as a ‘caching’ service as to information included in the results of an inquiry”, whereas “[e]lements displayed alongside those results, such as online advertisements, would however still qualify as a hosting service”.Footnote 34 The Council finally proposes to codify a definition of search engines and to subject their providers to the liability rules for “caching” services.Footnote 35 Consequently, Arts. 8 and 9 regarding orders issued by competent authorities to act against illegal content and to provide information plus the basic due diligence obligations under Chapter III Sec. 1 (Arts. 10–13) regarding electronic points of contact, legal representatives, terms and conditions and transparency reporting would apply to search engine providers. Whereas the due diligence obligations of providers of hosting services, online platforms and online marketplaces (Arts. 14–24c) would not be applicable, the Council wants to extend the rules on VLOPs to VLOSEs (Art. 33a).Footnote 36
All these approaches towards search engines appear problematic. On the one hand, the aims of the DSA can hardly be achieved if search engines are not specifically regulated. On the other hand, the approach adopted by the Council also fails to establish an adequate regulatory framework for current search engine practice, in particular the massive-scale notice and delisting procedures and the handling of complaints by recipients whose content is delisted or demoted in ranking. Neither Art. 14 nor Arts. 17–20 and 23 of the Council proposal, which address these practices in the cases of hosting services and online platforms, would be applicable to search engines. Absent such medium-level due diligence obligations, the implications of the already vague high-level systemic risk provisions (Arts. 26 and 27) for search engines are completely unclear. According to these rules, VLOSEs will have to mitigate inter alia the systemic risk of the dissemination of illegal content through their services, also by “adapting” their content moderation processes.Footnote 37 These content moderation processes are, however, largely unregulated by the DSA in the first place. In order to address this problem, the DSA would have to be complemented with tailor-made, medium-level due diligence obligations for search engines.
The Concept of “Public”
Another issue relating to the scope of application of DSA rules concerns the concept of the “public”, as defined in Art. 2(i) DSA. This definition is relevant in particular for the notion of “online platform” and thus for the scope of application of the due diligence obligations in Chapter III Sec. 3 DSA. An “online platform” is a provider of a hosting service which, at the request of a recipient of the service, stores and “disseminates to the public” information.Footnote 38 “Dissemination to the public” is defined as “making information available … to a potentially unlimited number of third parties”.Footnote 39
The criterion of a “potentially unlimited number” of third parties has been criticised for excluding services like telegram groups from the scope of the platform-related norms of the DSA simply because such channels, in spite of the fact that they involve individual and societal risks addressed by the DSA,Footnote 40 have a fixed maximum of 200,000 recipients.Footnote 41
To avoid such flaws and ensure consistency with the sector-specific rules of EU copyright law, Art. 2(h) DSA could incorporate the more flexible and functional concept of “public” as developed in the case law of the CJEU concerning the right of communication to the public under Art. 3 InfoSoc Directive. The number of recipients would then not need to be “potentially unlimited” but “indeterminate” and “(fairly) large”.Footnote 42
Preventive Measures
Another highly contentious issue in the context of both the DSA and copyright law pertains to the legality of preventive measures, in particular automated content moderation activities.Footnote 43
No General Monitoring or Active Fact-Finding Obligations
In this regard, it is to be noted that Art. 7 DSA confirms the prohibition of general monitoring or active fact-finding obligations.Footnote 44 The language is essentially the same as Art. 15(1) of the E-Commerce Directive. Hence the prohibition on general monitoring and active fact-seeking obligations continuously applies to hosting and other intermediary service providers. This prohibition extends to obligations imposed both through any provisions of national or EU law within DSA’s scope. Therefore, it will apply to orders of authorities, injunctions issued by courts, or risk mitigation measures imposed by the DSA. On the other hand, Recital 28 DSA confirms that obligations imposed on providers to monitor in specific cases are not against the ban of Art. 7 DSA.Footnote 45 The DSA does not specify the line between general and specific monitoring, leaving this task largely to the CJEU, as was the case up until now.
Preventive Measures and Art. 17 CDSMD
The legality of automated, preventive content moderation tools also lies at the heart of the debate about Art. 17 CDSMD. That provision is a complex mixture of ex ante preventive obligation and ex post notice-and-take-down measures, while still operating under the prohibition on general monitoring obligations. Given that the CDSMD is generally lex specialis to the DSA, it is plausible that the scope of permissible specific monitoring in the context of Art. 17 – regarding copyright protected works and subject-matter available on OCSSPs – is broader as under Art. 7 DSA – regarding other types of illegal content.
In his opinion in Case C-401/19,Footnote 46 AG Saugmandsgaard Øe built upon the reasoning of the CJEU’s judgment in Glawischnig-Piesczek, which however did not concern copyright but rather defamation. The AG argued that the only acceptable ex ante blocking of content uploaded by users of a platform is that of information already established as being illegal (by a court) or that is manifestly illegal (“obvious from the outset”).Footnote 47 According to this view, the filtering of identical or equivalent content ought to considered lawful. Such conclusion broadens the scope of accepted special filtering measures, but does not yet equalise them with a general monitoring obligation. Whether this reading would apply only to measures imposed within the scope of Art. 17 CDSMD or more broadly to other types of illegal content within the scope of Art. 7 DSA remains open to discussion.
Importantly, the intuition behind these arguments is that the CJEU in Glawischnig-Piesczek linked the scope of “identical or equivalent” information to the lack of need to carry out any additional assessment by humans. This suggests that the Court is drawing a distinction between types of infringements that can be automated without significant error rates and those that cannot be. Article 7 DSA does not resolve this and leaves the issue to the courts, and ultimately the CJEU.
Risk Mitigation Under Art. 27 DSA
Preventive measures including the use of automated content moderation tools are further key to the special set of due diligence obligations the DSA sets out for very large intermediary service providers, namely VLOPs and – according to the Council position – VLOSEs. Under Art. 26 DSA, VLOPs and VLOSEs shall “identify, analyse and assess” any systemic risks stemming from the functioning and use of their services in the Union. This includes looking at risks posed by “the dissemination of illegal content”. From the copyright perspective, this means that video, picture or other content sharing services and search engines would be under an obligation to assess copyright infringements as part of their risk management obligations.
Such risk assessment will not consider individual instances of copyright infringement but systemic misuses of services. Because providers have to equally consider systemic risks to freedom of expression of affected users, the provision incentivises platforms to mitigate both risks of under-blocking and over-blocking within the same assessment. Once providers identify the systemic risks, they must “put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks” (Art. 27(1) DSA). Such measures, as it is clear from the list of examples provided in the DSA, must be compliant with the prohibition on general monitoring obligations. This includes adjustments of terms and conditions, recommender systems, improving internal processes, strengthening alternative dispute resolution (ADR) systems, improving awareness of users, or cooperation with other providers.
The provision thus gives the European Commission a tool to consider copyright infringement risks and the ability to minimise the negative effects of creativity-stifling interventions. It is worth noting that this provision can have important spill-over effects on private enforcement measures, such as injunctions against intermediaries under Art. 8(3) of the InfoSoc Directive. In the IMCO/EP draft resolution, it is added that the risk mitigation measures must also be compatible with the prohibition on general monitoring obligations.Footnote 48 The existing case law on specific monitoring therefore remains highly relevant.
Automated Blocking and Safeguards to Fundamental Rights
The obligation to implement content recognition technologies in order to prevent the dissemination and findability of copyright-infringing and other illegal content finally raises important questions from the perspective of fundamental rights. Indeed, the main reason the Republic of Poland requested the annulment of Art. 17(4)(b) and (c) CDSMD was the concern that OCSSPs will rely on automated filtering mechanisms to evade liability for their end-users’ unauthorised activities, which could, however, disproportionally affect end-users’ freedom of expression. In his Opinion, AG Saugmandsgaard Øe agreed with the applicant that OCSSPs might rely on automated mechanisms to comply with Art. 17 CDSMD and that this would interfere in a particularly severe way with end-users’ freedom of expression.Footnote 49 The AG nevertheless concluded that freedom of expression can be limited to a certain degree, and that Art. 17 CDSMD contains numerous mechanisms to balance such limitation.Footnote 50
The current version of the DSA sets out an even more elaborate system concerning the use of automated means by various intermediary service providers. Such means are referred to explicitly in the definition of “content moderation” in Art. 2(p) and in the context of notice-and-action mechanisms of providers of hosting services, including providers of online platforms (Arts. 14–15), internal complaint-handling systems of providers of online platforms (Art. 17), transparency reporting obligations of providers of online platforms (Art. 23), and risk assessments by VLOPs and VLOSEs (Art. 26). The DSA remains silent, however, as regards the precise requirements and limits of any such acts.Footnote 51 It is therefore plausible that restrictions on the use of automated content moderation tools would be measured in light of the approach adopted by the AG and eventually the CJEU in the annulment action against Art. 17 CDSMD.
Transparency Obligations Concerning Automated Measures
Finally, the DSA focuses much more than Art. 17 CDSMD on the transparency and communication of information related to the use of automated means by the relevant service providers. Providers of hosting services and online platforms face almost identical obligations related to the use of automated means,Footnote 52 with one notable exception. Online platform service providers are obliged to specify the precise purposes and the indicators of the accuracy of the automated means used for the purposes of content moderation.Footnote 53 In contrast, “mere” hosting service providers are exempted from sharing detailed empirical data on their practices related to automated blocking.
This is also where the proper interpretation of “online platforms” will come into play.Footnote 54 The Council version of the DSA defines online platforms as hosting service providers “which, at the request of a recipient of the service, stores and disseminates to the public information, unless that activity is a minor and purely ancillary feature of another service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature into the other service is not a means to circumvent the applicability of this Regulation”.Footnote 55 These terms will almost certainly be tested by national courts, and ultimately the CJEU.
Implementation and Enforcement Provisions
Chapters IV and V of the DSA on the implementation and enforcement of due diligence obligations also raise concerns from a copyright perspective.
Symmetric Remedies for All Actors Involved
Firstly, the implementation and enforcement rules of the DSA should address all interests involved in a balanced, symmetrical way.
Actors Involved
The difficulty with this basic claim is that digital content disputes engage the rights and interests of various actors: (a) persons who post content online (content providers), (b) readers who consult the content, and (c) victims who are affected by the content posted (in our context: copyright holders).
In our understanding, the concept of “the recipient of the service” as defined in Art. 2(b) DSA includes not only providers of content on hosting services but also their readers. Thus, the DSA creates enforcement tools for victims, such as copyright holders, but also countermeasures for speakers, who can equally be copyright holders, and their readers. Our understanding is as follows:
-
Content providers are protected as “recipients of the service”, whether individually (Arts. 15, 17, 18) or collectively (Art. 68, sometimes even under Art. 72 if they do not act in their trade), and, if they qualify, as “copyright holders” under applicable copyright legislation;
-
Readers are protected as “recipients of the service” collectively (Art. 68) and often through consumer associations if their interests are harmed (Art. 72 DSA and Art 2(1) of the Representative Action DirectiveFootnote 56);
-
Victims are protected through the articulation of all of the DSA’s mechanisms addressing illegal content (Arts. 14, 19, etc.) and any respective legislation, such as copyright provisions that set out remedies against copyright infringements.
Remedies available for these actors for violations of DSA obligations should be symmetrical and avoid creating strong rights for one (set of) group(s) at the expense of other groups. The DSA proposal does, however, not establish such a level playing field for all interests involved. Whereas copyright holders can act individually (Arts. 14, 17, 18), via trusted flaggers (Art. 19) and representative entities (Art. 68), passive users (readers) and in particular content providers are more limited in their ability to implement and enforce intermediaries’ DSA obligations.
Firstly, they do not always have individual claims which they could advance. Readers (users) can, according to Art. 72 DSA in connection with the Representative Action Directive, only act through qualified consumer organisations or public bodies that may bring a representative action seeking injunctive relief or redress in cases of copyright over-blocking in violation of the DSA.Footnote 57 This type of collective enforcement is applicable, however, only in cases of DSA violations that “harm or may harm the collective interests of consumers”.Footnote 58 This requirement is not met if content providers act for commercial purposes yet in line with copyright, for example, in the case that a professional musician uploads a parody or remix of a song. For these actors, Art. 72 DSA will therefore not provide enforcement support.
Trusted Content Creators
In order to ensure effective representation of uploaders, Art. 68 DSA should firstly be complemented by a provision similar to Art. 20(1) and (2) Representative Action Directive, which would oblige Member States to support eligible representative entities with public funding, including structural support, or access to legal aid. In addition, the priority status available for trusted copyright holders under Art. 19 DSA should also be available in a defensive way for certain content providers (Trusted Content Creators, TCCs) who create and distribute content on regulated platforms. The content they upload should be recognised and potentially carefully privileged in the DSA’s notice-and-action system. This could benefit creators of various types of content, ranging from journalistic, scientific to artistic expressions, that all tend to be protected by copyright law and might be exposed to risks of damaging instantaneous removal of legitimate content. The idea mirrors that of trusted flaggers: based on their track record of high-quality notifications, trusted flaggers enjoy higher trust and priority. Similarly, TCCs who, based on their track-record of non-infringing content earning everyone’s trust, can potentially enjoy a higher level of protection against time-sensitive errors of the notice-and-action process.
To this end, the DSA could encourage content providers to establish TCC associations. TCCs would be continuously evaluated as to how well they police the content of their members. If they do a good job, they become trusted for as long as they maintain such a track record. TCC content should be then privileged in the notice-and-action process. For instance, they should not be subject to takedowns without prior discussion, or the accounts of their members should not be terminated as easily. Such benefits should lead to the following benefits:
-
Encourage the creation of associations of TCCs who jointly represent some quality standards, even in the context of professions that normally do not have such structures (e.g. YouTube Creators);
-
Drive interest in membership, as members of TCCs have a special status of quality/trust and thus a stronger position vis-à-vis platforms;
-
Incentivise internal quality control within TCCs; if one member accumulates mistakes, other members will act to remove such member, as its consistent mistakes can cost everyone the privileged status;
-
Bad actors can set up their own associations, but they will be judged on their track record and not the formal status (e.g. being an accredited journalist); as a consequence, if they fail to maintain the aggregate quality among the members, they all lose the status of TCCs.
Such policies strengthen the position of TCCs ex ante and reduce over-blocking prior to take-down/de-platforming. The benefit is that since TCCs are collective entities, their members have to keep each other in check, but they also bargain and deal with the platforms collectively. This strengthens their position and shifts the focus from individual mistakes to aggregate characteristics of quality that such TCCs represent. As with trusted flaggers, it is left to the society at large to develop its own institutions that can be relied upon to articulate trust, without concentrating such power in platforms. To keep some public oversight, the DSA could extend the accreditation mechanism used for trusted flaggers to TCCs and adjust it for these purposes.
Out-of-Court Dispute Settlements
Article 18 DSA sets out a framework for extra-judicial dispute settlement. Modelled partly on the Uniform Dispute Resolution Policy (UDRP),Footnote 59 it can contribute towards effective resolution of digital content disputes. However, the provision as currently drafted relies on the initiative of the affected content providers to take action. In practice, content providers might be disincentivised by costs or other factors. The DSA should therefore embrace solutions that make the use of such dispute settlement more likely in cases where content creators want to make sure that lawful content becomes available. The IMCO/EP draft resolution seems to be going in the right direction as it emphasises accessibility and ease of access, including by persons with disabilities.Footnote 60
To improve the provision further, Art. 18 DSA could better develop the interconnection with Art. 68 DSA, by giving consumer associations a right of action on behalf of content providers. Online platforms could be obliged to offer a menu of dispute settlement bodies once the content provider exhausts its options in the internal dispute system,Footnote 61 and allow them to automatically refer its case to one of the organisations of its choice. Such an organisation could then examine on its own initiative whether it wishes to pursue the case and sponsor the fees or provide necessary legal advice.
Access Restrictions According to Art. 41(3)(b) DSA
A second, practically important enforcement issue concerns the question whether copyright infringements justify a temporary restriction of access to a service according to Art. 41(3)(b) DSA.
The wholesale blocking of access to a website or online interface of an intermediary service, including VLOPs and VLOSEs (cf. Art. 65(1)), is an “extreme measure” that has to be justified on its own, separately and distinctly from measures against the illegal content accessible there.Footnote 62 Article 41(3)(b) therefore rightly establishes high hurdles for a judicial blocking order. In particular, the DSA infringement has to entail “a serious criminal offence involving a threat to the life or safety of persons” that “cannot be avoided through the exercise of other powers available under Union or national law”.
Copyright infringements, even on a commercial scale, do not meet these requirements. As such, and in contrast to trademark or patent infringing goods such as fake medicines, they never involve a threat to the life or safety of persons. Moreover, a “zero risk” approach regarding copyright infringements cannot be seen as justified in a democratic society.Footnote 63 Finally, the effective, proportionate and dissuasive enforcement of copyright, including through criminal laws, and through blocking orders against access providers have provided sufficient tools to curb online piracy.Footnote 64
Recital 82 in fine nevertheless mentions “services … used by a third party to infringe an intellectual property right” as an example in which a DSA blocking order might be viable. This statement creates legal uncertainty in that it also includes copyright infringements and thus indirectly contradicts the material requirements of Art. 41(3)(b). It should therefore be deleted or limited to an example like fake medicines.
Private Enforcement of DSA Obligations
The DSA as proposed by the Commission and agreed upon by the Council is furthermore silent on the general question whether Chapter IV regulates the enforcement of intermediaries’ obligations conclusively or whether a failure of an intermediary to comply with the DSA can also trigger private claims on other legal grounds including general tort and unfair competition law.Footnote 65
A copyright perspective reveals that this is not merely a theoretical problem. For example, will an intermediary be liable to compensate copyright holders for the damage arising from a non-existent or insufficient notice-and-action mechanism? Do uploaders have such a claim if a platform fails to reverse an unjustified removal decision without undue delay according to Art. 17(3) DSA?
In the interest of legal certainty,Footnote 66 it is submitted that the issue of non-DSA enforcement measures is specifically addressed. Delegating enforcement powers to Member StatesFootnote 67 or otherwise providing that non-DSA measures remain applicableFootnote 68 would contribute to the practical effectiveness of DSA obligations. Enforcement would not lie exclusively in the hands of the Commission and national Digital Services Coordinators and thus eventually depend upon the public resources allocated for that purpose. There are, however, also arguments in favour of ruling out any enforcement beyond the DSA. In particular, unspecified non-DSA remedies based on national laws might disturb the delicate overall balance between reducing the dissemination of illegal content and safeguarding freedom of expression online, which the DSA establishes on a procedural meta-level on top of copyright and other substantive laws. In the area of copyright law, where a comprehensive EU enforcement acquis remains applicable, such a restrictive approach would arguably not create relevant enforcement lacuna.
Digital Services Coordinators
We have argued that copyright law in the DSA context needs to be understood as part of an EU-wide regulatory regime. Specifically, we have shown that the copyright acquis will come under DSA scrutiny in important respects, with respect to the formalities involved in copyright takedown and search removals, with respect to transparency obligations by intermediaries, and with respect to the assessment of automated content moderation activities (which must include creator, user and fundamental rights perspectives as the core of “systemic risk” mitigation).
This leaves one critical question: Who is the regulator? If copyright law accounts for most content moderation decisions within the future scope of the DSA, who has the authority to oversee and enforce the regime?
The DSA envisages a tiered system where generally competences sit with Member States who under Art. 38 (Council version) “shall designate one or more competent authorities as responsible for the supervision and enforcement of this Regulation (‘competent authorities’)”. For VLOPs and, according to the Council text, also for VLOSEs, the Commission will exercise powers to initiate proceedings (Art. 51), make findings of non-compliance (Art. 58) and issue fines (Art. 59). In our view, the designation of national authorities with DSA regulatory competences with respect to copyright law will prove highly sensitive for the effective implementation of the new regime. National authorities with DSA competences need to possess expertise in copyright law, freedom of expression and a comprehensive understanding of creator and user contexts.