Skip to main content

Fake News in Times of Pandemic and Beyond: Enquiry into the Rationales for Regulating Information Platforms

  • Chapter
  • First Online:
Law and Economics of the Coronavirus Crisis

Part of the book series: Economic Analysis of Law in European Legal Scholarship ((EALELS,volume 13))

Abstract

The COVID-19 pandemic threw our societies in dire times with deep effects on all societal sectors and on our lives. The pandemic was accompanied by another phenomenon also associated with grave consequences—that of the “infodemic”. Fake news about the cause, prevention, impact and potential cures for the coronavirus spread on social platforms and other media outlets, and continue to do so. The chapter takes this infodemic as a starting point to exploring the broader phenomenon of online misinformation. The legal analysis in this context focuses on the rationales for regulating Internet platforms as critical information intermediaries in a global networked media space. As Internet platforms do not fall under the category of media companies, they are currently not regulated in most countries. Yet, the pressure to regulate them, also in light of other negative phenomena, such as hate speech proliferation, political disinformation and targeting, has grown in recent years. The regulatory approaches differ, however, across jurisdictions and encompass measures that range from mere self-regulatory codes to more binding interventions. Starting with some insights into the existing technological means for mediating speech online, the power of platforms, and more specifically their influence on the conditions of freedom of expression, the chapter discusses in particular the regulatory initiatives with regard to information platforms in the United States and in the European Union, as embedded in different traditions of free speech protection. The chapter offers an appraisal of the divergent US and EU approaches and contemplates the adequate design of regulatory intervention in the area of online speech in times of infodemic and beyond it.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 179.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    On the origins and the reason for distribution of conspiracy theories, see e.g. Sunstein and Vermeule (2009).

  2. 2.

    For an overview of the different misinformation threats, see e.g. European Commission and the High Representative of the Union for Foreign Affairs and Security Policy (2020); See also Brennen et al. (2020); Baines and Elliott (2020); Enders (2020).

  3. 3.

    See European Commission and the High Representative of the Union for Foreign Affairs and Security Policy (2020).

  4. 4.

    WHO stated that “infodemics are an excessive amount of information about a problem, which makes it difficult to identify a solution. They can spread misinformation, disinformation and rumours during a health emergency. Infodemics can hamper an effective public health response and create confusion and distrust among people.” See WHO (2020); see also WHO et al. (2020).

  5. 5.

    See e.g. Knuutila et al. (2020); Brennen et al. (2020); Center for Countering Digital Hate (2020); Nielsen et al. (2020).

  6. 6.

    Posetti and Matthews (2020), at 1; for understanding the different types of fake news, making a difference between disinformation, misinformation and malinformation, see Wardle (2019); For a slightly different classification, see Tandoc et al. (2018).

  7. 7.

    See e.g. Ireton and Posetti (2018) and the next section.

  8. 8.

    See e.g. Posetti and Matthews (2020); Soll (2016); Wendling (2018); Merriam-Webster (2021); Poole (2019).

  9. 9.

    Center for Information Technology and Society (2019).

  10. 10.

    Ibid.

  11. 11.

    Ibid.; see also Chesney and Citron (2019).

  12. 12.

    Center for Information Technology and Society (2019).

  13. 13.

    For updated information on the initiatives around the world, see e.g. https://infogram.com/covid-19-fake-news-regulation-passed-during-covid-19-1h8n6md5q9kj6xo (last access 23 February 2022)

  14. 14.

    See e.g. Benkler (2006); Whitt (2013).

  15. 15.

    See e.g. Benkler (2006); Lessig (2009).

  16. 16.

    For a clarification of the term, see below.

  17. 17.

    See e.g. Balkin (2012); Cohen (2018); Lobel (2016).

  18. 18.

    See e.g. European Commission (2016); UK House of Commons (2019).

  19. 19.

    The focus here is placed not upon the physical intermediaries, such as network operators or Internet service suppliers (although these matter too: see e.g. Benkler (2006); DeNardis (2009); Frishmann (2012) but upon those gatekeepers existing at the applications and the content levels—the so-called “choice intermediaries” or “new editors”. See Helberger (2011); Miel and Farris (2008).

  20. 20.

    Sunstein (2007).

  21. 21.

    Napoli (2012, 2015).

  22. 22.

    See e.g. Sag (2018); Spindler (2020); Frosio (2020); Burri and Zihlmann (2021).

  23. 23.

    See e.g. Ezrachi and Stucke (2016); for a brief overview of the issues, see Burri (2019a).

  24. 24.

    See e.g. Burri (2015).

  25. 25.

    Balkin (2012); also Balkin (2018).

  26. 26.

    Balkin (2012).

  27. 27.

    Balkin (2012, 2018); Klonick (2018).

  28. 28.

    See Helberger (2011); Miel and Farris (2008).

  29. 29.

    For a taxonomy of the different algorithmic filters, see Bozdag (2013).

  30. 30.

    Napoli (2014), pp. 33–38; Also Saurwein et al. (2015); Helberger et al. (2015); Burri (2019b).

  31. 31.

    Klonick (2018).

  32. 32.

    See e.g. Bamberger and Lobel (2018); Burri (2019a).

  33. 33.

    Burri (2019a); European Data Protection Supervisor (EDPS) (2014); Pollicino (2021).

  34. 34.

    Cohen (2018); Kreiss and Mcgregor (2017); Benkler et al. (2018).

  35. 35.

    Latzer et al. (2014), at pp. 29–30.

  36. 36.

    Hoffman et al. (2015), at p. 1365.

  37. 37.

    Hoffman et al. (2015); Sunstein (2007).

  38. 38.

    Hoffman et al. (2015), at p. 1365.

  39. 39.

    Dahlgren (2005).

  40. 40.

    Pariser (2011).

  41. 41.

    Sunstein (2001, 2007, 2009).

  42. 42.

    Filter bubbles, together with “information cascades” and the human attraction to negative and novel information have been said to fuel the distribution and virality of fake news. For a careful analysis of these phenomena of online communication, see Chesney and Citron (2019), in particular pp. 1765–1768.

  43. 43.

    Goodman Ellen (2004); Helberger (2012); Napoli (2012); Napoli et al. (2018).

  44. 44.

    Napoli et al. (2018); Burri (2019b); McKelvey and Hunt (2019).

  45. 45.

    Balkin (2012); Klonick (2018).

  46. 46.

    Cohen (2018).

  47. 47.

    Balkin (2018); see also Kaye (2019).

  48. 48.

    See e.g. De Schutter (2014).

  49. 49.

    See e.g. Farber (2019); Feldman and Sullivan (2019); Keller (2011); Oster (2017); Pollicino (2021).

  50. 50.

    The United Nations (UN) Special Rapporteur on Freedom of Opinion and Expression, the Organization for Security and Co-operation in Europe (OSCE) Representative on Freedom of the Media, the Organization of American States (OAS) Special Rapporteur on Freedom of Expression and the African Commission on Human and Peoples’ Rights (ACHPR) Special Rapporteur on Freedom of Expression and Access to Information, Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda, FOM.GAL/3/17, 3 March 2017, at 1. General Principles, para. (a).

  51. 51.

    Ibid., at 2. Standards on Disinformation and Propaganda, para. (a).

  52. 52.

    See e.g. Eichensehr (2015); Daskal (2016); Eichensehr (2017).

  53. 53.

    See e.g. the Yahoo! case as one of the first on free speech violation and online jurisdiction. There the US court when faced with the recognition and implementation of the French court order under the “comity of nations” doctrine stated that: “Absent a body of law that establishes international standards with respect to speech on the Internet and an appropriate treaty or legislation addressing enforcement of such standards to speech originating within the United States, the principle of comity is outweighed by the Court’s obligation to uphold the First Amendment.” (see Yahoo! II, 169 F. Supp. 2d, at 1193). For more on the case, see Goldsmith and Wu (2001), at pp. 49–64; Greenberg (2003).

  54. 54.

    See Klonick (2018); see also Chander (2014).

  55. 55.

    See Bontcheva and Posetti (2020), in particular pp. 36–40.

  56. 56.

    Boncheva and Posetti (2020), ibid., at p. 108; see also Bayer et al. (2019); Roudik et al. (2019).

  57. 57.

    See e.g. Nunziato (2019); Klonick (2020).

  58. 58.

    See e.g. Haupt (2005) (also providing an overview of the comparative literature); Tourkochoriti (2016).

  59. 59.

    Abrams v. United States, 250 U.S. 616 (1919) (dissenting opinion Holmes).

  60. 60.

    521 U.S. 844 (1996).

  61. 61.

    Ibid., at para. 885. In the more recent case of Packingham v. North Carolina, 137 S. Ct. 1730 (2017), the Supreme Court compared social media platforms to a town square and recognized their function as a forum to exchange ideas and viewpoints.

  62. 62.

    Tompros et al. (2020), at pp. 88–89.

  63. 63.

    Police Dep’t of Chi. v. Mosley, 408 U.S. 92, 95 (1972).

  64. 64.

    United States v. Eichman, 110 S. Ct. 2404, 2410 (1990).

  65. 65.

    Tompros et al. (2020), at p. 90, referring to Burson v. Freeman, 504 U.S. 191, 199 (1992).

  66. 66.

    See e.g. United States v. Alvarez, 567 U.S. 709, 717 (2012). A regulation of unprotected speech may still violate the First Amendment with regard to content discrimination if it includes distinctions among subcategories of speech that cannot be justified. See e.g. R.A.V. v. City of St. Paul, 505 U.S. 377 (1992).

  67. 67.

    United States v. Williams, 553 U.S. 285, 322 (dissenting opinion Souter).

  68. 68.

    New York Times Co. v. Sullivan, 376 U.S. 254 (1964), at 279–280. The decision in Gertz extended the NY Times standard of “reckless disregard” from public officials to public figures and defined these as the persons who, due to their notoriety, achievements, or the rigour of their success, seek the attention of the public. See Gertz v. Robert Welch, Inc., 418 U.S. 323, 350 (1974).

  69. 69.

    567 U.S. 709 (2012). For a fully analysis of the case, see Tompros et al. (2020), at pp. 93–97.

  70. 70.

    Ibid., at 718.

  71. 71.

    Ibid., at 710, 726.

  72. 72.

    N.Y. Penal Law §240.50 reads: “A person is guilty of falsely reporting an incident in the third degree when, knowing the information reported, conveyed or circulated to be false or baseless, he or she […] [i]nitiates or circulates a false report or warning of an alleged occurrence or impending occurrence of a crime, catastrophe or emergency under circumstances in which it is not unlikely that public alarm or inconvenience will result.” Falsely reporting an incident in the third degree is a class A misdemeanor and punishable by up to 1 year’s imprisonment and a fine of USD 1’000. The statute permits in addition entities providing emergency services to seek restitution for “the amount of funds reasonably expended for the purpose of responding” to false reports.

  73. 73.

    For a fully-fledged analysis of the law, as well as its possible unconstitutionality post-Alvarez, see Tompros et al. (2020).

  74. 74.

    See e.g. Tourkochoriti (2016); Burri (2021).

  75. 75.

    Securing the Protection off Our Enduring and Established Constitutional Heritage Act (SPEECH Act), 124 Stat. 2380 (2010). See Goldman (2020).

  76. 76.

    Communications Decency Act of 1996, (CDA), Pub. L. No. 104-104 (Tit. V), 110 Stat. 133 (8 February1996), codified at 47 U.S.C. §§223, 230. For a detailed analysis, see Brannon and Holmes (2021).

  77. 77.

    Goldman (2020); see also Goldman (2019); Bone (2021).

  78. 78.

    Kloseff (2019).

  79. 79.

    Klonick (2018).

  80. 80.

    47 U.S.C. §230(c)(2)(A) (emphasis added).

  81. 81.

    Goldman (2020).

  82. 82.

    See e.g. Burri (2022).

  83. 83.

    See e.g. Goldman (2019); On the constitutionality of possible Section 230 amendments, see e.g. Brannon and Holmes (2021); see also Citron and Wittes (2017) (arguing that platforms should enjoy immunity from liability if they could show that their response to unlawful uses of their services was reasonable).

  84. 84.

    Allow States and Victims to Fight Online Sex Trafficking Act of 2017, H.R.1865 (115th Cong. 2017–18).

  85. 85.

    Ending Support for Internet Censorship Act, S. 1914, 116th Cong. (2019).

  86. 86.

    Biased Algorithm Deterrence Act of 2019, H.R. 492, 116th Cong. (2019).

  87. 87.

    Algorithmic Accountability Act, S. 1108, 116th Cong. (2019).

  88. 88.

    For details on and analysis of the legislative proposals, see Bone (2021).

  89. 89.

    The IF model maintains that platforms should be required to abide by fiduciary duties of care, loyalty and confidentiality with regard to their end users. For a discussion, see Balkin (2016, 2018); Balkin and Zittrain (2016); Khan and Pozen (2019); Whitt (2019); Haupt (2020).

  90. 90.

    See e.g. De Gregorio (2020); Bloch-Wehba (2019).

  91. 91.

    Dink v Turkey [2010] ECtHR 2668/07, 6102/08, 30079/08, 7072/09 and 7124/09.

  92. 92.

    Informationsverein Lentia and Others v Austria [1993] ECtHR 13914/88; 15041/89; 15717/89; 15779/89; 17207/90.

  93. 93.

    European Convention on Human Rights, 4 November 1950, 213 U.N.T.S. 221; see also Article 11 (Freedom of Expression and Information) of the Charter of Fundamental Rights of the European Union (CFREU), OJ C [2012] 326/393, 26 October 2012.

  94. 94.

    See e.g. Autronic AG v Switzerland [1990] ECtHR 12726/87; Schweizerische Radio- und Fernsehgesellschaft SRG v Switzerland [2012] ECtHR 34124/06.

  95. 95.

    Handyside v. United Kingdom [1976] ECtHR 5493/72, at para. 49.

  96. 96.

    For details, see Oster (2017), at Chapter 3; Pollicino (2021).

  97. 97.

    See e.g. Perinçek v. Switzerland [2015] ECtHR 27510/08.

  98. 98.

    See e.g. Lehideux and Isorni v. France [1998] ECtHR 55/1997/839/1045; Garaudy v. France [2003] ECtHR 65831/01; Witzsch v. Germany [2005] ECtHR 7485/03.

  99. 99.

    See e.g. Pavel Ivanov v. Russia [2004] ECtHR 35222/04; Aksu v. Turkey [2012] ECtHR 4149/04 and 41029/04.

  100. 100.

    See e.g. Mayer-Schönberger and Cukier (2013); van der Sloot et al. (2016); Burri (2021).

  101. 101.

    Case C-131/12, Google Spain SL and Google Inc. v. Agencia Española de Protección de Datos (AEPD) and Mario Costeja González, Judgment of the Court (Grand Chamber) of 13 May 2014, ECR [2014] 317 [hereinafter Google Spain].

  102. 102.

    See e.g. Wechsler (2015); Hoffman et al. (2016).

  103. 103.

    Google Spain, at para. 74, referring to Joined Cases C-468/10 and C-469/10, Asociación Nacional de Establecimientos Financieros de Crédito (ASNEF) and Federación de Comercio Electrónico y Marketing Directo (FECEMD) v. Administración del Estado, Judgment of 24 November 2011, ECR I-12181, at paras 38, 40.

  104. 104.

    Directive 2016/680 of the European Parliament and of the Council of April 27, 2016, on the protection of natural persons with regard to the processing of personal data by competent authorities for the purposes of the prevention, investigation, detection, or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision 2008/977/JHA, OJ L [2016] 119/89. [hereinafter GDPR].

  105. 105.

    Google Spain, at para. 88. There is a qualification in para. 99: “As the data subject may, in the light of his fundamental rights under Articles 7 and 8 of the Charter, request that the information in question no longer be made available to the general public on account of its inclusion in such a list of results, those rights override, as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in having access to that information upon a search relating to the data subject’s name. However, that would not be the case if it appeared, for particular reasons, such as the role played by the data subject in public life, that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of its inclusion in the list of results, access to the information in question.”

  106. 106.

    Oster (2017), at p. 18 and Chapter 3.

  107. 107.

    Bladet Tromsø and Stensaas v Norway [1999] ECtHR 21980/93, at para. 65.

  108. 108.

    Stoll v Switzerland [2007] ECtHR 69698/01.

  109. 109.

    Ahmet Yildirim v Turkey [2012] ECtHR 3111/10, at para. 54.

  110. 110.

    See Helberger et al. (2020).

  111. 111.

    See ibid.

  112. 112.

    Delfi v Estonia [2015] ECtHR 64569/09.

  113. 113.

    MTE v Hungary [2016] ECtHR 22947/13, at para. 82.

  114. 114.

    C-507/17, Google v. Commission nationale de l’informatique et des libertés (CNIL), Judgment of 24 September 2019, ECLI:EU:C:2019:772. For a great summary of the case and references to the primary sources, see Columbia Global Freedom of Expression, https://globalfreedomofexpression.columbia.edu/cases/google-llc-v-national-commission-on-informatics-and-liberty-cnil/ (last access 23 February 2022).

  115. 115.

    Ibid., at para. 74: “… the operator is not required to carry out that de-referencing on all versions of its search engine, but on the versions of that search engine corresponding to all the Member States, using, where necessary, measures which, while meeting the legal requirements, effectively prevent or, at the very least, seriously discourage an internet user conducting a search from one of the Member States on the basis of a data subject’s name from gaining access via the list of results displayed following that search, to the links which are the subject of that request.”

  116. 116.

    See ibid., at para. 27; see also Opinion of Advocate General Szpunar delivered on 10 January 2019, ECLI:EU:C:2019:15.

  117. 117.

    Goldman (2019), referring to Search King, Inc. v. Google Technology, Inc., 2003 WL 21464568 (W.D. Okla. 2003); Langdon v. Google, Inc., 474 F. Supp. 2d 622 (D. Del. 2007); Zhang v. Baidu.com, Inc., 10 F. Supp. 3d 433 (S.D.N.Y. 2014); Google, Inc. v. Hood, 96 F. Supp. 3d 584 (S.D. Miss. 2015); e-ventures Worldwide v. Google, Inc., 2017 WL 2210029 (M.D. Fla. 2017); Martin v. Hearst Corporation, 777 F.3d 546 (2d Cir. 2015).

  118. 118.

    Goldman (2019), referring to Maughan v. Google Technology, Inc., 143 Cal. App. 4th 1242 (Cal. App. Ct. 2006); Murawski v. Pataki, 514 F. Supp. 2d 577 (S.D.N.Y. 2007); Shah v. MyLife.Com, Inc., 2012 WL 4863696 (D. Or. 2012); Merritt v. Lexis Nexis, 2012 WL 6725882 (E.D. Mich. 2012); Nieman v. Versuslaw, Inc., 2012 WL 3201931 (C.D. Ill. 2012); Getachew v. Google, Inc., 491 Fed. Appx. 923 (10th Cir. 2012); Mmubango v. Google, Inc., 2013 WL 664231 (E.D. Pa. 2013); O’Kroley v. Fastcase Inc., 831 F.3d 352 (6th Cir. 2016); Fakhrian v. Google Inc., 2016 WL 1650705 (Cal. App. Ct. 2016); Despot v. Baltimore Life Insurance Co., 2016 WL 4148085 (W.D. Pa. 2016); Manchanda v. Google, Inc., 2016 WL 6806250 (S.D.N.Y. 2016).

  119. 119.

    Case C-18/18 Eva Glawischnig-Piesczek v Facebook Ireland Limited, Judgment of 3 October 2019, ECLI:EU:C:2019:821. For a great summary of the case and references to the primary sources, see Columbia Global Freedom of Expression, https://globalfreedomofexpression.columbia.edu/cases/glawischnig-piesczek-v-facebook-ireland-limited/; For a critique of the case, see Keller (2019).

  120. 120.

    Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000 on certain legal aspects of information society services, in particular electronic commerce, in the Internal Market (Directive on electronic commerce), OJ L [2000] 178/1.

  121. 121.

    See in particular Article 14 E-Commerce Directive.

  122. 122.

    Austrian Supreme Court, ORF/Facebook, Judgment 4Ob36/20b of 30 March 2020.

  123. 123.

    There was not such a request involved in the case at issue.

  124. 124.

    Articles 12–14 E-Commerce Directive.

  125. 125.

    Article 15 E-Commerce Directive. In Scarlet v SABAM (Case C-70/10, Scarlet Extended SA v Société belge des auteurs, compositeurs et éditeurs SCRL (SABAM), ECLI:EU:C:2011:771), the Belgian collecting society SABAM applied for a permanent order requiring a network access provider to monitor and block peer-to-peer transmission of music files from SABAM’s catalogue. The CJEU decided that a broad order of the type requested would go both against the prohibition of general monitoring obligations of the E-Commerce Directive and the fundamental rights of Internet users to the protection of their personal data and freedom of expression guaranteed under the EU Charter of Fundamental Rights. See also Case C-360/10, Belgische Vereniging van Auteurs, Componisten en Uitgevers CVBA (SABAM) v Netlog NV, ECLI:EU:C:2012:85. Specific monitoring obligations have been however found not in violation of Article 15 E-Commerce Directive (see De Streel et al. (2021)).

  126. 126.

    Article 16 E-Commerce Directive.

  127. 127.

    Article 15(2) E-Commerce Directive.

  128. 128.

    European Commission (2018); see also European Commission (2017).

  129. 129.

    European Commission (2018), ibid., at points 5–17.

  130. 130.

    European Commission (2018), ibid., at points 16–21.

  131. 131.

    European Commission (2018), at points 22–28. For an evaluation of the rules, see de Streel et al. (2021), at pp. 22–23.

  132. 132.

    Council Directive 89/552/EEC of 3 October 1989 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the pursuit of television broadcasting activities, OJ L [1989] 298/23.

  133. 133.

    Directive 2010/13 of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audio-visual media services (Audio-Visual Media Services Directive), OJ L [2010] 95/1, as amended by Directive 2018/1808, OJ L [2018] 303/69 [hereinafter AVMSD].

  134. 134.

    Article 1(1aa) AVMSD defines the “video-sharing platform service” as “a service as defined by Articles 56 and 57 TFEU, where the principal purpose of the service or of a dissociable section thereof or an essential functionality of the service is devoted to providing programmes, user-generated videos, or both, to the general public, for which the video-sharing platform provider does not have editorial responsibility, in order to inform, entertain or educate, by means of electronic communications networks [...] and the organisation of which is determined by the video-sharing platform provider, including by automatic means or algorithms in particular by displaying, tagging and sequencing.”; see also European Commission (2020).

  135. 135.

    Article 28b (1b) and (1c) AVMSD.

  136. 136.

    Article 28b (1a) AVMSD.

  137. 137.

    Article 28b (3) AVMSD. The AVMSD lists certain appropriate measures, such as transparent and user-friendly mechanisms to report and flag the content and easy- to-use and effective procedures for the handling and resolution of users’ complaints.

  138. 138.

    Directive 2017/541 of the European Parliament and of the Council of 15 March 2017 on combating terrorism and replacing Council Framework Decision 2002/475/JHA and amending Council Decision 2005/671/JHA, OJ L [2017] 88/6.

  139. 139.

    Directive 2011/93/EU of the European Parliament and of the Council of 13 December 2011 on combating the sexual abuse and sexual exploitation of children and child pornography, and replacing Council Framework Decision 2004/68/JHA, OJ L [2011] 335/1. For an analysis of both documents, see e.g. de Streel et al. (2021), at pp. 25–29.

  140. 140.

    The Code was signed in 2016 by Facebook, Microsoft, Twitter and YouTube. Google+, Instagram, Dailymotion and Snapchat and Jeuxvideo.com joined subsequently. The Code’s text is available at: https://ec.europa.eu/info/policies/justice-and-fundamental-rights/combatting-discrimination/racism-and-xenophobia/eu-code-conduct-countering-illegal-hate-speech-online_en (last access 23 February 2022)

  141. 141.

    The Code was signed by Facebook, Google and Twitter, Mozilla, as well as by advertisers and parts of the advertising industry in October 2018; Microsoft joined in May 2019, while TikTok became a signatory in June 2020. Code’s text is available at: https://digital-strategy.ec.europa.eu/en/policies/code-practice-disinformation (last access 23 February 2022)

  142. 142.

    The Code includes also an annex identifying best practices that signatories will apply to implement the Codeʼs commitments. For all documents, see https://ec.europa.eu/digital-single-market/en/news/code-practice-disinformation (last access 23 February 2022)

  143. 143.

    European Commission (2021).

  144. 144.

    The Commission’s Assessment of the Code of Practice in 2020 revealed in particular include inconsistent and incomplete application of the Code across platforms and Member States, limitations intrinsic to the self- regulatory nature of the Code, as well as gaps in the coverage of the Code’s commitments. The assessment also highlighted the lack of an appropriate monitoring mechanism, including key performance indicators, lack of commitments on access to platforms’ data for research on disinformation and limited participation from stakeholders, in particular from the advertising sector. See European Commission (2020a, 2021).

  145. 145.

    European Commission (2020b).

  146. 146.

    European Commission (2020c).

  147. 147.

    The DSA defines very large platforms in Article 25 as online platforms which provide their services to a number of average monthly active recipients of the service in the EU corresponding to 10% of the EU’s population.

  148. 148.

    For an overview of the new obligations depending on the type of platform, see https://ec.europa.eu/info/strategy/priorities-2019-2024/europe-fit-digital-age/digital-services-act-ensuring-safe-and-accountable-online-environment_en (last access 23 February 2022)

  149. 149.

    For an overview of the different initiatives, see e.g. de Streel et al. (2021); Roudik et al. (2019).

  150. 150.

    See e.g. Kaye (2017); Haupt (2021); see also Zurth (2021); Tworek and Leerssen (2019); Citron (2018).

  151. 151.

    Gesetz zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken [NetzDG] [Network Enforcement Act], 01 September 2017 [BGBL I] at 3352. The law entered into force on 1 January 2018.

  152. 152.

    § 1(3) NetzDG, referring to §§ 86, 86a, 89a, 91, 100a, 111, 126, 129 bis 129b, 130, 131, 140, 166, 184b, 185 bis 187, 201a, 241 and 269 of the German Criminal Code.

  153. 153.

    § 1(1) NetzDG. Platforms that post original journalistic content, email or messaging services are not covered.

  154. 154.

    The deadline may be extended if additional facts are necessary to determine the truthfulness of the information or if the social network hires an outside agency to perform the vetting process.

  155. 155.

    § 3 paras 1 and 2 NetzDG.

  156. 156.

    § 2 paras 1 and 2 NetzDG. The report has to be published in German in the Federal Gazette and on the website of the social media network one month after the end of each half-year period. The report must be easily identifiable, immediately accessible, and permanently available. It must include information on the general efforts to prevent illegal actions on the platform, a description of the complaint procedure, the number of complaints received, the number and qualifications of employees who are handling the complaints, the network’s association memberships, the number of times an external party has been used to decide the illegality of the content, the number of complaints that led to the content being deleted, the time it took to delete the content, and measures that were taken to inform the complainant and the member who posted the deleted content.

  157. 157.

    § 4 NetzDG, in conjunction with Gesetz über Ordnungswidrigkeiten [OWiG] [Act on Regulatory Offenses], 19 February 1987 [BGBL. I] at 602, as amended, § 30(2). The fine is rendered by the Department of Justice upon a Court decision. The decision of the Court is final and binding on the Department of Justice.

  158. 158.

    See e.g. Haupt (2021); also Guggenberger (2017); Nolte (2017).

  159. 159.

    See e.g. Zurth (2021); see also the refences listed in note 150 above.

  160. 160.

    Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität, 30 March 2021 [BGBl. I], at 441.

  161. 161.

    Gesetz zur Änderung des Netzwerkdurchsetzungsgesetzes, 3 June 2021 [BGBL I] at 1436.

  162. 162.

    § 3d amended NetzDG.

  163. 163.

    For overview of the changes, see https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/DE/NetzDGAendG.html (last access 23 February 2022) (in German).

  164. 164.

    Goldsmith and Woods (2020).

  165. 165.

    Ibid.

  166. 166.

    See e.g. Bhagwat (2021).

  167. 167.

    Helberger et al. (2017), at p. 2; see also Finck (2018) (arguing for a co-regulatory model of platform regulation); Saurwein and Spencer-Smith (2020) (mapping the different governance approaches).

  168. 168.

    See e.g. with regard to Russia, Moyakinea and Tabachnik (2021); for a discussion of other cases, see Roudik et al. (2019).

  169. 169.

    Daskal (2019), at p. 1605.

  170. 170.

    Ibid.

References

  • Baines D, Elliott RJR (2020) Defining misinformation, disinformation and malinformation: an urgent need for clarity during the COVID-19 infodemic. University of Birmingham Department of Economics Discussion Papers 20-06

    Google Scholar 

  • Balkin JM (2012) Free speech is a triangle. Columbia Law Rev 118:2011–2055

    Google Scholar 

  • Balkin JM (2016) Information fiduciaries and the first amendment. Univ California Davis Law Rev 49:1183–1234

    Google Scholar 

  • Balkin JM (2018) Free speech in the algorithmic society: big data, private governance, and new school speech regulation. Univ California Davis Law Rev 51:1149–1210

    Google Scholar 

  • Balkin JM, Zittrain J (2016) A grand bargain to make tech companies trustworthy. The Atlantic, 3 October 2016

    Google Scholar 

  • Bamberger KA, Lobel O (2018) Platform market power. Berkeley Technol Law J 32:1052–1092

    Google Scholar 

  • Bayer J et al (2019) Disinformation and propaganda – impact on the functioning of the rule of law in the EU and its Member States, study for the European Parliament. European Parliament, Brussels

    Google Scholar 

  • Benkler Y (2006) The wealth of networks: how social production transforms markets and freedom. Yale University Press, New Haven

    Google Scholar 

  • Benkler Y et al (2018) Network propaganda: manipulation, disinformation, and radicalization in American politics. Oxford University Press, Oxford

    Book  Google Scholar 

  • Bhagwat A (2021) The law of Facebook. Univ California Davis Law Rev 54:2353–2403

    Google Scholar 

  • Bloch-Wehba H (2019) Global platform governance: private power in the shadow of the state. SMU Law Rev 73:27–80

    Google Scholar 

  • Bone T (2021) How content moderation may expose social media companies to greater defamation liability. Wash Univ Law Rev 98:937–963

    Google Scholar 

  • Bontcheva K, Posetti J (eds) (2020) Balancing act: countering digital disinformation while respecting freedom of expression, Broadband Commission research report on “Freedom of expression and addressing disinformation on the internet”. ITU/UNESCO, Geneva/Paris

    Google Scholar 

  • Bozdag E (2013) Bias in algorithmic filtering and personalization. Ethics Inf Technol 15:209–227

    Article  Google Scholar 

  • Brannon VC, Holmes EN (2021) Section 230: An Overview. Congressional Research Service Report R46751, 7 April 2021

    Google Scholar 

  • Brennen JS et al (2020) Types, sources, and claims of COVID-19 misinformation. Reuters Institute for the Study of Journalism, Oxford

    Google Scholar 

  • Burri M (2015) Public service broadcasting 3.0: legal design for the digital present. Routledge, London

    Google Scholar 

  • Burri M (2019a) Understanding the implications of big data and big data analytics for competition law: an attempt for a primer. In: Mathis K, Tor A (eds) New developments in competition behavioural law and economics. Springer, Berlin, pp 241–263

    Google Scholar 

  • Burri M (2019b) Discoverability of Local, National and Regional Content Online, A Thought Leadership Paper written for the Canadian Commission for UNESCO and Canadian Heritage, 7 February 2019

    Google Scholar 

  • Burri M (2021) Interfacing privacy and trade. Case West J Int Law 53:35–88

    Google Scholar 

  • Burri M (2021) Approaches to digital trade and data flow regulation across jurisdictions: implications for the future ASEAN-EU agreement. Legal Issu Econ Integr 49(2022)

    Google Scholar 

  • Burri M, Zihlmann Z (2021) Intermediaries’ liability in light of the recent EU copyright reform. Indian J Intell Prop Law 11

    Google Scholar 

  • Center for Countering Digital Hate (2020) Malgorithm: How Instagram’s Algorithm Publishes Misinformation and Hate to Millions during a Pandemic. Center for Countering Digital Hate, London

    Google Scholar 

  • Center for Information Technology and Society – UC Santa Barbara (2019) A Brief History of Fake News. Center for Information Technology and Society, Santa Barbara

    Google Scholar 

  • Chander A (2014) How law made Silicon Valley. Emory Law J 63:639–694

    Google Scholar 

  • Chesney B, Citron DK (2019) Deep fakes: a looming challenge for privacy, democracy, and National Security. Calif Law Rev 107:1753–1820

    Google Scholar 

  • Citron DK (2018) Extremist speech, compelled conformity, and censorship creep. Notre Dame Law Rev 93:1035–1072

    Google Scholar 

  • Citron DK, Wittes B (2017) The internet will not break: denying bad samaritans § 230 immunity. Fordham Law Rev 86:401–423

    Google Scholar 

  • Cohen JE (2018) Law for the platform economy. Univ California Davis Law Rev 51:133–204

    Google Scholar 

  • Dahlgren (2005) The internet, public spheres, and political communication. Polit Commun 22:147–162

    Article  Google Scholar 

  • Daskal J (2016) The un-territoriality of data. Yale Law J 125:326–398

    Google Scholar 

  • Daskal J (2019) Speech across Borders. Va Law Rev 105:1605–1666

    Google Scholar 

  • de Gregorio G (2020) Democratising online content moderation: a constitutional framework. Comput Law Secur Rev 36. https://doi.org/10.1016/j.clsr.2019.105374 (last access 23 February 2022)

  • de Schutter O (2014) International human rights law, 2nd edn. Cambridge University Press, Cambridge

    Google Scholar 

  • de Streel A, Kuczerawy A, Ledger M (2021) Online platforms and services. In: Garzaniti L et al (eds) Electronic communications, audiovisual services and the internet. Sweet and Maxwell, London, pp 125–157

    Google Scholar 

  • DeNardis L (2009) Protocol politics: the globalization of internet governance. MIT Press, Cambridge

    Book  Google Scholar 

  • Eichensehr KE (2015) The cyber-law of nations. Georgetown Law J 103:317–380

    Google Scholar 

  • Eichensehr KE (2017) Data extraterritoriality. Texas Law Rev 95:145–160

    Google Scholar 

  • Enders AM (2020) The Different Forms of COVID-19 Misinformation and Their Consequences, The Harvard Kennedy School (HKS) Misinformation Review

    Google Scholar 

  • European Commission (2016) Online platforms and the digital single market, COM (2016) 288 final, 25 May 2016

    Google Scholar 

  • European Commission (2017) Tackling Illegal Content Online. Towards an Enhanced Responsibility for Online Platforms, COM (2017) 555 final, 28 September 2017

    Google Scholar 

  • European Commission (2018) Recommendation 2018/334 of 1 March 2018 on measures to effectively tackle illegal content online, OJ L [2018] 63/50

    Google Scholar 

  • European Commission (2020a) Guidelines on the practical application of the essential functionality criterion of the definition of a “video-sharing platform service” under the Audiovisual Media Services Directive, OJ C [2020] 223/3, 7 July 2020

    Google Scholar 

  • European Commission (2020b) Assessment of the Code of Practice on Disinformation: Achievements and Areas for Further Improvement, SWD (2020)180, 10 September 2020

    Google Scholar 

  • European Commission (2020c) Proposal for a Regulation of the European Parliament and the Council on a Single Market for Digital Services (Digital Services Act) and amending Directive 2000/31/EC, COM (2020) 825 final, 15 December 2020

    Google Scholar 

  • European Commission (2020d) Communication on the European Democracy Action Plan, COM (2020) 790 final, 3 December 2020

    Google Scholar 

  • European Commission (2021) Guidance on Strengthening the Code of Practice on Disinformation Brussels, COM (2021) 262 final, 26 May 2021

    Google Scholar 

  • European Commission and the High Representative of the Union for Foreign Affairs and Security Policy (2020) Tackling COVID-19 Disinformation: Getting the Facts Right, JOIN (2020) 8 final, 10 June 2020

    Google Scholar 

  • European Data Protection Supervisor (EDPS) (2014) Privacy and competitiveness in the age of big data

    Google Scholar 

  • Ezrachi A, Stucke ME (2016) Virtual competition: the promise and perils of the algorithm-driven economy. Harvard University Press, Cambridge

    Book  Google Scholar 

  • Farber DA (2019) The first amendment: concepts and insights. Foundation Press, St. Paul, MN

    Google Scholar 

  • Feldman NR, Sullivan KM (2019) First amendment law, 7th edn. West Academic, St. Paul, MN

    Google Scholar 

  • Finck (2018) Digital co-regulation: designing a supranational legal framework for the platform economy. Eur Law Rev 43:47–68

    Google Scholar 

  • Frishmann BM (2012) Infrastructure: the social value of shared resources. Oxford University Press, Oxford

    Book  Google Scholar 

  • Frosio G (ed) (2020) Oxford handbook of online intermediary liability. Oxford University Press, Oxford

    Google Scholar 

  • Goldman E (2019) Why section 230 is better than the first amendment. Notre Dame Law Rev Reflect 95:33–46

    Google Scholar 

  • Goldman E (2020) An overview of the United States’ Section 230 internet immunity. In: Frosio G (ed) The Oxford handbook of online intermediary liability. Oxford University Press, Oxford, pp 155–171

    Google Scholar 

  • Goldsmith J, Woods AK (2020) Internet Speech Will Never Go Back to Normal. The Atlantic, 26 April 2020

    Google Scholar 

  • Goldsmith J, Wu T (2001) Who controls the internet? Illusions of a borderless world. Oxford University Press, Oxford, p 2001

    Google Scholar 

  • Goodman Ellen P (2004) Media policy out of the box: content abundance, attention scarcity, and the failures of digital markets. Berkeley Technol Law J 19:1389–1472

    Google Scholar 

  • Greenberg MH (2003) A return to Lilliput: the LICRA v. yahoo – case and the regulation of online content in the world market. Berkeley Technol Law Rev 18:1191–1258

    Google Scholar 

  • Guggenberger N (2017) Das Netzwerkdurchsetzungsgesetz – schön gedacht, schlecht gemacht. Zeitschrift für Rechtspolitik 2017:98–101

    Google Scholar 

  • Haupt CE (2005) Regulating hate speech: damned if you do and damned if you don’t – lessons learned from comparing the German and U.S. approaches. Boston Univ Int Law J 23:300–335

    Google Scholar 

  • Haupt CE (2020) Platforms as trustees: information fiduciaries and the value of analogy. Harv Law Rev Forum 134:34–41

    Google Scholar 

  • Haupt CE (2021) Regulating speech online: free speech values in constitutional frames. Wash Univ Law Rev 99:751–786

    Google Scholar 

  • Helberger N (2011) Diversity by design. J Inf Policy 1:441–469

    Article  Google Scholar 

  • Helberger N (2012) Exposure diversity as a policy goal. J Media Law 4:65–92

    Article  Google Scholar 

  • Helberger N, Kleinen-von Königlöw K, van der Noll R (2015) Regulating the new information intermediaries as gatekeepers of information diversity. Info 6:50–71

    Google Scholar 

  • Helberger N, Pierson J, Poell T (2017) Governing online platforms: from contested to cooperative responsibility. Inf Soc. https://doi.org/10.1080/01972243.2017.1391913 (last access 23 February 2022)

  • Helberger N et al (2020) A freedom of expression perspective on AI in the media – with a special focus on editorial decision making on social media platforms and in the news media. Eur J Law Technol 11

    Google Scholar 

  • Hoffman CP et al (2015) Diversity by choice: applying a social cognitive perspective to the role of public service media in the digital age. Int J Commun 9:1360–1381

    Google Scholar 

  • Hoffman D, Bruening P, Carter S (2016) The right to obscurity: how we can implement the Google Spain decision. North Carolina J Law Technol 17:437–481

    Google Scholar 

  • Ireton C, Posetti J (eds) (2018) Journalism, “Fake News” and disinformation. UNESCO, Paris

    Google Scholar 

  • Kaye D (2017) How Europe’s New Internet Laws Threaten Freedom of Expression Recent Regulations Risk Censoring Legitimate Content, Foreign Affairs, 18 December 2017

    Google Scholar 

  • Kaye D (2019) Speech policy: the struggle to govern the internet. Columbia Global Reports, New York

    Book  Google Scholar 

  • Keller D (2019) Dolphins in the net: internet content filters and the advocate General’s Glawischnig-Piesczek v. Facebook Ireland Opinion, Stanford Center for Internet and Society, 4 September 2019

    Google Scholar 

  • Keller P (2011) European and international media law: liberal democracy, trade, and the new media. Oxford University Press, Oxford

    Book  Google Scholar 

  • Khan LM, Pozen DE (2019) A skeptical view of information fiduciaries. Harv Law Rev 133:497–541

    Google Scholar 

  • Klonick K (2018) The new governors: the people, rules, and processes governing online speech. Harv Law Rev 131:1598–1670

    Google Scholar 

  • Klonick K (2020) The Facebook oversight board: creating an independent institution to adjudicate online free expression. Yale Law J 129:2418–2299

    Google Scholar 

  • Kloseff J (2019) The twenty-six words that created the internet. Cornell University Press, Ithaca

    Book  Google Scholar 

  • Knuutila A et al (2020) Covid-related misinformation on YouTube: the spread of misinformation videos on social media and the effectiveness of platform policies. Oxford Internet Institute, COMPROP Data Memo

    Google Scholar 

  • Kreiss D, Mcgregor SC (2017) Technology firms shape political communication: the work of Microsoft, Facebook, Twitter, and Google with campaigns during the 2016 US presidential cycle. Polit Commun:1–23

    Google Scholar 

  • Latzer M et al (2014) The economics of algorithmic selection on the internet. Media Change and Innovation Working Paper, pp 29–30

    Google Scholar 

  • Lessig L (2009) Remix: making art and commerce thrive in the hybrid economy. Penguin, New York

    Google Scholar 

  • Lobel O (2016) The law of the platform. Minnesota Law Rev 101:87–166

    Google Scholar 

  • Mayer-Schönberger V, Cukier K (2013) Big data: a revolution that will transform how we live, work, and think. Eamon Dolan/Houghton Mifflin Harcourt, New York

    Google Scholar 

  • McKelvey F, Hunt R (2019) Discoverability: toward a definition of content discovery through platforms. Social Media + Society, January/March, pp 1–15

    Google Scholar 

  • Merriam-Webster (2021) The real story of “Fake News”. Available at: https://www.merriam-webster.com/words-at-play/the-realstory-of-fake-news (last access 23 February 2022)

  • Miel P, Farris R (2008) News and information as digital media come of age. The Berkman Center for Internet and Society, Cambridge

    Google Scholar 

  • Moyakinea E, Tabachnik A (2021) Struggling to strike the right balance between interests at stake: the “Yarovaya”, “Fake news” and “Disrespect” laws as examples of Ill-conceived legislation in the age of modern technology. Comput Law Secur Rev 40, https://doi.org/10.1016/j.clsr.2020.105512 (last access 23 February 2022)

  • Napoli PM (2012) Persistent and emergent diversity policy concerns in an evolving media environment. In: Pager SA, Candeub A (eds) Transnational culture in the internet age. Edward Elgar, Cheltenham, pp 165–181

    Google Scholar 

  • Napoli PM (2014) On automation in media industries: integrating algorithmic media production into media industries scholarship. Media Ind J 1:33–38

    Google Scholar 

  • Napoli PM (2015) Social media and the public interest: governance of news platforms in the realm of individual and algorithmic gatekeepers. Telecommun Policy 39:751–760

    Article  Google Scholar 

  • Napoli PM et al (2018) Assessing local journalism: news deserts, journalism divides, and the determinants of the robustness of local news. News Measures Research Project, New Brunswick

    Google Scholar 

  • Nielsen RK et al (2020) Communications in the coronavirus crisis: lessons for the second wave. Reuters Institute for the Study of Journalism, Oxford

    Google Scholar 

  • Nolte G (2017) Hate-Speech, Fake-News, das Netzwerkdurchsetzungsgesetz und Vielfaltsicherung durch Suchmaschinen. Zeitschrift für Urheber- und Medienrecht 61:552–565

    Google Scholar 

  • Nunziato DC (2019) The marketplace of ideas online. Notre Dame Law Rev 94:1519–1583

    Google Scholar 

  • Oster J (2017) European and international media law. Cambridge University Press, Cambridge

    Google Scholar 

  • Pariser E (2011) The filter bubble: what the internet is hiding from you. Viking, New York

    Google Scholar 

  • Pollicino O (2021) Judicial protection of fundamental rights on the internet. Hart, Oxford

    Book  Google Scholar 

  • Poole S (2019) Before trump: the real history of fake news. The Guardian, 22 November 2019. Available at: https://www.theguardian.com/books/2019/nov/22/factitious-taradiddle-dictionary-real-history-fake-news (last access 23 February 2022)

  • Posetti J, Matthews A (2020) A short guide to the history of “fake news” and disinformation. International Center for Journalists, Washington

    Google Scholar 

  • Roudik P et al (2019) Initiatives to counter fake news in selected countries: Argentina, Brazil, Canada, China, Egypt, France, Germany, Israel, Japan, Kenya, Malaysia, Nicaragua, Russia, Sweden, United Kingdom. The Law Library of Congress, Washington, DC

    Google Scholar 

  • Sag M (2018) Internet safe harbors and the transformation of copyright law. Notre Dame Law Rev 93:499–564

    Google Scholar 

  • Saurwein F, Spencer-Smith C (2020) Combating disinformation on social media: multilevel governance and distributed accountability in Europe. Digit Journal 8:820–841

    Google Scholar 

  • Saurwein F et al (2015) Governance of algorithms: options and limitations. Info 17:35–49

    Article  Google Scholar 

  • Soll J (2016) The long and brutal history of fake news. POLITICO Magazine, 18 December 2016. Available at: http://politi.co/2FaV5W9 (last access 23 February 2022)

  • Spindler G (2020) Copyright law and internet intermediaries liability. In: Tatiana-Eleni Synodinou et al (eds) EU internet law in the digital era. Springer, Berlin, pp 3–25

    Google Scholar 

  • Sunstein CR (2001) Echo chambers: bush v. Gore impeachment, and beyond. Princeton University Press, Princeton

    Google Scholar 

  • Sunstein CR (2007). Republic.com 2.0. Princeton University Press, Princeton

  • Sunstein CR (2009) Going to extremes: how like minds unite and divide. Oxford University Press, Oxford

    Google Scholar 

  • Sunstein CR, Vermeule A (2009) Consipiracy theories: causes and cures. J Polit Philos 17:202–227

    Article  Google Scholar 

  • Tandoc EC Jr, Lim ZW, Ling R (2018) Defining “Fake News”. Digit Journal 6:137–153

    Google Scholar 

  • Tompros LW et al (2020) The constitutionality of criminalizing false speech made on social networking sites in a post-Alvarez, social media-obsessed world’. Harv J Law Technol 31:66–109

    Google Scholar 

  • Tourkochoriti I (2016) Speech, privacy and dignity in France and in the U.S.A.: a comparative analysis, Loyola L.A. Int Comp Law Rev 38:101–182

    Google Scholar 

  • Tworek H, Leerssen P (2019) An Analysis of Germany’s NetzDG Law. Transatlantic High Level Working Group on Content Moderation Online and Freedom of Expression

    Google Scholar 

  • UK House of Commons (2019) Disinformation and “fake news”: final report of the digital. Culture, Media and Sport Committee

    Google Scholar 

  • United Nations (UN) Special Rapporteur on Freedom of Opinion and Expression, the Organization for Security and Co-operation in Europe (OSCE) Representative on Freedom of the Media, the Organization of American States (OAS) Special Rapporteur on Freedom of Expression and the African Commission on Human and Peoples’ Rights (ACHPR) Special Rapporteur on Freedom of Expression and Access to Information (2017) Joint Declaration on Freedom of Expression and “Fake News”, Disinformation and Propaganda, FOM.GAL/3/17, 3 March 2017

    Google Scholar 

  • van der Sloot B, Broeders D, Schrijvers E (eds) (2016) Exploring the boundaries of big data. Amsterdam University Press, Amsterdam

    Google Scholar 

  • Wardle C (2019) Understanding information disorder. First Draft, London

    Google Scholar 

  • Wechsler S (2015) The right to remember: the European convention on human rights and the right to be forgotten. Columbia J Law Soc Probl 49:135–165

    Google Scholar 

  • Wendling M (2018) The (almost) complete history of “fake news”. BBC News, 22 January 2018. Available at: https://www.bbc.co.uk/news/blogs-trending-42724320 (last access 23 February 2022)

  • Whitt RS (2013) A deference to protocol: fashioning a three-dimensional public policy framework for the internet age. Cardozo Arts Entertain Law J 31:689–768

    Google Scholar 

  • Whitt RS (2019) Old school goes online: exploring fiduciary obligations of loyalty and care in the digital platforms era. Santa Clara High Technol Law Rev 36:75–131

    Google Scholar 

  • WHO (2020) Coronavirus disease 2019 (COVID-19). Situation Report 45, 5 March 2020. Available at: https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200305-sitrep-45-covid-19.pdf?sfvrsn=ed2ba78b_4 (last access 23 February 2022)

  • WHO et al (2020) Managing the COVID-19 infodemic: promoting healthy behaviours and mitigating the harm from misinformation and disinformation. Joint Statement by WHO, UN, UNICEF, UNDP, UNESCO, UNAIDS, ITU, UN Global Pulse, and IFRC, 23 September 2020. Available at: https://www.who.int/news/item/23-09-2020-managing-the-covid-19-infodemic-promoting-healthy-behaviours-and-mitigating-the-harm-from-misinformation-and-disinformation (last access 23 February 2022)

  • Zurth P (2021) The German NetzDG as role model or cautionary tale? Implications for the debate on social media liability, Fordham intellectual property. Media Entertain Law J 31:1084–1153

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mira Burri .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Burri, M. (2022). Fake News in Times of Pandemic and Beyond: Enquiry into the Rationales for Regulating Information Platforms. In: Mathis, K., Tor, A. (eds) Law and Economics of the Coronavirus Crisis. Economic Analysis of Law in European Legal Scholarship, vol 13. Springer, Cham. https://doi.org/10.1007/978-3-030-95876-3_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-95876-3_2

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-95875-6

  • Online ISBN: 978-3-030-95876-3

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics