Abstract
Minors are increasingly exposed to harmful content and must be especially protected from incitement to hatred, violence and terrorism, in particular through misinformation, in their development phase. The principal provisions relevant to the protection of minors are set forth in the Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV), the Protection of Minors Act (JuSchG), and the Network Enforcement Act (NetzDG). The chief point is for the platforms, as content intermediaries, to be held more responsible than they traditionally have been. This paper examines what measures are imposed on platforms under the JMStV, JuSchG, and NetzDG, as well as on how these may be structured going forward. In particular, the consequences for the protection of minors will be addressed.
You have full access to this open access chapter, Download chapter PDF
Similar content being viewed by others
1 Introduction
Minors can be exposed to violence-glorifying, sexualized and racist content on video-sharing platforms. They can also be influenced by untrue and polarizing information and experience hate speech. Furthermore, this may lead to the infringement of personality rights guaranteed by the German Constitution.Footnote 1 Such infringement occurring on the internet rather than by ‘analogue’ means are more difficult to prosecute due to the principle of anonymity still in place regarding the internet—see § 13 (6) of the German Telemedia Act (Telemediengesetz; TMG)Footnote 2—making it less likely that perpetrators will be held accountable for their actions. The German Federal Ministry of Justice currently has no plans to implement ‘real name’ requirements,Footnote 3 and the information-providing obligations of the platforms and portals on which such infringing material is distributed are extremely limited. While information can be requested about stored data on users who have committed certain criminal offenses and/or personal sphere rights infringements pursuant to § 14 (3) TMG,Footnote 4 this right of information is useless if the platform does not have the infringer’s name but only their IP address. The victim in a given case must involve the public prosecutor’s office in the hope that they will be able to determine the identity of the perpetrator(s), but in many instances, such cases are dropped. It is abundantly clear that there is an urgent need for more effective protection of minors on the internet, in view of, for example, increasing press coverage of teen suicides prompted by an infringement of personal sphere rights on the internet.Footnote 5
The Audiovisual Media Services Directive (AVMSD) of November 14, 2018 ((EU) 2018/1808)Footnote 6 is aimed at improving the protection of minors on video sharing platforms, partly in recognition of a changed risk situation for minors on the internet. Minors are increasingly exposed to harmful content and must be especially protected from incitement to hatred, violence and terrorism, in particular through misinformation, in their development phase. The principal provisions relevant to the protection of minors are set forth in the Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV),Footnote 7 the Protection of Minors Act (JuSchG),Footnote 8 and the Network Enforcement Act (NetzDG). The chief point is for the platforms, as content intermediaries, to be held more responsible than they traditionally have been. A similar trend is observable in other areas of law, including copyright law, for which platforms will in the future bear liability as perpetrators of infringement rather than as mere contributors.Footnote 9 Irrespective thereof, general monitoring obligations are prohibited pursuant to Article 14 of the E-Commerce Directive (transposed in § 10 of the TMG), and this prohibition is to remain in place after the drafting of the Digital Services Act. This paper examines what measures are imposed on platforms under the JMStV, JuSchG, and NetzDG, as well as on how these may be structured going forward. In particular, the consequences for the protection of minors will be addressed.
2 Structure of this Paper
The paper begins with a detailed look at the changes brought about by a revision of the AVMS Directive (Sect. 3) before reviewing the corresponding measures used to implement these in national law (Sect. 4). These measures include amendments to the Interstate Treaty on the Protection of Minors in the Media (Sect. 4.1), the Protection of Minors Act (Sect. 4.2), and the Network Enforcement Act (Sect. 4.3). A summary of conclusions is then provided in the subsequent and final Sect. 5.
3 Amendments to the Audiovisual Media Services Directive
Video sharing platforms are subject to the regulatory regime established in the amended Audiovisual Media Services Directive (AVMSD; (EU) 2018/1808).Footnote 10 Requirements under the AVMSD and their implementation are discussed below.
3.1 Expanded Scope of Application
Amendments to the AVMSD have always involved expansions of scope. As technologies increasingly converged, the original scope covering classic television—thus the name “Television Directive”—was expanded via amendment to include non-linear services.Footnote 11 In the amended AVMSD, the scope of application, which is largely prescribed by the definitions of terms per Art. 1, has again been expanded.Footnote 12 An audiovisual media service within the meaning of Art. 1 a) i) AVMSD is now defined as a service whose main purpose or separable element is to provide content via electronic communication networks to the general public for informational, entertainment, or educational purposes under the editorial responsibility of a media service provider. The new part is that the AVMSD now applies to any provider of understandable, discrete video content that lacks direct reference to other content.Footnote 13 The scope of application has additionally been expanded to include video sharing platforms. The primary purpose or essential function of video sharing platforms is to make programming or user-generated video content electronically available to the general public, with the video sharing platform provider bearing no editorial responsibility. The operator of a video sharing platform solely determines the organization of the platform, not what content is available on it, see Art. 1 aa) AVMSD. Thus, in addition to major providers like YouTube, AVMSD will now also apply to audiovisual content distributed by users on social media platforms like Facebook, or in separate sections of newspaper websites.Footnote 14 The expansion of AVMSD to include video sharing platforms has most recently led to the addition of Chapter IXa with Art. 28a and 28b AVMSD. The geographic scope of application is extended under Art. 28a (1) and (2) AVMSD to include providers which have, or effectively have, a branch operation within the territory of an EU Member State, or when that company has a parent company, subsidiary, or corporate affiliate domiciled in a Member State.Footnote 15 Preventive protection measures are required under Art. 28b AVMSD, such as certain requirements for the protection of minors applicable to video sharing platform operators.Footnote 16
3.2 Amendments to Media Laws for the Protection of Minors
The former Art. 12Footnote 17 and Art. 27 AVMSD, which provided for separate regulation of television programming and on-demand services with graduated regulation levels for media protection of minors, were eliminated in the amended AVMSD. Art. 6a (1) AVMSD now requires all providers of audiovisual media services to establish media exposure/consumption barriers to content deemed deleterious to development. The law provides that audiovisual services that could harm minors’ physical, mental, or moral development may only be provided in a manner that ensures that minors will generally not be exposed to or consume such audio and image/video content. The measures implemented to this end include broadcast scheduling, age verification procedures, and other technical measures.
Platform operators’ responsibilities were also specifically regulated to implement protections against content that poses a danger to minors or incites hatred or violence (Art. 28b (1) AVMSD), as well as to comply with advertising requirements as per Art. 28b (2) AVMSD.Footnote 18 In particular, Art. 28b (3) AVMSD provides that platforms must enable their users to designate content as unsuitable for minors as per Art. 28b (1) AVMSD. Platforms themselves must have reporting systems in place for unsuitable content in accordance with the aforementioned paragraph 1 and provide systems for parental control enabling parents to keep such content out of their children’s accounts. Additionally, the AVMSD provides for the “set-up and operation of age verification systems” in these clauses.Footnote 19
4 Measures Required Under National Law
4.1 The Interstate Treaty on the Protection of Minors in the Media
The amended provisions of the AVMSD have been implemented in the similarly amended Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV) in special regulations applicable to video sharing services. A key change is the new clause § 5a (1) of the amended JMStV, which now expressly obligates video sharing services to implement adequate measures to protect children and teens from content deleterious to their development irrespective of the obligations per § 4 and § 5 JMStV.Footnote 20 While the necessity of this detailed clause has been questioned, it does make clear that platforms can be held liable for non-proprietary content.Footnote 21 It must be noted that the obligations per the new § 5a JMStV “apply irrespective of the obligations per § 4 and § 5 JMStV”. Thus, these require commentary (under points Sect. 4.1.1 and 4.1.2) before addressing the latest amendments to the JMStV (under point Sect. 4.1.3).
General categorical distinction is made in the JMStV between “harmful content for minors” and “content deleterious to development”. The former is fundamentally illegal to distribute in general under § 4 JMStV and, pursuant to § 4 (2) 2nd st. JMStV, can only be made accessible in telemedia to a closed user group with robust mandatory age verification. It is, however, fundamentally legal to distribute the latter, although providers are obliged under § 5 JMStV to prevent minors from consuming such content “under normal circumstances”.Footnote 22
4.1.1 Protection Against Endangering Content for Minors
The list of “absolutely prohibited content” per § 4 (1) 1st st. JMStV remains unchanged. Such content is generally prohibited in both broadcasting and telemedia. The list is intended to ensure the upholding of human dignity and prevent sexual abuse by banning child pornography and similar or related endangering content. In particular, the list of “absolutely prohibited content” serves to establish that making content public in violation of legal norms for the protection of society constitutes a criminal act under media-specific protection of minors laws.Footnote 23 The violations of public order per § 24 (1) no. 1 a–k represent such a violation.Footnote 24
In contrast, the distribution of “content endangering to minors” per § 4 (2) 1st st. JMStV is only illegal under certain circumstances. The distribution of such content in telemedia is permitted, as per the 2nd st., if access is restricted to adult users.Footnote 25 Content endangering to minors, such as regular pornography in particular (i.e. pornography without relevance to the sexualization of children or other crime), may therefore be distributed via telemedia in exceptional cases despite the general prohibition.Footnote 26 The provider must ensure that such content is only accessible within “closed user groups”.Footnote 27 The existence of such a user group is ensured by having a reliable age verification system in place that requires personal identification, although under the JMStV there are no officially prescribed recognition rules.Footnote 28 There is the possibility, however, of the Commission on the Protection of Minors (Kommission für Jugendmedienschutz; KJM) assessing the merits of a given age verification system with reference to the set of criteria the Commission has outlined.Footnote 29 Solution concepts based thereupon primarily utilize video calls for identification purposes or draw upon successful identity verificationFootnote 30 carried out elsewhere, such as when opening a bank or savings account.Footnote 31 In a recent development, age verification providers have even integrated autoident technology into their systems.Footnote 32 This technology enables user identification by automatically cross-referencing a photo against biometric and other data stemming from an identifying document.Footnote 33
4.1.2 Protections Against Content Deleterious to Development
Conceptual distinction must be made between content “endangering to minors” and content “deleterious to the development of minors”. Fundamentally, the distribution of content deleterious to development is legal, but pursuant to § 5 (1) 1st st. JMStV, providers of such content must ensure that children and youth of the concerned age levels will generally not be exposed to it.Footnote 34 Content deleterious to the development of minors is content that can lead to dysfunction through overstimulation or other excessive stressors; lack of socio-ethical orientation, such as by confusing fiction and reality; or impairment of the maturation of children and youth into responsible adults.Footnote 35 In implementing the mandatory controls limiting access to such content, providers must take into account what ages are concerned or would thereby be affected.Footnote 36 The provider’s obligations extend solely to ensuring that minors of the concerned age levels will “generally not be exposed” to content deleterious to their development. There is no requirement that it must be rendered completely impossible to access such content.Footnote 37 The potential accessing of such content does not have to be completely prevented, but rather only made difficult. This is implemented on television by limiting the broadcast to a certain time, e.g. late evening hours. Due to the ubiquitous access to content on video sharing platforms, finding technical solutions is increasingly challenging. Providers of such platforms can fulfill these requirements by marking their content as relevant for the protection of minors for filtering software. Such software can be installed by parents and helps to decide which contents are suitable for which age. The filtering software filters the internet and only shows suitable content.Footnote 38 Apart from that providers might specify time limits, and/or implement other technical measures.Footnote 39 Accordingly, the requirements per § 5 JMStV are less stringent than the requirement of enforcing a closed user group with age verification procedure as per § 4 (2) 2nd st. JMStV.Footnote 40
4.1.3 Expanded Media Protection for Minors
The increasing popularity of video sharing services, among minors in particular, led to the amendments to § 5a and § 5b JMStV, implementing Article 28b AVMSD,Footnote 41 which are the primary changes to the law. Art. 28b AVMSD is the central norm to implement protection for minors against harmful content or content that incites hatred or violence (see Sect. 3.2). The overall scope of the JMStV was also expanded, including particularly its geographic scope of application, which is of key importance for platform operators.
4.1.3.1 Applicability to Foreign Providers
The new § 2 (1) 2nd st. of the amended JMStV clarifies that the provisions of the JMStV likewise apply to providers based outside of GermanyFootnote 42 if they host content intended for use in Germany.Footnote 43 This was implemented to enhance regulators’ ability to enforce the law against foreign providers, among other objectives.Footnote 44 Foreign providers are thus now required to appoint a domestic authorized recipient of correspondence under § 21 (2) of the amended JMStV. However, the expansion of the scope of the JMStV to include foreign providers is subject to the limits of the ‘country of origin’ principle. This became evident through the amendment of a provision to insert an express reference to compliance with the country of origin principle pursuant to a resolution adopted by the Conference of Minister Presidents (Ministerpräsidentenkonferenz; MPK).Footnote 45 The country of origin principle proceeding from Art. 3 of the e-Commerce Directive (ECD), and implemented in Art. 3 (2) of the TMG, means that the free movement of telemedia services that are offered or provided in Germany by service providers domiciled in another state subject to the ECD may not be restricted. The effects of the exceptions per Art. 3 (4)–(6) ECD and Art. 3 (5) and (6) TMG must be specifically considered on a case-by-case basis. But in actual practice, regulatory measures are only allowed in a few individual cases within the framework of these stringent exceptions, and the JMStV is virtually inapplicable to providers domiciled in other EU countries.Footnote 46
4.1.3.2 Expanded Measures for Video Sharing Service Providers
In addition to the familiar pre-existing methods, the suitable measures for protection against content deleterious to minors’ development proposed for providers of video sharing services under § 5a (2) of the amended JMStVFootnote 47 include implementing and operating age verification procedures and systems that allow parents to control access to content. Using the term “age verification” when referring to less stringent procedures for content that is only detrimental to development can be confusing, as it is also used to refer to the systems employed to create closed user groups for content endangering to minors as per § 4 (2) 2nd st. JMStV, which are distinct in that they are complex and non-circumventable.Footnote 48 The Official Unified Declaration of German States clarifies that the term “age verification procedures” per § 5a JMStV also refers to procedures that establish age group classification as well as to those that create closed user groups.Footnote 49 Pursuant to § 5a (2) 2nd st. JMStV as amended, a user feedback/rating system must also be implemented for monitoring the effectiveness of such procedures in place with video sharing services.
4.1.3.3 Codified ‘Notice and Take Down’ Procedure
The newly inserted § 5b in the amended JMStV serves the determination of the illegality of content as per §§ 10a ff. of the TMG. Pursuant to § 10a (1) TMG, video sharing platform providers are obligated to have reporting procedures in place that enable users to electronically file complaints about illegal content being made available on their respective platform. Whether content is illegal proceeds from § 4 and § 5 JMStV, which establish that content deleterious to development of minors is only illegal if made available to the general public and if the video sharing service provider has not fulfilled the obligations per § 5 (1), (3)–(5) JMStV, see § 5b nos. 1 and 2 JMStV as amended.
These provisions codify a ‘notice and take down’ procedureFootnote 50 for video sharing service providers. Established under § 10 TMG, the fundamental rule is that “host providers”,Footnote 51 a term that includes video sharing providers like YouTube,Footnote 52 are not liable for third-party data which they save on a user’s behalf. The prerequisites apply, however, that they must either have no knowledge of the illegal act/data—and further, if damages are claimed, be unaware of any facts or circumstances which render the illegal act/content obvious—or have taken action to remove or restrict access to such content immediately after becoming aware of it. Host providers do not in any case have preventive review obligations, i.e. obligations to monitor or investigate activities, pursuant to § 7 (2) 1st st. TMG. However, under § 10a and § 10b TMG in conjunction with § 5b JMStV as amended, video sharing platform providers are now expressly obligated to set up a reporting procedure which is easily recognizable as such, easy to use, directly accessible, and continuously available, as well as to review user reports to ascertain whether content violates media laws concerning the protection of minors. The competent state media authority is responsible for monitoring compliance with these newly created regulations under § 14 (1) of the amended JMStV.
4.1.3.4 Self-Regulation Mechanisms for Social Media
Self-regulation mechanisms supplement the systems/procedures for the protection of minors required by law. Pursuant to § 7 (1) JMStV, the telemedia provider must appoint a Protection of Minors Officer if the provider’s platform is publicly accessible and contains content that is endangering to minors or deleterious to their development. Additionally, voluntary self-regulation panels exist, which are recognized by the KJM in accordance with § 16 2nd st. no. 2 and § 19 JMStV. Lastly, the terms of use of social media networks also provide for the protection of minors, such as the Facebook ban on nudity and pornography.Footnote 53
4.2 Protection of Minors Act
Under § 24a of the Protection of Minors Act (Jugendschutzgesetz; JuSchG), service providers which save or provide third-party content for users with a profit motive must take appropriate and effective structural precautionary measures, irrespective of § 10 TMG, to ensure that the protective objectives per § 10a nos. 1–3 JuSchG are upheld. The protective objectives per § 10a nos. 1–3 are as follows: (1) to afford protection from media likely deleterious to the development of minors and/or to their maturation into responsible, socially adequate adults (media deleterious to development); (2) to afford protection from media deleterious to the development of minors and/or to their maturation into responsible, socially adequate adults (media endangering to minors); (3) to protect the personal integrity of minors as media users; and (4) to provide orientation for children, youth, parents/guardians, and educators regarding media usage and literacy.
General monitoring obligations are prohibited per Art. 14 ECD and § 10 TMG, thus “structural precautionary measures” do not concern reviewing the content of media available on the platform. The draft Digital Services Act provides that such reviewing of content on a voluntary basis will not, going forward at any rate, potentially result in platforms which merely make third-party content available losing exemption from liability. In the future, such host providers will only be liable for content made available, even if content is checked in advance, if they become aware of a specific legal violation (see Art. 5 and 6 DSA as proposed).
In the argumentation behind the law, the term “structural precautionary measures” per § 24a JuSchG is outlined to mean:
“the structuring of a service/offering so as to facilitate the protection of the personal integrity of minors, their protection against exposure to content of a deleterious or endangering nature, and their ability to take steps accordingly on their own behalf.”Footnote 54
A list is provided specifying a range of measures which may be appropriate given the technical features and the terms of use of the respective service or offering, as well as the content and/or structuring thereof. However, measures cannot be imposed upon platforms that would create excessive hardship, for constitutionality reasons among others. Additionally, the ‘regulatory triangle’ concept applies to platform regulation, according to which the rights and interests of content providers, platform users (minors in this case), and platforms themselves are to be appropriately weighed.Footnote 55 It is thus proper and important that each case be considered individually, as the law prescribes. The listed measures include reporting and complaint systems, classification schemes for user-generated content, age verification procedures, information on where to get advice and assistance from and report issues to an independent non-provider entity, technical means provided to parents and guardians for controlling and monitoring content usage, and terms of use suitable for the protection of minors.
These measures are indicative of a trend toward platform regulation predominantly through design obligations, i.e. requiring the operator to structure and organize the platform in a child-friendly manner (‘child protection by design’).Footnote 56 If it is found in a given case that these requirements are not met, a dialogue with regulators first takes place aimed at improving the content offered and the platform design. Only if the issues remain unresolved will specific prevention measures be ordered. Failure to comply with such orders is punishable by a fine of up to five million euros, see § 28 (3) no. 4 and (5) 1st st. JuSchG. The regulator is the Federal Review Board for Media Harmful to Minors, which is to be reorganized as the Federal Center for Media Protection for Minors. The law applies equally to service providers not domiciled in Germany pursuant to § 24a (4) JuSchG.
4.3 Network Enforcement Act
While the Network Enforcement Act (Netzwerkdurchsetzungsgesetz; NetzDG) is not explicitly focused on the protection of minors, it is intended to afford protections to affected parties, including minors. The regulations it sets forth originally applied solely to social media networks per § 1 (1) NetzDG, but the scope of the law is being expanded to include video sharing platforms as part of implementation of the AVMSD. An outline of the fundamental provisions of the NetzDG is first provided below before discussing the regulations governing video sharing platforms in regard to their applicability requirements and legal ramifications.
4.3.1 The Regulatory Framework of the Network Enforcement Act
Social media networks per § 1 (1) NetzDG that have more than two million registered users in Germany are subject to specific reporting obligations,§ 2, and erasure obligations, § 3, under penalty of fines, § 4, pursuant to NetzDG.Footnote 57 Under § 3 NetzDG, social media network providers are required to have a process in place for filing complaints about illegal content that is “easily recognizable as such, directly accessible, and available at all times.”Footnote 58 Pursuant to § 3 (1) NetzDG, the content specified under § 1 (1) NetzDG is illegal, including content of an insulting nature per §§ 185 ff. of German Penal Code (StGB), of a threatening nature per § 241 StGB, or of a nature violating an individual’s intimate personal sphere per § 201a StGB. Whether an offense is culpably committed is legally irrelevant.Footnote 59
Social media network operators were already required to have complaint management processes in place prior to enactment of the NetzDG for breach of privacy issues by virtue of the blog entry procedure established by the German Federal Court of Justice (BGH).Footnote 60 However, not every case in which the criteria are met for the above offenses contains a legal breach of privacy, nor is every breach of privacy criminally relevant. Even for the overlap between offenses covered by the NetzDG and illegal breaches of privacy, the process now has to ensure that the social media network provider promptly registers the complaint and reviews whether the content reported in the complaint is illegal and has to be removed or access to it restricted.Footnote 61 In some cases, other deadlines apply for breach of privacy incidents. However, § 10 TMG already requires action to be taken promptly upon receiving notification as a fundamental principle, and EU Member States are not able to attach further nuance to that principle.Footnote 62 Any illegal content has to be promptly removed or access to it restricted, generally within seven days within receipt of complaint.Footnote 63 Furthermore, social media network providers must remove content which is obviously illegal or restrict access to it within 24 h of receiving complaint. Thus, for obviously illegal content, a shorter period of time is given. This is conditional upon the complaint in question stating sufficiently specific information.Footnote 64 The applicable deadline is extended accordingly if the social media network provider forwards the matter to a recognized regulated self-regulation panel per § 3 (6)–(8) NetzDGFootnote 65 to make a decision regarding illegality, accepting the panel’s decision as binding. The legal position of such panels and the requirements they are subject to remain unclear, however.Footnote 66 Enactment of the NetzDG did not result in significant changes to the complaint management procedure employed by social media network providers. Social media networks still primarily review content with reference to their community standards; the specific review required under NetzDG is implemented as a downstream step.Footnote 67 There is still no final clarity on whether community standards that differ from fundamental legal requirements are in any way valid.
Violations of the NetzDG are punishable by fine as per § 4 (1) no. 2. If the fine-imposing authority intends to base its decision on the illegality of content, a court decision on the illegality must be obtained first (see § 4 (5) NetzDG. Per § 68 (1) of the Administrative Offenses Act (Ordnungswidrigkeitengesetz; OWiG)). The competent court is the court of jurisdiction in the district where the administrative authority (the Federal Office of Justice) is based, i.e. Bonn Local Court.Footnote 68 While the local court decision is binding and cannot be challenged,Footnote 69 the Federal Office of Justice’s decision, as fine-imposing authority that draws on the local court’s decision, can be contested by filing an objection.Footnote 70 Some consider this obligation to obtain a decision from Bonn Local Court regarding the illegality of specific content before a fine can even be imposed to be alien to the legal system.Footnote 71 In any event, there would likely be agreement that if Bonn Local Court was demonstrably overloaded, involving other courts in the matter should be considered.Footnote 72
Pursuant to § 2 (1) NetzDG, a semi-annual complaint handling report must be posted on the provider’s website and in the Federal Gazette. And pursuant to § 5 NetzDG, a non-temporary contact person in Germany who is easily identifiable as such, as well as an authorized served document recipient to facilitate legal enforcement, must be named.
4.3.2 Amendment of the Network Enforcement Act
The Network Enforcement Act (Netzwerkdurchsetzungsgesetz; NetzDG) is currently being altered in two ways: First, by the April 2021Footnote 73 enactment of the Act against Right-Wing Extremism and Hate CrimesFootnote 74 as well as by an amendment to the NetzDG itself. The explanations outlined below, which particularly concern heightened obligations for platforms and their impact on platform design, proceed mostly from the latter of the aforementioned legislation.Footnote 75
In addition to heightened reporting obligations, the amendment provides for supplementation of the complaint handling procedure per § 3 NetzDG as follows:
-
an obligation to promptly notify users when a complaint is received over content stored for them
-
an obligation to retain removed content for a period of ten weeks for evidentiary purposes
-
an obligation to inform the complainant and the user concerned of the corresponding decisions made
-
a legal basis for forwarding data to a recognized regulated self-regulation organization
-
provisions for establishing regulated self-regulation
Platform design is addressed as well, with the social media network provider being obligated, for example, to implement a process for reporting to the Federal Criminal Police Office, see § 3a NetzDG.Footnote 76 The provider must additionally ensure that an easily identifiable procedure is in place for contacting the provider (see § 3b (1) 3rd st. NetzDG). A remonstrance procedure is being introduced that enables users whose content has been deleted to protest its deletion (see § 3b NetzDG). An arbitration procedure is likewise introduced under § 3c NetzDG. There are also rudimentary supplemental provisions regarding fines under § 4 NetzDG, and a supervisory authority is established under § 4a NetzDG. The responsibilities of the named domestic authorized recipient of served documents are supplemented under § 5 NetzDG, and the transition period provisions are supplemented under § 6 NetzDG. Lastly, video sharing platforms are placed within the scope of the NetzDG under § 3d–§ 3f NetzDG, implementing § 28b AVMSD. Except as otherwise provided under § 3e (2) and (3), NetzDG applies to video sharing platforms. The NetzDG does not address minors in particular. Thus, the NetzDG does not state an obligation to design specific minor-friendly procedures. The procedures must only be identifiable for the common user. Still, the amendment of the NetzDG might have positive effects that benefit minors such as the deletion of harmful content and the comprehensive and practical implementation of the procedures mentioned.
4.3.3 Provisions Regulating Video Sharing Platforms
Video sharing platforms fall fundamentally and entirely within the framework of the NetzDG by virtue of § 3e (1) NetzDG, thereby obtaining the same status as social media networks. The term “video sharing platform service” is regulated per definitions under § 3d (1) NetzDG. The conceptual content of this term and of other definitions conforms with requirements under the AVMSD.Footnote 77 The scope of application of the NetzDG is thus expanded with regard to the provisions of the AVMSD governing the removal of illegal content.Footnote 78 Certain video sharing platforms already met the definition to constitute a social media network as per § 1 (1) NetzDG, as all forms of communication are concerned thereunder.Footnote 79 However, the expansion in scope affects cases with video sharing platforms that specialize in the exclusive distribution of specific content, such as the publication of scenes from a computer game.Footnote 80 This content is of specific interest for minors and may be harmful due to potential violence or sexualization in computer games. Thus the content regulated by the NetzDG, both in general as well as after the amendment, is of special relevance to children. Here, it must be kept in mind that a social media network under the definition per § 1 (1) NetzDG is a platform designed for any content, rather than for specific content. Because of its legal orientation around the requirements per AVMSD, the NetzDG does not apply in exactly the same ways to social media networks and video sharing platforms respectively, as there are differences regarding details. Thus, the discussion below focuses first on differences in how the NetzDG applies to social media networks versus video sharing platforms before turning to the legal ramifications in regard to the protection of minors.
4.3.3.1 Applicability of the Network Enforcement Act
Limits are set to the applicability of the NetzDG to video sharing platforms under § 3e (2) and (3). These limitations may apply to smaller video sharing platforms with fewer than two million registered users in Germany (sub-Sect. 4.3.3.1.1.) and video sharing platforms that are domiciled outside Germany in another EU Member State (sub-Sect. 4.3.3.1.2.).
4.3.3.1.1 Limited Applicability to Smaller Video Sharing Platforms in Germany
Pursuant to § 1 (2) NetzDG, social media networks of a certain size, i.e. with fewer than two million registered users in Germany, are exempt from all requirements under § 2–§ 3b NetzDG. Smaller sized platforms are to be relieved of excessive burden that would be imposed by having to comply with the NetzDG.Footnote 81 Yet regarding smaller video sharing platforms, in contrast, the requirements per the AVMSD still fundamentally apply,Footnote 82 i.e. video sharing platform providers with fewer than two million registered users in Germany that are domiciled in Germany are not exempt from all obligations under the NetzDG, pursuant to § 3e (2) 3rd st.
Instead, alleged illegal content on smaller video sharing platforms is viewed in relation to distinct categories. Thus, certain content from the general list per § 1 (3) NetzDG is referenced under § 3e (2) 2nd st. NetzDG—namely user-generated videos and broadcasts that constitute criminal acts under §§ 111, 130 (1) and (2), 131, 140, 166, and 184b in conjunction with § 184d of Criminal Code (Strafgesetzbuch; StGB). These are categories of content already covered by the NetzDG, which are also regulated under the AVMSD. Where such specifically listed illegal content is concerned, requirements under the NetzDG still apply to smaller video sharing platforms as well, though to a lesser extent, being exempt from reporting obligations per § 2 NetzDG and notification obligations per § 3a NetzDG. The complaint procedure per § 3 (1) NetzDG is essentially limited to the deletion of obviously illegal content. There is neither a review deadline for ‘regular’ illegal content nor an obligation to retain removed content nor any specific requirements regarding in-house monitoring of complaint handling. It is required, however, to have a remonstrance procedure in place pursuant to § 3b NetzDG.
Regarding content not relevant to the criminal offenses specified under § 3e (2) 2nd st. NetzDG, smaller video sharing platforms are likewise exempted from all obligations per § 2–§ 3b NetzDG, in accordance with § 1 (2) NetzDG. Such content, which for small video sharing platforms thus does not trigger any obligations under the NetzDG, includes, for example, content illegal for reasons concerning the preservation of the democratic and constitutional state (§§ 86, 86a StGB), preservation of public order (§§ 126, 129–129b StGB), and protection of personal dignity (§§ 185–187 StGB).
Thus, while smaller video sharing platforms enjoy privileges compared to larger video sharing platforms with respect to liability for illegal content and related obligations, such privileging falls short of the exemption for smaller social media networks.
4.3.3.1.2 Limited Applicability to Video Sharing Platforms in Other EU Countries
Because the requirements per the AVMSD are directly referenced, the geographical scope of applicability of the NetzDG to video sharing platforms is derived differently and with greater specificity. The AVMSD applies the country of origin principle.Footnote 83 The NetzDG retains this in § 3e (3), thus there is a fundamental reliance on the degree of protection on the European level being ensured by the Member State in which the video sharing platform provider is domiciled, or is effectively domiciled pursuant to § 3d (2) and (3) NetzDG.Footnote 84 Therefore, video sharing platforms that are either actually or effectively domiciled in Germany are fully subject to the NetzDG. Video sharing platforms domiciled in another EU Member State are only obligated to comply with the NetzDG to a limited extent. Here, too, alleged illegal content must be viewed in relation to distinct categories. There is no fundamental obligation under the NetzDG regarding content that falls within the above-referenced list per § 3e (2) 2nd st. NetzDG. Such obligation can only be triggered in specific cases by a special order of the Federal Office of Justice pursuant to § 3e (3) and § 4a NetzDG. In contrast, regarding all other content per § 1 (3) NetzDG, the obligations under the NetzDG apply in full. This concerns a substantial amount of illegal content because criminal breach of preservation of dignity and sphere of privacy protections are a particularly significant focus within the framework of the NetzDG. After all, the most frequent reasons for content removal are violations of personal sphere rights or hate speech under community standards.Footnote 85 A breach of preservation of dignity and sphere of privacy protections does not only occur to adults, but is of special relevance to minors due to the dangers of mobbing and hate for their development.
This nuanced view regarding geographical applicability to video sharing platforms is distinct from the geographical scope of application of the NetzDG to social media networks. Social media networks have to fully comply with requirements under the NetzDG even if domiciled in another EU Member State.Footnote 86 This deviation from the country-of-origin principle remains a subject of criticism.Footnote 87
Accordingly, the distinction between social media networks and video sharing platforms in the NetzDG is highly important, with special relevance to smaller platforms with fewer than two million registered users in Germany and to those domiciled in other EU countries. Platforms that represent a hybrid between a video sharing platform and a social media network, or feature elements of both, will have to evaluate and independently decide whether they need to set up a complaint process per the NetzDG, and for what platform content. There is thus some concern that the differing outlined obligations for social media networks and video sharing platforms respectively may complicate efforts to establish coherent and uniform complaint procedures and mechanisms for users.Footnote 88 This differentiation can lead to two types of issues and corresponding legal uncertainty. First, platforms must assess whether specific content relates primarily to the function of a social media network or to a video sharing platform. In the case of YouTube, for example, which is domiciled in Ireland, different conclusions can be reached.Footnote 89 In a second step, platforms may have to review whether obligations under the NetzDG apply to the entire list of illegal content or only parts of it. Smaller platforms will have to confront the same considerations.Footnote 90
4.3.3.2 Legal Ramifications
Once the few complicated questions regarding applicability of the NetzDG for video sharing platforms have been resolved, the legal ramifications should be crystal clear. Under the NetzDG, video sharing platforms have the same status as social media networks except for the differences described, whereby they fall within the regulatory framework of the NetzDG (see Sect. 4.3.1.) and have new obligations under the most recent amendment (see 4.3.2.). Being only partially subject to these obligations is only possible for smaller video sharing platforms in exceptional cases (see 4.3.3.1.1.).
Video sharing platforms are further characterized by two particularities. Regarding illegal content per § 3e (2) 2nd st. NetzDG, video sharing platform providers are obligated to have platform users contractually agree not to use the service for the content in question, see § 3e (4) NetzDG and to monitor and ensure compliance with that agreement. This does not, however, represent an obligation to proactively review content, as such mandatory monitoring is prohibited under § 10 TMG.Footnote 91 Again, there is no special regulation of a contractual agreement with minors. The question whether and under what conditions minors can conclude a contract with the video sharing platform provider therefore depends on the contract law of the member state. The second particularity is that video sharing platforms are subject to regulatory arbitration pursuant to § 3f NetzDG. The regulatory arbitration panel exists for the sole purpose of settling disputes with video sharing platform providers out of court. Social media network providers do not have this option for disputes, instead having only non-regulatory arbitration options organized under private law as per § 3c NetzDG. However, regulatory mediation is only an option for disputes with video sharing platform providers if the they do not already participate in private-sector arbitration or the recognized arbitration panel is not a private-sector organization (see § 3f (1) 3rd st. NetzDG). This possibility is thus subsidiary to arbitration disputes organized under private law.
4.3.3.3 Legislative Overlap Between the Network Enforcement Act, the Interstate Treaty on the Protection of Minors in the Media, and the Protection of Minors Act
The discussed amendments mean that a ‘notice and takedown’ procedure for video sharing platforms will be provided under both the JMStV and the NetzDG. There can be overlap between JMStV and NetzDG regarding much content,Footnote 92 such as prohibited content specifically illegal under § 4 (1) JMStV, in particular. For example, the content listed under § 4 (1) 1st st. nos. 1–6 and no. 10 JMStV also falls within the scope of § 1 (3) NetzDG.Footnote 93 Content deleterious to the development of minors per § 5 JMStV could also be concerned. Minors suffer particularly from violation of personal sphere rights on the internet, as such negative experiences can be deleterious to their development.Footnote 94 Therefore, deletion obligations may exist in parallel under the JMStV, and in conjunction with the TMG and the NetzDG, respectively.
In such cases, the procedure per the NetzDG is seen as fundamentally more specific in nature, thus taking precedence. This is evident from the wording of § 10a and § 10b TMG.Footnote 95 The NetzDG similarly also has precedence in case of overlap with § 24a JuSchG, see § 24a (4) JuSchG, although further requirements under the JuSchG then apply relating to the protective objectives outlined above. Regarding possible overlap, however, especially between the JMStV and the NetzDG, there are concerns that the different administrative structures may lead to intersecting competencies and redundant structures that could undermine efforts to implement practical and effective complaint mechanisms, thereby giving rise to conflict.Footnote 96 Additionally, the Federal Office of Justice, as a regulatory authority, could become the subject of constitutional concerns in view of its rapidly growing importance with a view to the principle of separation of state and public media, which requires that media supervision must be relatively independent of government authorities.Footnote 97
5 Conclusions
Legislators have formulated significantly less stringent requirements for telemedia than for broadcasting regarding the protection of minors. In practice, there is no concept in place that is absolutely effective in protecting minors, in part because of ubiquitous access to social media content via smartphones and tablets, which renders technical solutions increasingly difficult.Footnote 98 Legislators have now explicitly addressed video sharing service providers for the first time, i.e. platforms where users post videos, specifically with respect to the protection of minors, imposing concrete obligations.
Laws enacted to implement the AVMSD had largely failed to provide either transparent, user-friendly mechanisms that enable user reporting of illegal content or age verification systems to restrict access to content deleterious to the physical, mental, or moral development of minors.Footnote 99 These deficits are now being addressed by implementing separate clauses §§ 5a ff. JMStV in conjunction with §§ 10a ff. TMG. On the one hand, it does seem suboptimal that the term “age verification” is used in § 5a, (2) no. 1 of the amended JMStV. At any rate, its usage does, however, yield a mechanism for creating a detailed catalog of appropriate measures depending on the type of objectionable content, the harm it could cause, the defining characteristics of the category of persons to be protected, and the rights and legitimate interests concerned as required under Art. 6 a (1) AVMSD.
This tendency towards greater platform regulation is reflected in the development of the JuSchG and the planned amendments to the NetzDG. The JuSchG focuses on the creation of structural precautionary measures. As before, these are not to give rise to general review/monitoring obligations for video sharing platforms, with reference to the Digital Services Act among other considerations. Instead, a set of various measures is available that may include the implementation of registration, classification and age verification systems, minor-friendly terms and conditions, and advisories on external sources and contacts for information and advice. The ‘child protection by design’ obligations to be fulfilled are determined with a view to constitutionality considerations, applying the principle of appropriateness, possibly in coordination with the regulator within the framework of a regulatory dialogue procedure.
In addition to outlining specifics and fleshing out the existing complaint procedure, such as additional notification and retention requirements, the draft amendment of the NetzDG also addresses platform design. For example, platforms must have a remonstrance procedure in place and provide uncomplicated channels for contacting them that are easily identifiable as such. The Federal Office of Justice, which in the past functioned solely as fine-imposing authority, is also to play a greater supervisory role. Video sharing platforms will likewise be compelled to meet these requirements going forward, irrespective of whether any kind of content or only content of a specific nature may be shared on the platform. Smaller video sharing platforms and platforms domiciled in EU countries besides Germany are possible exceptions, and will have to consider in detail what content of theirs falls within the scope of these obligations.
Notes
- 1.
Cf. Müller-Terpitz: Persönlichkeitsrechtliche Aspekte der Social Media, in: Hornung/Müller-Terpitz (eds.): Rechtshandbuch Social Media, pp. 253, 254 ff.
- 2.
Since enactment of GDPR, the independent significance of § 13 (6) TMG (Telemediengesetz; Telemedia Act) has been in question—for a detailed discussion, see cf. Keppeler: Was bleibt vom TMG-Datenschutz, MMR (Multimedia und Recht) 2015, pp. 779 ff. and Tinnefeld/Buchner, in: BeckOK-Datenschutz, Datenschutz in Medien und Telekommunikation, mn. 57; for a differing opinion, see Roßnagel/Geminn/Jandt/Richter: Datenschutzrecht 2016 “Smart” genug für die Zukunft.
- 3.
Wissenschaftliche Dienste des Bundestags: Klarnamenpflicht im Internet.
- 4.
Specifically concerned are those falling under § 1 (3) of the Network Enforcement Act (Netzwerkdurchsetzungsgesetz; NetzDG).
- 5.
See, for example, Spiegel: Erneut Selbstmord wegen Cyber-Mobbing.
- 6.
Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) regarding changing market conditions, ABl. 2018/L 303/69.
- 7.
As amended in the Interstate Treaty on the Modernized Media Regulation (Staatsvertrag zur Modernisierung der Medienordnung in Deutschland; MoModStV) dated April 14, 2020, enacted November 7, 2020.
- 8.
In the version of the Second Law amending the Protection of Minors Act; adopted March 26, 2021; announced April 9, 2021; entered into force May 1, 2021.
- 9.
Ultimately, this classification is less relevant than the specific duty of care obligations that are imposed on the platforms, cf. Specht-Riemenschneider/Hofmann: Verantwortung von Online-Plattformen, pp. 102 f.
- 10.
Recital 45, Directive (EU) 2018/1808 of the European Parliament and of the Council of 14 November 2018 amending Directive 2010/13/EU on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive) regarding changing market conditions, ABl. 2018/L 303/57.
- 11.
Kröber, in: BeckOK-RundfunkR, § 6 RStV mn. 38; Holznagel, in: Spindler/Schuster (eds.): Recht der elektronischen Medien, § 2 RStV mn. 3 ff.; Holznagel/Hartmann, in: Hoeren/Sieber/Holznagel (eds.): Handbuch Multimedia-Recht, Rundfunk und Telemedien, mn. 29 ff.
- 12.
Jäger: Die Novellierung der AVMD-RL, ZUM (Zeitschrift für Urheber- und Medienrecht) 2019, pp. 477, 478.
- 13.
Ibid.
- 14.
Hartmann: Welche Dienste zählen künftig zu den audiovisuellen Mediendiensten?, MMR (Multimedia und Recht) 2018, pp. 790, 792.
- 15.
The video platform YouTube provides a spectacular case regarding application of the law, because of which the law was presumably passed. While not domiciled in the EU, the company falls within the scope of Art. 28 a (2) AVMSD due to being a Google Group company, cf. Holznagel/Hartmann, in: Hoeren/Sieber/Holznagel (eds.): Handbuch Multimedia-Recht, Rundfunk und Telemedien, mn. 51.
- 16.
Liesching: Das Herkunftslandprinzip nach E-Commerce- und AVMD-Richtlinie, MMR-Beil. (Multimedia und Recht) (Appendix) 2020, pp. 3, 10.
- 17.
Directive 2010/2013/ EU of the European Parliament and of the Council of 10 March 2010 on the coordination of certain provisions laid down by law, regulation or administrative action in Member States concerning the provision of audiovisual media services (Audiovisual Media Services Directive), ABl. 2010/L 95/1.
- 18.
Gundel: Die Fortentwicklung der europäischen Medienregulierung, ZUM (Zeitschrift für Urheber- und Medienrecht) 2019, pp. 131, 132.
- 19.
Hilgert/Sümmermann: Technischer Jugendmedienschutz, MMR-Beil. (Multimedia und Recht) (Appendix) 2020, pp. 26, 30.
- 20.
Under the old Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV), “broadcasting” and “telemedia” were the terms utilized; “video sharing services” were not mentioned as such.
- 21.
Hilgert: Novellierung des Jugendmedienschutz-Staatsvertrags.
- 22.
Hilgert/Sümmermann: Technischer Jugendmedienschutz, MMR-Beil. (Multimedia und Recht) (Appendix) 2020, pp. 26 ff.
- 23.
Müller-Terpitz: Persönlichkeitsrechtliche Aspekte der Social Media, in: Hornung/Müller-Terpitz (eds.): Rechtshandbuch Social Media, pp. 253, 293 f.
- 24.
Liesching, in: BeckOK JMStV, § 4 mn. 1.
- 25.
Liesching, in: BeckOK JMStV, § 4 mn. 16.
- 26.
Some question why the exception to the distribution ban should concern telemedia only, cf. Kaspar, in: Beck RundfunkR, § 4 JMStV mn. 81 a; Liesching: Verfassungskonformer Jugendschutz nach der Medienkonvergenz, MMR 2018, pp. 141 ff.
- 27.
Erdemir, in: Spindler/Schuster (eds.): Recht der elektronischen Medien, JMStV, § 4 mn. 147; Altenhain, in: Hoeren/Sieber/Holznagel (eds.): Handbuch Multimedia-Recht, Jugendschutz, mn. 64.
- 28.
Altenhain, in: Hoeren/Sieber/Holznagel (eds.): Handbuch Multimedia-Recht, Jugendschutz, mn. 64.; Kaspar, in: Beck RundfunkR, § 4 JMStV mn. 81 a.
- 29.
The amendments in the new § 11 (3) JMStV (Jugendmedienschutz-Staatsvertrag) now expressly state that the Kommission für Jugendmedienschutz (KJM) defines the suitability criteria for protection of minors software in consultation with the recognized voluntary self-regulation organizations in the form of guidelines—see also Kommission für Jugendmedienschutz: Bewertung von Konzepten für Altersverifikationssysteme.
- 30.
The German Federal Court of Justice (Bundesgerichtshof; BGH) has largely upheld earlier high court rulings, finding that legal criteria for the protection of minors are not met by pornographic internet content being made available to users after solely having to enter their personal identification or passport number, see BGH, decision from October 18, 2007 (I ZR 102/05), NJW (Neue Juristische Wochenschrift) 2008, pp. 1882, 1884 f.
- 31.
Hilgert/Sümmermann: Technischer Jugendmedienschutz, MMR-Beil. (Multimedia und Recht) (Appendix) 2020, pp. 26 f.; Altenhain, in: Hoeren/Sieber/Holznagel (eds.): Handbuch Multimedia-Recht, Jugendschutz, mn. 71 ff.
- 32.
Kommission für Jugendmedienschutz: Neue Methode für Altersverifikation.
- 33.
Ibid.
- 34.
Hilgert/Sümmermann: Technischer Jugendmedienschutz, MMR-Beil. (Multimedia und Recht) (Appendix) 2020, p. 26.
- 35.
Geidner, in: Beck RundfunkR, § 5 JMStV mn. 5.
- 36.
Ibid., mn. 6.
- 37.
Keller/Liesching, in: Hamburger Kommentar, Pflichten von Anbietern von Rundfunk und Telemedien, mn. 6.
- 38.
Such software must be recognized by the competent state media authority pursuant to § 11 (2) JMStV (Jugendmedienschutz-Staatsvertrag) through the agency of the Kommission für Jugendmedienschutz; one example for such software is the german JUSPROG, see www.jugendschutzprogramm.de.
- 39.
Liesching, in: BeckOK JMStV, § 5 mn. 9 ff.; regarding technical media protection measures for minors see Hilgert/Sümmermann: Technischer Jugendmedienschutz, MMR-Beil. (Multimedia und Recht) (Appendix) 2020, pp. 26, 27 ff.
- 40.
Erdemir, in: Spindler/Schuster (eds.): Recht der elektronischen Medien, JMStV, § 5 mn. 57.
- 41.
MoModStV dated April 14, 2020, Official Unified Declaration of the German Federal States on the Interstate Treaty on the Modernized Media Regulation (Staatsvertrag zur Modernisierung der Medienordnung in Deutschland), III. Argumentation for Article 3, Amendment to the Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV), B. regarding point 4.
- 42.
Whereas the previous version of the Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV) only asserted applicability to German providers.
- 43.
Such as those which employ the German language.
- 44.
MoModStV dated April 14, 2020, Official Unified Declaration of the German Federal States on the Interstate Treaty on the Modernized Media Regulation, III. Argumentation for Article 3, Amendment to the Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV), B. regarding point 2. The extent is questioned to which there is compatibility with the ‘country of origin’ principle, cf. Hilgert: Novellierung des Jugendmedienschutz-Staatsvertrags.
- 45.
See also the detailed discussion in Liesching: Das Herkunftslandprinzip und seine Auswirkung, pp. 88 ff.
- 46.
According at any rate to ibid.
- 47.
I.e. broadcast time restriction, technical or other means, programming of effective protection of minors software in line with § 5 JMStV (Jugendmedienschutz-Staatsvertrag).
- 48.
Hilgert: Novellierung des Jugendmedienschutz-Staatsvertrags.
- 49.
MoModStV dated April 14, 2020, Official Unified Declaration of the German Federal States on the Interstate Treaty on the Modernized Media Regulation, III. Argumentation for Article 3, Amendment to the Interstate Treaty on the Protection of Minors in the Media (Jugendmedienschutz-Staatsvertrag; JMStV), B. regarding point 4.
- 50.
See Holznagel: Melde- und Abhilfeverfahren zur Beanstandung rechtswidrig gehosteter Inhalte, GRUR (Gewerblicher Rechtsschutz und Urheberrecht) Int. 2014, p. 105.
- 51.
A host provider saves third-party data for users on its own servers on a non-temporary basis.
- 52.
YouTube qualifies as a host provider, see German Federal Court of Justice (Bundesgerichtshof; BGH), decision from September 13, 2018 (I ZR 140/15), GRUR (Gewerblicher Rechtsschutz und Urheberrecht) 2018, p. 1132.
- 53.
Cf. Müller-Terpitz: Persönlichkeitsrechtliche Aspekte der Social Media, in: Hornung/Müller-Terpitz (eds.): Rechtshandbuch Social Media, pp. 253, 294 f.
- 54.
Justification for the Law, BT-Drs. 19/24909, p. 27.
- 55.
Specht-Riemenschneider/Hofmann: Nutzerrechte als Baustein einer fairen Plattformökonomie, NJW-aktuell (Neue Juristische Wochenschrift Aktuell) 2021, pp. 15 ff.
- 56.
Regarding the corresponding design obligations cf. Specht-Riemenschneider et al.: Stellungnahme des SVRV, pp. 67 f., 77 f.; Specht-Riemenschneider/Hofmann: Verantwortung von Online-Plattformen, p. 88.
- 57.
- 58.
Regarding the imposable fines, for a detailed discussion, see Guggenberger: Das Netzwerkdurchsetzungsgesetz, ZRP (Zeitschrift für Rechtspolitik) 2017, pp. 98, 99.
- 59.
For a detailed discussion, see Guggenberger: Das Netzwerkdurchsetzungsgesetz in der Anwendung, NJW (Neue Juristische Wochenschrift) 2017, pp. 2577, 2578 ff.
- 60.
BT-Drs. 18/12356, p. 20; thus the critical opinion of eco Verband der Internetwirtschaft e.V.: Stellungnahme zum Referentenentwurf, pp. 4 f.; Spindler: Rechtsdurchsetzung von Persönlichkeitsrechten, GRUR (Gewerblicher Rechtsschutz und Urheberrecht) 2018, pp. 365, 368; Holznagel: Das Compliance-System des Entwurfs des Netzwerkdurchsetzungsgesetzes, ZUM (Zeitschrift für Urheber- und Medienrecht) 2017, pp. 615, 620.
- 61.
German Federal Court of Justice (Bundesgerichtshof; BGH), decision from October 25, 2011 (VI ZR 93/10), GRUR (Gewerblicher Rechtsschutz und Urheberrecht) 2012, p. 311—blog post.
- 62.
For a detailed discussion of the review procedure, see Guggenberger: Das Netzwerkdurchsetzungsgesetz, ZRP (Zeitschrift für Rechtspolitik) 2017, pp. 98, 99.
- 63.
This being the predominant theory view in any case. For an essentially similar view, see also Wimmers/Heymann: Zum Referentenentwurf eines Netzwerkdurchsetzungsgesetzes, AfP (Zeitschrift für das gesamte Medienrecht) 2017, pp. 93, 95; Liesching: Die Durchsetzung von Verfassungs- und Europarecht gegen das NetzDG, MMR (Multimedia und Recht) 2018, pp. 26, 29; Guggenberger: Das Netzwerkdurchsetzungsgesetz in der Anwendung, NJW (Neue Juristische Wochenschrift) 2017, pp. 2577, 2579; Spindler: Rechtsdurchsetzung von Persönlichkeitsrechten, GRUR (Gewerblicher Rechtsschutz und Urheberrecht) 2018, pp. 365, 369; for a differing view not entirely rejecting strict deadlines cf. Höch: Nachbessern: ja, verteufeln: nein, K&R (Kommunikation und Recht) 2017, pp. 289, 291.
- 64.
Regarding deadline exceptions cf. Schwartmann: Verantwortlichkeit Sozialer Netzwerke, GRUR-Prax (Gewerblicher Rechtsschutz und Urheberrecht, Praxis) 2017, pp. 317 ff.
- 65.
BT-Drs. 18/13013, p. 20; also: Liesching, in: Spindler/Schmitz (eds.): Telemediengesetz mit Netzwerkdurchsetzungsgesetz, § 3 NetzDG mn. 5.
- 66.
The Federal Office of Justice (Bundesamt für Justiz; BfJ) recognized the Association of Multimedia Service Providers for Voluntary Self-regulation (Verein Freiwillige Selbstkontrolle Multimedia-Diensteanbieter e. V.; FSM) as such an panel on January 23, 2020, see: BfJ: Erstmals Selbstregulierung nach dem Netzwerkdurchsetzungsgesetz.
- 67.
For a critical view, see in particular Spindler: Rechtsdurchsetzung von Persönlichkeitsrechten, GRUR (Gewerblicher Rechtsschutz und Urheberrecht) 2018, pp. 365, 370 f.
- 68.
Cf. Löber/Roßnagel: Das Netzwerkdurchsetzungsgesetz in der Umsetzung, MMR (Multimedia und Recht) 2019, p. 71.
- 69.
Cf. also Höld: Das Vorabentscheidungsverfahren nach dem neuen NetzDG, MMR (Multimedia und Recht) 2017, pp. 791, 792 ff.
- 70.
Cf. also Guggenberger: Das Netzwerkdurchsetzungsgesetz, ZRP (Zeitschrift für Rechtspolitik) 2017, pp. 98, 99.
- 71.
Höld: Das Vorabentscheidungsverfahren nach dem neuen NetzDG, MMR (Multimedia und Recht) 2017, pp. 791, 794.
- 72.
Holznagel: Das Compliance-System des Entwurfs des Netzwerkdurchsetzungsgesetzes, ZUM (Zeitschrift für Urheber- und Medienrecht) 2017, pp. 615, 624.
- 73.
Similarly implicit in ibid.
- 74.
The Federal President declined to prepare the document in view of a decision of the Federal Constitutional Court (BVerfG), decision from May 27, 2020 (1 BvR 1873/13, 1 BvR 2618/13), NJW (Neue Juristische Wochenschrift) 2020, p. 2699—Bestandsdatenauskunft II indicating possible unconstitutionality. The Amendment/Constitutional Emendation Act initiated thereupon to yield conformity with the BVerfG decision was adopted by the Bundestag and Bundesrat on March 26, 2021 as drafted by the Mediation Committee (see: Deutscher Bundestag: Vorgang-Gesetzgebung. Gesetz zur Anpassung der Regelungen über die Bestandsdatenauskunft) so the document was prepared later.
- 75.
In the adopted version BR-Drs. 339/20 (see also the later Repair Act, see footnote 74); for an overview of the proceedings see: Deutscher Bundestag: Vorgang-Gesetzgebung. Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität.
- 76.
The article is based on the government draft of March 31, 2020, BT-Drs. 19/18792; at the time of printing, the bill was being discussed in the first round of deliberation in the Bundestag, thus changes are still possible.
- 77.
This amendment, unlike the others, stems from the former of the two laws mentioned above. Footnote 75.
- 78.
BT-Drs. 19/18792, p. 50; regarding the requirements under the AVMS Directive, see above, 3.1.
- 79.
BT-Drs. 19/18792, p. 50.
- 80.
BT-Drs. 18/12356, p. 18.
- 81.
The legal definition of a social media network hinges upon the shareability of “any content”; regarding effects on the area of application, see BT-Drs. 19/18792, p. 50 f.
- 82.
BMJV: Entwurf eines Gesetzes zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (draft legislation for improved legal enforcement against social media networks), p. 19 f.
- 83.
BT-Drs. 19/18792, p. 51.
- 84.
For detailed commentary see, 3.1. above.
- 85.
BT-Drs. 19/18792, p. 51. Pursuant to § 3d (2) and (3), the country of domicile may be imputed if the parent company, subsidiary, or corporate affiliate is domiciled in a Member State, see also 3.1. above.
- 86.
Such complaints are highly common, see, for example, Facebook: NetzDG Transparenzbericht and YouTube: Transparenzbericht.
- 87.
Liesching, in: NomosBR-NetzDG, § 1 NetzDG mn. 2.
- 88.
For further references, cf. Hoven/Gersdorf, in: BeckOK-Informations- und MedienR, § 1 NetzDG mn. 9; for detailed discussion, see again Liesching: Stellungnahme, pp. 1 ff.
- 89.
Cf. Bitkom: Stellungnahme, p. 27 f.; Google: Stellungnahme, p. 19.
- 90.
See also as an example the Federal Council Position Paper (Stellungnahme des Bundesrates), May 20, 2020, BT-Drs. 19/19367, p. 4 f.
- 91.
Regarding risks, esp. for smaller platforms, cf. Google: Stellungnahme, pp. 18 f. and HateAid: Stellungnahme, pp. 21 f.; regarding concerns in relation to Article 3 of the German Constitution (GG) and appropriateness cf. Liesching: Stellungnahme, p. 9.
- 92.
BT-Drs. 19/18792, p. 52.
- 93.
Referring to overlap areas see also Kreißig: Stellungnahme der Medienanstalten, p. 2.
- 94.
Liesching, in: BeckOK JMStV, § 4 mn. 2.
- 95.
Cf. Bitkom: Kinder und Jugendliche in der digitalen Welt, p. 13; Medienpädagogischer Forschungsverband Südwest: JIM-Studie 2019, pp. 49 f.; for a discussion based on specific illegal acts, see cf. Brings-Wiesen: Staatliche Reaktionsmöglichkeiten auf jugendlichen und jugendgefährdenden Hass im Netz, ZJJ (Zeitschrift für Jugendkriminalrecht und Jugendhilfe) 2020, pp. 127 ff.
- 96.
The German Federal Government has cited this as well in a reply to a Federal Council statement, see Gegenäußerung der Bundesregierung auf die Stellungnahme des Bundesrates, BT-Drs. 19/19367, regarding number 3 c (p. 7).
- 97.
Stellungnahme des Bundesrates, 20.5.2020, BT-Drs. 19/19367, p. 2; Kreißig: Stellungnahme der Medienanstalten, p. 2.; Facebook: Stellungnahme, pp. 3 f.
- 98.
Cf. also Google: Stellungnahme, pp. 8 ff.; similarly Liesching: Stellungnahme, p. 10; in contrast the Federal Government reply: Gegenäußerung der Bundesregierung, BT-Drs. 19/19367, regarding 3 e (p. 8); for a view tolerating overlap (given dialogue between the instances), see HateAid: Stellungnahme, p. 23.
- 99.
Cf. Müller-Terpitz: Persönlichkeitsrechtliche Aspekte der Social Media, in: Hornung/Müller-Terpitz (eds.): Rechtshandbuch Social Media, pp. 253, 296; Beyerbach: Social Media im Verfassungsrecht und der einfachgesetzlichen Medienregulierung, in: Hornung/Müller-Terpitz (eds.): Rechtshandbuch Social Media, pp. 507, 560.
- 100.
Kommission für Jugendmedienschutz: Stellungnahme Novellierung des Medienstaatsvertrages/Jugendmedienschutz-Staatsvertrages, p. 4.
References
Altenhain, Karsten: Teil 20 Jugendschutz, in: Hoeren, Thomas/Sieber, Ulrich/Holznagel, Bernd (eds.): Handbuch Multimedia-Recht, Munich 54th ed. 2020.
Beyerbach, Hannes: Social Media im Verfassungsrecht und der einfachgesetzlichen Medienregulierung, in: Hornung, Gerrit/Müller-Terpitz, Ralf (eds.): Rechtshandbuch Social Media, Berlin 2nd ed. 2021, pp. 507–593.
BfJ (Bundesamt für Justiz): Erstmals Selbstregulierung nach dem Netzwerkdurchsetzungsgesetz, January 23, 2020, https://www.bundesjustizamt.de/DE/Presse/Archiv/2020/20200123.html; last accessed July 20, 2021.
Bitkom Bundesverband Informationswirtschaft, Telekommunikation und Neue Medien e.V.: Stellungnahme zum Entwurf eines Gesetzes zur Änderung des Netzwerkdurchsetzungsgesetzes, February 17, 2020, https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Stellungnahmen/2020/Downloads/021720_Stellungnahme%20_Bitkom_RefE_NetzDG.pdf;jsessionid=25CB94881F7EA6D227F44CC9A65E5DF0.2_cid297?__blob=publicationFile&v=2; last accessed May 12, 2021.
Bitkom Bundesverband Informationswirtschaft, Telekommunikation und Neue Medien e.V.: Kinder und Jugendliche in der digitalen Welt, May 28, 2019, https://www.bitkom.org/sites/default/files/2019-05/bitkom_pk-charts_kinder_und_jugendliche_2019.pdf; last accessed May 12, 2021.
BMJV (Bundesministerium für Justiz und Verbraucherschutz): Entwurf eines Gesetzes zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken, April 5, 2017, https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Dokumente/RegE_NetzDG.pdf; last accessed July 20, 2021.
Brings-Wiesen, Tobias: Staatliche Reaktionsmöglichkeiten auf jugendlichen und jugendgefährdenden Hass im Netz, ZJJ (Zeitschrift für Jugendkriminalrecht und Jugendhilfe), 2020, pp. 127–140.
Deutscher Bundestag: Vorgang-Gesetzgebung. Gesetz zur Anpassung der Regelungen über die Bestandsdatenauskunft an die Vorgaben aus der Entscheidung des Bundesverfassungsgerichts vom 27. Mai 2020, https://dip.bundestag.de/vorgang/gesetz-zur-anpassung-der-regelungen-%C3%BCber-die-bestandsdatenauskunft-an-die/271425; last accessed July 20, 2021.
Deutscher Bundestag: Vorgang-Gesetzgebung. Gesetz zur Bekämpfung des Rechtsextremismus und der Hasskriminalität, http://dipbt.bundestag.de/extrakt/ba/WP19/2599/259975.html; last accessed July 20, 2021.
eco Verband der Internetwirtschaft e. V.: Stellungnahme zum Referentenentwurf eines Gesetzes zur Verbesserung der Rechtsdurchsetzung in sozialen Netzwerken (Netzwerkdurchsetzungsgesetz), March 30, 2017, https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Stellungnahmen/2017/Downloads/03302017_Stellungnahme_eco_RefE_NetzDG.pdf?__blob=publicationFile&v=2; last accessed May 12, 2021.
Erdemir, Murad: § 4 JMStV, § 5 JMStV, in: Spindler, Gerald/Schuster, Fabian (eds.): Recht der elektronischen Medien, Munich 4th ed. 2019.
Facebook: NetzDG Transparenzbericht, 2020, https://about.fb.com/wp-content/uploads/2020/07/facebook_netzdg_July_2020_German.pdf; last accessed May 12, 2021.
Facebook: Stellungnahme zum Referentenentwurf eines Gesetzes zur Änderung des Netzwerkdurchsetzungsgesetzes, February 17, 2020, https://www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Stellungnahmen/2020/Downloads/021720_Stellungnahme_Facebook_RefE_NetzDG.pdf?__blob=publicationFile&v=2; last accessed May 12, 2021.
Geidner, Silvia: § 5 JMStV, in: Binder, Reinhart/Vesting, Thomas (eds.): Beck’scher Kommentar zum Rundfunkrecht (Beck RundfunkR), Munich 4th ed. 2018.
Google: Stellungnahme zur Anhörung des Ausschusses für Recht und Verbraucherschutz des Deutschen Bundestags am 17. Juni 2020 zum Entwurf eines Gesetzes zur Änderung des Netzwerkdurchsetzungsgesetzes, June 17, 2020, https://www.bundestag.de/resource/blob/701152/0f347aec58fff15d3edbd999628c3bb0/frank_google-data.pdf; last accessed May 12, 2021.
Guggenberger, Nikolaus: Das Netzwerkdurchsetzungsgesetz – schön gedacht, schlecht gemacht, ZRP (Zeitschrift für Rechtspolitik), 2017, pp. 98–101.
Guggenberger, Nikolaus: Das Netzwerkdurchsetzungsgesetz in der Anwendung, NJW (Neue Juristische Wochenschrift), 2017, pp. 2577–2582.
Gundel, Jörg: Die Fortentwicklung der europäischen Medienregulierung: Zur Neufassung der AVMD-Richtlinie, ZUM (Zeitschrift für Urheber- und Medienrecht), 2019, pp. 131–139.
Hartmann, Sarah: Welche Dienste zählen künftig zu den audiovisuellen Mediendiensten? Reform des materiellen Anwendungsbereichs der AVMD-RL und Möglichkeiten der alternativen Ausgestaltung, MMR (Multimedia und Recht), 2018, pp. 790–794.
HateAid gGmbH: Stellungnahme zum Gesetzentwurf zur Änderung des Netzwerkdurchsetzungsgesetzes, June 16, 2020, https://www.bundestag.de/resource/blob/701054/02b45b9aca30a33a56750db25cea404f/ballon_hateaid-data.pdf; last accessed May 12, 2021.
Hilgert, Felix: Novellierung des Jugendmedienschutz-Staatsvertrags, December 6, 2019, https://spielerecht.de/novellierung-des-jugendmedienschutz-staatsvertrags/; last accessed May 11, 2021.
Hilgert, Felix/Sümmermann, Philipp: Technischer Jugendmedienschutz. Maßnahmen der Jugendschutz-Compliance in Rundfunk und Telemedien, MMR-Beil. (Multimedia und Recht – Beilage) (Appendix), 2020 (8), pp. 26–30.
Höch, Dominik: Nachbessern: ja, verteufeln: nein. Das NetzDG ist besser als sein Ruf, K & R (Kommunikation & Recht), 2017, pp. 289–292.
Höld, Florian: Das Vorabentscheidungsverfahren nach dem neuen NetzDG. Rechtliches Neuland bei der Bekämpfung von Kriminalität in sozialen Netzwerken, MMR (Multimedia und Recht), 2017, pp. 791–795.
Holznagel, Bernd: § 2 RStV, in: Spindler, Gerald/Schuster, Fabian (eds.): Recht der elektronischen Medien, Munich 4th ed. 2019.
Holznagel, Bernd: Das Compliance-System des Entwurfs des Netzwerkdurchsetzungsgesetzes, ZUM (Zeitschrift für Urheber- und Medienrecht), 2017, pp. 615–624.
Holznagel, Bernd/Hartmann, Sarah: Teil 3 Rundfunk und Telemedien, in: Hoeren, Thomas/Sieber, Ulrich/Holznagel, Bernd (eds.): Handbuch Multimedia-Recht, Munich 54th ed. 2020.
Holznagel, Daniel: Melde- und Abhilfeverfahren zur Beanstandung rechtswidrig gehosteter Inhalte nach europäischem und deutschem Recht im Vergleich zu gesetzlich geregelten notice and take-down-Verfahren, GRUR Int. (Gewerblicher Rechtsschutz und Urheberrecht Internationaler Teil), 2014, pp. 105–113.
Hoven, Elisa/Gersdorf, Hubertus: § 1 NetzDG, in: Gersdorf, Hubertus/Paal, Boris P. (eds.): Beck’scher Onlinekommentar Informations- und Medienrecht (BeckOK-Informations- und MedienR), Munich 30th ed. 2019.
Jäger, Manuel: Die Novellierung der AVMD-RL—Anwendungsbereich und Werberegulierung—eine erneut vertane Chance?, ZUM (Zeitschrift für Urheber- und Medienrecht), 2019, pp. 477–490.
Kaspar, Marcel: § 4 JMStV, in: Binder, Reinhart/Vesting, Thomas (eds.): Beck’scher Kommentar zum Rundfunkrecht (Beck RundfunkR), Munich 4th ed. 2018.
Keller, Rainer/Liesching, Marc: 82. Abschnitt: Pflichten von Anbietern von Rundfunk und Telemedien, in: Paschke, Marian/Berlit, Wolfgang/Meyer, Klaus/Kröner, Lars (eds.): Hamburger Kommentar Gesamtes Medienrecht (Hamburger Kommentar), Baden-Baden 4th ed. 2021.
Keppeler, Lutz Martin: Was bleibt vom TMG-Datenschutz nach der DS-GVO? Lösung und Schaffung von Abgrenzungsproblemen im Multimedia-Datenschutz, MMR (Multimedia und Recht), 2015, pp. 779–783.
Kommission für Jugendmedienschutz: Kriterien zur Bewertung von Konzepten für Altersverifikationssysteme, December 11, 2019, https://www.kjm-online.de/fileadmin/user_upload/KJM/Aufsicht/Technischer_Jugendmedienschutz/KJM-AVS-Raster.pdf; last accessed May 11, 2021.
Kommission für Jugendmedienschutz: Neue Methode für Altersverifikation im Internet, December 20, 2019, https://www.kjm-online.de/service/pressemitteilungen/meldung/jugendmedienschutz-neue-methode-fuer-altersverifikation-im-internet/; last accessed May 11, 2021.
Kommission für Jugendmedienschutz: Stellungnahme der Kommission für Jugendmedienschutz (KJM) zur Novellierung des Medienstaatsvertrages/Jugendmedienschutz-Staatsvertrages, August 22, 2019, https://www.kjm-online.de/fileadmin/user_upload/KJM/Ueber_uns/Positionen/20190822_KJM-Stellungnahme_MStV.pdf; last accessed May 11, 2021.
Kreißig, Wolfgang: Stellungnahme der Medienanstalten zur Änderung des Netzwerkdurchsetzungsgesetzes (NetzDG), June 15, 2020, https://www.bundestag.de/resource/blob/700782/c3c25a329df06b6b832d7aac69490616/kreissig-data.pdf; last accessed May 12, 2021.
Kröber, Martin: § 6 RStV, in: Binder, Reinhart/Vesting, Thomas (eds.): Beck’scher Kommentar zum Rundfunkrecht (BeckOK-RundfunkR), Munich 4th ed. 2018.
Liesching, Marc: § 1 NetzDG, in: Liesching, Marc (ed.): NomosBundesrecht: Netzwerkdurchsetzungsgesetz (NomosBR-NetzDG), Baden-Baden 1st online ed. 2018.
Liesching, Marc: § 3 NetzDG, in: Spindler, Gerald/Schmitz, Peter (eds.): Telemediengesetz mit Netzwerkdurchsetzungsgesetz, Munich 2nd ed. 2018.
Liesching, Marc: § 4 JMStV, § 5 JMStV, in: Liesching, Marc (ed.): Beck’scher Online-Kommentar JMStV (BeckOK JMStV), Munich 18th ed. 1.1.2020.
Liesching, Marc: Das Herkunftslandprinzip der E-Commerce-Richtlinie und seine Auswirkung auf die aktuelle Mediengesetzgebung in Deutschland, Berlin 2020.
Liesching, Marc: Das Herkunftslandprinzip nach E-Commerce- und AVMD-Richtlinie. Anwendbarkeit von NetzDG, JuSchG, MStV und JMStV auf Soziale Netzwerke mit Sitz in anderen EU-Mitgliedstaaten, MMR-Beil. (Multimedia und Recht—Beilage) (Appendix), 2020 (6), pp. 3–27.
Liesching, Marc: Die Durchsetzung von Verfassungs- und Europarecht gegen das NetzDG, MMR (Multimedia und Recht), 2018, pp. 26–30.
Liesching, Marc: Stellungnahme zum Entwurf eines Gesetzes zur Änderung des Netzwerkdurchsetzungsgesetzes, June 15, 2020, https://www.bundestag.de/resource/blob/700788/83b06f596a5e729ef69348849777b045/liesching-data.pdf; last accessed May 12, 2021.
Liesching, Marc: Verfassungskonformer Jugendschutz nach der Medienkonvergenz. Anforderungen an ein differenziertes Regelungssystem, Multimedia und Recht (MMR), 2018, pp. 141–143.
Löber, Isabell/Roßnagel, Alexander: Das Netzwerkdurchsetzungsgesetz in der Umsetzung. Bilanz nach den ersten Transparenzberichten, MMR (Multimedia und Recht), 2019, pp. 71–76.
Medienpädagogischer Forschungsverband Südwest (mpfs): JIM-Studie 2019. Jugend, Information, Medien, https://www.mpfs.de/fileadmin/files/Studien/JIM/2019/JIM_2019.pdf; last accessed May 12, 2021.
Müller-Terpitz, Ralf: Persönlichkeitsrechtliche Aspekte der Social Media, in: Hornung, Gerrit/Müller-Terpitz, Ralf (eds.): Rechtshandbuch Social Media, Berlin 2nd ed. 2021, pp. 253–304.
Roßnagel, Alexander/Geminn, Christian L./Jandt, Silke/Richter, Philipp: Datenschutzrecht 2016 “Smart” genug für die Zukunft?, Kassel 2016.
Schwartmann, Rolf: Verantwortlichkeit Sozialer Netzwerke nach dem Netzwerkdurchsetzungsgesetz, GRUR-Prax (Gewerblicher Rechtsschutz und Urheberrecht, Praxis im Immaterialgüter- und Wettbewerbsrecht), 2017, pp. 317–319.
Specht-Riemenschneider, Louisa/Dehmel, Susanne/Kenning, Peter/Liedtke, Christa/Micklitz, Hans W./Scharioth, Sven: Stellungnahme des SVRV – Grundlegung einer verbrauchergerechten Regulierung interaktionsmittelnder Plattformfunktionalitäten, Berlin 2020.
Specht-Riemenschneider, Louisa/Hofmann, Franz: Verantwortung von Online-Plattformen – ein Plädoyer für funktionszentrierte Verkehrspflichten, 2021, https://www.vzbv.de/sites/default/files/downloads/2021/02/04/specht_hofmann_gutachten_plattformverantwortlichkeitdocx.pdf; last accessed May 11, 2021.
Specht-Riemenschneider, Louisa/Hofmann, Franz: Nutzerrechte als Baustein einer fairen Plattformökonomie, NJW-aktuell (Neue Juristische Wochenschrift Aktuell) 2021, pp. 15 ff. (accepted for publication).
Spiegel: Erneut Selbstmord wegen Cyber-Mobbing, September 21, 2009, https://www.spiegel.de/netzwelt/web/grossringen-erneut-selbstmord-wegen-cyber-mobbing-a-650340.html; last accessed March 9, 2021.
Spindler, Gerald: Rechtsdurchsetzung von Persönlichkeitsrechten. Bußgelder gegen Provider als Enforcement?, GRUR (Gewerblicher Rechtsschutz und Urheberrecht), 2018, pp. 365–373.
Tinnefeld, Jutta/Buchner, Benedikt: Syst. I. Datenschutz in Medien und Telekommunikation, in: Wolff, Heinrich Amadeus/Brink, Stefan (eds.): Beck’scher Onlinekommentar DatenschutzR (BeckOK-DatenschutzR), Munich 35th ed. 2019.
Wimmers, Jörg/Heymann, Britta: Zum Referentenentwurf eines Netzwerkdurchsetzungsgesetzes (NetzDG)—eine kritische Stellungnahme, AfP (Zeitschrift für das gesamte Medienrecht), 2017, pp. 93–102.
Wissenschaftliche Dienste des Bundestags: Überblick über den aktuellen Diskussionsstand und rechtliche Grundlagen zum Thema “Klarnamenpflicht im Internet”, 2020, https://www.bundestag.de/resource/blob/691400/10dd5fe59e4dc35885d4752f25126350/WD-10-003-20-pdf-data.pdf; last accessed May 11, 2021.
YouTube: Transparenzbericht. Entfernungen von Inhalten nach dem Netzwerkdurchsetzungsgesetz, https://transparencyreport.google.com/netzdg/youtube?hl=de; last accessed May 12, 2021.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2023 The Author(s)
About this chapter
Cite this chapter
Specht-Riemenschneider, L., Marko, A., Wette, S. (2023). Protection of Minors on Video Sharing Platforms. In: Dethloff, N., Kaesling, K., Specht-Riemenschneider, L. (eds) Families and New Media. Juridicum – Schriften zum Medien-, Informations- und Datenrecht. Springer, Wiesbaden. https://doi.org/10.1007/978-3-658-39664-0_11
Download citation
DOI: https://doi.org/10.1007/978-3-658-39664-0_11
Published:
Publisher Name: Springer, Wiesbaden
Print ISBN: 978-3-658-39663-3
Online ISBN: 978-3-658-39664-0
eBook Packages: Law and CriminologyLaw and Criminology (R0)