Automated communication and basic rights

Abstract

Nowadays communication does not necessarily originate from humans, but also from “machines” such as “social bots” or “things” in the Internet of Things. From a basic rights perspective, this phenomenon raises the question if such communication benefits from the same level of protection as communication created by human beings. With regard to the basic rights of the Grundgesetz, the Federal Constitutional Court for years has been excluding some forms of communication from the scope of protection as not needing or deserving protection. The ratio of these decisions must not be applied to cases of automated communication without noticing possible differences. Automated communication can mostly be linked to the human beings or legal entities applying it. Therefore, its prohibition or regulation is a matter of proportionality of infringements with fundamental rights.

This is a preview of subscription content, log in to check access.

Notes

  1. 1.

    “Der Staat ist um des Menschen willen da, nicht der Mensch um des Staates willen.”

  2. 2.

    Article 11.

  3. 3.

    BVerfGE 7, 198 (208).

  4. 4.

    “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof; or abridging the freedom of speech, or of the press; or the right of the people peaceably to assemble, and to petition the Government for a redress of grievances.”

  5. 5.

    Fundamental: BVerfGE 20, 162 (174): “[…] insbesondere ist eine freie, regelmäßig erscheinende politische Presse für die moderne Demokratie unentbehrlich.”

  6. 6.

    BVerfGE 35, 202 (221); 73, 118 (157).

  7. 7.

    However, freedom of speech also applies to private communication.

  8. 8.

    For negative freedom see BVerfGE 65, 1 (40 f.); Grabenwarter, Maunz/Dürig 1/2013, Article 5 para 95.

  9. 9.

    Cf. e.g. article 8 ECHR, article 7 CFREU.

  10. 10.

    Cf. Di Fabio, Maunz/Dürig 1/2013, Art.2 I para 149; BVerfGE 54, 148 (153); originally, the right was „invented“by the Federal Court of Justice, see BGHZ 13, 334 ff.; BGHZ 26, 349 ff.

  11. 11.

    https://www.gesetze-im-internet.de/englisch_gg/englisch_gg.html.

  12. 12.

    As in the official German version of article 42 CFREU.

  13. 13.

    See also Milker2017, ZUM: 216–222, p. 217; Schröder2018, DVBl: 465–472, p. 466.

  14. 14.

    https://www.wired.de/collection/tech/welche-social-bots-gibt-es-und-wie-funktionieren-sie.

  15. 15.

    Milker2017, ZUM: 216–222, p. 216.

  16. 16.

    http://www.faz.net/aktuell/wirtschaft/netzkonferenz-dld/dld-2017-der-bot-macht-meinung-14630988-p2.html.

  17. 17.

    https://www.wired.de/collection/tech/welche-social-bots-gibt-es-und-wie-funktionieren-sie.

  18. 18.

    Gasser/Kraatz, Social Bots: Wegbereiter der Maschinokratie, http://verfassungsblog.de/social-bots-wegbereiter-der-maschinokratie/.

  19. 19.

    http://www.berliner-zeitung.de/politik/social-bots-parteien-in-deutschland-wollen-auf-technische-wahlkampfhelfer-verzichten-25278052.

  20. 20.

    https://www.heise.de/newsticker/meldung/Niedersachsen-will-Facebook-Co-verpflichten-Social-Bots-zu-loeschen-3617730.html.

  21. 21.

    For an overview see already Bullinger/ten Hompel2007; more recently Reichwald/Pfisterer2016, CR: 208–212.

  22. 22.

    BVerfG (Kammer), NJW 2007, 351.

  23. 23.

    The BGH considered it applicable, NJW 2001, 1587; agreeing Dix2004 Kriminalistik: 81–85. p. 85; Schenke2000, AöR 125: 1–44, p. 20; for criticism see Bernsmann2002, NStZ: p. 103; Roggan2003, KritV: 76–95, p. 80.

  24. 24.

    BVerfG (Kammer), NJW 2007, 351 (353), translated by Google, revised by the author.

  25. 25.

    Ibid.

  26. 26.

    BVerfG (Kammer), NJW 2007, 351 (353), translated by Google, revised by the author.

  27. 27.

    BVerfG (Kammer), NJW 2007, 351 (354 f.).

  28. 28.

    Especially Schenke2000, AöR 125: 1–44, p. 20; more recently Marosi/Skobel2018, DÖV: 837–845, p. 840 seq.

  29. 29.

    BVerfGE 65, 1 (40 f.). For other facts see Starck, v. Mangoldt/Klein/Starck 2016, Art. 5 I,II, para 26.

  30. 30.

    BVerfGE 99, 185 (197), thereby giving up BVerfGE 54, 208 (219), where all false facts were excluded.

  31. 31.

    BVerfGE 60, 234 (242); 82, 43 (51). It has to be distinguished from invective criticism (“Schmähkritik”), which is, according to the Court, not excluded from protection, but cannot outweigh in the consideration of proportionality, which is therefore dispensable, cf. BVerfGE 82, 43 (51); 90, 241 (248); 93, 266 (294).

  32. 32.

    Cf. Grabenwarter, Maunz/Dürig 1/2013, Art. 5 I para 86.

  33. 33.

    Cf. especially the reactions to the Federal Constitutional Court’s decisions Glykol, Osho, Wunsiedel, Sprayer von Zürich.

  34. 34.

    Kahl2004, Der Staat 43: 167–202, p. 186 with further references.

  35. 35.

    This does not exclude possible protection under the freedom to act, article 2 I GG.

  36. 36.

    BVerfGE 7, 198 (208); “unmittelbarster Ausdruck der menschlichen Persönlichkeit in der Gesellschaft”.

  37. 37.

    Starck, v. Mangoldt/Klein/Starck 2016, Art. 5 I, II, para 22.

  38. 38.

    However, the programming of the social bot may enjoy basic rights protection itself, either under the freedom of profession (article 12 I GG) or, as a prerequisite of an opinion, under the media freedoms in article 5 I GG. Milker2017, ZUM: 216–222, p. 217 considers the anticipated opinion as protected under article 5 I 1 GG.

  39. 39.

    http://www.faz.net/aktuell/sport/fussball/social-media-fauxpas-bei-mesut-oezil-und-ilkay-guendogan-14368478.html.

  40. 40.

    Schröder2018, DVBl: 465–472, p. 467.

  41. 41.

    BVerfG, NJW 1998, 2889 (2890). This is generally agreed upon, cf. Starck, v. Mangoldt/Klein/Starck 2016, Article 5 I,II, para 32.

  42. 42.

    Otherwise the negative freedom of opinion of an existing person may inhibit the use of their name for a certain opinion, moreover their general right to protection of personality may be affected by the use of their name as someone else’s pseudonym.

  43. 43.

    Grabenwarter, Maunz/Dürig 1/2013, Art. 5 I para 86 (1/2013). This is also the intention of § 13 VI TMG. This argument is also mentioned by Zumkeller-Quast, Die Nutzung von Socialbots als Identitätstäuschung?, https://www.juwiss.de/2-2017/.

  44. 44.

    Recently: Kersten2017, JuS 193–202, p. 195.

  45. 45.

    Brings-Wiesen, Meinungskampf mit allen Mitteln und ohne Regeln, https://www.juwiss.de/93-2016/.

  46. 46.

    Schröder2018, DVBl: 465–472, p. 468.

  47. 47.

    See also Gasser/Kraatz, Social Bots: Wegbereiter der Maschinokratie, http://verfassungsblog.de/social-bots-wegbereiter-der-maschinokratie/.

  48. 48.

    Starck, v. Mangoldt/Klein/Starck 2016, Art. 5 I,II para 33.

  49. 49.

    Cf. Grabenwarter, Maunz/Dürig 1/2013, Art. 5 I para 82.

  50. 50.

    BVerfG (Kammer), NJW 2003, 1109.

  51. 51.

    In this direction: Zumkeller-Quast, Die Nutzung von Socialbots als Identitätstäuschung?, https://www.juwiss.de/2-2017/.

  52. 52.

    BVerfG, NJW 1998, 2889 (2890).

  53. 53.

    Cf. article 38 I GG.

  54. 54.

    Cf. BVerfGE 111, 382 (398). The principle is based on articles 21 I 1 and 3 I GG.

  55. 55.

    Gasser/Kraatz, Social Bots: Wegbereiter der Maschinokratie, http://verfassungsblog.de/social-bots-wegbereiter-der-maschinokratie/.

  56. 56.

    BVerfG, NJW 1998, 2889 (2890)—translated by the author.

  57. 57.

    The parallelism is supported by Brings-Wiesen, Meinungskampf mit allen Mitteln und ohne Regeln, https://www.juwiss.de/93-2016/.

  58. 58.

    BVerfGE 54, 208 (219).

  59. 59.

    Violation of § 4 Nr. 3 UWG, see http://www.lto.de/recht/hintergruende/h/fake-likes-und-gefaelschte-kritiken-machtlose-online-shops-und-entwertete-bewertungen/.

  60. 60.

    Starck, v. Mangoldt/Klein/Starck 2016, Art. 5 I, II para 37.

  61. 61.

    BVerfGE 27, 71 (81).

  62. 62.

    Obermayer1980, BayVBl: 1–5, p. 3; Starck, v. Mangoldt/Klein/Starck 2016, Art. 5 I, II para 22.

  63. 63.

    For lying prior to elections see Milker2017, ZUM: 216–222, p. 219 referring to BVerfG, Urt. v. 08.02.2001, Az.: 2 BvF 1/00, Rez. 86.

  64. 64.

    See above.

  65. 65.

    Schröder2018, DVBl: 465–472, p. 469.

  66. 66.

    Cf. Gasser/Kraatz, Social Bots: Wegbereiter der Maschinokratie, http://verfassungsblog.de/social-bots-wegbereiter-der-maschinokratie/.

  67. 67.

    Brings-Wiesen, Meinungskampf mit allen Mitteln und ohne Regeln, https://www.juwiss.de/93-2016/.

  68. 68.

    Gasser/Kraatz, Social Bots: Wegbereiter der Maschinokratie, http://verfassungsblog.de/social-bots-wegbereiter-der-maschinokratie/.

  69. 69.

    See www.faz.net/aktuell/feuilleton/medien/roboterjournalismus-prosa-als-programm-14873449.html; www.botpoet.com; http://truthy.indiana.edu/botornot/.

  70. 70.

    Schröder2018, DVBl: 465–472, p. 472.

  71. 71.

    See also Milker2017, ZUM: 216–222, p. 221.

  72. 72.

    BGH, 12.1.2017, I ZR 253/14, Rezs. 50 f.

  73. 73.

    Cf. e.g. article 11 of the constitution of Brandenburg.

  74. 74.

    With regard to article 10 I GG, Dix2004, Kriminalistik: 81–85, p. 83, speaks about communication initiated by or on behalf of humans.

  75. 75.

    Essential: BVerfGE 39, 1 (42); 46, 160 (164); 56, 54 (73).

  76. 76.

    See above.

  77. 77.

    Established in BVerfGE 120, 274 as another aspect of the general right of personality.

  78. 78.

    BVerfGE 115, 205 (229); sometimes these are also considered property in the meaning of article 14 GG, see BVerwG, NVwZ 2009, 114.

  79. 79.

    The question was very controversial in the 19th century with Otto von Gierke’s “Lehre von der realen Verbandsperson” being opposed by Friedrich Carl von Savigny’s representation theory.

  80. 80.

    “Grundrechtstypische Gefährdungslage”, see Huber, Mangoldt/Klein/Starck 2016, Art. 19 III para 214.

  81. 81.

    BVerfGE 100, 313 (356); it is even discussed whether it has a general right to privacy, see Jarass2016, Jarass/Pieroth, Art. 2 para 43.

References

  1. Bernsmann K (2002) Anordnung der Überwachung des Fernmeldeverkehrs. Neue Zeitschrift für Strafrecht 22:103–104

    Google Scholar 

  2. Bullinger H-J, ten Hompel M (2007) Internet der Dinge

  3. Di Fabio U (2013) Art. 2. In: Maunz/Dürig Kommentar zum Grundgesetz. C.H. Beck, München

  4. Dix A (2004) Informations- und Kommunikationskriminalität. Kriminalistik 58:81–85

    Google Scholar 

  5. Grabenwarter C (2013) Art. 5. In: Maunz/Dürig Kommentar zum Grundgesetz. C.H. Beck, München

  6. Jarass H (2016) Art. 2. In: Jarass/Pieroth Kommentar zum Grundgesetz. C.H. Beck, München

  7. Kahl W (2004) Vom weiten Schutzbereich zum engen Gewährleistungsgehalt. Zur Kritik einer neuen Richtung der deutschen Grundrechtsdogmatik. Der Staat 43:167–202

    Google Scholar 

  8. Kersten J (2017) Anonymität in der liberalen Demokratie. Juristische Schulung 58:193–202

    Google Scholar 

  9. Marosi J, Skobel E (2018) Von Menschen und Maschinen—Zur Technologieneutralität von Art. 10 Abs. 1 Var. 3 GG. Die Öffentliche Verwaltung 71:837–845

    Google Scholar 

  10. Milker J (2017) Social Bots im Meinungskampf. Zeitschrift für Urheber- und Medienrecht 61:216–222

    Google Scholar 

  11. Obermayer K (1980) Aspekte der Meinungsfreiheit. Bayerische Verwaltungsblätter 111:1–5

    Google Scholar 

  12. Reichwald J, Pfisterer D (2016) Autonomie und Intelligenz im Internet der Dinge—Möglichkeiten und Grenzen autonomer Handlungen. Computer und Recht 32:208–212

    Article  Google Scholar 

  13. Roggan F (2003) Moderne Telekommunikationsüberwachung: Eine kritische Bestandsaufnahme. Kritische Vierteljahresschrift für Gesetzgebung und Rechtswissenschaft 86:76–95

    Article  Google Scholar 

  14. Schenke RP (2000) Verfassungsrechtliche Probleme einer präventiven Überwachung der Telekommunikation. Archiv des öffentlichen Rechts 125:1–44

    Google Scholar 

  15. Schröder M (2018) Rahmenbedingungen der staatlichen Regulierung von Social Bots. Deutsches Verwaltungsblatt 133:465–472

    Article  Google Scholar 

  16. Starck C (2016) Art. 5. In: v. Mangoldt/Klein/Starck Kommentar zum Grundgesetz. C.H. Beck, München

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Meinhard Schröder.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Schröder, M. Automated communication and basic rights. China-EU Law J 6, 175–187 (2019). https://doi.org/10.1007/s12689-018-0081-y

Download citation

Keywords

  • Basic rights
  • Internet of things
  • Social bots
  • Fake identity
  • Fake news
  • Democracy