Skip to main content

H@te Online: Die Bedeutung digitaler Kommunikation für Hass und Hetze

  • Chapter
  • First Online:
Hate Speech

Part of the book series: Aktivismus- und Propagandaforschung ((AKPRO))

Zusammenfassung

Online-Medien sind selbstverständlicher Teil unserer Lebenswelt und bieten ein ideales Umfeld für die Verbreitung von Hate Speech. Hate Speech schadet dem Wohlbefinden der Angegriffenen und kann zu Radikalisierungsprozessen beitragen. Hate Speech ist kultur- und ideologieübergreifend zu finden. Ein substanzieller Teil der in Deutschland lebenden Personen hat bereits Hate Speech beobachtet, vor allem Angriffe auf benachteiligte Gruppen, aber auch auf Politiker*innen oder Journalist*innen. Hate Speech wird in Deutschland sowohl durch organisierte Gruppen und vor allem aus dem rechten Spektrum, aber auch durch einflussreiche Akteur*innen wie Politiker*innen und Individuen mit verschiedenen Motivlagen verbreitet. Entsprechend vielfältig müssen Maßnahmen zur Förderung demokratischer Resilienz gegen Hate Speech sein. Insbesondere müssen soziale Normen durch gesetzliche Vorgaben, aber auch durch konkretes Verhalten von Moderator*innen und anderen User*innen gestärkt werden. Auch die Förderung individueller Empathiefähigkeit könnte helfen. Repressionen von Inhalten sind gegen das Recht auf Meinungsäußerung abzuwägen und bergen das Risiko, zu einem Verfolgungserleben beizutragen. Studien zu zivilgesellschaftlichem Engagement machen Hoffnung auf ein besseres Miteinander, auch in digitalen Medien.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 44.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Zunehmend wird Trolling synonym für alle Formen „dunkler Partizipation“ im Netz verwendet (vgl. u. a. Alizadeh et al., 2020), hier verwenden wir den Begriff für eine klarere Abgrenzung koordinierter versus unkoordinierter Angriffe in seiner ursprünglichen Variante.

  2. 2.

    Als Memes bezeichnet der Zoologe und Evolutionsbiologe Richard Dawkings kulturelle Analogien zu Genen: Bausteine, die sich selbst replizieren, mutieren und die dem Druck einer Anpassung an ihre Umwelt unterliegen. Oft werden darunter visuelle Inhalte in sozialen Medien verstanden, die häufig reproduzierte Bilder durch Texte in einen neuen Kontext stellen (s. auch Zannettou u. a., 2018).

Literatur

  • Abay Gaspar, H., Daase, C., Deitelhoff, N., Junk, J., & Sold, M. (2018). Warum wir einen weiten Begriff von Radikalisierung brauchen. In M. von Drachenfels, P. Offermann, & C. Wunderlich (Hrsg.), Radikalisierung und De-Radikalisierung in Deutschland. Eine gesamtgesellschaftliche Herausforderung Frankfurt a. M., Leibniz-Institut Hessische Stiftung Friedens- und Konfliktforschung.

    Google Scholar 

  • Ajzen, I., & Fishbein, M. (1970). The prediction of behavior from attitudinal and normative variables. Journal of Experimental Social Psychology Psychology, 6(4), 466–487. https://doi.org/10/dzxrbx.

  • Ali, K., Zain-ul-abdin, K., Li, C., Johns, L., Ali, A. A., & Carcioppolo, N. (2019). Viruses going viral: Impact of fear-arousing sensationalist social media messages on user engagement. Science Communication, 41(3). 314–38. https://doi.org/10/gg5dxx

  • Alizadeh, M., Shapiro, J. N., Buntain, C., & Tucker, J. A. (2020). Content-Based Features Predict Social Media Influence Operations. Science Advances, 6(30), eabb5824. https://doi.org/10/d4p7.

  • Allport, G. (1979). The nature of prejudice (25th anniversary edition). Basic Books.

    Google Scholar 

  • Anderson, A. A., Brossard, D., Scheufele, D. A., Xenos, M. A., & Yeo, S. K. (2016). Toxic talk? How online incivility can undermine perceptions of media credibility. International Journal of Public Opinion Research. https://doi.org/10/gf3gwf.

  • Appel, M. (2012). Anti-immigrant propaganda by radical right parties and the intellectual performance of adolescents. Political Psychology. https://doi.org/10/f35xnv.

  • Arendt, F. (2013). Dose-dependent media priming effects of stereotypic newspaper articles on implicit and explicit stereotypes. Journal of Communication, 63(5), 830–851. https://doi.org/10.1111/jcom.12056.

  • Arendt, F. (2015). Effects of right-wing populist political advertising. Journal of Media Psychology, 27, 178–189. https://doi.org/10/gf3hbq.

  • Bachl, M. (2018). (Alternative) media sources in AfD-centered Facebook discussions. Studies in Communication and Media, 7 (2), 256–70. https://doi.org/10.5771/2192-4007-2018-2-256.

  • Baldauf, J, Ebner, J., & Guhl, J. (2018). Hassrede und Radikalisierung im Netz – der OCCI Forschungsbericht. Institute for Strategic Dialogue.

    Google Scholar 

  • Barlińska, J, Szuster, A., & Winiewski, M. (2018). Cyberbullying among adolescent bystanders: Role of Affective versus cognitive empathy in increasing prosocial cyberbystander behavior. Frontiers in Psychology, 9. https://doi.org/10/gdpsnp.

  • Bartlett, J., & Miller, C. (2010). The power of unreason, conspiracy theories, extremism and counter-terrorism. Demos.

    Google Scholar 

  • Beisch, N., Koch, W., & Schäfer, C. (2019). ARD/ZDF-Onlinestudie 2019: Mediale Internetnutzung und Video-on- Demand gewinnen weiter an Bedeutung. Media Perspektiven, 9, 374–388.

    Google Scholar 

  • Benesch, S. (2012). Dangerous speech: A proposal to prevent group violence.

    Google Scholar 

  • Bilewicz, M., & Soral, W. (2020). Hate speech epidemic. The dynamic effects of derogatory language on intergroup relations and political radicalization. Political Psychology. https://doi.org/10/gg4whj.

  • Bimber, B., & de Zúñiga, H. G. (2020). The Unedited Public Sphere. New Media & Society, 22(4), 700–715. https://doi.org/10.1177/1461444819893980.

  • Boberg, S., Quandt, T., Schatto-Eckrodt, T., & Frischlich, L. (2020). Pandemic populism: Facebook pages of alternative news media and the corona crisis – A computational content analysis. arXiv preprint. https://arxiv.org/abs/2004.02566.

  • Boberg, S., Schatto-Eckrodt, T., Frischlich, L., & Quandt, T. (2018). The moral gatekeeper? Moderation and deletion of user-generated content in a leading news forum. Media and Communication, 6(4), 58–69. https://doi.org/10.17645/mac.v6i4.1493.

  • Bochmann, C., & Staufer, W. (2013). Vom ‚Negerkönig‘ zum ‚Südseekönig‘ zum …? – Politische Korrektheit in Kinderbüchern. BPJM Aktuell, 2.

    Google Scholar 

  • Böhm, R., Rusch, H., & Baron, J. (2018). The psychology of intergroup conflict: A review of theories and measures. Journal of Economic Behavior & Organization, Januar, S0167268118300209. https://doi.org/10/gfgqs9.

  • Borum, R. (2011). Radicalization into violent extremism I: A review of social science theories. Journal of Strategic Security, 4(4), 7–36. https://doi.org/10/fzbkr9.

  • Breuer, J. (2017). Hate Speech in Online Games. In K. Kaspar, L. Gräßer, & A. Riffi (Hrsg.), Online Hate Speech – Perspektiven auf eine Form des Hasses(S. 107–12). kopaed.

    Google Scholar 

  • Brewer, M. B. (1999). The psychology of prejudice: Ingroup love or outgroup hate? Journal of Social Issues, 55(3), 429–444. https://doi.org/10/cjhnj6.

  • Brosius, H-B. (2002). Zwischen Eskalation und Verantwortung- Die Berichterstattung über fremdenfeindliche Gewalt und Rechtsextremismus [Between escalation and responsibility: Reporting about violence against foreigners and right-wing extremism]. In D. Wiedemann (Hrsg.), Die rechtsextreme Herausforderung: Jugendarbeit und Öffentlichkeit zwischen Konjunkturen und Konzepten.

    Google Scholar 

  • Buckels, E. E. (2014). Trolls just want to have fun. Personality and Individual Differences, 67, 97–102. https://doi.org/10/f58bzw.

  • Cialdini, R. B., Kallgren, C. A., & Reno, R. R. (1991). A Focus Theory of Normative Conduct: A Theoretical Refinement and Reevaluation of the Role of Norms in Human Behavior. In M. P. Zanna (Hrsg.), Advances in Experimental Social Psychology (S. 201–34). Academic Press. https://doi.org/10.1016/S0065-2601(08)60330-5.

  • Coe, K., Kenski, K., & Rains, S. A. (2014). Online and uncivil? Patterns and determinants of incivility in newspaper website comments. Journal of Communication, 64(4), 658–79. https://doi.org/10/f6dxrx.

  • Costello, M., Barrett-Fox, R., Bernatzky, C., Hawdon, J., & Mendes, K. (2018). Predictors of viewing online extremism among America’s youth. Youth and Society. https://doi.org/10/gf3gq3.

  • Costello, M., Hawdon, J., Bernatzky, C., & Mendes, K. (2019). Social Group Identity and Perceptions of Online Hate. Sociological Inquiry, 89(3). 427–52. https://doi.org/10/gghcnc.

  • Cottee, S., & Cunliffe, J. (2018). Watching ISIS: How young adults engage with official english-language ISIS videos. Studies in Conflict and Terrorism, 0731, 1–25. https://doi.org/10.1080/1057610X.2018.1444955.

  • der Spiegel. (2020). Donald Trump: Twitter kennzeichnet Trump-Tweet als gewaltverherrlichend – DER SPIEGEL. Der Spiegel, 29. Mai 2020. https://www.spiegel.de/netzwelt/netzpolitik/donald-trump-twitter-kennzeichnet-trump-tweet-als-gewaltverherrlichend-a-2d25e766-9b72-4fc6-b698-c7fe895ac997.

  • Dieckmann, J., Geschke, D., & Braune, I. (2018). Für die Auseinandersetzung mit Diskriminierung ist die Betroffenen Perspektive von großer Bedeutung. Jena. https://doi.org/10.19222/201702/4.

  • Doosje, B.& van Eerten, J. J. (2017). ‚Counter-narratives‘ against violent extremism. In L. Colaert (Hrsg.), „De-radicalisation“: Scientific insights for policy(S. 83–100). Bruxelles, Belgium: Tomas Baum, Brussels. http://www.tandfonline.com/doi/full/10.1080/14678800600933480.

  • Doosje, B., Moghaddam, F. M., Kruglanski, A. W., De Wolf, A., Mann, L., & Feddes, A. R. (2016). Terrorism, radicalization and de-radicalization. Current Opinion in Psychology, 11, 79–84. https://doi.org/10/gf3g84.

  • Dovidio, J. F., Love, A., Schellhaas, F. M. H., & Hewstone, M. (2017). Reducing intergroup bias through intergroup contact: Twenty years of progress and future directions. Group Processes and Intergroup Relations, 20(5), 606–620. https://doi.org/10/gbvqnb.

  • dpa. (2020). Rechtsextremismus: Twitter sperrt Konten der Identitären Bewegung. Die Zeit, 11. Juli 2020, Abschn. Digital. https://www.zeit.de/digital/internet/2020-07/rechtsextremismus-twitter-identitaere-bewegung-soziale-medien.

  • Dziri, A. (2018). Wir brauchen einen weiten Begriff von Radikalisierung – aber nicht immer und überall! In M. von Drachenfels, P. Offermann, & C. Wunderlich (Hrsg.), Radikalisierung und De-Radikalisierung in Deutschland. Eine gesamtgesellschaftliche Herausforderung (S. 192).

    Google Scholar 

  • Eckert, T., & Röckert, T. (2019). Notre-Dame: Keine Belege dafür, dass das Feuer absichtlich gelegt wurde – und auch keine für einen Terroranschlag. 17. April 2019. https://correctiv.org/faktencheck/migration/2019/04/17/notre-dame-keine-belege-dafuer-dass-das-feuer-absichtlich-gelegt-wurde-und-auch-keine-fuer-einen-terroranschlag.

  • Eggers, M. M. (2013). Pippi Langstrumpf – Emanzipation nur für weiße Kinder? https://blog.derbraunemob.info (blog). 8. August 2013. https://blog.derbraunemob.info/wp-content/uploads/2008/10/pippi_langstrumpf-emanzipation_nur_fuer_weisse_kinder.pdf.

  • Engesser, S., Fawzi, N., & Larsson, A. O. (2017). Populist online communication: Introduction to the special issueInformation, Communication & Society, 20(9), 1279–1292. https://doi.org/10/gf3ghw.

  • Erjavec, K., & Poler Kovačič, M. (2012). “You don’t understand, this is a new war!” Analysis of hate speech in news web sites’ comments. Mass Communication and Society, 15(6), 899–920. https://doi.org/10/gfgnmm.

  • Ernst, J., Schmitt, J. B., Rieger, D., Beier, A. K., Bente, G., & Roth, H. J. (2017). Hate beneath the counter speech? A qualitative content analysis of user comments on YouTube related to counter speech videos. Journal for Deradicalization, 10(Spring), 1–49.

    Google Scholar 

  • Fasoli, F., Maass, A., & Carnaghi, A. (2015). Labelling and discrimination: Do homophobic epithets undermine fair distribution of resources? British Journal of Social Psychology 54(2), 383–393. https://doi.org/10/gg4whf.

  • Feddes, A. R., & Jonas, K. J. (2020). Associations between dutch LGBT hate crime experience, well-being, trust in the police and future hate crime reporting. Social Psychology, 51(3), 171–182. https://doi.org/10/gg4whh.

  • Festl, R. (2016). Perpetrators on the internet: Analyzing individual and structural explanation factors of cyberbullying in school context. Computers in Human Behavior, 59, 237–248. https://doi.org/10/f8hw28.

  • Finke, G. (2013). Muslime in der Mehrheitsgesellschaft: Medienbild und Alltagserfahrungen in Deutschland. Berlin, Germany: Sachverständigenrat deutscher Stiftungen für Integration und Migration (SVR) GmbH. https://d-nb.info/1046705946/34.

  • Frischlich, L. (2018). Propaganda3: Einblicke in die Inszenierung und Wirkung von Online-Propaganda auf der Makro-Meso-Mikro Ebene [Propaganda3: Insights into the staging and effects of online-propaganda on the macro-meso-micro level. In B. Zywietz (Hrsg.), Fake-News, Hashtags & Social Bots: Neue Methoden der populistischen Propaganda (S. 133–70). Springer Fachmedien VS. https://doi.org/10.1007/978-3-658-22118-8.

  • Frischlich, L. (2021). #Dark inspiration: Eudaimonic entertainment in extremist Instagram posts. New Media and Society, 23(3), 554–577. https://doi.org/10.1177/1461444819899625.

  • Frischlich, L, Boberg, S., & Quandt, T. (2019). Comment sections as targets of dark participation? Journalists’ evaluation and moderation of deviant user comments. Journalism Studies, 20(14), 2014–2033. https://doi.org/10/gfwcjc.

  • Frischlich, L., Klapproth, J., & Brinkschulte, F. (2020). Between Mainstream and Alternative – Co-Orientation in Right- Wing Populist Alternative News Media. In C. Grimme, M. Preuß, F. W. Takes, & A. Waldherr (Hrsg.), Disinformation in Open Online Media (S. 150–167). Hamburg; Germany: Springer. https://doi.org/10.1007/978-3-030-39627-5_12.

  • Frischlich, L., Schatto Eckrodt, T., Boberg, S., & Wintterlin, F. (2021). Roots of Incivility: How Personality, Media Use, and Online Experiences Shape Uncivil Participation. Media and Communication, 9(1), 195–208. https://doi.org/10.17645/mac.v9i1.3360

  • Furnham, A,, Richards, S. C., & Paulhus, D. L. (2013). The dark triad of personality: A 10 year review. Social and Personality Psychology Compass, 7(3), 199–216. https://doi.org/10/6fb.

  • Gagliardone, I., Pohjonen, M., Zerai, A., Beyene, Z., Aynekulu, G., Bright, J., Awoke Bekalu, M., et al. (2016). MECHACHAL: Online debates and elections in Ethiopia – From hate speech to engagement in social media.

    Google Scholar 

  • Garland, J., Ghazi-Zahedi, K., Young, J. G., Hébert-Dufresne, L., & Galesic, M. (2020). Countering Hate on Social Media: Large Scale Classification of Hate and Counter Speech. ArXiv: 2006.01974 [Cs], Juni. http://arxiv.org/abs/2006.01974.

  • Gelber, K., & Mcnamara, L. (2015). Evidencing the harms of hate speech. Social Identities, 1–14. https://doi.org/10.1080/13504630.2015.1128810.

  • George, C. (2016). Regulating , hate spin‘: The limits of law in managing religious incitement and offense. International Journal of Communication, 10, 2955–2972.

    Google Scholar 

  • Geschke, D., Klaßen, A., Quent, M. & Richter, C. (2019). Hass im Netz – der schleichende Angriff auf unsere Demokratie. Jena, Germany: Institut für Demokratie und Zivilgesellschaft.

    Google Scholar 

  • Glaser, S. (2018). Islamismus im Internet. Mainz, Germany: Jugendschutz. net. http://www.jugendschutz.net/fileadmin/download/pdf/Bericht_2018_Islamismus_im_Internet.pdf.

  • Glaser, S. (2019a). Jugendschutz Report 2019. Mainz, Germany: Jugendschutz.net. http://www.jugendschutz.net/fileadmin/download/pdf/bericht2019.pdf.

  • Glaser, S. (2019b). Rechtsextremismus im Netz. Jugendschutz.net. Mainz, Germany: Jugendschutz.net http://www.jugendschutz.net/fileadmin/download/pdf/Bericht_2018_2019_Rechtsextremismus_im_Netz.pdf.

  • Goodboy, A. K., & Martin, M. M. (2015). The personality profile of a cyberbully: Examining the Dark Triad. Computers in Human Behavior 49, 1–4. https://doi.org/10.1016/j.chb.2015.02.052.

  • Gowen, A., & Bearak, M. (2017). Fake news on Facebook fans the flames of hate against the Rohinga in Burmar. Washington Post, Dezember 2017.

    Google Scholar 

  • Grimme, C., Assenmacher, D., & Adam, L. (2018). Changing perspectives: Is it sufficient to detect social bots? In G. Meiselwitz (Hrsg.), Social Computing and Social Media. User Experience and Behavior (10913) (S. 445–61). Springer International Publishing. Verfügbar unter: https://doi.org/10.1007/978-3-319-91521-0_32.

  • Guhl, J., Ebner, J., & Rau, J. (2020). Das Online-Ökosystem rechtsextremer Akteure. Institute for Strategic Dialogue.

    Google Scholar 

  • Haslam, C., Cruwys, T, Haslam, S. A., Dingle, G., & Xue Ling Chang, M. (2016). Groups 4 Health: Evidence that a social-identity intervention that builds and strengthens social group membership improves mental health. Journal of Affective Disorders, 194, 188–195. https://doi.org/10/gf3hcv.

  • Heft, A., Mayerhöffer, E., Reinhardt, S., & Knüpfer, C. (2019). Beyond Breitbart: Comparing Right‐wing Digital News Infrastructures in Six Western Democracies. Policy & Internet, August, poi3.219. https://doi.org/10/gf8pn9.

  • Heiss, R., & Matthes, J. (2020). Stuck in a nativist spiral: Content, selection, and effects of right-wing populists’ communication on facebook. Political Communication, 37(3), 303–328. https://doi.org/10/ggvqh2.

  • Heiss, R., Schmuck, D., & Matthes, J. (2019). What drives interaction in political actors’ facebook posts? profile and content predictors of user engagement and political actors’ reactions. Information, Communication & Society, 22(10), 1497–1513. https://doi.org/10/gfdbn2.

  • Heitmeyer, W. (2002). Deutsche Zustände [German states] (1. Aufl.). Suhrkamp.

    Google Scholar 

  • Hern, A. (2019). Revealed: Catastrophic Effects of Working as a Facebook Moderator. The Guardian, 17. September 2019, Abschn. Technology. https://www.theguardian.com/technology/2019/sep/17/revealed-catastrophic-effects-working-facebook-moderator.

  • Hölig, S., Hasebrink U., & Hans-Bredow-Institut. (2019). Reuters Institute Digital News Report 2019: Ergebnisse für Deutschland.

    Google Scholar 

  • Hsueh, M., Yogeeswaran, K., & Malinen, S. (2015). Leave your comment below: Can biased online comments influence our own prejudicial attitudes and behaviors? Human Communication Research, 41(4), 557–576. https://doi.org/10.1111/hcre.12059.

    Article  Google Scholar 

  • Humprecht, E. (2019). Where ‘fake news’ flourishes: A comparison across four Western democracies. Information Communication and Society, 22(13), 1973–88. https://doi.org/10.1080/1369118X.2018.1474241.

  • Igartua, J. J., Moral-Toranzo, F., & Fernández, I. (2011). Cognitive, attitudinal, and emotional effects of news frame and group cues, on processing news about immigration. Journal of Media Psychology, 23 (4), 174–85. Verfügbar unter: https://doi.org/10/gfz44g

  • Igartua, J. J., & Cheng, L. (2009). Moderating effect of group cue while processing news on immigration: Is the framing effect a heuristic process? Journal of Communication, 59(4), 726–749. https://doi.org/10/brb4r6.

  • Jacks, W., & Adler, J. R. (2015). A Proposed Typology of Online Hate Crime, 7, 27.

    Google Scholar 

  • Jetten, J., Haslam, A. S., & Haslam, C. (2012). The case for a social identity analysis of health and well-being. In J. Jetten, C. Haslam, & S. A. Haslam (Hrsg.), The Social Cure: Identity, Health and Well-being (S. 3–21). Psychology Press.

    Chapter  Google Scholar 

  • Joko & Klaas gegen ProSieben – Männerwelten. (2020). https://www.prosieben.de/tv/joko-klaas-gegen-prosieben/video/32-maennerwelten-joko-klaas-15-minuten-clip

  • Jonason, P. K., & Krause, L. (2013). The emotional deficits associated with the dark triad traits: Cognitive empathy, affective empathy, and alexithymia. Personality and Individual Differences, 55(5), 532–537. https://doi.org/10/gftkzw.

  • Kalmar, I., Stevens, C., & Worby, N. (2018). Twitter, Gab, and Racism: The Case of the Soros Myth. In Proceedings of the 9th International Conference on Social Media and Society (S. 330–34). Copenhagen Denmark: ACM. https://doi.org/10/gg4gvx.

  • Stenzel, K. (2020). Attila Hildmann bringt Anhänger gegen Wissenschaftlerin auf | Kölner Stadt-Anzeiger. Kölner Stadtanzeiger, 10. Juni 2020. https://www.ksta.de/panorama/forschung-zu-verschwoerungsmythen-kochbuchautor-bringt-fans-gegen-wissenschaftlerin-auf-36832414.

  • Kenski, K., Coe, K., & Rains, S. A. (2017). Perceptions of Uncivil Discourse Online: An Examination of Types and Predictors. Communication Research, April, 009365021769993. https://doi.org/10/gghcnf.

  • Koban, K., Stein, J. P., Eckhardt, V., & Ohler, P. (2018). Quid pro quo in Web 2.0. Connecting personality traits and Facebook usage intensity to uncivil commenting intentions in public online discussions. Computers in Human Behavior, 79(Februar), 9–18. https://doi.org/10/gf3gv4.

  • Kramp, L., & Weichert, S. (2018). Hass Im Netz: Steuerungsstrategien Für Redaktionen. Schriftenreihe Medienforschung Der Landesanstalt Für Medien Nordrhein-Westfalen, 80. Leipzig: VISTAS Verlag.

    Google Scholar 

  • Kreißel, P., Ebner, J., Urban, A., & Guhl, J. (2018). Hass auf Knopfdruck – Rechtsextreme Trollfabriken und das Ökosystem koordinierter Hasskampagnen im Netz. Institute for Strategic Dialogue, 28–28.

    Google Scholar 

  • Kümpel, A. S., & Rieger, D. (2019). Wandel der Sprach- und Debattenkultur in sozialen Online Medien. Ein Literaturüberblick zu Ursachen und Wirkungen von inziviler Kommunikation. Berlin, Germany: Konrad-Adenauer Stiftung. https://www.kas.de/documents/252038/4521287/Wandel+der+Sprach-+und+Debattenkultur+in+sozialen+Online-Medien.pdf/6a76553c-7c30-b843-b2c8-449ba18c814e?version=1.0&t=1560853247556.

  • Lamm, H., & Myers, D. G. (1978). Group-Induced Polarization of Attitudes and Behavior. In Advances in Experimental Social Psychology, herausgegeben von Leonard Berkowitz, 11, 145–95. Academic Press. https://doi.org/10.1016/S0065-2601(08)60007-6.

  • Landesanstalt für Medien Nord-Rhein Westfalen. (2020). forsa-Befragung zu: Hate Speech 2020. Düsseldorf; Germany. https://www.medienanstalt-nrw.de/fileadmin/user_upload/NeueWebsite_0120/Themen/Hass/forsa_LFMNRW_Hassrede2020_Ergebnisbericht.pdf.

  • Laschyk, T. (2020). ‚Jüdische Weltverschwörung‘: Attila Hildmann verbreitet antisemitische Aussagen – Volksverpetzer. Volksverpetzer.de (blog). 7. Juni 2020. https://www.volksverpetzer.de/analyse/attila-antisemitismus/.

  • Leader Maynard, J., Benesch, S., & American University. (2016). Dangerous Speech and Dangerous Ideology: An Integrated Model for Monitoring and Prevention. Genocide Studies and Prevention, 9(3), 70–95. https://doi.org/10/gf9nxr.

  • Leets, L. (2002). Experiencing hate speech: Perceptions and responses to anti-semitism and antigay speech. Journal of Social Issues, 58(2), 341–361. https://doi.org/10.1111/1540-4560.00264.

  • Levin, B., & Grisham, K. E. (2017). Final U.S. Status Report: Hate crime analysis & forecast 2016/2017.

    Google Scholar 

  • Lyons-Padilla, S., Gelfand, M. J., Mirahmadi, H., Farooq, M., & van Egmond, M. (2015). Belonging nowhere: Marginalization & radicalization risk among Muslim Immigrants. Behavioral Science and Policy, 1(2), 1–12. https://doi.org/10/gf3gj3.

  • Mackie, D. M., & Smith, E. R. (2002). The role of threat in intergroup relations. In From prejudice to intergroup emotions: Differentiated reactions to social groups, 191–208. Psychology Press.

    Google Scholar 

  • Marwick, A. & Lewis, R. (2017). Media manipulation and disinformation online. Data & Society Research Institute, 1–104.

    Google Scholar 

  • Masullo Chen, G., Muddiman, A., Wilner, T., Pariser, E., & Jomini Stroud, N. (2019). We Should Not Get Rid of Incivility Online. Social Media + Society, 5(3), 205630511986264. https://doi.org/10/gghcnh.

  • Matzat, L. (2019). Faktencheck mit Haken: Das Facebook-Dilemma von Correctiv. Übermedien (blog). 12. Dezember 2019. https://uebermedien.de/44183/faktencheck-mit-haken-das-facebook-dilemma-von-correctiv/.

  • Maurer, M., Jost, P., Haßler, J., & Kruschinski, S. (2018). Auf den Spuren der Lügenpresse. Publizistik, 64 (1), 15–35. https://doi.org/10/gf3gmx.

  • McCauley, C. & Moskalenko, S. (2008). Mechanisms of political radicalization: Pathways toward terrorism. Terrorism and Political Violence, 20(3), 415–433. Verfügbar unter: https://doi.org/10/d8nxv9.

  • McDevitt, J., Levin, J., & Bennett, S. (2002). Hate crime offenders: An expanded typology. Journal of Social Issues, 58 (2), 303–17. Verfügbar unter: https://doi.org/10/bt2bkh.

  • McGregor, S. C. (2019). Social Media as Public Opinion: How Journalists Use Social Media to Represent Public Opinion. Journalism, 20(8), 1070–1086. https://doi.org/10/ggb6h5.

  • Meleagrou-Hitchens, A., & Kaderbhai, N. (2017). Perspectives on online radicalization, Literature review 2006–2016. Vox Pol. http://icsr.info/wp-content/uploads/2017/05/ResearchPerspectivesonOnlineRadicalisation.pdf.

  • Mohan, S., Guha, A., Harris, M., Popowich, F., Schuster, A., & Priebe, C. (2017). The Impact of Toxic Language on the Health of Reddit Communities. In M. Mouhoub, P. Langlais (Hrsg.), Advances in Artificial Intelligence (S. 51–56). Lecture Notes in Computer Science. Cham: Springer International Publishing. https://doi.org/10/ggn3rx.

  • Muldoon, O. T., Alexander Haslam, S., Haslam, C., Cruwys, T., Kearns, M., & Jetten, J. (2019). The Social Psychology of Responses to Trauma: Social Identity Pathways Associated with Divergent Traumatic Responses. European Review of Social Psychology, 30(1), 311–48. https://doi.org/10/gg4whk.

  • Mutz, D. C. (2015). In-Your-Face Politics: The Consequences of Uncivil Media. Princeton University Press.

    Google Scholar 

  • Nabben, B., Ringler, N., Schöffler, N., Hakan, N., Zierer, M., Altland, N., Pittelkow, S., Basl, C., & Riedel, K. (2020). Die Hassmaschine. BR. 23. Juni 2020. https://web.br.de/interaktiv/hassmaschine/.

  • Näsi, M., Räsänen, P., Hawdon, J., Holkeri, E., & Oksanen, A. (2015). Exposure to Online Hate Material and Social Trust among Finnish Youth. Information Technology & People, 28(3), 607–622. https://doi.org/10.1108/ITP-09-2014-0198.

  • Neubaum, G., & Krämer, N. C. (2016). Monitoring the opinion of the crowd: Psychological mechanisms underlying public opinion perceptions on social media. Media Psychology, 3269 (August), 1–30. https://doi.org/10/gfbxwf.

  • Neubaum, G., & Krämer, N. C. (2017). Opinion climates in social media: Blending mass and interpersonal communication. Human Communication Research, 43, 464–476. https://doi.org/10/gcmcjk.

  • Neubaum, G., & Krämer, N. C. (2018). What Do We Fear? Expected Sanctions for Expressing Minority Opinions in Offline and Online Communication. Communication Research, 45(2), 139–64. https://doi.org/10/gcvmvb.

  • Newman, B., Merolla, J. L., Shah, S., Casarez Lemi, D., Collingwood, L., & Ramakrishnan, K. (2020). The Trump Effect: An Experimental Investigation of the Emboldening Effect of Racially Inflammatory Elite Communication. British Journal of Political Science, Februar, 1–22. https://doi.org/10/gg2kwm.

  • Newton, C. (2019). The Secret Lives of Facebook Moderators in America. The Verge, 25. Februar 2019. https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona.

  • Noelle-Neumann, E. (1974). The spiral of silence. Journal of Communication, 24(2), 43–51. https://doi.org/10/b4n6tv.

  • Obermaier, M., Hofbauer, M., & Reinemann, C. (2018). Journalists as targets of hate speech. How German journalists perceive the consequences for themselves and how they cope with it. Studies in Communication | Media, 7(4), 499–524.https://doi.org/10/gf3gkx.

  • Olson, J. (2008). Whiteness and the Polarization of American Politics. Political Research Quarterly, 61(4), 704–718. https://doi.org/10/dbsw7z.

  • O’Sullivan, P. B., & Flanagin, A. J. (2003). Reconceptualizing ‚flaming‘ and other problematic messages. New Media & Society, 5(2), 1461–4448. https://doi.org/10/b3txz4.

  • Pabian, S., De Backer, C. J. S., & Vandebosch, H. (2015). Dark Triad personality traits and adolescent cyber-aggression. Personality and Individual Differences, 75, 41–46. https://doi.org/10/f6x44f.

  • Papacharissi, Z. (2004). Democracy online: Civility, politeness, and the democratic potential of online political discussion groups. New Media & Society, 6(2), 259–283. https://doi.org/10/dz4rp6.

  • Papasavva, A., Zannettou, S., De Cristofaro, E., Stringhini, G., & Blackburn, J. (2020). Raiders of the Lost Kek: 3.5 Years of Augmented 4chan Posts from the Politically Incorrect Board. ArXiv:2001.07487 [Cs], Januar. http://arxiv.org/abs/2001.07487.

  • Petrusich, A. (2020). K-Pop Fans Defuse Racist Hashtags. The New Yorker, 5. Juni 2020. https://www.newyorker.com/culture/cultural-comment/k-pop-fans-defuse-racist-hashtags.

  • Pfundmair, M. (2019). Ostracism Promotes a Terroristic Mindset. Behavioral Sciences of Terrorism and Political Aggression, 11(2), 134–148. https://doi.org/10/gf3hcs.

  • Pfundmair, M., Aßmann, E., Kiver, B., Penzkofer, M., Scheuermeyer, A., Sust, L., & Schmidt, H. (2019). Pathways toward Jihadism in Western Europe: An empirical exploration of a comprehensive model of terrorist radicalization. Terrorism and Political Violence, 0 (0), 1–23. https://doi.org/10/ggbhgt.

  • Pfundmair, M., & Wetherell, G. (2019). Ostracism Drives Group Moralization and Extreme Group Behavior. The Journal of Social Psychology, 159(5), 518–530. https://doi.org/10/ggbhgv.

  • Phillips, W. M. (2012). This Is Why We Can’t Have Nice Things: The Origins, Evolution and Cultural Embeddedness of Online Trolling. Ph.D., Ann Arbor, United States. https://search.proquest.com/docview/1237277556/abstract/E53C6686D73D419BPQ/1.

  • Postmes, T., Spears, R., & Lea, M. (1998). Breaching or building social boundaries? SIDE-effects of computer-mediated communication. Communication Research, 25(6), 689–715. https://doi.org/10/ffsbdn.

  • Priebe, M. (2020). Angst, Hass und Vorurteile: Wie Rassisten das Coronavirus für sich nutzen. Frankfurter Allgemeine Zeitung, 2. März 2020. https://www.faz.net/1.6614102.

  • Prochazka, F., Weber, P., & Schweiger, W. (2018). Effects of Civility and Reasoning in User Comments on Perceived Journalistic Quality. Journalism Studies, 19(1), 62–78. https://doi.org/10/gf3g7t.

  • Quandt, T., & Festl, R. (2017). Cyberhate. In P, Rössler, C. A. Hoffner, & L. van Zoonen (Hrsg.), International Encyclopedia of Media Effects (S. 8). Wiley-Blackwell. https://doi.org/10.1002/9781118783764.

  • Räsänen, P., Hawdon, J., Holkeri, E., Keipi, T., Näsi, M., & Oksanen, A. (2016). Targets of Online Hate: Examining Determinants of Victimization among Young Finnish Facebook Users. Violence and Victims, 31(4), 708–725. https://doi.org/10/f837gt.

  • Rieger, D., Frischlich, L., & Bente, G. (2019). Dealing with the dark side: The effects of right-wing extremist and Islamist extremist propaganda from a social identity perspective. Media, War & Conflict. https://doi.org/10/gf3gmz.

  • Rieger, D., Frischlich, L., Rack, S., & Bente, G. (2020). Digitaler Wandel, Radikalisierungsprozesse und Extremismusprävention im Internet. In B. B.b Slama & U. Kemmesies (Hrsg.), Handbuch Extremismusprävention Polizei + Forschung, 54 (S. 351–388). Bundeskriminalamt.

    Google Scholar 

  • Salzborn, S. & Maegerle, A. (2016). Die dunkle Seite des WWW: Rechtsextremismus und Internet. Zeitschrift für Vergleichende Politikwissenschaft, 10(S2), 213–31. https://doi.org/10/gg2kwp.

  • Sarovic, A. (2020). Donald Trump: Erstmals hat Twitter den US-Präsidenten einem Faktencheck unterzogen. Der Spiegel, 27. Mai 2020. https://www.spiegel.de/politik/ausland/erstmals-hat-twitter-donald-trump-einem-faktencheck-unterzogen-a-2967c864-ecd3-4ce1-b539-79fad729e643.

  • Schieb, C., & Preuss, M. (2018). Considering the Elaboration Likelihood Model for simulating hate and counter speech on Facebook. Studies in Communication | Media, 7(4), 580–606. https://doi.org/10/gf3gkw.

  • Schmehl, K. (2017). Diese geheimen Chats zeigen, wer hinter dem Meme-Angriff #Verräterduell aufs TV-Duell steckt. BuzzFeed News, 2017.

    Google Scholar 

  • Schmidt-Kleinert, A. (2020). The great divide? The online-offline nexus and insights from research on the far-right in Germany. PRIF BLOG (blog). 25. Mai 2020. https://blog.prif.org/2020/05/25/the-great-divide-the-online-offline-nexus-and-insights-from-research-on-the-far-right-in-germany/.

  • Schmitt, J. B., Harles, D., & Rieger, D. (2020). Themen, Motive und Mainstreaming in rechtsextremen Online-Memes. M&K Medien & Kommunikationswissenschaft, 73–93. https://doi.org/10.5771/1615-634X-2020-1-1.

  • Schmitt, J. B., Rieger, D., Rutkowski, O., &Ernst, J. (2018). Counter-messages as prevention or promotion of extremism?! The potential role of YouTube: Recommendation algorithms. Journal of Communication, 68(August), 780–808. https://doi.org/10.1093/joc/jqy029.

  • Schönefeld, V., Altmann, T., & Roth, M. (2016). Das vielfältige Spektrum der Empathieprogramme. In M. Roth, V. Schöefeld, & T. Altmann (Hrsg.), Trainings- und Interventionsprogramme zur Förderung von Empathie – Ein praxisorientiertes Kompendium (S. 207–210). Springer.

    Chapter  Google Scholar 

  • Schwarzenegger, C., & Wagner, A. (2018). Can it be hate if it is fun? Discursive ensembles of hatred and laughter in extreme right satire on Facebook. Studies in Communication | Media, 7(4), 473–498. https://doi.org/10/gf3gk3.

  • Schweiger, W. (2017). Der (des)informierte Bürger im Netz [The (dis) informed citizen in the Web]. Springer Fachmedien. https://doi.org/10.1007/978-3-658-16058-6.

  • Séville, A. (2019). Vom Sagbaren zum Machbaren? Rechtspopulistische Sprache und Gewalt | APuZ. Aus Politik und Zeitgeschichte, 40(November), 49–50.

    Google Scholar 

  • Sherif, M., & Hovland, C. (1961). Social judgment: Assimilation and contrast effects in communication and attitude change. Yale University Press.

    Google Scholar 

  • Shoemaker, pp. J. & Vos, T. (2009). Gatekeeping Theory. Routledge.

    Google Scholar 

  • Silva, L., Mondal, M., Correa, D., Benevenuto, F., & Weber, I. (2016). Analyzing the targets of hate in online social media. Cornell University Library, Nr. June. http://arxiv.org/abs/1603.07709.

  • Soral, W., ilewicz, M., & Winiewski, M. (2017). Exposure to hate speech increases prejudice through desensitization. Aggressive Behavior, 1–11. https://doi.org/10/gf3gx2.

  • Spears, R., & Postmes, T. (2015). Group identity, social influence, and collective action online. Extensions and applications of the SIDE model. In The Handbook of the Psychology of Communication Technology (1. Aufl.) (S. 23–46). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781118426456.

  • Springer, N-, Engelmann, I., & Pfaffinger, C. (2015). User comments: Motives and inhibitors to write and read. Information Communication and Society, 18(7), 798–815. https://doi.org/10/gf3g7v.

  • Stephan, W. G., & Renfro, C. L. (2002). The role of threats in intergroup relations. In D. Mackie & E. R. Smith (Hrsg.), From prejudice to intergroup emotions (S. 191–208). Psychology Press.

    Google Scholar 

  • Stöcker, C. (2020). How Facebook and Google Accidentally Created a Perfect Ecosystem for Targeted Disinformation. In C. Grimme, M. Preuß, F. W. Takes, & A. Waldherr (S. 21).Dinsinformation in Open Online Media.

    Google Scholar 

  • Tagesschau.de (2020). Facebook löscht Wahlanzeige von Trumps Team. tagesschau.de, 19. Juni 2020. https://www.tagesschau.de/ausland/trump-facebook-107.html.

  • Tajfel, H., & Turner, J. C. (1979). An integrative theory of intergroup conflict. In S. WOrchel & W. G. Austin (Hrsg.), The Social Psychology of Intergroup Relations (S. 33–47). Brooks-Cole. https://doi.org/10.1016/S0065-2601(05)37005-5.

  • Tamborini, R. (2011). Moral intuition and media entertainment. Journal of Media Psychology, 23(1), 39–45.https://doi.org/10.1027/1864-1105/a000031.

  • Tamborini, R., Eden, A., Bowman, N. D., Grizzard, M., & Lachlan, K. (2012). The influence of morality subcultures on the acceptance and appeal of violence. Journal of Communication, 62(1), 136–157.https://doi.org/10.1111/j.1460-2466.2011.01620.x.

  • Taylor, W. D., Johnson, G., Ault, M. K., Griffith, J. A., Rozzell, B., Connelly, S., Jensen, M. L., Dunbar, N. E., & Ness, A. M. (2015). Ideological group persuasion: A within-person study of how violence, interactivity, and credibility features influence online persuasion. Computers in Human Behavior, 51, 448–460. https://doi.org/10/f7pfth.

  • Teding van Berkhout, E., & Malouff, J. M. (2016). The efficacy of empathy training: A meta-analysis of randomized controlled trials. Journal of Counseling Psychology, 63(1), 32–41. https://doi.org/10.1037/cou0000093.

  • Tiffany, K. (2020). ‚My Little Pony‘ Fans Are Ready to Admit They Have a Nazi Problem. The Atlantic. 23. Juni 2020. https://www.theatlantic.com/technology/archive/2020/06/my-little-pony-nazi-4chan-black-lives-matter/613348/

  • Tucker, J. A., Guess, A., Barbera, P., Vaccari, C., Siegel, A., Sanovich, S., Stukal, D., & Nyhan, B. (2018). Social media, political polarization, and political disinformation: A review of the scientific literature. SSRN Electronic Journal, March, 1–95. https://doi.org/10/gf3gqk.

  • Turner, J. C., Hogg, M. A., Oakes, P. J., Reicher, S. D., & Wetherell, M. S. (1987). Rediscovering the social group: A self-categorization theory. Blackwell.

    Google Scholar 

  • Van der Heide, B. L. (2009). Computer -Mediated Impression Formation: A Test of the Sticky Cues Model Using Facebook. Ph.D., Ann Arbor, United States. Verfügbar unter: https://search.proquest.com/docview/304943206/abstract/E72DE7060A1A415DPQ/1.

  • Walters, M. A., Brown, R., & Wiedlitzka, S. (2016). Causes and motivations of hate crime. Equality and Human Rights Commission Research report, 102, 61–61.

    Google Scholar 

  • Wardle, C. (2018). The Need for Smarter Definitions and Practical, Timely Empirical Research on the Information Disorder. Digital Journalism, 6(8), 951–963. https://doi.org/10/gfj4br.

  • Weiß, E. (2020). Identitären-Chef nun auch bei TikTok gesperrt, Webhoster kündigt ihm. heise online, 16. Juli 2020. https://www.heise.de/news/Identitaeren-Chef-nun-auch-bei-TikTok-gesperrt-Webhoster-kuendigt-4845240.html.

  • Wenzel, M. & Żerkowska-Balas, M. (2018). Framing Effect of Media Portrayal of Migrants to the European Union: A Survey Experiment in Poland. East European Politics and Societies, 1–22. https://doi.org/10/gf3gqn.

  • Wesermüller, E. (2017). Wetterfest durch den Shit-Storm: Leitfaden für Journalist*innen zum Umgang mit Hassrede im Netz. Berlin, Germany: Neue deutsche Medienmacher e.V. No Hate Speech Movement.

    Google Scholar 

  • Williams, K. D. (1997). Social Ostracism. In R. M. Kowalski (Hrsg.), Aversive Interpersonal Behaviors (S. 133–170). The Springer Series in Social/Clinical Psychology. Boston, MA: Springer US. https://doi.org/10.1007/978-1-4757-9354-3_7

  • Williams, M. L., Burnap, P., Javed, A., Liu, H., & Ozalp, S. (2019). Hate in the Machine: Anti-Black and Anti-Muslim Social Media Posts as Predictors of Offline Racially and Religiously Aggravated Crime. The British Journal of Criminology, Juli, azz049. https://doi.org/10/c9qh.

  • Wintterlin, F., Schatto-Eckrodt, T., Boberg, S., Frischlich, L., & Quandt, T. (2020). How to cope with dark participation: Moderation practices in German newsrooms. Digital Journalism. Digital Journalism. https://doi.org/10.1080/21670811.2020.1797519.

  • Wojcieszak, M. (2008). False consensus goes online: Impact of ideologically homogeneous groups on false consensus. Public Opinion Quarterly, 72(4), 781–791. Verfügbar unter: https://doi.org/10/dj9vbq.

  • Xiaojing, X., Xiangyu, Z., Xiaoying, W., & Han, S. (2009). Do you feel my pain? Racial group membership modulates empathic neural responses. The Journal of Neuroscience, 29(26), 8525–8529. https://doi.org/10.1523/JNEUROSCI.2418-09.2009.

  • Yanagizawa-Drott, D. (2014). Propaganda and conflict: Evidence from the Rwandan genocide. Quarterly Journal of Economics, 129(4), 1947–1994. https://doi.org/10/f6tvdp.

  • Zadro, L., Williams, K. D., & Richardson, R. (2004). How low can you go? Ostracism by a computer is sufficient to lower self-reported levels of belonging, control, self-esteem, and meaningful existence. Journal of Experimental Social Psychology, 40 (4), 560–567. https://doi.org/10.1016/j.jesp.2003.11.006.

  • Zannettou, S., Caulfield, T., Blackburn, J., De Cristofaro, E., Sirivianos, M., Stringhini, G., & Suarez-Tangil, G. (2018). On the origins of memes by means of fringe web communities. ACM Internet Measurement Conference. http://arxiv.org/abs/1805.12512.

  • Ziegele, M., Johnen, M., Bickler, A., Jakob, I., Setzer, T., & Schnauber, A. (2013). Männlich, rüstig, kommentiert? Einflussfaktoren auf die Aktivität kommentierender Nutzer von Online-Nachrichtenseiten. Studies in Communication | Media, 1, 67–114. https://doi.org/10/gf3g7z

  • Ziegele, M., & Jost, P. B. (2016). Not funny? The effects of factual versus sarcastic journalistic responses to uncivil user comments. Communication Research, Oktober. https://doi.org/10.1177/0093650216671854

  • Ziegele, M., Jost, P., Bormann, M., & Heinbach, D. (2018). Journalistic counter-voices in comment sections: Patterns, determinants, and potential consequences of interactive moderation of uncivil user comments. Studies in Communication | Media, 7(4), 525–54. https://doi.org/10/gf3gk2.

  • Ziegele, M., Jost, P., Frieß, D., & Naab, T. (2019). Aufräumen im Trollhaus. Zum Einfluss von Community-Managern und Aktionsgruppen in Kommentarspalten. Précis 3. Düsseldorf Institute for Internet and Democracy.

    Google Scholar 

  • Ziegele, M., Koehler, C., & Weber, M. (2018). Socially Destructive? Effects of Negative and Hateful User Comments on Readers’ Donation Behavior toward Refugees and Homeless Persons. Journal of Broadcasting & Electronic Media, 62(4), 636–653. https://doi.org/10/gf8pn4

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Lena Frischlich .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Der/die Autor(en), exklusiv lizenziert durch Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Frischlich, L. (2022). H@te Online: Die Bedeutung digitaler Kommunikation für Hass und Hetze. In: Weitzel, G., Mündges, S. (eds) Hate Speech. Aktivismus- und Propagandaforschung. Springer VS, Wiesbaden. https://doi.org/10.1007/978-3-658-35658-3_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-658-35658-3_5

  • Published:

  • Publisher Name: Springer VS, Wiesbaden

  • Print ISBN: 978-3-658-35657-6

  • Online ISBN: 978-3-658-35658-3

  • eBook Packages: Social Science and Law (German Language)

Publish with us

Policies and ethics