Skip to main content

Human and Technical Aspect of Content Management

  • Chapter
  • First Online:
Censorship from Plato to Social Media

Part of the book series: Law, Governance and Technology Series ((LGTS,volume 61))

  • 202 Accesses

Abstract

The solution for content moderation is unlikely to be a choice between human moderation or moderation by artificial intelligence, but rather a combination of the two in the future. In 2020, during the COVID-19 pandemic—while big tech companies were also making their employees work from home and giving artificial intelligence more tasks—“Facebook and Google roughly doubled the amount of potentially harmful material they removed in the second quarter of this year compared with the three months through March”, and there were many more complaints about the decisions as a result, making it clear that human content scrutiny will not be unnecessary for some time. The chapter examines the human moderation and the use of artificial intelligence. “After all, a butt is a butt and a nipple is a nipple. But deciding when a nipple is art, porn or protest gets murky even when humans are doing the deciding. Teaching AI software about human sexual desire is a whole other ballgame.”

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Frieze (2018).

  2. 2.

    Ibid.; It should be added, however, that the events saw a fortunate increase in traffic at Flemish museums during the period, and the agency produced a video mocking Facebook’s policy: Visitflanders (2018).

  3. 3.

    Stapley-Brown (2018).

  4. 4.

    For the interesting case that ended in a settlement between the parties, see Gosztonyi (2022).

  5. 5.

    Dawson (2018).

  6. 6.

    Facebook Community Standards, III.14. Adult nudity and sexual activity.

  7. 7.

    BBC News (2016).

  8. 8.

    NCAC (2019).

  9. 9.

    And Angelo Stagnaro probably misunderstands the process when writing: “Facebook has had a long history of censoring Christian organizations and individuals, flagging our beliefs as being ‘hateful’ or otherwise inappropriate, but it is their actions and inaction that is most accurately branded as hateful.” Stagnaro (2018).

  10. 10.

    Daily Mail Online (2020).

  11. 11.

    Roy (2020).

  12. 12.

    Jenik (2021).

  13. 13.

    https://transparency.fb.com/policies/community-standards/.

  14. 14.

    Pintér (2018); Cf. Rebecca MacKinnon’s term ‘Facebookistan’. MacKinnon (2013).

  15. 15.

    Gillespie (2018), p. 116.

  16. 16.

    Huszár (2022).

  17. 17.

    Interpretative insertion by the author.

  18. 18.

    Koebler and Cox (2018).

  19. 19.

    Ibid., 2018.

  20. 20.

    Joaquin Quiñonero Candela, the company’s Head of Artificial Intelligence, was interviewed about the shortcomings of Facebook and its use of artificial intelligence. Hao (2021).

  21. 21.

    Dias Oliva (2020), p. 612.

  22. 22.

    Jillson (2021).

  23. 23.

    Sorbán (2021), pp. 72–73.

  24. 24.

    Balkin (2018), p. 2024.

  25. 25.

    Llansó et al. (2020), pp. 2–3.

  26. 26.

    Huszár (2021), p. 34.

  27. 27.

    Nlc.hu (2018).

  28. 28.

    Newton (2019a); Cf. Buni and Chemaly (2016).

  29. 29.

    Newton (2019b).

  30. 30.

    Hvg.hu (2019).

  31. 31.

    Ruckenstein and Turunen (2020), pp. 1026–1042.

  32. 32.

    Gillespie (2020), p. 2.

  33. 33.

    Scott and Kayali (2020).

  34. 34.

    Barker and Murphy (2020).

  35. 35.

    UNHRC (2018), p. 64.

  36. 36.

    Turner Lee et al. (2019).

  37. 37.

    Cambridge Consultants (2019), p. 5.

  38. 38.

    Ibid., pp. 6–8.

References

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Gosztonyi, G. (2023). Human and Technical Aspect of Content Management. In: Censorship from Plato to Social Media. Law, Governance and Technology Series, vol 61. Springer, Cham. https://doi.org/10.1007/978-3-031-46529-1_8

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-46529-1_8

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-46528-4

  • Online ISBN: 978-3-031-46529-1

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics