Skip to main content

Post-Truth: Organisational Social Responsibility in an AI-Driven Society

  • Chapter
  • First Online:
Platforms and Artificial Intelligence

Part of the book series: Progress in IS ((PROIS))

  • 974 Accesses

Abstract

This study deals with the nature of a post-truth society, which is imminent, considering the widespread use of artificial intelligence (AI)-based information systems that use machine learning methods such as deep learning, as well as the social attitudes and responsibilities of organisations that develop, implement, operate and/or use those systems. In such a society, the truths about individuals, groups, organisations, communities, societies, nations, things, events and the world become meaningless or worthless; individuals are treated as black boxes to be manipulated and exploited by malicious AI-based system operators; and the four factors that erode accountability in computing—many hands, bugs, the computer as a scapegoat, and ownership without liability [Nissenbaum (Science and Engineering Ethics 2(1):25–42, 1996)]—worsen because of the unpredictability and uncontrollability of AI-based system behaviours, leading to a lack of responsibility and accountability in AI computing. To prevent the full emergence of a post-truth society or mitigate risks associated with such a society, and to restore responsibility and accountability to computing, organisations that are key players in AI computing must be required to proactively address ethical and social issues caused by the development and use of AI-based systems.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 139.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Thinking far-distant future, we may need to specify the conditions of granting legal personality to AI-based systems before it is too late. But, this is beyond the scope of this study.

References

Download references

Acknowledgements

This study is based on the author’s previous work with Dr. Yohko Orito of Ehime University, Mr. Tatsuya Yamazaki of the University of Toyama and Dr. Kazuyuki Shimizu of Meiji University, which was presented at the ETHICOMP 2020 conference and published as a full conference paper under the title of “Post-truth society: the AI-driven society where no one is responsible” (Yamazaki et al., 2020). The author thanks these colleagues for their enthusiastic participation in discussions about the phenomena of the post-truth society. The author also thanks Dr. Fareed Ben-Youssef of Texas Tech University and Dr. Paul B. de Laat of the University of Groningen for helpful suggestions provided during the conference. This study was supported by the JSPS Grant-in-Aid for Scientific Research (C) 20K01920.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kiyoshi Murata .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Murata, K. (2022). Post-Truth: Organisational Social Responsibility in an AI-Driven Society. In: Bounfour, A. (eds) Platforms and Artificial Intelligence . Progress in IS. Springer, Cham. https://doi.org/10.1007/978-3-030-90192-9_13

Download citation

Publish with us

Policies and ethics