An Investigation of Misinformation Harms Related to Social Media During Humanitarian Crises

  • Thi TranEmail author
  • Rohit Valecha
  • Paul Rad
  • H. Raghav Rao
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1186)


During humanitarian crises, people face dangers and need a large amount of information in a short period of time. Such need creates the base for misinformation such as rumors, fake news or hoaxes to spread within and outside the affected community. It could be unintended misinformation with unconfirmed details, or intentional disinformation created to trick people for benefits. It results in information harms that can generate serious short term or long-term consequences. Although some researchers have created misinformation detection systems and algorithms, examined the roles of involved parties, examined the way misinformation spreads and convinces people, very little attention has been paid to the types of misinformation harms. In the context of humanitarian crises, we propose a taxonomy of information harms and assess people’s perception of risk regarding the harms. Such a taxonomy can act as the base for future research to quantitatively measure the harms in specific contexts. Furthermore, perceptions of related people were also investigated in four specifically chosen scenarios through two dimensions: Likelihood of occurrence and Level of impacts of the harms.


Misinformation Humanitarian crises Disasters Harms Injuries Taxonomy 


  1. Agrafiotis, I., Nurse, J.R., Goldsmith, M., Creese, S., Upton, D.: A taxonomy of cyber-harms: defining the impacts of cyber-attacks and understanding how they propagate. J. Cybersecur. 4, tyy006, 1–15 (2018)Google Scholar
  2. Alexander, K.: What caused nearly 20,000 quakes at Oroville Dam? Scientists weigh in on mystery (2018). Accessed 25 Aug 2019
  3. Berman, M.: Risk assessments 101: the role of probability & impact in measuring risk (2018). Accessed 01 Sept 2019
  4. Bostrom, N.: Information hazards: a typology of potential harms from knowledge. Rev. Contemporary Philosophy 10, 44–79 (2011). Accessed 01 July 2019
  5. Buhrmester, D.M., Kwang, N.T., Gosling, D.S.: Amazon’s mechanical turk: a new source of inexpensive, yet high-quality, data? Perspect. Psychol. Sci. 6(1), 3–5 (2011). Scholar
  6. Casler, K., Bickel, L., Hackett, E.E.: Separate but equal? A comparison of participants and data gathered via Amazon’s MTurk, social media, and face-to-face behavioral testing. Comput. Hum. Behav. 29(6), 2156–2160 (2013)CrossRefGoogle Scholar
  7. Cheung, H.J., Burns, K.D., Sinclair, R., Sliter, M.: Amazon mechanical turk in organizational psychology: an evaluation and practical recommendations. J. Bus. Psychol. 32(4), 347–361 (2017). Scholar
  8. Chokshi, N., McPHate, M.: Flood risk near Oroville Dam causes thousands to evacuate in California (2017). Accessed 25 Aug 2019
  9. Curtis, P., Carey, M.: Risk Assessment in Practice. Thought leadership in ERM. By Deloitte & Touche LLP, and COSO (Committee of Sponsoring Organizations of the Treadway Commission) (2012). Accessed 27 Aug 2019
  10. Elliott, D.: Concept unwrapped – causing harms. Copyright © 2019 ethics unwrapped - McCombs School of Business – The University of Texas at Austin (2019). Accessed 15 Feb 2019
  11. Ghenai, A., Mejova, Y.: Catching zika fever: application of crowdsourcing and machine learning for tracking health misinformation on Twitter. (2017)Google Scholar
  12. Gupta, A., Lamba, H., Kumaraguru, P., Joshi, A.: Faking Sandy: characterizing and identifying fake images on Twitter during Hurricane Sandy. In: Proceedings of the 22nd International Conference on World Wide Web (WWW 2013 Companion), pp. 729–736. ACM, New York (2013)Google Scholar
  13. Heen, M.S., Lieberman, J.D., Miethe, T.D.: A comparison of different online sampling approaches for generating national samples. UNLV – Center Crime Justice Policy 1, 1–8 (2014). Research In Brief. September 2014, CCJP 2014-01Google Scholar
  14. Holdeman, E.: BLOG: disaster zone: how to counter fake news during a disaster. TCA Regional News, Chicago, 27 February 2018Google Scholar
  15. Homeland Security Report. Countering false information on social media in disasters and emergencies. Department of Homeland Security – Science and Technology (2018). Accessed 20 Apr 2019
  16. Human Coalition. What is a humanitarian emergency? (2018). Accessed 07 Dec 2018
  17. Maddock, J., Starbird, K., Al-Hassani, H., Sandoval, D., Orand, M., Mason, R.: Characterizing online rumoring behavior using multi-dimensional signatures. In: Proceedings of the 18th ACM Conference on Computer Supported Cooperative Work & Social Computing, pp. 228–241. ACM (2015)Google Scholar
  18. Madrigal, C.A.: #BostonBombing: the anatomy of a misinformation disaster (2013). Accessed 14 June 2019
  19. Majima, Y., Nishiyama, K., Nishihara, A., Hata, R.: Conducting online behavioral research using crowdsourcing services in Japan. Front. Psychol. 8 (2017)
  20. McNamara, A.: Facebook announces plan to combat anti-vaccine misinformation (2019). Accessed 14 June 2019
  21. Miller, C.A.: Viral misinformation: rise of ‘anti-vaxxer’ movement requires news literacy inoculation. USA Today (2019). Accessed 14 June 2019
  22. Nealon: False tweets during Harvey, Irma under scrutiny by University at Buffalo Researchers. US Fed News Service, Including US State News, Washington, D.C, 29 September 2017. Accessed 15 Feb 2019
  23. Newton, C.: Instagram will begin blocking hashtags that return anti-vaccination misinformation (2019). Accessed 14 June
  24. Oh, O., Agrawal, M., Rao, H.: Community intelligence and social media services: a rumor theoretic analysis of tweets during social crises. MIS Q. 37(2), 407–426 (2013)CrossRefGoogle Scholar
  25. Ohlhausen, M.K.: Informational injury in FTC privacy and data security cases, 19 September 2017. Accessed 15 Feb 2019
  26. Pang, N., Ng, J.: Misinformation in a riot: a two-step flow view. Online Inf. Rev. 41(4), 438–453 (2017)CrossRefGoogle Scholar
  27. Peretti-Watel, P., Raude, J., Sagaon-Teyssier, L., Constant, A., Verger, P., Beck, F.: Attitudes toward vaccination and the H1N1 vaccine: poor people’s unfounded fears or legitimate concerns of the elite? Soc. Sci. Med. 109, 10–18 (2014)CrossRefGoogle Scholar
  28. Rajdev, M., Lee, K.: Fake and spam messages: detecting misinformation during natural disasters on social media. In: 2015 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT), vol. 1, pp. 17–20. IEEE (2015)Google Scholar
  29. Rogers, P.: Oroville Dam: Designer of failed spillway had almost no experience (2018). Accessed 25 Aug 2019
  30. Sandvik, K., Jacobsen, K., McDonald, S.: Do no harm: a taxonomy of the challenges of humanitarian experimentation, 99(904), 319–344 (2017). Scholar
  31. Sheehan, B.K.: Crowdsourcing research: data collection with Amazon’s Mechanical Turk. Commun. Monogr. 85(1), 140–156 (2018). Scholar
  32. Shih, G.: Boston marathon bombings: how Twitter and Reddit got it wrong (2013). Accessed 14 June
  33. Speri, A.: FEMA Is Trying To Get Back $5.8M in Hurricane Sandy Aid Money (2014). Accessed 15 Feb 2019
  34. Starbird, K., Maddock, J., Orand, M., Achterman, P., Mason, R.M.: Rumors, false flags, and digital vigilantes: misinformation on Twitter after the 2013 Boston marathon bombing. In: iConference 2014 Proceedings, pp. 654–662 (2014).
  35. Wardle, C., Derakhshan, H.: Information disorder: toward an interdisciplinary framework for research and policy making. Council of Europe report. DGI, September 2017Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  • Thi Tran
    • 1
    Email author
  • Rohit Valecha
    • 1
  • Paul Rad
    • 1
  • H. Raghav Rao
    • 1
  1. 1.Department of Information Systems and Cyber SecurityThe University of Texas at San AntonioSan AntonioUSA

Personalised recommendations