Challenging Misinformation: Exploring Limits and Approaches

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11749)


The manipulation of information and the dissemination of “fake news” are practices that trace back to the early records of human history. Significant changes in the technological environment enabling ubiquity, immediacy and considerable anonymity, have facilitated the spreading of misinformation in unforeseen ways, raising concerns around people’s (mis)perception of social issues worldwide. As a wicked problem, limiting the harm caused by misinformation goes beyond technical solutions, requiring also regulatory and behavioural changes. This workshop proposes to unpack the challenge at hand by bringing together diverse perspectives to the problem. Based on participatory design principles, it will challenge participants to critically reflect the limits of existing socio-technical approaches and co-create scenarios in which digital platforms support misinformation resilience.


Co-creation Misinformation Disinformation Fake news 



This workshop proposal has been supported by the EC within the Horizon 2020 programme under grant agreement 770302 - Co-Inform .


  1. 1.
    Van der Linden, S., et al.: Inoculating the public against misinformation about climate change. Glob. Challenges 1(2) (2017). Scholar
  2. 2.
    Claire, W., Hossein, D.: Information disorder: Toward an interdisciplinary framework for research and policymaking. Technical report, Council of Europe (2017).
  3. 3.
    DiFranzo, D., Gloria-Garcia, K.: Filter bubbles and fake news. XRDS 23(3), 32–35 (2017). Scholar
  4. 4.
    European Commission - Directorate-General for Communications Networks, Content and Technology: A multi-dimensional approach to disinformation. Technical report, European Commission (2018).
  5. 5.
    Facebook: Hard questions: How is facebook’s fact-checking program working? June 2018. Accessed 12 Oct 2018
  6. 6.
    Fernandez, M., Alani, H.: Online misinformation: challenges and future directions. In: Companion Proceedings of the Web Conference 2018 (WWW 2018), pp. 595–602 (2018).
  7. 7.
    Garrett, R.K., Weeks, B.E.: The promise and peril of real-time corrections to political misperceptions. In: Proceedings of the 2013 Conference on Computer supported cooperative work, pp. 1047–1058. ACM (2013)Google Scholar
  8. 8.
    Ireton, E.C., Posetti, J.: Journalism, ‘Fake News’ and Disinformation Handbook for Journalism Education and Training. United Nations Educational, Scientific and Cultural Organization - UNESCO Publishing, Paris (2018)Google Scholar
  9. 9.
    Kata, A.: A postmodern Pandora’s box: anti-vaccination misinformation on the internet. Vaccine 28(7), 1709–1716 (2010)CrossRefGoogle Scholar
  10. 10.
    Kull, S., Ramsay, C., Lewis, E.: Misperceptions, the media, and the Iraq war. Polit. Sci. Q. 118(4), 569–598 (2003)CrossRefGoogle Scholar
  11. 11.
    Levy, N.: Nudges in a post-truth world. J. Med. Ethics 43(8), 495–500 (2017)CrossRefGoogle Scholar
  12. 12.
    Metaxas, P.: Technology, propaganda, and the limits of human intellect. arXiv preprint arXiv:1806.09541 (2018)
  13. 13.
    Pennycook, G., Rand, D.G.: Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning. Cognition 188, 39–50 (2018)CrossRefGoogle Scholar
  14. 14.
    Prahalad, C.K., Ramaswamy, V.: The co-creation connection. Strategy and Business, pp. 50–61 (2002)Google Scholar
  15. 15.
    Voorberg, W.H., Bekkers, V.J., Tummers, L.G.: A systematic review of co-creation and co-production: embarking on the social innovation journey. Public Manag. Rev. 17(9), 1333–1357 (2015)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  1. 1.Knowledge Media Institute, The Open UniversityMilton KeynesUK
  2. 2.eGovlabStockholm UniversityStockholmSweden
  3. 3.Cyprus University of TechnologyLimassolCyprus

Personalised recommendations