Updating Probabilistic Epistemic States in Persuasion Dialogues

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10369)


In persuasion dialogues, the ability of the persuader to model the persuadee allows the persuader to make better choices of move. The epistemic approach to probabilistic argumentation is a promising way of modelling the persuadee’s belief in arguments, and proposals have been made for update methods that specify how these beliefs can be updated at each step of the dialogue. However, there is a need to better understand these proposals, and moreover, to gain insights into the space of possible update functions. So in this paper, we present a general framework for update functions in which we consider existing and novel update functions.


Epistemic State Full Version Satisfaction Condition Argumentation Framework Semantical Constraint 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This research was partly funded by EPSRC grant EP/N008294/1 for the Framework for Computational Persuasion project.


  1. 1.
    Black, E., Coles, A., Bernardini, S.: Automated planning of simple persuasion dialogues. In: Bulling, N., Torre, L., Villata, S., Jamroga, W., Vasconcelos, W. (eds.) CLIMA 2014. LNCS (LNAI), vol. 8624, pp. 87–104. Springer, Cham (2014). doi: 10.1007/978-3-319-09764-0_6 Google Scholar
  2. 2.
    Caminada, M., Podlaszewski, M.: Grounded semantics as persuasion dialogue. In: Proceedings of the International Conference on Computational Models of Argument (COMMA), pp. 478–485 (2012)Google Scholar
  3. 3.
    Daniel, L.: Paraconsistent probabilistic reasoning. Ph.D. thesis, L’École Nationale Supérieure des Mines de Paris (2009)Google Scholar
  4. 4.
    Dung, P.: On the acceptability of arguments and its fundamental role in nonmonotonic reasoning, logic programming, and n-person games. Artif. Intell. 77, 321–357 (1995)MathSciNetCrossRefMATHGoogle Scholar
  5. 5.
    Fan, X., Toni, F.: Assumption-based argumentation dialogues. In: Proceedings of IJCAI 2011, pp. 198–203 (2011)Google Scholar
  6. 6.
    Hadjinikolis, C., Siantos, Y., Modgil, S., Black, E., McBurney, P.: Opponent modelling in persuasion dialogues. In: Proceedings of IJCAI 2013, pp. 164–170 (2013)Google Scholar
  7. 7.
    Hadoux, E., Beynier, A., Maudet, N., Weng, P., Hunter, A.: Optimization of probabilistic argumentation with Markov decision models. In: Proceedings of IJCAI 2015, pp. 2004–2010 (2015)Google Scholar
  8. 8.
    Hadoux, E., Hunter, A.: Computationally viable handling of beliefs in arguments for persuasion. In: Proceedings of ICTAI 2016, pp. 319–326. IEEE Press (2016)Google Scholar
  9. 9.
    Hadoux, E., Hunter, A.: Strategic sequences of arguments for persuasion using decision trees. In: Proceedings of AAAI 2017, pp. 1128–1134. AAAI Press (2017)Google Scholar
  10. 10.
    Hunter, A.: Modelling the persuadee in asymmetric argumentation dialogues for persuasion. In: Proceedings of IJCAI 2015, pp. 3055–3061 (2015)Google Scholar
  11. 11.
    Hunter, A., Thimm, M.: On partial information and contradictions in probabilistic abstract argumentation. In: Principles of Knowledge Representation and Reasoning (KR 2016), pp. 53–62 (2016)Google Scholar
  12. 12.
    Potyka, N.: Reasoning over linear probabilistic knowledge bases with priorities. In: Beierle, C., Dekhtyar, A. (eds.) SUM 2015. LNCS (LNAI), vol. 9310, pp. 121–136. Springer, Cham (2015). doi: 10.1007/978-3-319-23540-0_9 CrossRefGoogle Scholar
  13. 13.
    Potyka, N., Thimm, M.: Probabilistic reasoning with inconsistent beliefs using inconsistency measures. In: Proceedings of IJCAI 2015, pp. 3156–3163 (2015)Google Scholar
  14. 14.
    Prakken, H.: Coherence and flexibility in dialogue games for argumentation. J. Logic Comput. 15(6), 1009–1040 (2005)MathSciNetCrossRefMATHGoogle Scholar
  15. 15.
    Prakken, H.: Formal sytems for persuasion dialogue. Knowl. Eng. Rev. 21(2), 163–188 (2006)CrossRefGoogle Scholar
  16. 16.
    Rienstra, T., Thimm, M., Oren, N.: Opponent models with uncertainty for strategic argumentation. In: Proceedings of IJCAI 2013, pp. 332–338 (2013)Google Scholar
  17. 17.
    Rosenfeld, A., Kraus, S.: Providing arguments in discussions on the basis of the prediction of human argumentative behavior. ACM Trans. Interact. Intell. Syst. 6, 30:1–30:33 (2016)CrossRefGoogle Scholar
  18. 18.
    Thimm, M.: Strategic argumentation in multi-agent systems. Kunstliche Intelligenz 28, 159–168 (2014)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity College LondonLondonUK
  2. 2.Institute of Cognitive ScienceUniversity of OsnabrückOsnabrückGermany

Personalised recommendations