Advertisement

Evaluation of an Argument Visualisation Platform by Experts and Policy Makers

  • Efthimios Tambouris
  • Efpraxia Dalakiouridou
  • Eleni Panopoulou
  • Konstantinos Tarabanis
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6847)

Abstract

Argument visualisation (AV) tools enable structured debates around issues, positions and arguments. These tools have the potential to substantially improve transparency e.g. by enabling understanding complex legislation and debating. In this paper we present the results of the evaluation of an AV platform by experts and policy makers. The results suggest the potential of such tools is large particularly for understanding complex legislation and debates. The results indicate an AV tool can be also potentially used for massive deliberations when however usability is further improved. They further suggest an AV tool seems particularly relevant to the analysis and policy formation stages of policy making, where identification, elaboration and presentation of complex topics are needed. In this paper we employed a mature AV tool and concentrate on evaluating general aspects of such platforms hence we believe the results can also apply to other AV platforms.

Keywords

Argument visualisation tools WAVE project Debategraph 

References

  1. 1.
    European Commission, Better Regulation – Simply explained (2006), http://ec.europa.eu/governance/better_regulation/documents/brochure/br_brochure_en.pdf
  2. 2.
    European Commission, Smart Regulation in the European Union, COM, 543 final (2010)Google Scholar
  3. 3.
    Panopoulou, E., Tambouris, E., Tarabanis, K.: eParticipation initiatives: How is Europe progressing? European Journal of ePractice (7), 15–26 (2009) Google Scholar
  4. 4.
    Rittel, H.W.J., Webber, M.M.: Dilemmas in a General Theory of Planning. Policy Sciences 4, 155–169 (1973)CrossRefGoogle Scholar
  5. 5.
    Shum, S.B.: Cohere: Towards web 2.0 argumentation. In: Hunter, A. (ed.) Proceedings of the 2nd International Conference on Computational Models of Argument (COMMA). IOS Press, Amsterdam (2008)Google Scholar
  6. 6.
    Van den Braak, S.W., Vreeswijk, G.A.W.: AVER: Argument visualization for evidential reasoning. In: The Nineteenth Annual Conference on Legal Knowledge and Information Systems, JURIX 2006, Amsterdam, The Netherlands, pp. 151–156 (2006) Google Scholar
  7. 7.
    Atkinson, K., Cartwright, D.: Political Engagement through Tools for Argumentation. In: Proceedings of COMMA 2008, pp. 116–127 (2008) Google Scholar
  8. 8.
    DEMO-net Deliverable D5.2: eParticipation: The potential of new and emerging technologies (2007) Google Scholar
  9. 9.
    Lee, D., Menda, Y.P., Price, D., Tambouris, E., Peristeras, V., Tarabanis, K.: Platforms To Facilitate Online Political Debates: The Wave Platform. In: Chappelet, J.L., Glassey, O., Janssen, M., Macintosh, A., Scholl, J., Tambouris, E., Wimmer, M.A. (eds.) Electronic Government and Electronic Participation: Joint Proceedings of Ongoing Research and Projects of IFIP EGOV and ePart 2010, pp. 303–310. Trauner Duck (2010)Google Scholar
  10. 10.
    Macintosh, A., Whyte, A.: Towards an evaluation framework for eParticipation. Transforming Government. People, Process and Policy 2(1), 16–30 (2008)CrossRefGoogle Scholar
  11. 11.
    Aichholzer, G., Westholm, H.: Evaluating eParticipation Projects: Practical Examples and Outline of an Evaluation Framework. European Journal of ePractice (7), 27–44 (2009) Google Scholar
  12. 12.
    MOMENTUM Deliverable D2.5: e-participation projects evaluation methodology (2008) Google Scholar
  13. 13.
    Chess, C.: Evaluating Environmental Public Participation: Methodological Questions. Journal of Environmental Planning and Management 43(6), 769–784 (2000)CrossRefGoogle Scholar
  14. 14.
    Lawrence, D.P.: Quantitative versus Qualitative Evaluation: A False Dichotomy? Environmental Impact Assessment Review 13(1), 3–11 (1993)CrossRefGoogle Scholar
  15. 15.
    Thomas, D.R.: A general Inductive Approach for Analyzing Qualitative Evaluation Data. American Journal of Evaluation 27, 237–246 (2006)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2011

Authors and Affiliations

  • Efthimios Tambouris
    • 1
  • Efpraxia Dalakiouridou
    • 1
  • Eleni Panopoulou
    • 1
  • Konstantinos Tarabanis
    • 1
  1. 1.University of MacedoniaThessalonikiGreece

Personalised recommendations