Paper or Pixel? Comparing Paper- and Tool-Based Participatory Design Approaches

  • Matthias HeintzEmail author
  • Effie Lai-Chong Law
  • Samaneh Soleimani
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9298)


Traditionally, in participatory design (PD) workshops, pens and paper are often used by participants to provide their design ideas. However, using a software tool to gather their feedback can have certain advantages. While some attempts to develop such tools have been undertaken, the basic question whether the tool-based approach is better or worse than its paper-based counterpart in terms of the quality of feedback gathered is rarely explored. We aim to address this research question by conducting three PD workshops with the paper-based and tool-based approach. In addition to the findings about the comparability of the two approaches, one of our main contributions to the future research on this question is the development of the coding scheme CAt+. It enables systematic comparisons of PD data collected with different methods and aims to support designers and developers to exploit PD results.


Participatory design Paper-based Tool-based Coding scheme 



This work was partially funded by the European Union in the context of the Go-Lab project (Grant Agreement no. 317601) under the Information and Communication Technologies (ICT) theme of the 7th Framework Programme for R&D (FP7). This document does not represent the opinion of the European Union, and the European Union is not responsible for any use that might be made of its content.


  1. 1.
    Greenbaum, J., Kyng, M.: Design at Work: Cooperative Design of Computer Systems. Erlbaum, Hillsdale (1991)Google Scholar
  2. 2.
    Muller, M.J.: Participatory design: the third space in HCI (revised). In: Jacko, J., Sears, A. (eds.) Handbook of HCI, 2nd edn. Erlbaum, Hillsdale (2007)Google Scholar
  3. 3.
    Schuler, D., Namioka, A. (eds.): Participatory design: principles and practices. CRC Press (1993)Google Scholar
  4. 4.
    Sanders, E.B.N., Brandt, E., Binder, T.: A framework for organizing the tools and techniques of participatory design. In: Proceedings of the 11th Biennial Participatory Design Conference, pp. 195–198 (2010)Google Scholar
  5. 5.
    Walsh, G., Foss, E., Yip, J., Druin, A.: FACIT PD: a framework for analysis and creation of intergenerational techniques for participatory design. In: Proceedings of CHI 2013, pp. 2893–2902. ACM (2013)Google Scholar
  6. 6.
    Weibel, N., Signer, B., Norrie, M.C., Hofstetter, H., Jetter, H.C., Reiterer, H.: PaperSketch: a paper-digital collaborative remote sketching tool. In: Proceedings of the 16th International Conference on Intelligent User Interfaces, pp. 155–164. ACM (2011)Google Scholar
  7. 7.
    Lin, J., Newman, M.W., Hong, J.I., Landay, J.A.: DENIM: finding a tighter fit between tools and practice for Web site design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 510–517. ACM (2000)Google Scholar
  8. 8.
    Rogers, Y., Sharp, H., Preece, J.: Interaction Design: Beyond Human-Computer Interaction. Wiley, Hoboken (2011)Google Scholar
  9. 9.
    Sundar, S.S., Oh, J., Bellur, S., Jia, H., Kim, H.S.: Interactivity as self-expression: a field experiment with customization and blogging. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 395–404. ACM (2012)Google Scholar
  10. 10.
    Teo, H.H., Oh, L.B., Liu, C., Wei, K.K.: An empirical study of the effects of interactivity on web user attitude. Int. J. Hum. Comput. Stud. 58(3), 281–305 (2003)CrossRefGoogle Scholar
  11. 11.
    Zhao, L., Lu, Y.: Enhancing perceived interactivity through network externalities: an empirical study on micro-blogging service satisfaction and continuance intention. Decis. Support Syst. 53(4), 825–834 (2012)CrossRefGoogle Scholar
  12. 12.
    Govaerts, S., et al.: Towards an online lab portal for inquiry-based STEM learning at school. In: Wang, J.-F., Lau, R. (eds.) ICWL 2013. LNCS, vol. 8167, pp. 244–253. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  13. 13.
    Read, J.C., Gregory, P., MacFarlane, S.J., McManus, B., Gray, P., Patel, R.: An investigation of participatory design with children – informant, balanced and facilitated design. In: Interaction Design and Children, pp. 53–64 (2002)Google Scholar
  14. 14.
    Hundhausen, C., Trent, S., Balkar, A., Nuur, M.: The design and experimental evaluation of a tool to support the construction and wizard-of-oz testing of low fidelity prototypes. In: IEEE Symposium on Visual Languages and Human-Centric Computing, VL/HCC 2008, pp. 86–90 (2008)Google Scholar
  15. 15.
    Segura, V.C.V.B., Barbosa, S.D.J., Simões, F.P.: UISKEI: a sketch-based prototyping tool for defining and evaluating user interface behavior. In: Proceedings of the International Working Conference on Advanced Visual Interfaces, pp. 18–25. ACM (2012)Google Scholar
  16. 16.
    MacDonald, F., Miller, J.: A comparison of tool-based and paper-based software inspection. Empirical Softw. Eng. (Kluwer Academic Publishers) 3, 233–253 (1998)CrossRefGoogle Scholar
  17. 17.
    Bailey, B.P., Konstan, J.A.: Are informal tools better?: comparing DEMAIS, pencil and paper, and authorware for early multimedia design. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 313–320. ACM (2003)Google Scholar
  18. 18.
    Law, E.L.-C., van Schaik, P., Roto, V.: Attitudes towards user experience (UX) measurement. Int. J. Hum. Comput. Stud. 72(6), 526–541 (2014)CrossRefGoogle Scholar
  19. 19.
    Stumpf, S., Rajaram, V., Li, L., Burnett, M., Dietterich, T., Sullivan, E., Drummond, R., Herlocker, J.: Toward harnessing user feedback for machine learning. In: Proceedings of the 12th International Conference on Intelligent User Interfaces (IUI 2007), pp. 82–91. ACM, New York, NY, USA (2007)Google Scholar
  20. 20.
    Kindred, J., Mohammed, S.N.: “He will crush you like an academic ninja!”: exploring teacher ratings on J. Comput. Mediated Commun. 10, 00 (2005)CrossRefGoogle Scholar
  21. 21.
    Madden, A., Ruthven, I., McMenemy, D.: A classification scheme for content analyses of YouTube video comments. J. Documentation 69, 693–714 (2013)CrossRefGoogle Scholar
  22. 22.
    Könings, K.D., Brand-Gruwel, S., van Merriënboer, J.J.: An approach to participatory instructional design in secondary education: an exploratory study. Educ. Res. 52, 45–59 (2010)CrossRefGoogle Scholar
  23. 23.
    Naghsh, A.M., Andy, D.: GABBEH: a tool to support collaboration in electronic paper prototyping. In: CSCW 2004 the ACM Conference on Computer Supported Cooperative Work, Chicago, USA (2004)Google Scholar
  24. 24.
    Walsh, G., Druin, A., Guha, M.L., Bonsignore, E., Foss, E., Yip, J.C., et al.: DisCo: a co-design online tool for asynchronous distributed child and adult design partners. In: Proceedings of International Conference on Interaction Design and Children (IDC 2011), pp. 11–19. ACM (2012)Google Scholar
  25. 25.
    Newman, M.W., Lin, J., Hong, J.I., Landay, J.A.: DENIM: an informal web site design tool inspired by observations of practice. Hum. Comput. Interact. 18(3), 259–324 (2003)CrossRefGoogle Scholar
  26. 26.
    Walsh, G., Druin, A., Guha, M.L., Foss, E., Golub, E., Hatley, L., …, Franckel, S.: Layered elaboration: a new technique for co-design with children. In: Proceedings CHI 2010, pp. 1237–1240. ACM (2010)Google Scholar
  27. 27.
    Heintz, M., Law, E.L.-C., Govaerts, S., Holzer, A., Gillet, D.: Pdot: participatory design online tool. In: CHI 2014 Extended Abstracts on Human Factors in Computing Systems, pp. 2581–2586. ACM (2014)Google Scholar
  28. 28.
    Druin, A.: Cooperative inquiry: developing new technologies for children with children. In: Proceedings of CHI 1999, pp. 592–599. ACM Press (1999)Google Scholar
  29. 29.
    Hallgren, K.A.: Computing inter-rater reliability for observational data: an overview and tutorial. Tutorials Quant. Methods Psychol. 8(1), 23–34 (2012)Google Scholar
  30. 30.
    Krippendorff, K.: Content Analysis: An Introduction to Its Methodology, 2nd edn. SAGE, London (2004)Google Scholar
  31. 31.
    Vilbergsdottir, S.G., Hvannberg, E.T., Law, E.L.-C.: Assessing the reliability, validity and acceptance of a classification scheme of usability problems (CUP). J. Syst. Softw. 87, 18–37 (2014)CrossRefGoogle Scholar
  32. 32.
    Hornbaek, K., Stage, J.: The interplay between usability evaluation and user interaction design. Int. J. Hum. Comput. Interact. 21, 117–123 (2006)CrossRefGoogle Scholar
  33. 33.
    Fleiss, J.L., Cohen, J.: The equivalence of weighted kappa and the intraclass correlation coefficient as measures of reliability. Educ. Psychol. Measur. 33, 613–619 (1973)CrossRefGoogle Scholar
  34. 34.
    Altman, D.G.: Practical Statistics for Medical Research. Chapman and Hall, London (1991)Google Scholar
  35. 35.
    Fleiss, J.L., Levin, B., Paik, M.C.: Statistical Methods for Rates and Proportions, 3rd edn. Wiley, Hoboken (2003)CrossRefzbMATHGoogle Scholar
  36. 36.
    Maltby, J., Liza, D.: Early Success in Statistics. Pearson Education, New York (2002)Google Scholar
  37. 37.
    McDonald, J.H.: Handbook of Biological Statistics, 3rd edn. Sparky House Publishing, Baltimore (2014)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  • Matthias Heintz
    • 1
    Email author
  • Effie Lai-Chong Law
    • 1
  • Samaneh Soleimani
    • 1
  1. 1.University of LeicesterLeicesterUK

Personalised recommendations