Evaluating Ubiquitous Systems with Users (Workshop Summary)

  • Christian Kray
  • Lars Bo Larsen
  • Patrick Olivier
  • Margit Biemans
  • Arthur van Bunningen
  • Mirko Fetter
  • Tim Jay
  • Vassilis-Javed Khan
  • Gerhard Leitner
  • Ingrid Mulder
  • Jörg Müller
  • Thomas Plötz
  • Irene Lopez de Vallejo
Part of the Communications in Computer and Information Science book series (CCIS, volume 11)


Evaluating ubiquitous systems with users can be a challenge, and the goal of this workshop was to take stock of current issues and novel approaches to address this challenge. In this paper, we report on the discussions we had during several plenary and small-group sessions. We first briefly review those evaluation methods that we identified as being used in ubiquitous computing, and then discuss several issues and research questions that emerged during the discussion. These issues include: data sources used for evaluation, comparing ubiquitous systems, interdisciplinary evaluation, multi-method evaluation, factoring in context and disengaged users.


Ubiquitous Computing Task Completion Time Ambient Assisted Living Ubiquitous System Ubiquitous Technology 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Weiser, M.: The computer of the 21st century. Scientific American, 94–100 (1991)Google Scholar
  2. 2.
    Abowd, G.D., Mynatt, E.D.: Charting past, present, and future research in ubiquitous computing. ACM Trans. Comput.-Hum. Interact. 7, 29–58 (2000)CrossRefGoogle Scholar
  3. 3.
    Scholz, J., Consolvo, S.: Toward a framework for evaluating ubiquitous computing applications. IEEE Pervasive Computing 3, 82–88 (2004)CrossRefGoogle Scholar
  4. 4.
    Preece, J., Rogers, Y., Sharp, H.: Interaction Design. Wiley, Chichester (2002)Google Scholar
  5. 5.
    Rogers, Y., Connelly, K., Tedesco, L., Hazlewood, W., Kurtz, A., Hall, R., Hursey, J., Toscos, T.: Why its worth the hassle: The value of in-situ studies when designing ubicomp, pp. 336–353 (2007)Google Scholar
  6. 6.
    Kjeldskov, J., Skov, M.B., Als, B.S., Hegh, R.T.: Is it worth the hassle? exploring the added value of evaluating the usability of context-aware mobile systems in the field, pp. 61–73 (2004)Google Scholar
  7. 7.
    Abowd, G.D., Atkeson, C.G., Bobick, A.F., Essa, I.A., Macintyre, B., Mynatt, E.D., Starner, T.E.: Living laboratories: the future computing environments group at the georgia institute of technology. In: CHI 2000: CHI 2000 extended abstracts on Human factors in computing systems, pp. 215–216. ACM Press, New York (2000)CrossRefGoogle Scholar
  8. 8.
    Müller, J., Paczkowski, O., Krüger, A.: Situated public news and reminder displays, pp. 248–265 (2007)Google Scholar
  9. 9.
    Cheverst, K., Dix, A., Fitton, D., Rouncefield, M., Graham, C.: Exploring awareness related messaging through two situated-display-based systems. Human-Computer Interaction 22, 173–220Google Scholar
  10. 10.
    Taylor, N., Cheverst, K., Fitton, D., Race, N., Rouncefield, M., Graham, C.: Probing communities: Study of a village photo display. In: Proc. OzCHI 2007 (2007)Google Scholar
  11. 11.
    Müller, J., Krüger, A.: Learning topologies of situated public displays by observing implicit user interactions. In: Proceedings of HCI International 2007 (2007)Google Scholar
  12. 12.
    Hutchinson, H., Mackay, W., Westerlund, B., Bederson, B., Druin, A., Plaisant, C., Lafon, B.M., Conversy, S., Evans, H., Hansen, H., Roussel, N., Eiderbck, B., Lindquist, S., Sundblad, Y.: Technology probes: inspiring design for and with families (2003)Google Scholar
  13. 13.
    Breakwell, G., Fyfe-Schaw, C., Hammond, S.: Research Methods in Psychology. Sage Publications Ltd, Thousand Oaks (1995)Google Scholar
  14. 14.
    Marshall, C., Rossman, G.B.: Designing Qualitative Research. Sage Publications, Inc, Thousand Oaks (2006)Google Scholar
  15. 15.
    Consolvo, S., Walker, M.: Using the experience sampling method to evaluate ubicomp applications. IEEE Pervasive Computing 2, 24–31 (2003)CrossRefGoogle Scholar
  16. 16.
    Kahneman, D., Krueger, A.B., Schkade, D.A., Schwarz, N., Stone, A.A.: A survey method for characterizing daily life experience: The day reconstruction method. Science 306, 1776–1780 (2004)CrossRefGoogle Scholar
  17. 17.
    Gustafsson, A., Herrmann, A., Huber, F.: Conjoint Measurement: Methods and Applications. Springer, Heidelberg (2001)zbMATHGoogle Scholar
  18. 18.
    Hassenzahl, M., Wessler, R.: Capturing design space from a user perspective: The repertory grid technique revisited. International Journal of Human-Computer Interaction 12, 441–459 (2001)CrossRefGoogle Scholar
  19. 19.
    Mankoff, J., Dey, A.K., Hsieh, G., Kientz, J., Lederer, S., Ames, M.: Heuristic evaluation of ambient displays. In: CHI 2003: Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 169–176. ACM Press, New York (2003)Google Scholar
  20. 20.
    Singh, P., Ha, H.N., Kuang, Z., Olivier, P., Kray, C., Blythe, P., James, P.: Immersive video as a rapid prototyping and evaluation tool for mobile and ambient applications. In: MobileHCI 2006: Proceedings of the 8th conference on Human-computer interaction with mobile devices and services, pp. 264–264. ACM Press, New York (2006)CrossRefGoogle Scholar
  21. 21.
    Schellenbach, M.: Test methodologies for pedestrian navigation aids in old age. In: CHI 2006 Extended Abstracts on Human Factors in Computing Systems, pp. 1783–1786 (2006)Google Scholar
  22. 22.
    Jameson, A., Krüger, A.: Special double issue on user modeling in ubiquitous computing. User Modeling and User-Adapted Interaction (2005)Google Scholar
  23. 23.
    John, B.E., Kieras, D.E.: The GOMS family of user interface analysis techniques: comparison and contrast. ACM Trans. Comput.-Hum. Interact. 3, 320–351 (1996)CrossRefGoogle Scholar
  24. 24.
    Anderson, J.R., Matessa, M., Lebiere, C.: Act-r: A theory of higher level cognition and its relation to visual attention. Human-Computer Interaction 12, 439–462 (1997)CrossRefGoogle Scholar
  25. 25.
    Mayhew, D.J.: The Usability Engineering Lifecycle: A Practitioner’s Guide to User Interface Design. Morgan Kaufmann Publishers, San Francisco (1999)Google Scholar
  26. 26.
    Mulder, I., Kort, J.: Mixed emotions, mixed methods: the role of emergent technologies to study user experience in context (in press)Google Scholar
  27. 27.
    Scholtz, J., Richter, H.: Report from ubicomp 2001 workshop: evaluation methodologies for ubiquitous computing. SIGCHI Bull.: suppl. interactions 2002, 9–9 (2002)Google Scholar
  28. 28.
    Mulder, I., ter Hofte, H., Kort, J., Vermeeren, A.: In situ workshop at mobile hci (last accessed on November 22, 2007),
  29. 29.
    Neely, S., Stevenson, G., Terzis, S.: Ubiquitous systems evaluation workshop (USE 2007) (last accessed on November 22, 2007),

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Christian Kray
    • 1
  • Lars Bo Larsen
    • 2
  • Patrick Olivier
    • 1
  • Margit Biemans
    • 3
  • Arthur van Bunningen
    • 4
  • Mirko Fetter
    • 5
  • Tim Jay
    • 6
  • Vassilis-Javed Khan
    • 7
  • Gerhard Leitner
    • 8
  • Ingrid Mulder
    • 3
  • Jörg Müller
    • 9
  • Thomas Plötz
    • 10
  • Irene Lopez de Vallejo
    • 11
  1. 1.Newcastle upon TyneNewcastle UniversityUnited Kingdom
  2. 2.Aalborg UniversitetAalborgDenmark
  3. 3.Telematica InstituutEnschedeThe Netherlands
  4. 4.University of TwenteEnschedeThe Netherlands
  5. 5.Bauhaus-University WeimarWeimarGermany
  6. 6.University of BathBathUnited Kingdom
  7. 7.Eindhoven University of TechnologyEindhovenThe Netherlands
  8. 8.Alpen-Adria University of KlagenfurtKlagenfurtAustria
  9. 9.University of MünsterMünsterGermany
  10. 10.Technische Universität DortmundDortmundGermany
  11. 11.University College LondonLondonUnited Kingdom

Personalised recommendations