Personal and Ubiquitous Computing

, Volume 12, Issue 3, pp 237–254 | Cite as

Evaluating teamwork support in tabletop groupware applications using collaboration usability analysis

Original Article

Abstract

Tabletop groupware systems have natural advantages for collaboration, but they present a challenge for application designers because shared work and interaction progress in different ways than in desktop systems. As a result, tabletop systems still have problems with usability. We have developed a usability evaluation technique, T-CUA, that focuses attention on teamwork issues and that can help designers determine whether prototypes provide adequate support for the basic actions and interactions that are fundamental to table-based collaboration. We compared T-CUA with expert review in a user study where 12 evaluators assessed an early tabletop prototype using one of the two evaluation methods. The group using T-CUA found more teamwork problems and found problems in more areas than those using expert review; in addition, participants found T-CUA to be effective and easy to use. The success of T-CUA shows the benefits of using a set of activity primitives as the basis for discount usability techniques.

Keywords

Usability evaluation Tabletop groupware Computer-supported cooperative work Collaboration usability analysis T-CUA 

Abbreviations

CUA

Collaboration usability analysis

T-CUA

Table-collaboration usability analysis

References

  1. 1.
    Pinelle D, Gutwin C, Greenberg S (2003) Task analysis for groupware usability evaluation: modeling shared-workspace tasks with the mechanics of collaboration. TOCHI 10(4):281–311CrossRefGoogle Scholar
  2. 2.
    Pinelle D, Gutwin C (2002) Groupware walkthrough: adding context to groupware usability evaluation. In: Proceedings CHI 2002, ACM Press, New York, pp 455–462Google Scholar
  3. 3.
    Baker K, Greenberg S, Gutwin C (2002) Empirical development of a heuristic evaluation methodology for shared workspace groupware. In: Proceeding CSCW 2002, ACM Press, New York, pp 96–105Google Scholar
  4. 4.
    Steves M, Morse E, Gutwin C, Greenberg S (2001) A comparison of usage evaluation and inspection methods for assessing groupware usability. In: Proceedings ACM GROUP 2001, ACM Press, New York, pp 125–134Google Scholar
  5. 5.
    Wixon D, Jones S, Tse L, Casaday G (1994) Inspections and design reviews: framework, history, and reflection. In: Nielsen J, Mack R (eds) Usability inspection methods. Wiley, NY, pp 79–104Google Scholar
  6. 6.
    Bias R (1994) The pluralistic usability walkthrough: coordinated empathies. In: Nielsen J, Mack R (eds) Usability inspection methods. Wiley, NY, pp 65–78Google Scholar
  7. 7.
    Bias R (1991) Walkthroughs: efficient collaborative testing. IEEE Softw 8(5):94–95CrossRefGoogle Scholar
  8. 8.
    Lewis C, Polson P, Wharton C, Rieman J (1990) Testing a walkthrough methodology for theory-based design of walk-up-and-use interfaces. In: Proceedings CHI 1990, ACM Press, New York, pp 235–242Google Scholar
  9. 9.
    Polson P, Lewis C, Rieman J, Wharton C (1992) Cognitive walkthroughs: a method for theory-based evaluation of user interfaces. Int J Man Mach Stud 36:741–73CrossRefGoogle Scholar
  10. 10.
    Nielsen J, Mack RL (1994) Usability inspection methods. Wiley, NYGoogle Scholar
  11. 11.
    Nielsen J, Molich R (1990) Heuristic evaluation of user interfaces. In: Proceedings CHI 1990, ACM Press, New York, pp 249–256Google Scholar
  12. 12.
    Vredenburg K, Mao J, Smith PW, Carey T (2002) A survey of user-centered design practice. In: Proceedings of CHI 2002, ACM Press, New York, pp 471–478Google Scholar
  13. 13.
    Cugini J, Damianos L, Hirschman L, Kozierok R, Kurtz J, Laskowski S, Scholtz J (1997) Methodology for evaluation of collaboration systems. Technical report by the Evaluation Working Group of the DARPA Intelligent Collaboration and Visualization Program, Rev. 3.0Google Scholar
  14. 14.
    Gutwin C, Greenberg S (2000) The mechanics of collaboration: developing low cost usability evaluation methods for shared workspaces. In: Proceedings WET ICE 2000, IEEE Press, pp 98–103Google Scholar
  15. 15.
    Pinelle D, Gutwin C, Subramanian S (2006) Designing digital tables for highly integrated collaboration. Technical Report HCI-TR-06–02, Computer Science Department, University of SaskatchewanGoogle Scholar
  16. 16.
    Kruger R, Carpendale S, Scott SD, Greenberg S (2003) How people use orientation on tables: comprehension, coordination and communication. In: Proceedings GROUP 2003, ACM Press, New York, pp 369–378Google Scholar
  17. 17.
    Scott SD, Carpendale S, Inkpen KM (2004) Territoriality in collaborative tabletop workspaces. In: Proceedings CSCW 2004, ACM Press, New YorkGoogle Scholar
  18. 18.
    Inkpen K, Mandryk R, Morris DiMicco J, Scott S (2004) Methodologies for evaluation collaboration in co-located environments, workshop proposal in extended abstracts of CSCW 2004, ACM Press, New YorkGoogle Scholar
  19. 19.
    Tang J (1991) Findings from observational studies of collaborative work. Int J Man Mach Stud 34(2):143–160CrossRefGoogle Scholar
  20. 20.
    Diaper D (1989) Task analysis for human–computer interaction. Ellis Horwood, ChichesterGoogle Scholar
  21. 21.
    Richardson J, Ormerod TC, Shepherd A (1998) The role of task analysis in capturing requirements for interface design. Interact Comput 9:367–384CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London Limited 2007

Authors and Affiliations

  1. 1.School of Computer ScienceUniversity of Nevada Las VegasLas VegasUSA
  2. 2.Department of Computer ScienceUniversity of SaskatchewanSaskatoonCanada

Personalised recommendations