Journal on Multimodal User Interfaces

, Volume 8, Issue 3, pp 305–317 | Cite as

Gesture analysis in a case study with a tangible user interface for collaborative problem solving

  • Dimitra Anastasiou
  • Valérie Maquil
  • Eric Ras
Original Paper


This paper describes a case study that took place at the Public Research Centre Henri Tudor, Luxembourg in November 2012. A tangible user interface (TUI) was used in the context of collaborative problem solving. The task of participants was to explore the relation of external parameters on the production of electricity of a windmill presented on a tangible tabletop; these parameters were represented through physical objects. The goal of the study was to observe, analyze, and understand the interactions of multiple participants with the table while collaboratively solving a task. In this paper we focus on the gestures that the users performed during the experiment and the reaction of the other users to those gestures. Gestures were categorized into deictic/pointing, iconic, emblems, adaptors, and TUI-related. TUI-related/manipulative gestures, such as tracing and rotating, represented the biggest part, followed by the pointing gestures. In addition, we evaluated how active was the participation of the participants and whether gesture was accompanied by speech during the user study. Our case study can be described as a collaborative, problem solving, and cognitive activity, which showed that gesturing facilitates group focus, enhances collaboration among the participants, and encourages the use of epistemic actions.


Deictic/pointing gesture Emotion  Iconic gesture Tangible user interface (TUI) 


  1. 1.
    Ullmer B, Ishii H (2000) Emerging frameworks for tangible user interfaces. IBM Syst J 39:915–931CrossRefGoogle Scholar
  2. 2.
    Shaer O, Hornecker E (2010) Tangible user interfaces: past, present and future directions. Found Trends Hum Comput Interact 3(1–2):1–137Google Scholar
  3. 3.
    Schraw G, Robinson DR (2011) Assessment of higher order thinking skills. IAP-Information Age Publishing Inc, CharlotteGoogle Scholar
  4. 4.
    Ishii H (2006) Tangible user interfaces. In: Proceedings of the CHI 2006 Workshop, ACMGoogle Scholar
  5. 5.
    Ishii H (2008) The tangible user interface and its evolution. Commun ACM 51(6):32–36Google Scholar
  6. 6.
    Marshall P, Rogers Y, Hornecker E (2007) Are tangible interfaces really any better than other kinds of interfaces? In: Proceedings of CHI 2007 workshop on tangible user interfaces in context and theoryGoogle Scholar
  7. 7.
    Horn MS, Solovey ET, Crouser RJ, Jacob RJK (2009) Comparing the use of tangible and graphical programming languages for informal science education. In: Proceedings of CHI 2009, pp 975–984Google Scholar
  8. 8.
    Cheng LK, Der CS, Sidhu MS, Omar R (2011) GUI vs. TUI: engagement for children with no prior computing experience. Electron J Comput Sci Inf Technol 3:31–39Google Scholar
  9. 9.
    Xie L, Antle AN, Motamedi N (2008) Are tangibles more fun? Comparing children’s enjoyment and engagement using physical, graphical and tangible user interfaces. In: Proceedings of TEI 2008, pp 191–198Google Scholar
  10. 10.
    Tuddenham P, Kirk D, Izadi S (2010) Graspables revisited: multi-touch vs. tangible input for tabletop displays inacquisition and manipulation tasks. In: Proceedings of CHI 2010, pp 2223–2232Google Scholar
  11. 11.
    Zuckerman O, Gal-Oz A (2013) To TUI or not to TUI: evaluating performance and preference in tangible vs. graphical user interfaces. Int J Hum Comput Stud 71:803–820CrossRefGoogle Scholar
  12. 12.
    Patten J, Ishii H (2000) A comparison of spatial organizationstrategies in graphical and tangible user interfaces. In: Proceedings of DARE 2000, pp 41–50Google Scholar
  13. 13.
    Fitzmaurice GW, Buxton W (1997) An empirical evaluation of graspable user interfaces: towards specialized, space multiplexed input. In: Proceedings of CHI 1997, pp 43–50Google Scholar
  14. 14.
    Marshall P, Cheng PCH, Luckin R (2010) Tangibles in the balance: a discovery learning task with physical or graphical materials. In: Proceedings of TEI 2010, pp 153–160Google Scholar
  15. 15.
    Esteves A, Van den Hoven E, Oakley I (2013) Physical games or digital games? Comparing support for mental projection in tangible and virtual representations of a problem-solving task. In: Proceedings of TEI 2013, pp 167–174Google Scholar
  16. 16.
    Kirsh D, Maglio P (1994) On distinguishing epistemic from pragmatic action. Cogn Sci 18(4):513–549CrossRefGoogle Scholar
  17. 17.
    Fitzmaurice GW (1996) Graspable user interfaces. PhD thesis, 1996, University of TorontoGoogle Scholar
  18. 18.
    Sharlin E, Itoh Y, Watson B, Kitamura Y, Sutphen S, Liu L, Kishino F (2004) Spatial tangible user interfaces for cognitive assessment and training. In: Proceedings of Bio-ADIT, pp 410–425Google Scholar
  19. 19.
    Maher ML, Kim MJ (2005) Do tangible user interfaces impact spatial cognition in collaborative design? Coop Design Vis Eng Lect Notes Comput Sci 3675:30–41CrossRefGoogle Scholar
  20. 20.
    Efron D (1941/1972) Gesture, race and culture. The Hague: MoutonGoogle Scholar
  21. 21.
    Ekman P, Friesen WV (1972) Hand movements. J Commun 22:353–374CrossRefGoogle Scholar
  22. 22.
    Krauss RM, Chen Y, Chawla P (1996) Nonverbal behavior and nonverbal communication: what do conversational hand gestures tell us? Adv Exp Soc Psychol 28:389–450CrossRefGoogle Scholar
  23. 23.
    McNeill D (192) Hand and mind: what gestures reveal about thought. University of Chicago Press, ChicagoGoogle Scholar
  24. 24.
    Goodwin C (1994) Professional vision. Am Anthropol 96(3):606–633MathSciNetCrossRefGoogle Scholar
  25. 25.
    Murphy KM (2003) Building meaning in interaction: rethinking gesture classifications. Crossroad Lang Interact Cult 5:29–47Google Scholar
  26. 26.
    Lee J, Ishii H (2010) Beyond—collapsible tools and gestures for computational design. In: Proceedings of CHI 2010, pp 3931–3936Google Scholar
  27. 27.
    Goldin-Meadow S, Nusbaum H, Delly SD, Wagner S (2001) Explaining math: gesturing lightens the load. Psychol Sci 12(6):516–522CrossRefGoogle Scholar
  28. 28.
    Alibali MW, Kita S, Young A (2000) Gesture and the process of speech production: we think, therefore we gesture. Lang Cogn Process 15:593–613CrossRefGoogle Scholar
  29. 29.
    Morsella E, Krauss RM (2004) The role of gestures in spatial working memory and speech. Am J Psychol 117(3):411–424CrossRefGoogle Scholar
  30. 30.
    Ping R, Goldin-Meadow S (2010) Gesturing saves cognitive resources when talking about nonpresent objects. Cogn Sci 34:602–619CrossRefGoogle Scholar
  31. 31.
    Klemmer SR, Hartmann B, Takayama L (2006) How bodies matter: five themes for interaction design. In: Proceedings of DIS 2006 conference on designing interactive systems, pp 140–149Google Scholar
  32. 32.
    Kirk DS, Sellen A, Taylor S, Villar N, Izadi S (2009) Putting the physical into the digital: issues in designing hybrid interactive surfaces. In: Proceedings of BCS HCI 2009, pp 35–54Google Scholar
  33. 33.
    Bekker MM, Olson JS, Olson GM (1995) Analysis of gestures in face-to-face design teams provides guidance for how to use groupware in design. In: DIS 1995, pp 157–166Google Scholar
  34. 34.
    Tang JC (1991) Findings from observational studies of collaborative work. Int J Man Mach Stud 34:143–160CrossRefGoogle Scholar
  35. 35.
    Suzuki H, Kato H (1995) Interaction-level support for collaborative learning: AlgoBlock—open programming language. In: Proceedings of the conference on computer-supported collaborative learning (CSCL) 1995, pp 349–355Google Scholar
  36. 36.
    Kim MJ, Maher ML (2008) The impact of tangible user interfaces on designers’ spatial cognition. Hum Comput Interact 23(2):101–137CrossRefGoogle Scholar
  37. 37.
    Maquil V, Ras E (2012) Collaborative problem solving with objects: ophysical aspects of a tangible tabletop in technology-based assessment. From research to practice in the design of cooperative systems: results and open challenges, pp 153–166Google Scholar
  38. 38.
    Ras E, Maquil V, Foulonneau M, Latour T (2013) Empirical studies on a tangible user interface for technology-based assessment: insights and emerging challenges. Special CAA 2012 Issue: pedagogy and technology: harmony and tensions international. J e-Assess, 3(1)Google Scholar
  39. 39.
    Davis FD (1986) A technology acceptance model for empirically testing new end user information systems: theory and results. PhD, Massachusetts Institute of TechnologyGoogle Scholar
  40. 40.
    Brooke J (1996) Usability evaluation in industry. In: Jordan PW, Thomas B, Weerdmeester BA, Mcclelland IL (eds) Sus—a quick and dirty usability scale. Taylor & Francis, LondonGoogle Scholar
  41. 41.
    Venkatesh V, Morris MG, Davis GB, Davis FD (2003) User acceptance of information technology: toward a unified view. MIS Q 27(3):425–478Google Scholar
  42. 42.
    Attrakdiff. A tool for measuring hedonic and pragmatic quality.
  43. 43.
    Wittenburg P, Brugman H, Russel A, Klassmann A, Sloetjes H (2006) ELAN: a professional framework for multimodality research. In: Proceedings of the 5th international conference on language resources and evaluation (LREC)Google Scholar
  44. 44.
    McNeill D (2005) Gestures and thought. University of Chicago Press, ChicagoCrossRefGoogle Scholar
  45. 45.
    Kendon A (1982) The study of gesture: some observations on its history. Rech Semiot Semiot Inq 2(1):25–62Google Scholar
  46. 46.
    Kipp M (2004) Gesture generation by imitation—from human behavior to computer character animation., Boca Raton, Florida Google Scholar
  47. 47.
    Quek F (1994) Toward a vision-based hand gesture interface. In: Singh G, Feiner SK, Thalman D (eds) Proceedings of the virtual reality, software and technology conference, pp 17–31Google Scholar
  48. 48.
    Wexelblat A (1998) Research challenges in gesture: open issues and unsolved problems. In: Procedings of the international gesture workshop on gesture and sign language in human–computer interaction, pp 1–11Google Scholar
  49. 49.
    Maquil V, Wagner I, Basile M, Ehrenstrasser L, Idziorek M, Ozdirlik B, Sareika M, Terrin JJ, Wagner M (2010) WP6 final prototype of Urban renewal applications, integrated project on interaction and presence in Urban environmentsGoogle Scholar
  50. 50.
    North M (1972) Personality assessment through movement. Macdonald and Evans, PlymouthGoogle Scholar
  51. 51.
    Argyle M (1988) Bodily communication. Taylor & Francis, LondonGoogle Scholar
  52. 52.
    Lippa R (1998) The nonverbal display and judgment of extraversion, masculinity, femininity, and gender diagnosticity: a lens model analysis. J Res Person 32(1):80–107CrossRefGoogle Scholar
  53. 53.
    Fleck R, Rogers Y, Yuill N, Marshall P, Carr A, Rick J, Bonnett V (2009) Actions speak loudly with words: unpacking collaboration around the table. In: Proceedings of the ACM international conference on interactive tabletops and surfaces, pp 189–196Google Scholar
  54. 54.
    Stahl G (2005) Group cognition in computer-assisted collaborative learning. J Comput Assist Learn 21(2):79–90CrossRefGoogle Scholar
  55. 55.
    Fiske ST, Taylor SE (2013) Social cognition from brains to culture. SAGE Publications, Thousands OaksGoogle Scholar
  56. 56.
    Kita S (2009) Cross-cultural variation of speech-accompanying gesture: a review. Lang Cogn Process 24(2):145–167CrossRefGoogle Scholar
  57. 57.
    Kray C, Strohbach M (2004) Gesture-based interface reconfiguration. In: Proceedings of workshop on AI in mobile systems (AIMS) at UbicompGoogle Scholar

Copyright information

© OpenInterface Association 2014

Authors and Affiliations

  1. 1.Computer Science/Languages ScienceUniversity of BremenBremenGermany
  2. 2.Public Research Centre Henri TudorLuxembourgLuxembourg

Personalised recommendations