Towards Many Gestures to One Command: A User Study for Tabletops

  • Yosra Rekik
  • Laurent Grisoni
  • Nicolas Roussel
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8118)

Abstract

Multi-touch gestures are often thought by application designers for a one-to-one mapping between gestures and commands, which does not take into account the high variability of user gestures for actions in the physical world; it can also be a limitation that leads to very simplistic interaction choices. Our motivation is to make a step toward many-to-one mappings between user gestures and commands, by understanding user gestures variability for multi-touch systems; for doing so, we set up a user study in which we target symbolic gestures on tabletops. From a first phase study we provide qualitative analysis of user gesture variability; we derive this analysis into a taxonomy of user gestures, that is discussed and compared to other existing taxonomies. We introduce the notion of atomic movement; such elementary atomic movements may be combined throughout time (either sequentially or in parallel), to structure user gesture. A second phase study is then performed with specific class of gesture-drawn symbols; from this phase, and according to the provided taxonomy, we evaluate user gesture variability with a fine grain quantitative analysis. Our findings indicate that users equally use one or two hands, also that more than half of gestures are achieved using parallel or sequential combination of atomic movements. We also show how user gestures distribute over different movement categories, and correlate to the number of fingers and hands engaged in interaction. Finally, we discuss implications of this work to interaction design, practical consequences on gesture recognition, and potential applications.

Keywords

Tabletop multi-touch gesture gesture recognition interaction design 

References

  1. 1.
    Aigner, R., Wigdor, D., Benko, H., Haller, M., Lindbauer, D., Ion, A., Zhao, S.: Tzu Kwan Valino Koh, J.: Understanding Mid-Air Hand Gestures: A study of Human Preferences in Usage of Gesture Types for HCI, Microsoft Research TechReport MSR-TR-2012-111Google Scholar
  2. 2.
    Anthony, L., Wobbrock, J.O.: $n-protractor: a fast and accurate multistroke recognizer. In: GI, pp. 117–120. Canadian Information Processing Society (2012)Google Scholar
  3. 3.
    Baudel, T., Beaudouin-Lafon, M.: Charade: remote control of objects using free-hand gestures. ACM Commun. 36(7), 28–35 (1993)CrossRefGoogle Scholar
  4. 4.
    Bevans, A.: Investigating the effects of bimanual multitouch interaction on creativity. In: Proc. of Creativity & Cognition, pp. 451–452. ACM (2011)Google Scholar
  5. 5.
    Cohe, A., Hachet, M.: Understanding user gestures for manipulating 3d objects from touchscreen inputs. In: GI, pp. 157–164. Canadian Information Processing Society (2012)Google Scholar
  6. 6.
    Epps, J., Lichman, S., Wu, M.: A study of hand shape use in tabletop gesture interaction. In: CHI EA, pp. 748–753. ACM (2006)Google Scholar
  7. 7.
    Freeman, D., Benko, H., Morris, M.R., Wigdor, D.: Shadowguides: visualizations for in-situ learning of multi- touch and whole-hand gestures. In: ITS, pp. 165–172. ACM (2009)Google Scholar
  8. 8.
    Good, M.D., Whiteside, J.A., Wixon, D.R., Jones, S.J.: Building a user-derived interface. Comm. of the ACM, 1032–1043 (1984)Google Scholar
  9. 9.
    Guilford, S.J.: The nature of human intelligence. McGraw-Hill series in psychologyGoogle Scholar
  10. 10.
    Hager-Ross, C., Schieber, M.H.: Quantifying the independence of human finger movements: Comparisons of digits, hands, and movement frequencies. The Journal of Neuroscience 20, 8542–8550 (1967)Google Scholar
  11. 11.
    Henze, N., Löcken, A., Boll, S., Hesselmann, T., Pielot, M.: Free-hand gestures for music playback: deriving gestures with a user-centred process. In: MUM 2010, pp. 16:1–16:10. ACM (2010)Google Scholar
  12. 12.
    Hinrichs, U., Carpendale, S.: Gestures in the wild: studying multi-touch gesture sequences on interactive tabletop exhibits. In: CHI, pp. 3023–3032. ACM (2011)Google Scholar
  13. 13.
    Huang, T.S., Pavlovic, V.I.: Hand gesture modeling, analysis, and synthesis. In: IEEE International Workshop on Automatic Face and Gesture Recognition, pp. 73–79 (1995)Google Scholar
  14. 14.
    Jiang, Y., Tian, F., Zhang, X., Liu, W., Dai, G., Wang, H.: Unistroke gestures on multi-touch interaction: supporting flexible touches with key stroke extraction. In: IUI, pp. 85–88. ACM (2012)Google Scholar
  15. 15.
    Kammer, D., Wojdziak, J., Keck, M., Groh, R., Taranko, S.: Towards a formalization of multi-touch gestures. In: ITS, pp. 49–58. ACM (2010)Google Scholar
  16. 16.
    Karam, M., Schraefel, M.C.: A taxonomy of gestures in human computer interactions. Technical report, Univ. of Southampton (2005)Google Scholar
  17. 17.
    Kin, K., Hartmann, B., Agrawala, M.: Two-handed marking menus for multitouch devices. ACM Trans. Comput.-Hum. Interact., pp. 16:1–16:23 (2011)Google Scholar
  18. 18.
    Kin, K., Hartmann, B., DeRose, T., Agrawala, M.: Proton++: acustomizable declarativemultitouch framework. In: UIST, pp. 477–486. ACM (2012)Google Scholar
  19. 19.
    Kin, K., Hartmann, B., DeRose, T., Agrawala, M.: Proton: multitouch gestures as regular expressions. In: CHI, pp. 2885–2894. ACM (2012)Google Scholar
  20. 20.
    Li, Y.: Protractor: a fast and accurate gesture recognizer. In: CHI 2010, pp. 2169–2172. ACM (2010)Google Scholar
  21. 21.
    Lin, J., Wu, Y., Huang, T.S.: Modeling the constraints of human hand motion. In: HUMO, pp. 121–126. IEEE Computer Society (2000)Google Scholar
  22. 22.
    Long, A.C., Landay Jr., J.A., Rowe, L.A.: Implications for a gesture design tool. In: CHI, pp. 40–47. ACM (1999)Google Scholar
  23. 23.
    Long, A.C., Landay Jr., J.A., Rowe, L.A., Michiels, J.: Visual similarity of pen gestures. In: CHI, pp. 360–367. ACM (2000)Google Scholar
  24. 24.
    Lü, H., Li, Y.: Gesture coder: a tool for programming multi-touch gestures by demonstra-tion. In: CHI, pp. 2875–2884. ACM (2012)Google Scholar
  25. 25.
    Malik, S., Ranjan, A., Balakrishnan, R.: Interacting with large displays from a distance with vision-tracked multi-finger gestural input. In: UIST, pp. 43–52. ACM (2005)Google Scholar
  26. 26.
    McNeill, D.: Hand and Mind: What Gestures Reveal about Thought. Univ. of Chicago Press (1992)Google Scholar
  27. 27.
    Microsoft. Application gestures and semantic behavior (windows) Website, http://msdn.microsoft.com/en-us/library/windows/desktop/ms704830v=vs.85.aspx
  28. 28.
    Morris, M.R., Huang, A., Paepcke, A., Winograd, T.: Cooperative gestures: Multi-user gestural interactions for co-located groupware. In: CHI, pp. 1201–1210. ACM (2006)Google Scholar
  29. 29.
    Morris, M.R., Wobbrock, J.O., Wilson, A.D.: Understanding users’ preferences for surface gestures. In: GI, pp. 261–268. Canadian Information Processing Society (2010)Google Scholar
  30. 30.
    Nielsen, M., Störring, M., Moeslund, T.B., Granum, E.: A procedure for developing intuitive and ergonomic gesture interfaces for HCI. In: Camurri, A., Volpe, G. (eds.) GW 2003. LNCS (LNAI), vol. 2915, pp. 409–420. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  31. 31.
    Rekimoto, J.: Smart skin: an infrastructure for free hand manipulation on interactive surfaces. In: CHI, pp. 113–120. ACM (2002)Google Scholar
  32. 32.
    Schieber, M.H., Santello, M.: Hand function: peripheral and central constraints on performance. J. Appl. Physiol., 2293–2300 (2004)Google Scholar
  33. 33.
    Schuler, D., Namioka, A.: Participatory Design: Principles and Practices. L. Erlbaum As-sociates Inc. (1993)Google Scholar
  34. 34.
    Segala, R., Lynch, N.: Probabilistic simulations for probabilistic processes. Nordic J. of Computing, pp. 250– 273 (1995)Google Scholar
  35. 35.
    Spano, L.D., Cisternino, A., Paternò, F.: A compositional model for gesture definition. In: Winckler, M., Forbrig, P., Bernhaupt, R. (eds.) HCSE 2012. LNCS, vol. 7623, pp. 34–52. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  36. 36.
    Thieffry, S.: Hand gestures. In The Hand (R. Tubiana, ed.), pp. 488,492, Univ. of Chicago.Google Scholar
  37. 37.
    Tse,E.,Shen,C.,Greenberg,S.,Forlines,C.: Enabling interaction with single user applications through speech and gestures on a multi-user tabletop. AVI, pp. 336–343. ACM (2006)Google Scholar
  38. 38.
    Vatavu, R.-D., Anthony, L., Wobbrock, J. O.: Gestures as point clouds: a $P recognizer for user interface prototypes. ICMI, pp. 273–280, ACM (2012)Google Scholar
  39. 39.
    Wobbrock, J. O., Aung, H. H., Rothrock, B., Myers, B. A.: Maximizing the guessability of symbolic input. CHI EA, pp.1869–1872. ACM (2005)Google Scholar
  40. 40.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. CHI, pp. 1083–1092. ACM (2009)Google Scholar
  41. 41.
    Wu, M., Shen, C., Ryall, K., Forlines, C., Balakrishnan, R.: Gesture registration, relaxation, and reuse for multi-point direct-touch surfaces. ITS, pp. 185–192. IEEE CS (2006)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2013

Authors and Affiliations

  • Yosra Rekik
    • 1
  • Laurent Grisoni
    • 1
    • 2
  • Nicolas Roussel
    • 1
  1. 1.INRIA Lille Nord EuropeFrance
  2. 2.CNRSLIFL, University of Lille 1France

Personalised recommendations