User-Defined Gestures for Augmented Reality

  • Thammathip Piumsomboon
  • Adrian Clark
  • Mark Billinghurst
  • Andy Cockburn
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8118)


Recently there has been an increase in research towards using hand gestures for interaction in the field of Augmented Reality (AR). These works have primarily focused on researcher designed gestures, while little is known about user preference and behavior for gestures in AR. In this paper, we present our guessability study for hand gestures in AR in which 800 gestures were elicited for 40 selected tasks from 20 participants. Using the agreement found among gestures, a user-defined gesture set was created to guide designers to achieve consistent user-centered gestures in AR. Wobbrock’s surface taxonomy has been extended to cover dimensionalities in AR and with it, characteristics of collected gestures have been derived. Common motifs which arose from the empirical findings were applied to obtain a better understanding of users’ thought and behavior. This work aims to lead to consistent user-centered designed gestures in AR.


Augmented reality gestures guessability 


  1. 1.
    Azuma, R.: A Survey of Augmented Reality. Presence 6, 355–385 (1997)Google Scholar
  2. 2.
    Heidemann, G., Bax, I., Bekel, H.: Multimodal interaction in an augmented reality scenario. In: ICMI 2004 - Sixth International Conference on Multimodal Interfaces, October 14-15, pp. 53–60. Association for Computing Machinery (2004)Google Scholar
  3. 3.
    Kolsch, M., Bane, R., Hollerer, T., Turk, M.: Multimodal interaction with a wearable augmented reality system. IEEE Computer Graphics and Applications 26, 62–71 (2006)CrossRefGoogle Scholar
  4. 4.
    Kaiser, E., Olwal, A., McGee, D., Benko, H., Corradini, A., Li, X., Cohen, P., Feiner, S.: Mutual disambiguation of 3D multimodal interaction in augmented and virtual reality. In: ICMI 2003: Fifth International Conference on Multimodal Interfaces, November 5-7, pp. 12–19. Association for Computing Machinery (2003)Google Scholar
  5. 5.
    Lee, T., Hollerer, T.: Handy AR: Markerless inspection of augmented reality objects using fingertip tracking. In: 11th IEEE International Symposium on Wearable Computers, ISWC 2007, October 11-13, 2007, pp. 83–90. IEEE Computer Society (2007)Google Scholar
  6. 6.
    Fernandes, B., Fernandez, J.: Bare hand interaction in tabletop augmented reality. In: SIGGRAPH 2009: Posters, August 3-7. Association for Computing Machinery (2009)Google Scholar
  7. 7.
    Hilliges, O., Kim, D., Izadi, S., Weiss, M., Wilson, A.: HoloDesk: direct 3D interactions with a situated see-through display. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems, pp. 2421–2430. ACM, Austin (2012)CrossRefGoogle Scholar
  8. 8.
    Benko, H., Jota, R., Wilson, A.: MirageTable: freehand interaction on a projected augmented reality tabletop. In: Proceedings of the 2012 ACM Annual Conference on Human Factors in Computing Systems, pp. 199–208. ACM, Austin (2012)CrossRefGoogle Scholar
  9. 9.
    Wobbrock, J.O., Morris, M.R., Wilson, A.D.: User-defined gestures for surface computing. In: Proceedings of the 27th International Conference on Human Factors in Computing Systems, pp. 1083–1092. ACM, Boston (2009)Google Scholar
  10. 10.
    Ruiz, J., Li, Y., Lank, E.: User-defined motion gestures for mobile interaction. In: Proceedings of the 2011 Annual Conference on Human Factors in Computing Systems, pp. 197–206. ACM, Vancouver (2011)CrossRefGoogle Scholar
  11. 11.
    Wobbrock, J.O., Aung, H.H., Rothrock, B., Myers, B.A.: Maximizing the guessability of symbolic input. In: CHI 20 Extended Abstracts on Human Factors in Computing Systems, pp. 1869–1872. ACM, Portland (2005)Google Scholar
  12. 12.
    Lee, J.Y., Rhee, G.W., Seo, D.W.: Hand gesture-based tangible interactions for manipulating virtual objects in a mixed reality environment. International Journal of Advanced Manufacturing Technology 51, 1069–1082 (2010)CrossRefGoogle Scholar
  13. 13.
    Lee, T., Hollerer, T.: Multithreaded Hybrid Feature Tracking for Markerless Augmented Reality. IEEE Transactions on Visualization and Computer Graphics 15, 355–368 (2009)CrossRefGoogle Scholar
  14. 14.
    Olwal, A., Benko, H., Feiner, S.: SenseShapes: using statistical geometry for object selection in a multimodal augmented reality. In: Proceedings of the Second IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 300–301 (2003)Google Scholar
  15. 15.
    Lee, M.: Multimodal Speech-Gesture Interaction with 3D Objects in Augmented Reality Environments. In: Department of Computer Science and Software Engineering. Doctor of Philosophy, University of Canterbury, Christchurch (2010)Google Scholar
  16. 16.
    Newcombe, R.A., Izadi, S., Hilliges, O., Molyneaux, D., Kim, D., Davison, A.J., Kohli, P., Shotton, J., Hodges, S., Fitzgibbon, A.: KinectFusion: Real-time dense surface mapping and tracking. In: 2011 10th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011, October 26-29, pp. 127–136. IEEE Computer Society (2011)Google Scholar
  17. 17.
    Schuler, D., Namioka, A.: Participatory Design: Principles and Practices. Lawrence Erlbaum, Hillsdale (1993)Google Scholar
  18. 18.
    van Krevelen, D.W.F., Poelman, R.: A Survey of Augmented Reality Technologies, Applications and Limitations. The International Journal of Virtual Reality 9, 1–20 (2010)Google Scholar
  19. 19.
    Broll, W., Lindt, I., Ohlenburg, J., Wittkamper, M., Yuan, C., Novotny, T., Fatah, G., Mottram, C., Strothmann, A.: ARTHUR: A Collaborative Augmented Environment for Architectural Design and Urban Planning. Journal of Virtual Reality and Broadcasting 1 (2004)Google Scholar
  20. 20.
    Lee, G.A., Billinghurst, M., Kim, G.J.: Occlusion based interaction methods for tangible augmented reality environments. In: Proceedings VRCAI 2004 - ACM SIGGRAPH International Conference on Virtual Reality Continuum and its Applications in Industry, June 16-18, pp. 419–426. Association for Computing Machinery (2004)Google Scholar
  21. 21.
    Clark, A., Green, R.: Optical-Flow Perspective Invariant Registration. In: Proc. International Conference on Digital Image Computing: Techniques and Applications, pp. 117–123 (2011)Google Scholar
  22. 22.
    Piumsomboon, T., Clark, A., Umakatsu, A., Billinghurst, M.: Poster: Physically-based natural hand and tangible AR interaction for face-to-face collaboration on a tabletop. In: 2012 IEEE Symposium on 3D User Interfaces (3DUI), pp. 155–156 (2012)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2013

Authors and Affiliations

  • Thammathip Piumsomboon
    • 1
    • 2
  • Adrian Clark
    • 1
  • Mark Billinghurst
    • 1
  • Andy Cockburn
    • 2
  1. 1.HIT Lab NZUniversity of CanterburyChristchurchNew Zealand
  2. 2.Department of Computer Science and Software EngineeringUniversity of CanterburyChristchurchNew Zealand

Personalised recommendations