How to Click in Mid-Air

  • Florian van de Camp
  • Alexander Schick
  • Rainer Stiefelhagen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8028)


In this paper, we investigate interactions with distant interfaces. In particular, we focus on how to issue mouse click like commands in mid-air and we propose a taxonomy for distant one-arm clicking gestures. The gestures are divided into three main groups based on the part of the arm that is responsible for the gesture: the fingers, the hand, or the arm. We evaluated nine specific gestures in a Wizard of Oz study and asked participants to rate each gesture using a TLX questionnaire as well as to give an overall ranking. Based on the evaluation, we identified groups of gestures of varying acceptability that can serve as a reference for interface designers to select the most suitable gesture.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Bader, T., Räpple, R., Beyerer, J.: Fast Invariant Contour-Based Classification of Hand Symbols for HCI. In: Jiang, X., Petkov, N. (eds.) CAIP 2009. LNCS, vol. 5702, pp. 689–696. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  2. 2.
    Bolt, R.A.: “Put-that-there”: Voice and Gesture at the Graphics Interface. SIGGRAPH Computer Graphics 14(3), 262–270 (1980)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Grossman, T., Wigdor, D.: Going Deeper: a Taxonomy of 3D on the Tabletop. In: Proc. Horizontal Interactive Human-Computer Systems, pp. 137–144 (2007)Google Scholar
  4. 4.
    Kelley, J.F.: An Empirical Methodology for Writing User-friendly Natural Language Computer Applications. In: ACM SIGCHI Conference on Human Factors in Computing Systems, pp. 193–196 (1983)Google Scholar
  5. 5.
    Kendon, A.: Gesticulation and speech: two aspects of the process of utterance. In: Key, M.R. (ed.) The Relationship of Verbal and Nonverbal Communication, pp. 207–227 (1980)Google Scholar
  6. 6.
    Kendon, A.: Gesture: Visible Action as Utterance. Cambridge University Press (2004)Google Scholar
  7. 7.
    McNeill, D.: Hand and Mind: What Gestures Reveal about Thought. University of Chicago Press (1992)Google Scholar
  8. 8.
    Nancel, M., Wagner, J., Pietriga, E., Chapuis, O., Mackay, W.: Mid-air pan-and-zoom on wall-sized displays. In: Proc. SIGCHI Conference on Human Factors in Computing Systems, pp. 177–186 (2011)Google Scholar
  9. 9.
    Pavlovic, V.I., Sharma, R., Huang, T.S.: Visual Interpretation of Hand Gestures for Human-Computer Interaction: A Review. Pattern Analysis and Machine Intelligence 19(7), 677–695 (1997)CrossRefGoogle Scholar
  10. 10.
    Quek, F.K.H.: Toward a Vision-Based Hand Gesture Interface. In: Proc. Virtual Reality Software and Technology, pp. 17–29 (1994)Google Scholar
  11. 11.
    Quek, F.K.H.: Eyes in the Interface. Image and Vison Computing 13(6), 511–525 (1995)CrossRefGoogle Scholar
  12. 12.
    Schick, A., van de Camp, F., Ijsselmuiden, J., Stiefelhagen, R.: Extending Touch: Towards Interaction with Large-Scale Surfaces. In: Proc. Interactive Tabletops and Surfaces, pp. 127–134 (2009)Google Scholar
  13. 13.
    Vogel, D., Balakrishnan, R.: Distant Freehand Pointing and Clicking on Very Large, High Resolution Displays. In: ACM Symposium on User Interface Software and Technology, pp. 33–42 (2005)Google Scholar
  14. 14.
    Woobrock, J.O., Morris, M.R., Wilson, A.D.: User-Defined Gestures for Surface Computing. In: Proc. Human Factors in Computing Systems, pp. 1083–1092 (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Florian van de Camp
    • 1
  • Alexander Schick
    • 1
  • Rainer Stiefelhagen
    • 2
  1. 1.System Technologies and Image ExploitationFraunhofer Institute of OptronicsKarlsruheGermany
  2. 2.Karlsruhe Institute of Technology (KIT)Germany

Personalised recommendations