Detecting Deictic Gestures for Control of Mobile Robots

  • Tobias NowackEmail author
  • Stefan Lutherdt
  • Stefan Jehring
  • Yue Xiong
  • Sabine Wenzel
  • Peter Kurtz
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 499)


For industrial environments esp. under conditions of “Industry 4.0” it is necessary to have a mobile and hands-free controlled interaction solution. Within this project a mobile robot system (for picking, lifting and transporting of small boxes) in logistic domains was created. It consists of a gesture detection and recognition system based on Microsoft Kinect™ and gesture detection algorithms. For implementing these algorithms several studies about the intuitive use, executing and understanding of mid-air-gestures were processed. The base of detection was to define, if a gesture is executed dynamically or statically and to derive a mathematical model for these different kinds of gestures. Fitting parameters to describe several gesture phases could be found and will be used for their robust recognition. A first prototype with an implementation of this technology also is shown in this paper.


Human-robot-interaction Mid-air-gestures Deictic gestures (Pointing) Definition of gestures Kinect 2™ 


  1. 1.
    LärmVibrationsArbSchV: Verordnung zum Schutz der Beschäftigten vor Gefährdungen durch Lärm und Vibrationen (Lärm- und Vibrations- Arbeitsschutzverordnung - LärmVibrationsArbSchV), 06 March 2007 . Accessed March 2016
  2. 2.
    Meinel, K., Schnabel, G., Krug, J.: Bewegungslehre – Sportmotorik Abriss einer Theorie der sportlichen Motorik unter pädagogischem Aspekt, (11. überarb. und erw. aufl. ed.), Meyer & Meyer, 2007Google Scholar
  3. 3.
    Kendon, A.: Gesture Visible Action as Utterance. Cambridge University Press, Cambridge (2004)CrossRefGoogle Scholar
  4. 4.
    McNeill, D.: Hand and Mind. What Gestures Reveal About Thought. University of Chicago Press, Chicago (1992)Google Scholar
  5. 5.
    Walter, R., Bailly, G., Müller, J.: StrikeAPose: Revealing mid-air gestures on public displays. In: Wendy E. Mackay und A. Special Interest Group on Computer-Human Interaction (Hg.): Proceedings of the SIGCHI Conference on Human Factors in Computing Systems. [S. l.]: ACM, pp. 841–850 (2013)Google Scholar
  6. 6.
    Pavlovic, V.I., Sharma, R., Huang, T.S.: Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans. Pattern Anal. Mach. Intell. 19(7), 677–695 (1997)CrossRefGoogle Scholar
  7. 7.
    Nowack, T., Suzaly, N., Lutherdt, S., Schürger, K., Jehring, S., Witte, H., Kurtz, P.: Phases of technical gesture recognition. In: M. Kurosu (Ed.): Human-Computer Interaction, Part II, HCII 2015, LNCS 9170, pp. 130–139. Springer International Publishing Switzerland (2015)Google Scholar
  8. 8.
    Analytical dynamics, From Wikipedia, the free encyclopedia,
  9. 9.
    Technisches Taschenbuch, Schaeffler Technologies GmbH & Co. KG (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2017

Authors and Affiliations

  • Tobias Nowack
    • 1
    Email author
  • Stefan Lutherdt
    • 2
  • Stefan Jehring
    • 1
  • Yue Xiong
    • 1
  • Sabine Wenzel
    • 1
  • Peter Kurtz
    • 1
  1. 1.Ergonomics GroupTechnische Universität IlmenauIlmenauGermany
  2. 2.Biomechatronics GroupTechnische Universität IlmenauIlmenauGermany

Personalised recommendations