Analyzing Human Gestural Motions using Acceleration Sensors

  • F. G. Hofmann
  • G. Hommel
Conference paper


A novel approach to human gestural motion analysis and recognition on the basis of acceleration sensors is proposed. Using accelerometers for motion analysis has the advantage of very high temporal resolution (measurement rates ≥ 1 kHz are easily possible) and low instrumentation overhead. In this paper, a theoretical model for the description of gestural motion trajectories and the derivation of expected accelerations as well as the estimation of trajectories from measured acceleration data is presented. The theory is applied to measurements performed with a system of two combined triaxial accelerometers. Applying results from the theory of curves and differential geometry, features for pattern classification are extracted from the data. As an example, the nearest-neighbour classificator is used to match the features of gestural trajectories against stored templates. Preliminary trials indicate good recognition rates for a set of 9 different elementary gesture motions of varying size and speed. Also, a short overview of the interdisciplinary research project “Gesture Recognition with SensorGloves” at the Technical University of Berlin is given.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    Analog Devices. ADXL50 Accelerometer Data Sheet, 1994.Google Scholar
  2. [2]
    M. Bichsel, editor. Proceedings of the International Workshop on Automatic Face-and Gesture-Recognition. Zürich, Switzerland, June 26–28 1995.Google Scholar
  3. [3]
    K. R. Britting. Inertial Navigation Systems Analysis. Wiley-Interscience, 1971.Google Scholar
  4. [4]
    I. N. Bronstein and K. A. Semendjajew. Taschenbuch der Mathematik. B.G. Teubner Verlagsgesellschaft, Stuttgart Leipzig und Verlag Nauka, Moskau, 25th edition, 1991.Google Scholar
  5. [5]
    J. J. Craig. Introduction to Robotics: Mechanics & Control. Addison-Wesley, Reading, Massachusetts, 1986.Google Scholar
  6. [6]
    E G. Hofmann. Entwurf und implementierung eines sensorhandschuhs zur steuerung der USC/Belgrad hand. Studienarbeit am institut, Technische Informatik der TU Berlin, November 1991.Google Scholar
  7. [7]
    E G. Hofmann. Entwurf und implementierung einer ultraschallbasierten Stellungserkennung für den TUB-Sensorhandschuh. Diplomarbeit am institut, Technische Informatik der TU Berlin, June 1993.Google Scholar
  8. [8]
    G. Hommel, F. G. Hofmann, and J. Henz. The TU Berlin high-precision sensor glove. In Proceedings of the WWDU’94, Fourth International Scientific Conference, volume 2, pages F47–F49, University of Milan, Italy, 1994.Google Scholar
  9. [9]
    Kaiser. Polhemus, 3 Space user’s manual. A Kaiser Aerospace & Electronics Company, 1987.Google Scholar
  10. [10]
    T. Kanamori, H. Katayose, Y. Aono, S. Inokuchi, and T. Sakaguchi. Sensor integration for interactive digital art. In Proceedings of the International Computer Music Conference ICMC ‘85,pages 265–268, Banff, Canada, September 3–7 1995. International Computer Music Association.Google Scholar
  11. [11]
    Kistler. 8692b5 Accelerometer data sheet, 1994.Google Scholar
  12. [12]
    B. Marcus, T. Lawrence, and P. Churchill. Hand position/measurement control system, US Patent No. 4 986 280, January 1991.Google Scholar
  13. [13]
    J. R. W. Morris. Accelerometry — A technique for the measurement of human body movements. Journal of Biomechanics, 6: 729–736, 1973.CrossRefGoogle Scholar
  14. [14]
    S. Prillwitz, R. Leven, H. Zienert, T. Hanke, J. Henning, et al. HamNoSys (version 2.0) — Hamburg Notation System for Sign Languages / An introductory guide. In International Studies on Sign Language and the Communication of the Deaf,volume 5, Hamburg, Germany, 1989. Signum Press.Google Scholar
  15. [15]
    M. Rusch. Beitrag zur Führungsgröfienerzeugung für freie und kraftschlüssige Bewegungen von Industrierobotern. Carl Hanser Verlag, Munich, Vienna, 1993. Also: Dissertation at the Technical University of Berlin, 1992.Google Scholar
  16. [16]
    H. Sawada, S. Ohkura, and S. Hashimoto. Gesture analysis using 3D acceleration sensors for music control. In Proceedings of the International Computer Music Conference ICMC ‘85,pages 256–260, Banff, Canada, September 3–7 1995. International Computer Music Association.Google Scholar
  17. [17]
    V. Tartter and K. Knowlton. Perception of sign language from an array of 27 moving spots. Nature, 289, 1981.Google Scholar
  18. [18]
    TCAS Datawear. Bodysuit manufactured by Twenty-First Century Actuators and Sensors Ltd., Cardiff, Wales, UK.Google Scholar
  19. [19]
    T. Zimmermann, J. Lanier, C. Blanchard, S. Bryson, and Y. Harvill. A hand gesture interface device. In Proceedings of CHI + GI ‘87 Human Factors in Computing Systems,pages 189–192, Toronto, Canada, 5–9 April 1987. ACM Press.Google Scholar

Copyright information

© Springer-Verlag London 1997

Authors and Affiliations

  • F. G. Hofmann
    • 1
  • G. Hommel
    • 1
  1. 1.Department of Computer ScienceTechnical University of Berlin, Real-Time Systems and Robotics Research GroupBerlinGermany

Personalised recommendations