Context-Based Approach for Human Gesture Analysis

  • Chil-Woo Lee
  • Jae-Yong Oh
  • Yang-Weon Lee
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4251)


In this paper, we propose a state transition model using context based approach for gesture analysis. This method defines the analysis situation as five different states; NULL, OBJECT, POSTURE, LOCAL, and GLOBAL. We first infer the situation of the system by estimating the transition of the state model, and then apply different analysis algorithms according to the system state. Gestures are analyzed with the queue-based matching method which is newly proposed in the paper instead of general gesture spotting algorithms. In the algorithm, movement of feature points; face and both hands, is compared directly, so gesture can be recognized quickly without applying any constraints for real world application. The transition of the states can be interpreted as the context of motion and that is estimated with conditional probability between the states in the algorithm.


Recognition System Gesture Recognition Weight Information State Transition Model Human Gesture 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Li, B., Holstein, H.: Articulated point pattern matching in optical motion capture systems. In: Meng, Q. (ed.) Control, Automation, Robotics and Vision, 2002. ICARCV 2002. 7th International Conference, December 2-5, 2002, vol. 1 (2002)Google Scholar
  2. 2.
    Su, Y., Allen, C.R., Geng, D., Burn, D., Brechany, U., Bell, G.D., Rowland, R.: 3-D motion system (data-gloves): application for Parkinson’s disease. Instrumentation and Measurement, IEEE Transactions 52(3) (June 2003)Google Scholar
  3. 3.
    Davis, J.: Recognizing Movement using Motion Histograms, MIT Media Lab. Technical Report No. 487 (March 1999)Google Scholar
  4. 4.
    Cutler, R., Turk, M.: View-based Interpretation of Real-time Optical Flow for Gesture Recognition. In: Third IEEE International Conf. on Automatic Face and Gesture Recognition (1998)Google Scholar
  5. 5.
    Lee, C.-W., Lee, H.-J., Yoon, S.H., Kim, J.H.: Gesture Recognition in Video Image with Combination of Partial and Global Information. In: Proc. of VCIP 2003 (July 2003)Google Scholar
  6. 6.
    Yoon, H.-S., Min, B.-W., Soh, J., Bae, Y.-I., Yang, H.S.: Human computer interface for gesture-based editing system. Image Analysis and Processing, 969–974 (1999)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Chil-Woo Lee
    • 1
  • Jae-Yong Oh
    • 1
  • Yang-Weon Lee
    • 2
  1. 1.Department of Computer EngineeringChonnam National UniversityGwangjuKorea
  2. 2.Department of Information and Communication EngineeringHonam UniversityGwangjuKorea

Personalised recommendations