Advertisement

Human-Computer Interaction System with Artificial Neural Network Using Motion Tracker and Data Glove

  • Cemil Oz
  • Ming C. Leu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3776)

Abstract

A Human-Computer Interaction (HCI) system has been developed with an Artificial Neural Network (ANN) using a motion tracker and a data glove. The HCI system is able to recognize American Sign Language letter and number gestures. The finger joint angle data obtained from the strain gauges in the sensory glove define the hand shape while the data from the motion tracker describe the hand position and orientation. The data flow from the sensory glove is controlled by a software trigger using the data from the motion tracker during signing. Then, the glove data is processed by a recognition neural network.

Keywords

Artificial Neural Network Gesture Recognition Hand Gesture Motion Tracker American Sign Language 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Sturman, D.J., Zelter, D.: A Survey of Glove-Based Input. IEEE Computer and Applications, 30–39 (1994)Google Scholar
  2. 2.
    Wysoski, S.G., Lamar, M.V., Kuroyanagi, S., Iwata, A.: A Rotation Invariant Approach on Static Gesture Recognition Using Boundary Histograms and Neural networks. In: Proceeding of the 9th International Conference on Neural Information Processing (ICONIP 2002), vol. 4, pp. 2137–2141 (2002)Google Scholar
  3. 3.
    Vogler, C., Metaxas, D.: ASL Recognition Based on a Coupling Between HMMs and 3D Motion Analysis. In: Proceedings of the sixth IEEE International Computer Vision Conference, January 1998, pp. 363–369 (1998)Google Scholar
  4. 4.
    Lalit, G., Suei, M.: Gesture-Based Interaction and Communication: Automated Classification of Hand Gesture Contours. IEEE Transactions on Systems, Man, and Cybernetics, Part c: Application and Reviews 31(1), 114–120 (2001)CrossRefGoogle Scholar
  5. 5.
    Cui, Y., Weng, J.: A Learning-Based Prediction and Verification Segmentation Scheme for Hand Sign Image Sequence. IEEE Transactions on Pattern Analysis and Machine Intelligence 21(8), 798–804 (1999)CrossRefGoogle Scholar
  6. 6.
    Vogler, C., Metaxas, D.: Parallel Hidden Markov Models for American Sign Language Recognition. In: IEEE Proceeding of International Computer Vision Conference, pp. 116–122 (1999)Google Scholar
  7. 7.
    Lee, L.K., Kim, S., Choi, Y.K., Lee, M.H.: Recognition of Hand Gesture to Human-Computer Interaction. In: IECON 2000, 26th Annual Confjerence of the IEEE 2000, pp. 2177–2122 (2000)Google Scholar
  8. 8.
    Koker, R., Oz, C., Ferikoglu, A.: Object Recognition Based on Moment Invariants Using Artificial Neural Networks. In: Proceedings International Symposium on Intelligent Manufacturing Systems, August 30-31 (2001)Google Scholar
  9. 9.
    Oz, C., Sarawate, N.N., Leu, M.C.: American Sign Language Word Recognition with a Sensory Glove Using Artificial Neural Networks. In: Proceedings of ANNIE Conference, St. Louis, MO, November 7-10 (2004); also Book on Intelligent Engineering Systems through Artificial Neural Networks, vol. 14, pp. 633–638 (2004) ISBN 0-7918-0228-0Google Scholar
  10. 10.
    Oz, C., Leu, M.: Recognition of Finger Spelling of American Sign Language with ANN Using Position/Orientation Sensors and Data Glove. In: Wang, J., Liao, X.-F., Yi, Z. (eds.) ISNN 2005. LNCS, vol. 3497, pp. 157–164. Springer, Heidelberg (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Cemil Oz
    • 1
  • Ming C. Leu
    • 1
  1. 1.Department of Mechanical and Aerospace EngineeringUniversity of Missouri-RollaRollaUSA

Personalised recommendations