Towards an Efficient Implementation of a Video-Based Gesture Interface

  • Jong-Seung Park
  • Jong-Hyun Yoon
  • Chungkyue Kim
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4304)


Human-computer interactions in augmented reality games are generally based on human gestures. For each input video frame captured from a live video camera, image analysis technologies are used to infer the human intension. The development of augmented reality user interfaces is a difficult task due to the instability of the gesture analysis. The user interfaces cannot be efficiently developed with traditional development techniques. In this paper, we investigate an effective development methodology for gesture-based augmented reality interfaces by means of three different approaches. The implementation requires a real-time tracking of bare hands or real rackets to allow fast movements and interactions without delay. We also verify the applicability of the prototyping mechanism by implementing and demonstrating an augmented reality game played with either bare hands or real rackets.


Feature Point Augmented Reality Hand Gesture Hand Gesture Recognition Table Tennis 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Torre, R., Balcisoy, S., Fua, P., Ponder, M., Thalmann, D.: Interaction between real and virtual humans: Playing checkers. In: Eurographics Workshop on Virtual Environments (2000)Google Scholar
  2. 2.
    Asutay, A., Indugula, A., Borst, C.: Virtual tennis: a hybrid distributed virtual reality environment with fishtank vs. hmd. In: IEEE International Symposium on Distributed Simulation and Real-Time Applications, pp. 213–220 (2005)Google Scholar
  3. 3.
    Govil, A., You, S., Neumann, U.: A video-based augmented reality golf simulator. In: ACM Multimedia 2000, pp. 489–490 (2000)Google Scholar
  4. 4.
    Nilsen, T., Looser, J.: Tankwar tabletop war gaming in augmented reality. In: Proceedings of 2nd International Workshop on Pervasive Gaming Applications (2005)Google Scholar
  5. 5.
    O’Hagan, R., Zelinsky, A.: Finger track - a robust and real-time gesture interface. In: Australian Joint Conference on Artificial Intelligence, pp. 475–484 (1997)Google Scholar
  6. 6.
    von Hardenberg, C., Brard, F.: Bare-hand humancomputer interaction. In: Proceedings of Perceptual User Interfaces (2001)Google Scholar
  7. 7.
    Pavlovic, V., Sharma, R., Huang, T.S.: Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Transactions on Pattern Analysis and Machine Intelligence 19(7), 677–695 (1997)CrossRefGoogle Scholar
  8. 8.
    Nilsen, T., Linton, S., Looser, J.: Motivations for augmented reality gaming. In: Proceedings Fuse 2004, New Zealand Game Developers Conference, pp. 86–93 (2004)Google Scholar
  9. 9.
    Ishii, H., Wisneski, C., Orbanes, J., Chun, B., Paradiso, J.: Pingpongplus: Design of an athletic-tangible interface for computer supported cooperative play. In: Proceedings of Conference on Human Factors in Computing Systems, pp. 394–401 (1999)Google Scholar
  10. 10.
    Woodward, C., Honkamaa, P., Jappinen, J., Pyokkimies, E.: Camball - augmented virtual table tennis with real rackets. In: Proc. ACE 2004, pp. 275–276 (2004)Google Scholar
  11. 11.
    Geer, D.: Will gesture-recognition technology point the way? IEEE Computer 37(10), 20–23 (2004)Google Scholar
  12. 12.
    Garcia, C., Tziritas, G.: Face detection using quantized skin color regions merging and wavelet packet analysis. IEEE Transactions on Multimedia 1(3), 264–277 (1999)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Jong-Seung Park
    • 1
  • Jong-Hyun Yoon
    • 1
  • Chungkyue Kim
    • 1
  1. 1.Department of Computer Science & EngineeringUniversity of IncheonIncheonRepublic of Korea

Personalised recommendations