Skip to main content

Pointing Gesture Interface for Large Display Environments Based on the Kinect Skeleton Model

  • Conference paper
Future Information Technology

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 309))

Abstract

Even though many three-dimensional pointing gesture recognition methods have been researched, the problem of self-occlusion has not been considered. When two positions are used to define a pointing vector on a single camera perspective line, one position can occlude the other, which causes detection inaccuracies. In this paper, we propose a pointing gesture recognition method for large display environments based on the Kinect skeleton model. By taking the self-occlusion problem into account, a person’s detected shoulder position is compensated for in the case of a hand occlusion. By using exception handling for self-occlusions, experimental results indicate that the pointing accuracy of a specific reference position is greatly improved. The average root mean square error was approximately 13 pixels in 1920×1080 screen resolution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 219.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Yamamoto, Y., Yoda, I., Sakaue, K.: Arm-pointing gesture interface using surrounded stereo cameras system. In: 17th International Conference on Pattern Recognition, vol. 4, pp. 965–970 (2004)

    Google Scholar 

  2. Kai, N., Rainer, S.: Pointing Gesture Recognition based on 3D-Tracking of Face, Hands and Head Orientation. In: 5th International Conference on Multimodal Interfaces, pp. 140–146. ACM, New York (2003)

    Google Scholar 

  3. Nickel, K., Stiefelhagen, R.: Real-Time Recognition of 3D-Pointing Gestures for Human-Machine-Interaction. In: Michaelis, B., Krell, G. (eds.) DAGM 2003. LNCS, vol. 2781, pp. 557–565. Springer, Heidelberg (2003)

    Chapter  Google Scholar 

  4. Kolesnik, M., Kuleßa, T.: Detecting, Tracking, and Interpretation of a Pointing Gesture by an Overhead View Camera. In: Radig, B., Florczyk, S. (eds.) DAGM 2001. LNCS, vol. 2191, pp. 429–436. Springer, Heidelberg (2001)

    Chapter  Google Scholar 

  5. Tracking Users with Kinect Skeletal Tracking, http://msdn.microsoft.com/en-us/library/jj131025.aspx

  6. Jing, P., Yepeng, G.: Human-computer Interaction using Pointing Gesture based on an Adaptive Virtual Touch Screen. J. Signal Processing and Pattern Recognition 6(4), 81–92 (2013)

    Google Scholar 

  7. Yepeng, G., Mingen, Z.: Real-time 3D pointing gesture recognition for natural HCI. In: 7th World Congress on Intelligent Control and Automation, pp. 2433–2436 (2008)

    Google Scholar 

  8. Carbini, S., Viallet, J.E., Bernier, O.: Pointing Gesture Visual Recognition for Large Display. In: International Workshop on Visual Observation of Deictic Gestures, pp. 27–32 (2004)

    Google Scholar 

  9. Kehl, R., Van, G.L.: Real-time pointing gesture recognition for an immersive environment. In: 6th IEEE International Conference on Automatic Face and Gesture Recognition, pp. 17–19 (2004)

    Google Scholar 

  10. https://groups.google.com/forum/#topic/openkinect/ihfBIY56Is8

  11. http://msdn.microsoft.com/en-us/library/windows/desktop/ms648394v=vs.85.aspx

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hansol Kim .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Kim, H., Kim, Y., Ko, D., Kim, J., Lee, E.C. (2014). Pointing Gesture Interface for Large Display Environments Based on the Kinect Skeleton Model. In: Park, J., Pan, Y., Kim, CS., Yang, Y. (eds) Future Information Technology. Lecture Notes in Electrical Engineering, vol 309. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-55038-6_80

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-55038-6_80

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-55037-9

  • Online ISBN: 978-3-642-55038-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics