Skip to main content

Tracking the User and Environment

  • Chapter
  • First Online:
Book cover Virtual Reality Technology and Applications

Abstract

Virtual reality allows different methods of interaction and communication between the user and virtual world. An important aspect of the virtual reality system in this regard is the tracking of the user’s pose and actions. In this chapter methods and technologies that define inputs to a virtual reality system are presented. These include pose sensors (mechanical, ultrasonic, optical, electromagnetic, and inertial) as well as force and torque sensors. The chapter concludes with a summary of motion tracking concepts and physical input devices.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 129.00
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Rolland JP, Baillot Y, Goon AA (2001) A survey of tracking technology for virtual environments

    Google Scholar 

  2. Zhou H, Hu H (2008) Human motion tracking for rehabilitation—A survey. Biomed Signal Process Control 3:1–18

    Article  Google Scholar 

  3. Welch G, Foxlin E (2002) Motion tracking: no silver bullet, but a respectable arsenal. IEEE Comput Graph Appl 22:24–38

    Article  Google Scholar 

  4. Kuang WT, Morris AS (2000) Ultrasound speed compensation in an ultrasonic robot tracking system. Robotica 18:633–637

    Article  Google Scholar 

  5. Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, Kipman A, Blake A (2011) Real-Time human pose recognition in parts from single depth images. http://research.microsoft.com/apps/pubs/default.aspx?id=145347

  6. Kuipers JB (2002) Quaternions and rotation sequences: a primer with applications to orbits. Princeton University Press, Princeton (Aerospace and Virtual Reality)

    Google Scholar 

  7. Welch GF (2009) History: the use of the Kalman filter for human motion tracking in virtual reality. Presence: Teleoper Virtual Environ 18:72–91

    Google Scholar 

  8. Richardson DC (2004) Eye-tracking: characteristics and methods. Encyclopedia of biomaterials and biomedical engineering. Stanford University, ZDA

    Google Scholar 

  9. Poole A, Ball LJ (2005) Encyclopedia of human computer interaction, chap. Eye tracking in human-computer interaction and usability research: Current status and future prospects. Idea Group, Velika Britanija

    Google Scholar 

  10. Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, London

    Book  Google Scholar 

  11. Morimoto C, Koons D, Amir A, Flickner M (2000) Pupil detection and tracking using multiple light sources. Image Vis Comput 18:331–335

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Matjaž Mihelj .

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media Dordrecht

About this chapter

Cite this chapter

Mihelj, M., Novak, D., Begus, S. (2014). Tracking the User and Environment. In: Virtual Reality Technology and Applications. Intelligent Systems, Control and Automation: Science and Engineering, vol 68. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-6910-6_4

Download citation

  • DOI: https://doi.org/10.1007/978-94-007-6910-6_4

  • Published:

  • Publisher Name: Springer, Dordrecht

  • Print ISBN: 978-94-007-6909-0

  • Online ISBN: 978-94-007-6910-6

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics