Abstract
Virtual reality allows different methods of interaction and communication between the user and virtual world. An important aspect of the virtual reality system in this regard is the tracking of the user’s pose and actions. In this chapter methods and technologies that define inputs to a virtual reality system are presented. These include pose sensors (mechanical, ultrasonic, optical, electromagnetic, and inertial) as well as force and torque sensors. The chapter concludes with a summary of motion tracking concepts and physical input devices.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Rolland JP, Baillot Y, Goon AA (2001) A survey of tracking technology for virtual environments
Zhou H, Hu H (2008) Human motion tracking for rehabilitation—A survey. Biomed Signal Process Control 3:1–18
Welch G, Foxlin E (2002) Motion tracking: no silver bullet, but a respectable arsenal. IEEE Comput Graph Appl 22:24–38
Kuang WT, Morris AS (2000) Ultrasound speed compensation in an ultrasonic robot tracking system. Robotica 18:633–637
Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, Kipman A, Blake A (2011) Real-Time human pose recognition in parts from single depth images. http://research.microsoft.com/apps/pubs/default.aspx?id=145347
Kuipers JB (2002) Quaternions and rotation sequences: a primer with applications to orbits. Princeton University Press, Princeton (Aerospace and Virtual Reality)
Welch GF (2009) History: the use of the Kalman filter for human motion tracking in virtual reality. Presence: Teleoper Virtual Environ 18:72–91
Richardson DC (2004) Eye-tracking: characteristics and methods. Encyclopedia of biomaterials and biomedical engineering. Stanford University, ZDA
Poole A, Ball LJ (2005) Encyclopedia of human computer interaction, chap. Eye tracking in human-computer interaction and usability research: Current status and future prospects. Idea Group, Velika Britanija
Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, London
Morimoto C, Koons D, Amir A, Flickner M (2000) Pupil detection and tracking using multiple light sources. Image Vis Comput 18:331–335
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media Dordrecht
About this chapter
Cite this chapter
Mihelj, M., Novak, D., Begus, S. (2014). Tracking the User and Environment. In: Virtual Reality Technology and Applications. Intelligent Systems, Control and Automation: Science and Engineering, vol 68. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-6910-6_4
Download citation
DOI: https://doi.org/10.1007/978-94-007-6910-6_4
Published:
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-007-6909-0
Online ISBN: 978-94-007-6910-6
eBook Packages: EngineeringEngineering (R0)