Tracking the User and Environment

  • Matjaž MiheljEmail author
  • Domen Novak
  • Samo Begus
Part of the Intelligent Systems, Control and Automation: Science and Engineering book series (ISCA, volume 68)


Virtual reality allows different methods of interaction and communication between the user and virtual world. An important aspect of the virtual reality system in this regard is the tracking of the user’s pose and actions. In this chapter methods and technologies that define inputs to a virtual reality system are presented. These include pose sensors (mechanical, ultrasonic, optical, electromagnetic, and inertial) as well as force and torque sensors. The chapter concludes with a summary of motion tracking concepts and physical input devices.


Virtual Reality Virtual Environment Virtual World Motion Tracking Virtual Reality System 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Rolland JP, Baillot Y, Goon AA (2001) A survey of tracking technology for virtual environmentsGoogle Scholar
  2. 2.
    Zhou H, Hu H (2008) Human motion tracking for rehabilitation—A survey. Biomed Signal Process Control 3:1–18CrossRefGoogle Scholar
  3. 3.
    Welch G, Foxlin E (2002) Motion tracking: no silver bullet, but a respectable arsenal. IEEE Comput Graph Appl 22:24–38CrossRefGoogle Scholar
  4. 4.
    Kuang WT, Morris AS (2000) Ultrasound speed compensation in an ultrasonic robot tracking system. Robotica 18:633–637CrossRefGoogle Scholar
  5. 5.
    Shotton J, Fitzgibbon A, Cook M, Sharp T, Finocchio M, Moore R, Kipman A, Blake A (2011) Real-Time human pose recognition in parts from single depth images.
  6. 6.
    Kuipers JB (2002) Quaternions and rotation sequences: a primer with applications to orbits. Princeton University Press, Princeton (Aerospace and Virtual Reality)Google Scholar
  7. 7.
    Welch GF (2009) History: the use of the Kalman filter for human motion tracking in virtual reality. Presence: Teleoper Virtual Environ 18:72–91Google Scholar
  8. 8.
    Richardson DC (2004) Eye-tracking: characteristics and methods. Encyclopedia of biomaterials and biomedical engineering. Stanford University, ZDAGoogle Scholar
  9. 9.
    Poole A, Ball LJ (2005) Encyclopedia of human computer interaction, chap. Eye tracking in human-computer interaction and usability research: Current status and future prospects. Idea Group, Velika BritanijaGoogle Scholar
  10. 10.
    Duchowski AT (2003) Eye tracking methodology: theory and practice. Springer, LondonCrossRefGoogle Scholar
  11. 11.
    Morimoto C, Koons D, Amir A, Flickner M (2000) Pupil detection and tracking using multiple light sources. Image Vis Comput 18:331–335CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  1. 1.Faculty of Electrical EngineeringUniversity of LjubljanaLjubljanaSlovenia

Personalised recommendations