Indoor Positioning and Navigation Using Time-Of-Flight Cameras

  • Tobias K. Kohoutek
  • David Droeschel
  • Rainer Mautz
  • Sven Behnke


The development of indoor positioning techniques is booming. There is a significant demand for systems that have the capability to determine the 3D location of objects in indoor environments for automation, warehousing and logistics. Tracking of people in indoor environments has become vital during firefighting operations, in hospitals and in homes for vulnerable people and particularly for vision impaired or elderly people [1]. Along with the implementation of innovative methods to increase the capabilities in indoor positioning, the number of application areas is growing significantly. The search for alternative indoor positioning methods is driven by the poor performance of Global Navigation Satellite Systems (GNSS) within buildings. Geodetic methods such as total stations or rotational lasers can reach millimeter level of accuracy, but are not economical for most applications. In recent years, network based methods which obtain range or time of flight measurements between network nodes have become a significant alternative for applications at decimeter level accuracy. The measured distances can be used to determine the 3D position of a device by spatial resection or multilateration. Wireless devices enjoy widespread use in numerous diverse applications including sensor networks, which can consist of countless embedded devices, equipped with sensing capabilities, deployed in all environments and organizing themselves in an ad-hoc fashion [2]. However, knowing the correct positions of network nodes and their deployment is an essential precondition. There are a large number of alternative positioning technologies (Fig. 1) that cannot be detailed within the scope of this paper. An exhaustive overview of current indoor position technology is given in [3]. Further focus will be on optical methods.


Point Cloud Global Navigation Satellite System Global Navigation Satellite System Indoor Environment Scale Invariant Feature Transform 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



The support of Andreas Donaubauer, Andreas Schmidt, Dirk Holz and Stefan May is gracefully acknowledged.


  1. 1.
    T.K. Kohoutek, R. Mautz, A. Donaubauer, Real-time indoor positioning using range imaging sensors, in Proceedings of SPIE Photonics Europe, Real-Time Image and Video Processing, vol. 7724 (SPIE, 2010), p. 77240K. doi: 10.1117/12.853688 (CCC code: 0277-786X/10/$18)
  2. 2.
    R. Mautz, W.Y. Ochieng, Indoor positioning using wireless distances between motes, in Proceedings of TimeNav’07 / IEEE International Frequency Control Symposium, (Geneva, Switzerland, 29 May–1 June 2007), pp. 1530–1541Google Scholar
  3. 3.
    R. Mautz, Indoor positioning technologies, in A habilitation thesis submitted to ETH Zurich for the Venia Legendi in Positioning and Engineering Geodesy, (ETH, Zurich, 2012)Google Scholar
  4. 4.
    R. Mautz, S. Tilch, Survey of optical indoor positioning systems, in IEEE Xplore Proceedings of the 2011 International Conference on Indoor Positioning and Indoor Navigation (IPIN), (Guimarães, Portugal, 21–23 Sept 2011)Google Scholar
  5. 5.
    B. Julesz, Binocular depth perception of computer-generated patterns. Bell System Tech. 39(5), 1125–1161 (1960)Google Scholar
  6. 6.
    C. Keßler, C. Ascher, G.F. Trommer, Multi-sensor indoor navigation system with vision- and laser-based localisation and mapping capabilities. Eur. J. Navig. 9(3), 4–11 (2011)Google Scholar
  7. 7.
    G. Gröger, T.H. Kolbe, A. Czerwinski, C. Nagel: OpenGIS® City Geography Markup Language (CityGML) Encoding Standard Version 1.0.0, (International OGC Standard. Open GeospatialConsortium, Doc. No. 08-007r1, 2008)Google Scholar
  8. 8.
    G. Gröger, T.H. Kolbe, A. Czerwinski: Candidate OpenGIS® CityGML Implementation Specification (City Geography Markup Language) Version 0.4.0, (International OGC Standard. Open Geospatial Consortium, Doc. No. 07-062, 2007)Google Scholar
  9. 9.
    S. Cox, P. Daisey, R. Lake, C. Portele and A. Whiteside: OpenGIS® Geography Markup Language (GML) Implementation Specification Version 3.1.1, (International OGC Standard. Open Geospatial Consortium, Doc. No. 03-105r1, 2004)Google Scholar
  10. 10.
    C. Nagel, A. Stadler, T.H. Kolbe, Conceptual requirements for the automatic reconstruction of building information models from uninterpreted 3D models, in The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 34, Part XXX (2009)Google Scholar
  11. 11.
    A. Donaubauer, T.K. Kohoutek, R. Mautz, CityGML als Grundlage für die Indoor Positionierung mittels Range Imaging, in Proceedings of 15. Münchner Fortbildungsseminar Geoinformationssysteme, (Munich, Germany, 8–11 March 2010), pp. 168–181Google Scholar
  12. 12.
    S. May, D. Droeschel, D. Holz, S. Fuchs, E. Malis, A. Nüchter and J. Hertzberg, Three-dimensional mapping with Time-Of-Flight cameras, J. Field Robot. 26(11–12), 934–965 (2009). (Special Issue on Three-dimensional Mapping, Part 2)Google Scholar
  13. 13.
    D. Droeschel, D. Holz, S. Behnke, Probabilistic phase unwrapping for Time-Of-Flight cameras, in Proceedings of the joint conference of the 41st International Symposium on Robotics (ISR 2010) and the 6th German Conference on Robotics (ROBOTIK 2010), (Munich, Germany, 2010), pp. 318–324Google Scholar
  14. 14.
    D.G. Lowe, Distinctive image features from scale invariant keypoints. Int. J. Comput. Vision 60, 91–110 (2004)CrossRefGoogle Scholar
  15. 15.
    K.S. Arun, T.S. Huang, S.D. Blostein, Least-squares fitting of two 3-d point sets. IEEE Trans. Pattern Anal. Mach. Intell. 9(5), 698–700 (1987)CrossRefGoogle Scholar
  16. 16.
    D. Droeschel, J. Stückler, S. Behnke, Learning to interpret pointing gestures with a Time-Of-Flight camera, in Proceedings of the 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI), (Lausanne, Switzerland, March 2011), pp. 481–488Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Tobias K. Kohoutek
    • 1
  • David Droeschel
    • 2
  • Rainer Mautz
    • 1
  • Sven Behnke
    • 2
  1. 1.ETH Zurich—Institute for Geodesy and PhotogrammetryZurichSwitzerland
  2. 2.Rheinische Friedrich-Wilhelms-Universität Bonn—Institute for Informatics VIBonnGermany

Personalised recommendations