Skip to main content
Log in

Visual Navigation for Mobile Robots: A Survey

  • Unmanned Systems Paper
  • Published:
Journal of Intelligent and Robotic Systems Aims and scope Submit manuscript

Abstract

Mobile robot vision-based navigation has been the source of countless research contributions, from the domains of both vision and control. Vision is becoming more and more common in applications such as localization, automatic map construction, autonomous navigation, path following, inspection, monitoring or risky situation detection. This survey presents those pieces of work, from the nineties until nowadays, which constitute a wide progress in visual navigation techniques for land, aerial and autonomous underwater vehicles. The paper deals with two major approaches: map-based navigation and mapless navigation. Map-based navigation has been in turn subdivided in metric map-based navigation and topological map-based navigation. Our outline to mapless navigation includes reactive techniques based on qualitative characteristics extraction, appearance-based localization, optical flow, features tracking, plane ground detection/tracking, etc... The recent concept of visual sonar has also been revised.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Abascal, J., Lazkano, E., Sierra, B.: Behavior-based indoor navigation. Ambient intelligence for scientific discovery. Foundations, theories, and systems, LNAI 3345, 263–285 (2005)

    Google Scholar 

  2. Antich, J., Ortiz, A.: Development of the control architecture of a vision-guided underwater cable tracker. Int. J. Intell. Syst. 20(5), 477–498 (2005)

    Article  Google Scholar 

  3. Antich, J., Ortiz, A., Oliver, G.: A control strategy for fast obstacle avoidance in troublesome scenarios: application in underwater cable tracking. In: Proc. of 7th IFAC Conference on Manoeuvring and Control of Marine Craft (MCMC) (2006)

  4. Arkin, R.C.: Behaviour Based Robotics. The MIT Press (1998)

  5. Atiya, S., Hager, G.D.: Real time vision-based robot localization. IEEE Trans. Robot. Autom. 9(6), 785–800 (1993)

    Article  Google Scholar 

  6. Badal, S., Ravela, S., Draper, B., Hanson, A.: A practical obstacle detection and avoidance system. In: Proc. of 2nd IEEE Workshop on Applications of Computer Vision, pp. 97–104 (1994)

  7. Balasuriya, B., Ura, T.: Underwater cable following by twin-burger 2. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA) (2001)

  8. Bellotto, N., Hu, H.: Multisensor integration for human–robot interaction. J. Intell. Cybern. Syst. 1 (2005), July. Available http://www.cybernetic.org.uk/ics

  9. Bernardino, A., Santos-Victor, J.: Visual behaviours for binocular tracking. Robot. Auton. Syst. 25(3–4), 137–146 (1998)

    Article  Google Scholar 

  10. Borenstein, J., Koren, Y.: Real-time obstacle avoidance for fast mobile robots. IEEE Trans. Syst. Man Cybern. 19(5), 1179–1187 (1989)

    Article  Google Scholar 

  11. Borenstein, J., Koren, Y.: Real time obstacle avoidance for fast mobile robots in cluttered environments. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 572–577 (1990)

  12. Borenstein, J., Koren, Y.: The vector field histogram-fast obstacle avoidance for mobile robots. IEEE Trans. Robot. Autom. 7(3), 278–288 (1991)

    Article  Google Scholar 

  13. Braillon, C., Usher, K., Pradalier, C., Crowley, J.L., Laugier, C.: Fusion of stereo and optical flow data using occupancy grid. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 2302–2307 (2006)

  14. Camus, T., Coombs, D., Herman, M., Hong, T.H.: Real-time single-workstation obstacle avoidance using only wide-field flow divergence. In: Proc. of 13th International Conference on Pattern Recognition (ICPR). Applications and Robotic Systems (1996)

  15. Chao, M.T., Braunl, T., Zaknich, A.: Visually-guided obstacle avoidance. In: Proc. of 6th Int’l Conf. on Neural Information Processing ICONIP, vol. 2, pp. 650–655 (1999)

  16. Chatila, R., Laumond, J.P.: Position referencing and consistent world modeling for mobile robots. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 138–145 (1985)

  17. Choi, Y.H., Oh, S.Y.: Visual sonar based localization using particle attraction and scattering. In: Proc. of IEEE International Conference on Mechatronics and Automation, pp. 449–454 (July 2005)

  18. Choset, H., Lynch, K., Hutchinson, S., Kantor, G., Burgard, W., Kavraki, L., Thrun, S.: Principles of Robot Motion. The MIT Press (2005)

  19. Christensen, H.I., Kirkeby, N.O., Kristensen, S., Knudsen, L.: Model-driven vision for indoor navigation. Robot. Auton. Syst. 12, 199–207 (1994)

    Article  Google Scholar 

  20. Cornall, T., Egan, G.: Optic Flow Methods Applied to Unmanned Air Vehicles. Academic Research Forum, Department of Electrical and Computer Systems Engineering, Monash University (2003)

  21. Dalgleish, F.R., Tetlow, S.W., Allwood, R.L.: Hammerhead: an AUV with an Integral Laser Imaging Sensor. Oceanology (2004)

  22. Dalgleish, F.R., Tetlow, S.W., Allwood, R.L.: Vision-based navigation of unmanned underwater vehicles: a survey. Part I: Vision Based Cable-, Pipeline- and Fish Tracking. Proc. Inst. Marine Eng. Sci. Technol. Part B, J. Mar. Design Oper. B(7), 51–56 (2004)

    Google Scholar 

  23. Dalgleish, F.R., Tetlow, S.W., Allwood, R.L.: Vision-based navigation of unmanned underwater vehicles: a survey. Part II: Vision based station keeping and positioning. Proc. Inst. Mar. Eng. Sci. Technol. Part B, J. Mar. Design Oper. B(8), 13–19 (2004)

    Google Scholar 

  24. Dao, N.X., You, B., Oh, S.: Visual navigation for indoor mobile robots using a single camera. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 1992–1997 (2005)

  25. Davison, A.J.: Real time simultaneous localisation and mapping with a single camera. In: Proc. of International Conference on Computer Vision (ICCV) (2003)

  26. Davison, A.J., González, Y., Kita, N.: Real-time 3D SLAM with wide-angle vision. In: Proc. of IFAC Symposium on Intelligent Autonomous Vehicles (2004)

  27. Davison, A.J., Kita, N.: Sequential localization and map building for real time computer vision and robotics. Robot. Auton. Syst. 36(4), 171–183 (2001)

    Article  Google Scholar 

  28. DeSouza, G.N., Kak, A.C.: Vision for mobile robot navigation : a survey. IEEE Trans. Pattern Anal. Mach. Intell. 24(2), 237–267 (2002)

    Article  Google Scholar 

  29. Dev, A., Kröse, B., Groen, F.: Navigation of a mobile robot on the temporal development of the optic flow. In: Proc. of IEEE Int’l Conf. of Intelligent Robots and Systems (IROS), pp. 558–563 (1997)

  30. Dudek, G., Jenkin, M., Prahacs, C., Hogue, A., Sattar, J., Giguère, P., German, A., Liu, H., Saunderson, S., Ripsman, A., Simhon, S., Torres-Mendez, L.A., Milios, E., Zhang, P., Rekleitis, I.: A visually guided swimming robot. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS) (2005)

  31. Fan, Y., Balasuriya, B.: Optical flow based speed estimation in auv target tracking. Proc. of IEEE Oceans (2001)

  32. Fasola, J., Rybski, P.E., Veloso, M.: Fast goal navigation with obstacle avoidance using a dynamic local visual model. In: Proc. of SBAI’05, The VII Brazilian Symposium of Artificial Intelligence (2005)

  33. Fasola, J., Veloso, M.: Real-time object detection using segmented and grayscale images. In: Proc. of IEEE Int’l Conf. on Robotics and Automations (ICRA), pp. 4088–4090 (2006)

  34. Dellaert, F., Fox, D., Burgard, W., Thrun, S.: Monte carlo localization for mobile robots. In: Proc. of IEEE Int’l Conference on Robotics and Automation (ICRA), pp. 1322–1328 (1999)

  35. Ferruz, J., Ollero, A.: Real-time feature matching in image sequences for non-structured environments. Applications to vehicle guidance. J. Intell. Robot. Syst. 28, 85–123 (2000)

    Article  Google Scholar 

  36. Fleischer, S.D., Marks, R.L., Rock, S.M.: Improving real-time video mosaicing of the ocean floor. In: Proc. of IEEE Oceans (1995)

  37. Foresti, G.L., Gentili, S.: A hierarchical classification system for object recognition in under water environments. IEEE J. Oceanic Eng. 1(27), 66–78 (2002)

    Article  Google Scholar 

  38. Garcia, R., Cufi, X., Pacheco, Ll.: Image mosaicking for estimating the motion of an underwater vehicle. In: Proc. of 5th IFAC Conference on Manoeuvring and Control of Marine Craft (MCMC) (2000)

  39. Gartshore, R., Aguado, A., Galambos, C.: Incremental map buildig using an occupancy grid for an autonomous monocular robot. In: Proc. of Seventh International Conference on Control, Automation, Robotics and Vision ICARCV, pp. 613–618 (2002)

  40. Gartshore, R., Palmer, P.: Exploration of an unknown 2D environment using a view improvement strategy. Towards Autonomous Robotic Systems, 57–64 (2002)

  41. Gartshore, R., Palmer, P., Illingworth, J.: A novel exploration algorithm based on a improvement strategy. Int. J. Adv. Robot. Syst. 2(4), 287–294 (2005)

    Google Scholar 

  42. Gaspar, J., Winters, N., Santos-Victor, J.: Vision-based navigation and environmental representations with an omni-directional camera. IEEE Trans. Robot. Autom. 16(6), 890–898 (2000)

    Article  Google Scholar 

  43. Georgiades, C., German, A., Hogue, A., Liu, H., Prahacs, C., Ripsman, A., Sim, R., Torres, L.A., Zhang, P., Buehler, M., Dudek, G., Jenkin, M., Milios, E.: An aquatic walking robot. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS) (2004)

  44. Giralt, G., Sobek, R., Chatila, R.: A multi-level planning and navigation system for a mobile robot; a firts approach to Hilare. In: Proc. of Sixth Intl Joint Conf. on Artificial Intelligence, pp. 335–337 (1979)

  45. Goldberg, S.B., Maimone, M.W., Matthies, L.: Stereo vision and rover navigation software for planetary exploration. In: Proc. of IEEE Aerospace Conference Proceedings, pp. 5–2025, 5–2036 (2002)

  46. Gracias, N., Santos-Victor, J.: Underwater video mosaics as visual navigation maps. Comput. Vis. Image Underst. 1(79), 66–91 (2000)

    Article  Google Scholar 

  47. Graefe, V.: Driveless highway vehicles. In: Proc. of Int’l Hi Tech Forum, pp. 86–95 (1992)

  48. Graefe, V.: Vision for autonomous mobile robots. In: Proc. of IEEE Workshop on Advanced Motion Control, pp. 57–64 (1992)

  49. Graefe, V.: Vision for intelligent road vehicles. In: Proc. of the IEEE Symposium of Intelligent Vehicles, pp. 135–140 (1993)

  50. Graefe, V., Kuhnert, K.: Vision-based autonomous road vehicles. Vision Based Vehicle Guidance, Springer-Verlag New York, pp. 1–29 (1992)

    Google Scholar 

  51. Grau, A., Climent, J., Aranda, J.: Real-time architecture for cable tracking using texture descriptors. In: Proc. of IEEE Oceans Conference, vol. 3, pp. 1496–1500 (1998)

  52. Green, W.E., Oh, P.Y., Barrows, G.L.: Flying insects inspired vision for autonomous aerial robot maneuvers in near-earth environments. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 2347–2352 (2004)

  53. Green, W.E.,. Oh, P.Y., Sevcik, K., Barrows, G.L.: Autonomous landing for indoor flying robots using optic flow. In: Proc. of IMECE International Mechanical Engineering Congress, vol. 2, pp. 1341–1346 (2003)

  54. Haddad, H., Khatib, M., Lacroix, S., Chatila, R.: Reactive navigation in outdoor environments using potential fields. In: Proc. IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 1232–1237 (1998)

  55. Hallset, J.: Testing the robustness of an underwater vision system. In: Laplante, P., Stoyenko, A. (eds.) Real-Time Imaging: Theory, Techniques, and Applications. IEEE, pp. 225–260 (1996)

  56. Harris, C., Stephens, M.: Combined corner and edge detector. In: Proc. of Fourth Alvey Vision Conference, pp. 147–151 (1988)

  57. Hashima, M., Hasegawa, F., Kanda, S., Maruyama, T., Uchiyama, T.: Localization and obstacle detection for a robot for carrying food trays. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 345–351 (1997)

  58. Haywood, R.: Acquisition of a micro scale photographic survey using an autonomous submersible. In: Proc. of IEEE Oceans, vol. 18, pp. 1423–1426 (1986)

  59. Horn, B.K.P., Schunck, B.G.: Determining optical flow. Artif. Intell. 17(1–3), 185–203 (1981)

    Article  Google Scholar 

  60. Horswill, I.: Visual collision avoidance by segmentation. In: Proc. of ARPA94, pp. II:1135–1141, (1994)

  61. Howard, A., Tunstel, E., Edwards, D., Carlson, A.: Enhancing fuzzy robot navigation systems by mimicking human visual perception of natural terrain traversability. In: IFSA World Congress and 20th NAFIPS International Conference, vol. 1, pp. 7–12 (2001)

  62. Hrabar, S., Sukhatme, G.S., Corke, P., Usher, K., Roberts, J.: Combined optic-flow and stereo-based navigation of urban canyons for a UAV. In: Proc. of IEEE Int’l Conf. of Intelligent Robots and Systems (IROS), pp. 3309–3316 (2005)

  63. Jochem, T.M., Pomerleau, D.A., Thorpe, C.E.: Vision based neural network road and intersection detection and traversal. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), vol. 3, pp. 344–349 (1995)

  64. Jones, S.D., Andresen, C., Crowley, J.L.: Appearance based processes for visual navigation. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 551–557 (1997)

  65. Kabuka, M., Arenas, A.E.: Position verification of a mobile robot using standard pattern. Proc. IEEE Int’l Conf. Robot. Autom. (ICRA), 3(6), 505–516 (1987)

    Google Scholar 

  66. Kak, A.C., de Souza, G.N.: Robotic vision: what happened to the visions of yesterday? In: Proc. of 16th International Conference on Pattern Recognition, pp. 839–847 (2002)

  67. Kidono, K., Miura, J., Shirai, Y.: Autonomous visual navigation of a mobile robot using a human-guided experience. Robot. Auton. Syst. 40(2–3), 121–130 (2002)

    Article  Google Scholar 

  68. Kim, D., Nevatia, R.: Simbolic navigation with a generic map. In: Proc. of IEEE Workshop on Vision for Robots, pp. 136–145 (1995)

  69. Kosaka, A., Kak, A.C.: Fast vision-guided mobile robot navigation using model-based reasoning and prediction of uncertainties. Comput. Vis. Graph. Image Underst. 56(3), 271–329 (1992)

    MATH  Google Scholar 

  70. Kosecka, J., Zhou, L., Barber, P., Duric, Z.: Qualitative image based localization in indoors environments. In: Proc. of IEEE Intl Conf. Computer Vision and Pattern Recognition (CVPR), vol. 2, pp. 3–8 (2003)

  71. Krotkov, E., Herbert, M.: Mapping and positioning for a prototype lunar rover. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 2913–2919 (1995)

  72. Latombe, J.: Robot Motion Planning. Kluwer Academic Publishers (1991)

  73. Lenser, S., Veloso, M.: Visual sonar: fast obstacle avoidance using monocular vision. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 886–891 (2003)

  74. Leonard, J., Durrant-Whyte, H.F.: Simultaneous map building and localization for an autonomous mobile robot. In: Proc. of IEEE Intl Workshop on Intelligent Robots and Systems, pp. 1442–1447 (1991)

  75. Liang, B., Pears, N.: Visual navigation using planar homographies. In: Proc. of IEEE Int’l Conf. on Robotics and Automations (ICRA), vol. 1, pp. 205–210 (2002)

  76. Lorigo, L.M., Brooks, R.A., Grimson, W.E.: Visually-guided obstacle avoidance in unstruc tured environments. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 373–379 (1997)

  77. Lots, J-F., Lane, D.M., Trucco, E.: Application of a 2 1/2 D visual servoing to underwater vehicle station-keeping. In: Proc. of IEEE Oceans (2000)

  78. Lowe, D.G.: Object recognition from local scale invariant features. In: Proc. of International Conference on Computer Vision (ICCV), pp. 1150–1157 (1999)

  79. Lucas, B.D., Kanade, T.: An iterative image registration technique with an application to stero vision. In: Proc. of DARPA Image Uderstanding Workshop, pp. 121–130 (1981)

  80. Maja, J.M., Takahashi, T., Wang, Z.D., Nakano, E.: Real-time obstacle avoidance algorithm for visual navigation. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 925–930 (2000)

  81. Manessis, A., Hilton, A.: Scene modelling from sparse 3D data. J. Image Vis. Comput. 23(10), 900–920 (2005)

    Article  Google Scholar 

  82. Manessis, A., Hilton, A., Palmer, P., McLauchlan, P., Shen, X.: Reconstruction of scene models from sparse 3d structure. In: Proc. of IEEE Intl Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 2666–2673 (2000)

  83. Marks, R.L., Rocks, S., Lee, M.J.: Real-time video mosaicing of the ocean floor. In: Proc. of Symposium on Autonomous Underwater Vehicle Technology, pp. 21–27 (1994)

  84. Martens, S., Gaudian, P., Carpenter, G.A.: Mobile robot sensor integration with fuzzy ARTMAP. In: Proc. of the 1998 IEEE ISIC/CIRA/ISAS Joint Conferenc, pp. 307–312 (1998)

  85. Martin, M.C.: Evolving visual sonar: depth from monocular images. Pattern Recogn. Lett. 27, 1174–1180 (2006)

    Article  Google Scholar 

  86. Matsumoto, S., Ito, Y.: Real-time based tracking of submarine cables for AUV/ROV. In: Proc. of IEEE Oceans, vol. 3, pp. 1997–2002 (1995)

  87. Matsumoto, Y., Ikeda, K., Inaba, M., Inoue, H.: Visual navigation using omnidirectional view sequence. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 317–322 (1999)

  88. Matsumoto, Y., Inaba, M., Inoue, H.: Visual navigation using view sequenced route representation. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), vol. 1, pp. 83–88 (1996)

  89. Matsumoto, Y., Sakai, K., Inaba, M., Inoue, H.: View-based approach to robot navigation. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 1702–1708 (2000)

  90. Matthies, L., Gat, E., Harrison, R., Wilcox, B., Volpe, R., Litwin, T.: Mars microrover navigation: performance evaluation and enhancement. In: Proc. of IEEE Int’l Conf. of Intelligent Robots and Systems (IROS), vol. 1, pp. 433–440 (1995)

  91. Matthies, L., Shafer, S.A.: Error modeling in stereo navigation. IEEE J. Robot. Autom. 3(3), 239–248 (1987)

    Google Scholar 

  92. McLauchlan, P., Murray, D.: A unifing framework for structure and motion recovery from image sequences. In: Proc. of European Conference on Computer Vision (ECCV), pp. 314–320 (1995)

  93. Mejias, L.O., Saripalli, S., Sukhatme, G.S., Cervera, P.C.: Detection and tracking of external features in an urban environment using an autonomous helicopter. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 3972–3977 (2005)

  94. Meng, M., Kak, A.C.: Mobile robot navigation using neural networks and nonmetrical environment models. In: Proc. of IEEE Control Systems, pp. 30–39 (1993)

  95. Meng, M., Kak, A.C.: NEURO-NAV: a neural network based architecture for vision-guided mobile robot navigation using non-metrical models of the environment. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), vol. 2, pp. 750–757 (1993)

  96. Moravec, H.P.: The Stanford cart and the CMU rover. In: Proceedings of IEEE, vol. 71, pp. 872–884 (1983)

  97. Morita, H., Hild, M.: Panoramic view-based navigation in outdoor environments based on support vector learning. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 2302–2307 (2006)

  98. Morita, H., Hild, M., Miura, J., Shirai, Y.: Panoramic view-based navigation in outdoor environments based on support vector learning. In Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 1885–1890 (2006)

  99. Negahdaripour, S., Xu, X., Jin, L.: Direct estimation of motion from seafloor images for automatic station-keeping of submersible platforms. IEEE J. Oceanic Eng. 24(3), 370–382 (1999)

    Article  Google Scholar 

  100. Netter, T., Franceschini, N.: A robotic aircraft that follows terrain using a neuromorphic eye. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), vol. 1, pp. 129–134 (2002)

  101. Ohno, T., Ohya, A., Yuta, S.: Autonomous navigation for mobile robots referring pre-recorded image sequence. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), vol. 2, pp. 672–679 (1996)

  102. Ollero, A., Ferruz, J., Caballero, F., Hurtado, S., Merino, L.: Motion compensation and object detection for autonomous helicopter visual navigation in the COMETS system. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 19–24 (2004)

  103. Oriolo, G., Ulivi, G., Vendittelli, M.: On-line map building and navigation for autonomous mobile robots. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 2900–2906 (1995)

  104. Pan, J., Pack, D.J., Kosaka, A., Kak, A.C.: FUZZY-NAV: a vision-based robot navigation architecture using fuzzy inference for uncertainty-reasoning. In: Proc. of IEEE World Congress Neural Networks, vol. 2, pp. 602–607 (1995)

  105. Pears, N., Liang, B.: Ground plane segmentation for mobile robot visual navigation. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 1513–1518 (2001)

  106. Pomerleau, D.A.: ALVINN: an autonomous land vehicle in a neural network. Technical Report CMU-CS-89-107, Carnegie Mellon University (1989)

  107. Remazeilles, A., Chaumette, F., Gros, P.: Robot motion control from a visual memory. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 4695–4700 (2004)

  108. Remazeilles, A., Chaumette, F., Gros, P.: 3D navigation based on a visual memory. In: Proc. of IEEE Int’l Conf. on Robotics and Automations (ICRA), pp. 2719–2725 (2006)

  109. Rife, J., Rock, S.: A low energy sensor for auv-based jellyfish tracking. In: Proc. of 12th International Symposium on Unmanned Untethered Submersible Technology (2001)

  110. Rives, P., Borelly, J.J.: Visual servoing techniques applied to an underwater vehicle. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 20–25 (1997)

  111. Royer, E., Bom, J., Dhome, M., Thuillot, B., Lhuillier, M., Marmoiton, F.: Outdoor autonomous navigation using monocular vision. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 3395–3400 (2005)

  112. Saeedi, P., Lawrence, P.D., Lowe, D.G.: Vision-based 3-D trajectory tracking for unknown environments. IEEE Trans. Robot. 22(1), 119–136 (2006)

    Article  Google Scholar 

  113. Santos-Victor, J., Sandini, G.: Visual-based obstacle detection: a purposive approach using the normal flow. In: Proc. of International Conference on Intelligent Autonomous Systems (1995)

  114. Santos-Victor, J., Sandini, G., Curotto, F., Garibaldi, S.: Divergence stereo for robot navigation: learning from bees. In: Proc. of IEEE Conference on Computer Vision and Pattern Recognition (1993)

  115. Schleicher, D., Bergasa, L.M., Barea, R., Lopez, E., Ocaña, M.: Real-time simultaneous localization and mapping using a wide-angle stereo camera. In: Proc. of IEEE Workshop on Distributed Intelligent Systems: Collective Intelligence and Its Applications (DIS’06), pp. 55–60 (2006)

  116. Se, S., Lowe, D.G., Little, J.: Mobile robot localization and mapping with uncertainty using scale-invariant visual landmarks. Int. J. Robot. Res. 21(8), 735–758 (2002)

    Article  Google Scholar 

  117. Se, S., Lowe, D.G., Little, J.: Vision-based global localization and mapping for mobile robots. IEEE Trans. Robot. 21(3), 364–375 (2005)

    Article  Google Scholar 

  118. Shen, J., Hu, H.: Visual navigation of a museum guide robot. In: The Six World Congress on Intelligent Control and Automation (WCICA), vol. 2, pp. 9169–9173 (2006)

  119. Shi, J., Tomasi, C.: Good features to track. In: Proc. of IEEE Intl Conf. on Computer Vision and Pattern Recognition (CVPR), pp. 593–600 (1994)

  120. Silpa-Anan, C., Brinsmead, T., Abdallah, S., Zelinsky, A.: Preliminary experiments in visual servo control for autonomous underwater vehicle. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), vol. 4, pp. 1824–1829 (2001)

  121. Sim, R., Dudek, G.: Learning generative models of scene features. In: Proc. of IEEE Intl Conf. Computer Vision and Pattern Recognition (CVPR), vol. 1, pp. 406–412 (2001)

  122. Sim, R., Dudek, G.: Effective exploration strategies for the construction of visual maps. In: Proc. of 5th IFAC/EURON Symposium on Intelligent Autonomous Vehicles, vol. 3, pp. 3224–3231 (2003)

  123. Sim, R., Elinas, P., Griffin, M., Shyr, A., Little, J.J.: Design and analysis of a framework for real-time vision-Based SLAM using Rao-Blackwellised particle filters. In: Proc. of the 3rd Canadian Conference on Computer and Robotic Vision (2006)

  124. Sim, R., Little, J.J.: Autonomous vision-based exploration and mapping using hybrid maps and Rao-Blackwellised particle filters. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS) (2006)

  125. Smith, R., Self, M., Cheeseman, P.: Estimating uncertain spatial relationship in robotics. In: Cox, I., Wilfong, G.T. (eds.) Autonomous Robot Vehicles, pp. 167–193 (1990)

  126. Srinivasan, V., Lehrer, M., Kircner, W.H., Zhang, S.W.: Range perception through apparent image speed in freely flying honeybees. Vis. Neurosci. 6(5), 519–35 (May 1991)

    Article  Google Scholar 

  127. Srinivasan, V., Thurrowgood, S., Soccol, D.: An optical system for guidance of terrain following in UAVs. In: Proc. of the IEEE International Conference on Video and Signal Based Surveillance (AVSS), pp. 51–56 (2006)

  128. Srinivasan, V., Zhang, S.W., Chahl, J.S., Stange, G., Garratt, M.: An overview of insect-inspired guidance for application in ground and airborne platforms. Proc. Inst. Mech. Eng. Part G: J. Aerosp. Eng. 218(6), 375–388 (2004)

    Google Scholar 

  129. Thrun, S.: Probabilistic algorithms in robotics. Technical Report CMU-CS-00-126, Carnegie Mellon University (2000)

  130. Sugihara, K.: Some location problems for robot navigation using a single camera. Comput. Vis. Graph. Image Process. 42, 112–129, (1988)

    Article  Google Scholar 

  131. Talukder, A., Goldberg, S., Matties, L., Ansar, A.: Real-time detection of moving objects in a dynamic scene from moving robotic vehicles. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 1308–1313 (October 2003)

  132. Talukder, A., Matties, L.: Real-time detection of moving objects from moving vehicles using dense stereo and optical flow. In: Proc. of IEEE Int’l Conf. on Intelligent Robots and Systems (IROS), pp. 3718–3725 (October 2004)

  133. Temizer, S., Kaelbling, L.P.: Optical flow based local navigation. Web Page. Massachusetts Institute of Technology (2003) http://people.csail.mit.edu/lpk/mars/

  134. Thorpe, C.: FIDO: vision and navigation for a mobile robot. In: PhD dissertation, Dept Computer Science, Carnegie Mellon University (1983)

  135. Thorpe, C., Herbert, M.H., Kanade, T., Shafer, S.A.: Vision and navigation for the Carnegie-Mellon navlab. IEEE Trans. Pattern Anal. Mach. Intell. 10(3), 362–372 (1988)

    Article  Google Scholar 

  136. Thorpe, C., Kanade, T., Shafer, S.A.: Vision and navigation for the Carnegie-Mellon Navlab. In: Proc. of Image Understand Workshop, pp. 143–152 (1987)

  137. Thrun, S.: Learning metric-topological maps for indoor mobile robot navigation. Artif. Intell. 99(1), 21–71 (1998)

    Article  MATH  Google Scholar 

  138. Thrun, S., Bennewitz, M., Burgard, W., Cremers, A., Dellaert, F., Fox, D.: MINERva: a second-generation museum tour-guide robot. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), vol. 3, pp. 1999–2005 (May 1999)

  139. Tomono, M.: 3-D Object map building using dense object models with sift-based recognition features. In: Proc. of IEEE Int’l Conf. of Intelligent Robots and Systems (IROS), pp. 1885–1890 (2006)

  140. Torr, P., Fitzgibbon, A.W., Zisserman, A.: Maintaining a multiple motion model hypotheses over many views to recover matching and structure. In: Proc. of International Conference on Computer Vision (ICCV), pp. 485–491 (1998)

  141. Trucco, E., Plakas, K.: Video tracking : a concise survey. IEEE J. Ocean. Eng. 31(2), 520–529 (2006)

    Article  Google Scholar 

  142. Truk, M.A., Morgenthaler, D.G., Gremban, K.D., Marra, M.: VITS – A vision system for autonomous land vehicle navigation. IEEE Trans. Pattern Anal. Mach. Intell. 10(3), 342–361 (1988)

    Article  Google Scholar 

  143. Tsugawa, S., Yatabe, T., Hisrose, T., Matsumoto, S.: An automobile with artificial intelligence. In: Proc. of Sixth International Joint Conference on Artificial Intelligence, pp. 893–895 (1979)

  144. Tsubouchi, T., Yuta, S.: Map-assisted vision system of mobile robots for reckoning in a building environment. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 1978–1984 (1987)

  145. van der Zwaan, S., Santos-Victor, J.: An insect inspired visual sensor for the autonomous navigation of a mobile robot. In: Proc. of the Seventh International Sysposium on Intelligent Robotic Systems (SIRS) (1999)

  146. Wilcox, B., Matties, L., Gennery, D., Copper, B., Nguyen, T., Litwin, T., Mishkin, A., Stone, H.: Robotic vehicles for planetary exploration. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 175–180 (1992)

  147. Winters, N., Santos-Victor, J.: Omnidirectional visual navigation. In: Proc. of IEEE Int’l Symposium on Intelligent Robotic Systems (SIRS), pp. 109–118 (1999)

  148. Wooden, D.: A guide to vision-based map building. IEEE Robot. Autom. Mag. 13, 94–98 (2006)

    Article  Google Scholar 

  149. Xu, X., Negahdaripour, S.: Mosaic-based positioning and improved motion estimation methods for automatic navigation of sumersible vehicles. IEEE J. Ocean. Eng. 27(1), 79–99 (2002)

    Article  Google Scholar 

  150. Zhou, C., Wei, Y., Tan, T.: Mobile robot self-localization based on global visual appearance features. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 1271–1276 (2003)

  151. Zhou, J., Li, B.: Homography-based ground detection for a mobile robot platform using a single camera. In: Proc. of IEEE Int’l Conf. on Robotics and Automation (ICRA), pp. 4100–4101 (2006)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Francisco Bonin-Font.

Additional information

This work is partially supported by DPI 2005-09001-C03-02 and FEDER funding.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Bonin-Font, F., Ortiz, A. & Oliver, G. Visual Navigation for Mobile Robots: A Survey. J Intell Robot Syst 53, 263–296 (2008). https://doi.org/10.1007/s10846-008-9235-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-008-9235-4

Keywords

Navigation