Autonomous Robots

, Volume 19, Issue 1, pp 7–25 | Cite as

Robot Homing by Exploiting Panoramic Vision

  • Antonis A. Argyros
  • Kostas E. Bekris
  • Stelios C. Orphanoudakis
  • Lydia E. Kavraki

Abstract

We propose a novel, vision-based method for robot homing, the problem of computing a route so that a robot can return to its initial “home” position after the execution of an arbitrary “prior” path. The method assumes that the robot tracks visual features in panoramic views of the environment that it acquires as it moves. By exploiting only angular information regarding the tracked features, a local control strategy moves the robot between two positions, provided that there are at least three features that can be matched in the panoramas acquired at these positions. The strategy is successful when certain geometric constraints on the configuration of the two positions relative to the features are fulfilled. In order to achieve long-range homing, the features’ trajectories are organized in a visual memory during the execution of the “prior” path. When homing is initiated, the robot selects Milestone Positions (MPs) on the “prior” path by exploiting information in its visual memory. The MP selection process aims at picking positions that guarantee the success of the local control strategy between two consecutive MPs. The sequential visit of successive MPs successfully guides the robot even if the visual context in the “home” position is radically different from the visual context at the position where homing was initiated. Experimental results from a prototype implementation of the method demonstrate that homing can be achieved with high accuracy, independent of the distance traveled by the robot. The contribution of this work is that it shows how a complex navigational task such as homing can be accomplished efficiently, robustly and in real-time by exploiting primitive visual cues. Such cues carry implicit information regarding the 3D structure of the environment. Thus, the computation of explicit range information and the existence of a geometric map are not required.

Keywords

robot homing omni-directional vision panoramic cameras vision-based robot navigation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Argyros, A.A., Tsakiris, D.P., and Groyer, C. 2004. Biomimetic centering behavior: Mobile robots with panoramic sensors. In IEEE Robotics and Automation Magazine, Special Issue on Panoramic Robotics, December 2004, pp. 21–30.Google Scholar
  2. Baltzakis, H., Argyros, A.A., and Trahanias, P. 2003. Fusion of laser and visual data for reliable robot motion planning and collision avoidance. International Journal of Machine Vision and Applications, 15:92–100.CrossRefGoogle Scholar
  3. Basri, R., Rivlin, E., and Shimshoni, I. 1998. Visual homing: Surfing on the epipoles. In the Proceedings of the Sixth International Conference on Computer Vision (ICCV-98), Bombay, India, pp. 863–869.Google Scholar
  4. Bianco, G. and Zelinsky, A. 1999. Biologically inspired visual landmark learning and navigation for mobile robots. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS’99), Korea, pp. 671–676.Google Scholar
  5. Burgard, W., Fox, D., and Thrun, S. 1997. Active mobile robot localization. In Proceedings of the Fifteenth International Joint Conference on Artificial Intelligence (IJCAI’97), San Mateo, CA.Google Scholar
  6. Burgard, W., Trahanias, P., Haehnel, D., Moors, M., Schulz, D., Baltzakis, H., and Argyros, A.A. 2002. TOURBOT and WebFAIR: Web-operated mobile robots for tele-presence in populated exhibitions. In Proceedings of the IROS 02 Workshop on Robots in Exhibition, EPFL, Lausanne, Switzerland.Google Scholar
  7. Cartwright, B.A. and Collett, T.S. 1983. Landmark learning in bees: Experiments and models. Journal of Computational Physiology, 151:521–543.CrossRefGoogle Scholar
  8. Cartwright, B.A. and Collett, T.S. 1987. Landmark maps for honeybees. Biological Cybernetics, 57:85–93.CrossRefGoogle Scholar
  9. Cassinis, R., Grana, D. and Rizzi, A. 1996. A perception system for mobile robot localization. Machine Learning and Perception, series in Machine Perception Artificial Intelligence, Singapore, Vol. 23, pp. 57–64.Google Scholar
  10. Chahl, J.S. and Srinivasan, M.V. 1997. Navigation, path planning and homing for autonomous mobile robots using panoramic visual sensors. In the Proceedings of AISB Workshop on Spatial Reasoning in Mobile Robots and Animals, Manchester, UK, pp. 47–55.Google Scholar
  11. Choset, H. and Burdick, J. 2000. Sensor-based exploration: The hierarchical generalized voronoi graph. The International Journal of Robotics Research, 19:96–125.CrossRefGoogle Scholar
  12. Collett, T.S. 1996. Insect navigation en route to the goal: Multiple strategies for the use of landmarks. The Journal of Experimental Biology, 199:227–235.PubMedGoogle Scholar
  13. Collett, T.S. and Rees, J.A. 1997. View-based navigation in hymenoptera: Multiple strategies of landmark guidance in the approach to a feeder. Journal of Computational Physiology, 181:47–58.CrossRefGoogle Scholar
  14. Cormen, T.H., Leiserson, C.E. and Rivest, R.L. 1996. Introduction To Algorithms. MIT Press, McGraw-Hill Book Company.Google Scholar
  15. DeSouza, G.N. and Kak, A.C. 2002. Vision for mobile robot navigation: A survey. IEEE Transactions on Pattern Analysis and Machine Intelligence, 24(2):237–267.CrossRefGoogle Scholar
  16. Dyer, F.C. 1996. Spatial memory and navigation by honeybees on the scale of the foraging range. Journal of Experimental Biology, 99:147–154.Google Scholar
  17. Facchinetti, C. and Hügli, H. 1994. Using and learning vision-based self-positioning for autonomous robot navigation. In the Proceedings of the MLC-COLT Workshop on Robot Learning, Rutgers University, New Brunswick, USA.Google Scholar
  18. Fox, D., Burgard, W., Dellaert, F., and Thrun, S. 1999. Monte carlo localization: Efficient position estimation for mobile robots. In the Proceedings of AAAI-99.Google Scholar
  19. Fox, D., Burgard, W., and Thrun, S. 1998. Active markov localization for mobile robots. Robotics and Autonomous Systems.Google Scholar
  20. Franceschini, N., Pichon, J.M., and Blanes, C. 1992. From insect vision to robot vision. Philosophical Transactions of the Royal Society of London, 337:283–294.Google Scholar
  21. Franz, M.O. and Mallot, H.A. 1998. Biomimetic robot navigation. Technical Report No.65, Max-Planck-Institut für Biologische Kybernetik.Google Scholar
  22. Franz, M.O., Schölkopf, B., and Bülthoff, H.H. 1997. Homing by parameterized scene matching. TR No.46, Max-Planck-Institut für biologische Kybernetik.Google Scholar
  23. Franz, M.O., Schölkopf, B., Mallot, H.A., and Bülthoff, H.H. 1998a. Learning view graphs for robot navigation. Autonomous Robots, 5:111–125.CrossRefGoogle Scholar
  24. Franz, M.O., Schölkopf, B., Mallot, H.A., and Bülthoff, H.H. 1998. Where did I take that snapshot? Scene-based homing by image matching. Biological Cybernetics, 79:191–202.CrossRefGoogle Scholar
  25. Gaussier, P., Joulain, C., Banquet, J.P., Leprtre, S., and Revel, A. 2000. The visual homing problem: An example of robotics/biology cross fertilization. Robotics and Autonomous Systems, 30(1/2):155–180.CrossRefGoogle Scholar
  26. Gutmann, J.S., Burgard, W., Fox, D., and Konolige, K. 1998. An experimental comparison of localization methods. In the Proceedings of the 1998 IEEE/RSJ, International Conference on Intelligent Robots and Systems, Victoria, B.C., Canada.Google Scholar
  27. Kröse, B.J.A., Vlassis, N., and Bunschoten, R. 2002. Omnidirectional vision for appearance-based robot localization. In Sensor Based Intelligent Robots: International Workshop, Dagstuhl Castle, Germany, October 2000, G.D. Hagar, H.I. Cristensen, H. Bunke and R. Klein (Eds.), Selected Revised Papers, no 2238 Lecture Notes in Computer Science, Springer, pp. 39–50.Google Scholar
  28. Lambrinos, D., Möller, R., Labhart, T., Pfeifer, R., and Wehner, R. 2000. Mobile robot employing insect strategies for navigation. Robotics and Autonomous Systems, 30:39–64.CrossRefGoogle Scholar
  29. Lourakis, M., Tzurbakis, S., Argyros, A.A., and Orphanoudakis, S. 2003. Feature transfer and matching in disparate views through the use of plane homographies. IEEE Transactions on Pattern Analysis and Machine Intelligence, (T-PAMI), 25(2):271–276.CrossRefGoogle Scholar
  30. Matsumoto, Y., Sakai, K., Inaba, M., and Inoue, H. View-based approach to robot navigation. In Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000), 3:1702–1708.Google Scholar
  31. Möller, R. 2000. Insect visual homing strategies in a robot with analog processing. Biological Cybernetics, special issue in “Navigation in Biological and Artificial Systems”, 83(3):231–243.Google Scholar
  32. Rizzi, A., Duina, D., Inelli, S., and Cassinis, R. 2000. Unsupervised matching of visual landmarks for robotic homing using fourier-mellin transform. In International Conference on Intelligent Autonomous Systems, Venice, Italy.Google Scholar
  33. Santos-Victor, J., Vassallo, R., and Schneebeli, H.J. 1999. Topological maps for visual navigation. In the First International Conference on Computer Vision Systems, Las Palmas, Canaries.Google Scholar
  34. Shi, J. and Tomasi, C. 1993. Good features to track. Technical Report 93–1399, Department of Computer Science, Cornell University.Google Scholar
  35. Srinivasan, M.V., Zhang, S.W., Lehrer, M., and Collett, T.S. 1996. Honeybee navigation en route to the goal: Visual flight control and odometry. The Journal of Experimental Biology, 199:237–244.PubMedGoogle Scholar
  36. Thompson, S., Zelinsky, A., and Srinivasan, M.V. 1999. Automatic landmark selection for navigation with panoramic vision. In the Proceedings of Australian Conference on Robotics and Automation ACRA’99, Brisbane, Australia.Google Scholar
  37. Thrun, S. 1999. Learning metric-topological maps for indoor mobile robot navigation. Artificial Intelligence, 99(1):21–71.CrossRefGoogle Scholar
  38. Thrun, S. 2000. Probabilistic algorithms in robotics. AI Magazine, 21(4):93–109.Google Scholar
  39. Thrun, S., Fox, D., Burgard, W., and Dellaert, F. 2000. Robust monte carlo localization for mobile robots. Artificial Intelligence.Google Scholar
  40. Thrun, S., Fox, D., Burgard, W., and Dellaert, F. 2000. Robust monte carlo localization for mobile robots. Artificial Intelligence, 101:99–141.Google Scholar
  41. Tomasi, C. and Kanade, T. 1991. Detection and tracking of point features. CMU-CS-91-132, School of Computer Science, Carnegie Mellon University.Google Scholar
  42. Trahanias, P., Burgard, W., Argyros, A.A., Haehnel, D., Baltzakis, H., Pfaff, P., and Stachniss, C. Tourbot and webFair: Web operated mobile robots for telepresence in populated exhibitions. To appear in IEEE Robotics and Automation Magazine, Special issue on EU-funded projects in Robotics.Google Scholar
  43. Weber, K., Venkatesh, S., and Srinivasan, M.V. 1998. Insect inspired robot homing. Adaptive Behaviour.Google Scholar
  44. Winters, N., Gaspar, J., Lacey, G., and Santos-Victor, J. 2000. Omni-directional vision for robot navigation. IEEE Workshop on Omnidirectional Vision (OMNIVIS’00), Hilton Head, South Carolina.Google Scholar
  45. Winters, N. and Santos-Victor, J. 1999. Mobile robot navigation using omni-directional vision. In the Proceedings of the 3rd Irish Machine Vision and Image Processing Conference (IMVIP’99), Dublin, Ireland.Google Scholar

Copyright information

© Springer Science + Business Media, Inc. 2005

Authors and Affiliations

  • Antonis A. Argyros
    • 1
  • Kostas E. Bekris
    • 2
  • Stelios C. Orphanoudakis
    • 1
  • Lydia E. Kavraki
    • 2
  1. 1.Institute of Computer Science (ICS)Foundation for Research and Technology - Hellas (FORTH)Heraklion, CreteGreece
  2. 2.Department of Computer ScienceRice UniversityHouston

Personalised recommendations