Innovations in Intelligent Machines - 1 pp 223-270

Part of the Studies in Computational Intelligence book series (SCI, volume 70) | Cite as

Toward Robot Perception through Omnidirectional Vision

  • José Gaspar
  • Niall Winters
  • Etienne Grossmann
  • José Santos- Victor

Vision is an extraordinarily powerful sense. The ability to perceive the environment allows for movement to be regulated by the world. Humans do this effortlessly but we still lack an understanding of how perception works. Our approach to gaining an insight into this complex problem is to build artificial visual systems for semi-autonomous robot navigation, supported by humanrobot interfaces for destination specification. We examine how robots can use images, which convey only 2D information, in a robust manner to drive its actions in 3D space. Our work provides robots with the perceptual capabilities to undertake everyday navigation tasks, such as go to the fourth office in the second corridor. We present a complete navigation system with a focus on building – in line with Marr’s theory [57] – mediated perception modalities. We address fundamental design issues associated with this goal; namely sensor design, environmental representations, navigation control and user interaction.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    S. Baker and S. K. Nayar, A theory of catadioptric image formation, Proc. Int. Conf. Computer Vision (ICCV’97), January 1998, pp. 35-42.Google Scholar
  2. 2.
    A theory of single-viewpoint catadioptric image formation, International Journal of Computer Vision 35 (1999), no. 2, 175-196.Google Scholar
  3. 3.
    R. Benosman and S. B. Kang (eds.), Panoramic vision, Springer Verlag, 2001.Google Scholar
  4. 4.
    M. Betke and L. Gurvits, Mobile robot localization using landmarks, IEEE Trans. on Robotics and Automation. 13 (1997) 2, 251-263.CrossRefGoogle Scholar
  5. 5.
    J. Borenstein, H. R. Everett, and Liqiang Feng, Navigating mobile robots: Sen-sors and techniques, A. K. Peters, Ltd., Wellesley, MA, 1996 (also: Where am I? Systems and Methods for Mobile Robot Positioning, ftp://ftp.eecs.umich. edu/people/johannb/pos96rep.pdf).
  6. 6.
    G. Borgefors, Hierarchical chamfer matching: A parametric edge matching algo-rithm, IEEE Transactions on Pattern Analysis and Machine Intelligence 10 (1988), no. 6, 849-865.Google Scholar
  7. 7.
    R. Brooks, Visual map making for a mobile robot, Proc. IEEE Conf. on Robotics and Automation, 1985.Google Scholar
  8. 8.
    R. A. Brooks, A robust layered control system for a mobile robot, IEEE Trans-actions on Robotics and Automation 2 (1986), 14-23.MathSciNetGoogle Scholar
  9. 9.
    A. Bruckstein and T. Richardson, Omniview cameras with curved surface mir-rors, Proceedings of the IEEE Workshop on Omnidirectional Vision at CVPR 2000, June 2000, First published in 1996 as a Bell Labs Technical Memo, pp. 79-86.Google Scholar
  10. 10.
    D. Burschka, J. Geiman, and G. Hager, Optimal landmark configuration for vision-based control of mobile robots, Proc. IEEE Int. Conf. on Robotics and Automation, 2003, pp. 3917-3922.Google Scholar
  11. 11.
    Z. L. Cao, S. J. Oh, and E.L. Hall, Dynamic omni-directional vision for mobile robots, Journal of Robotic Systems 3 (1986), no. 1, 5-17.Google Scholar
  12. 12.
    J. S. Chahl and M. V. Srinivasan, Reflective surfaces for panoramic imaging, Applied Optics 36 (1997), no. 31, 8275-8285.Google Scholar
  13. 13.
    P. Chang and M. Herbert, Omni-directional structure from motion, Proceed- ings of the1st International IEEE Workshop on Omni-directional Vision (OMNIVIS’00) at CVPR 2000, June 2000.Google Scholar
  14. 14.
    R. Collins and R. Weiss, Vanishing point calculation as a statistical inference on the unit sphere, Int. Conf. on Computer Vision (ICCV), 1990, pp. 400-403.Google Scholar
  15. 15.
    T. Conroy and J. Moore, Resolution invariant surfaces for panoramic vision systems, IEEE ICCV’99, 1999, pp. 392-397.Google Scholar
  16. 16.
    Olivier Cuisenaire, Distance transformations: Fast algorithms and applications to medical image processing, Ph.D. thesis, U. Catholique de Louvain, October 1999.Google Scholar
  17. 17.
    K. Daniilidis (ed.), 1st international ieee workshop on omnidirectional vision at cvpr 2000, June 2000.Google Scholar
  18. 18.
    ——, Page of omnidirectional vision hosted by the grasp laboratory, http://www. cis.upenn.edu/∼kostas/omni.html, 2005.
  19. 19.
    P. David, D. DeMenthon, and R. Duraiswami, Simultaneous pose and correspon-dence determination using line features, Proc. IEEE Conf. Comp. Vision Patt. Recog., 2003.Google Scholar
  20. 20.
    A. Davison, Real-time simultaneous localisation and mapping with a single cam-era, IEEE Int. Conf. on Computer Vision, 2003, pp. 1403-1410 vol. 2.CrossRefGoogle Scholar
  21. 21.
    C. Canudas de Wit, H. Khennouf, C. Samson, and O. J. Sordalen, Chap.5: Nonlinear control design for mobile robots, Nonlinear control for mobile robots (Yuan F. Zheng, ed.), World Scientific series in Robotics and Intelligent Sys-tems, 1993.Google Scholar
  22. 22.
    P. E. Debevec, C. J. Taylor, and J. Malik, Modeling and rendering architecture from photographs: a hybrid geometry and image-based approach, SIGGRAPH, 1996.Google Scholar
  23. 23.
    S. Derrien and K. Konolige, Approximating a single viewpoint in panoramic imaging devices, Proceedings of the 1st International IEEE Workshop on Omni-directional Vision at CVPR 2000, June 2000, pp. 85-90.Google Scholar
  24. 24.
    G. DeSouza and A. Kak, Vision for mobile robot navigation: A survey, IEEE Transactions on Pattern Analysis and Machine Intelligence 24 (2002), no. 2, 237-267.Google Scholar
  25. 25.
    O. Faugeras, Three-dimensional computer vision - a geometric viewpoint, MIT Press, 1993.Google Scholar
  26. 26.
    Mark Fiala, Panoramic computer vision, Ph.D. thesis, University of Alberta, 2002.Google Scholar
  27. 27.
    S. Fleck, F. Busch, P. Biber, H. Andreasson, and W. Straber, Omnidirectional 3d modeling on a mobile robot using graph cuts, Proc. IEEE Int. Conf. on Robotics and Automation, 2005, pp. 1760-1766.Google Scholar
  28. 28.
    J. Foote and D. Kimber, Flycam: Practical panoramic video and automatic cam-era control, Proc. of the IEEE Int. Conference on Multimedia and Expo, vol. III, August 2000, pp. 1419-1422.Google Scholar
  29. 29.
    S. Gaechter and T. Pajdla, Mirror design for an omnidirectional cam-era with a uniform cylindrical projection when using svavisca sensor, Tech. report, Czech Tech. Univ. - Faculty of Electrical Eng. ftp://cmp.felk.cvut.cz/ pub/cmp/articles/pajdla/Gaechter-TR-2001-03.pdf, March 2001.
  30. 30.
    S. Gaechter, T. Pajdla, and B. Micusik, Mirror design for an omnidirectional camera with a space variant imager, IEEE Workshop on Omnidirectional Vision Applied to Robotic Orientation and Nondestructive Testing, August 2001, pp. 99-105.Google Scholar
  31. 31.
    J. Gaspar, Omnidirectional vision for mobile robot navigation, Ph.D. thesis, Instituto Superior Técnico, Dept. Electrical Engineering, Lisbon - Portugal, 2003.Google Scholar
  32. 32.
    J. Gaspar, C. Deccó, J. Okamoto Jr, and J. Santos-Victor, Constant resolution omnidirectional cameras, 3rd International IEEE Workshop on Omni-directional Vision at ECCV, 2002, pp. 27-34.Google Scholar
  33. 33.
    J. Gaspar, E. Grossmann, and J. Santos-Victor, Interactive reconstruction from an omnidirectional image, 9th International Symposium on Intelligent Robotic Systems (SIRS’01), July 2001.Google Scholar
  34. 34.
    J. Gaspar and J. Santos-Victor, Visual path following with a catadiop-tric panoramic camera, Int. Symp. Intelligent Robotic Systems, July 1999, pp. 139-147.Google Scholar
  35. 35.
    J. Gaspar, N. Winters, and J. Santos-Victor, Vision-based navigation and envi-ronmental representations with an omni-directional camera, IEEE Transactions on Robotics and Automation 16 (2000), no. 6, 890-898.Google Scholar
  36. 36.
    D. Gavrila and V. Philomin, Real-time object detection for smart vehicles, IEEE, Int. Conf. on Computer Vision (ICCV), 1999, pp. 87-93.Google Scholar
  37. 37.
    C. Geyer and K. Daniilidis, A unifying theory for central panoramic systems and practical applications, ECCV 2000, June 2000, pp. 445-461.Google Scholar
  38. 38.
    ——, Catadioptric projective geometry, International Journal of Computer Vision 43 (2001), 223-243.Google Scholar
  39. 39.
    Gene H. Golub and Charles F. Van Loan, Matrix computations, third ed., Johns Hopkins Studies in the Mathematical Sciences, The Johns Hopkins University Press, 1996. MR 1 417 720.Google Scholar
  40. 40.
    P. Greguss, Panoramic imaging block for 3d space, US patent 4,566,763, January 1986, Hungarian Patent granted in 1983.Google Scholar
  41. 41.
    P. Greguss (ed.), Ieee icar 2001 workshop on omnidirectional vision applied to robotic orientation and non-destructive testing, August 2001.Google Scholar
  42. 42.
    E. Grossmann, D. Ortin, and J. Santos-Victor, Algebraic aspects of recon-struction of structured scenes from one or more views, British Machine Vision Conference, BMVC2001, September 2001, pp. 633-642.Google Scholar
  43. 43.
    Etienne Grossmann, Maximum likelihood 3d reconstruction from one or more uncalibrated views under geometric constraints, Ph.D. thesis, Instituto Superior Técnico, Dept. Electrical Engineering, Lisbon-Portugal, 2002.Google Scholar
  44. 44.
    E. Hecht and A. Zajac, Optics, Addison Wesley, 1974.Google Scholar
  45. 45.
    R. Hicks, The page of catadioptric sensor design, http://www.math.drexel. edu/∼ahicks/design/, 2004.
  46. 46.
    R. Hicks and R. Bajcsy, Catadioptric sensors that approximate wide-angle perspective projections, Proceedings of the Computer Vision and Pattern Recog-nition Conference (CVPR’00), June 2000, pp. 545-551.Google Scholar
  47. 47.
    A. Howard, M.J. Mataric, and G. Sukhatme, Putting the ‘i’ in ‘team’: an ego-centric approach to cooperative localization, IEEE Int. Conf. on Robotics and Automation, 2003.Google Scholar
  48. 48.
    D. Huttenlocher, G. Klanderman, and W. Rucklidge, Comparing images using the hausdorff distance, IEEE Transactions on Pattern Analysis and Machine Intelligence 15 (1993), no. 9, 850-863.Google Scholar
  49. 49.
    D. Huttenlocher, R. Lilien, and C. Olsen, View-based recognition using an eigenspace approximation to the hausdorff measure, IEEE Transactions on Pattern Analysis and Machine Intelligence 21 (1999), no. 9, 951-956.Google Scholar
  50. 50.
    S. B. Kang and R. Szeliski, 3d scene data recovery using omnidirectional multi-baseline stereo, CVPR, 1996, pp. 364-370.Google Scholar
  51. 51.
    N. Karlsson, E. Di Bernardo, J. Ostrowski, L. Goncalves, P. Pirjanian, and M. Munich, The vslam algorithm for robust localization and mapping, Proc. IEEE Int. Conf. on Robotics and Automation, 2005, pp. 24-29.Google Scholar
  52. 52.
    A. Kosaka and A. Kak, Fast vision-guided mobile robot navigation using model-based reasoning and prediction of uncertainties, CVGIP: Image Understanding 56 (1992), no. 3, 271-329.Google Scholar
  53. 53.
    J. J. Leonard and H. F. Durrant-Whyte, Mobile robot localization by tracking geometric beacons, IEEE Trans. on Robotics and Automation 7 (1991), no. 3, 376-382.Google Scholar
  54. 54.
    R. Lerner, E. Rivlin, and I. Shimshoni, Landmark selection for task-oriented navigation, Proc. Int. Conf. on Intelligent Robots and Systems,2006, pp. 2785-2791.Google Scholar
  55. 55.
    LIRA-Lab, Document on specification, Tech. report, Esprit Project n. 31951-SVAVISCA - available at http://www.lira.dist.unige.it - SVAVISCA-GIOTTO Home Page, May 1999.
  56. 56.
    A. Majumder, W. Seales, G. Meenakshisundaram, and H. Fuchs, Immersive tele-conferencing: A new algorithm to generate seamless panoramic video imagery, Proceedings of the 7th ACM Conference on Multimedia, 1999.Google Scholar
  57. 57.
    D. Marr, Vision, W.H. Freeman, 1982.Google Scholar
  58. 58.
    B. McBride, Panoramic cameras time line, http://panphoto.com/TimeLine. html.
  59. 59.
    B. Micusik and T. Pajdla, Structure from motion with wide circular field of view cameras, IEEE Transactions on Pattern Analysis and Machine Intelligence (PAMI) 28 (2006), no. 7, 1135-1149.Google Scholar
  60. 60.
    K. Miyamoto, Fish-eye lens, Journal of the Optical Society of America 54 (1964), no. 8, 1060-1061.Google Scholar
  61. 61.
    L. Montesano, J. Gaspar, J. Santos-Victor, and L. Montano, Cooperative local-ization by fusing vision-based bearing measurements and motion, Int. Conf. on Intelligent Robotics and Systems, 2005, pp. 2333-2338.Google Scholar
  62. 62.
    H. Murase and S. K. Nayar, Visual learning and recognition of 3d objects from appearance, International Journal of Computer Vision 14 (1995), no. 1, 5-24.Google Scholar
  63. 63.
    V. Nalwa, A true omni-directional viewer, Technical report, Bell Laboratories, February 1996.Google Scholar
  64. 64.
    S. K. Nayar, Catadioptric image formation, Proc. of the DARPA Image Under-standing Workshop, May 1997, pp. 1431-1437.Google Scholar
  65. 65.
    ——, Catadioptric omnidirectional camera, Proc. IEEE Conf. Computer Vision and Pattern Recognition, June 1997, pp. 482-488.Google Scholar
  66. 66.
    S. K. Nayar and V. Peri, Folded catadioptric cameras, Proceedings of the IEEE Computer Vision and Pattern Recognition Conference, June 1999.Google Scholar
  67. 67.
    E. Oja, Subspace methods for pattern recognition, Research Studies Press, 1983.Google Scholar
  68. 68.
    M. Ollis, H. Herman, and S. Singh, Analysis and design of panoramic stereo using equi-angular pixel cameras, Tech. report, Carnegie Mellon University Robotics Institute, TR CMU-RI-TR-99-04, 1999, comes from web.Google Scholar
  69. 69.
    T. Pajdla and V. Hlavac, Zero phase representation of panoramic images for image based localization, 8th Inter. Conf. on Computer Analysis of Images and Patterns CAIP’99, 1999.Google Scholar
  70. 70.
    V. Peri and S. K. Nayar, Generation of perspective and panoramic video from omnidirectional video, Proc. DARPA Image Understanding Workshop, 1997, pp. 243-246.Google Scholar
  71. 71.
    R. Pless, Using many cameras as one, Proc CVPR, 2003, pp. II: 587-593.Google Scholar
  72. 72.
    D. Rees, Panoramic television viewing system, us patent 3 505 465, postscript file, April 1970.Google Scholar
  73. 73.
    W. Rucklidge, Efficient visual recognition using the hausdorff distance, Lecture Notes in Computer Science, vol. 1173, Springer-Verlag, 1996.Google Scholar
  74. 74.
    J. Shi and C. Tomasi, Good features to track, Proc. of the IEEE Int. Conference on Computer Vision and Pattern Recognition, June 1994, pp. 593-600.Google Scholar
  75. 75.
    S. Sinha and M. Pollefeys, Towards calibrating a pan-tilt-zoom camera network, OMNIVIS’04, workshop on Omnidirectional Vision and Camera Networks (held with ECCV 2004), 2004.Google Scholar
  76. 76.
    S.N. Sinha and M. Pollefeys, Synchronization and calibration of camera networks from silhouettes, International Conference on Pattern Recognition (ICPR’04), vol. 1, 23-26 Aug. 2004, pp. 116-119 Vol. 1.Google Scholar
  77. 77.
    T. Sogo, H. Ishiguro, and M. Treivedi, Real-time target localization and track-ing by n-ocular stereo, Proceedings of the 1st International IEEE Workshop on Omni-directional Vision (OMNIVIS’00) at CVPR 2000, June 2000.Google Scholar
  78. 78.
    M. Spetsakis and J. Aloimonos, Structure from motion using line correspon-dences, International Journal of Computer Vision 4 (1990), no. 3, 171-183.Google Scholar
  79. 79.
    P. Sturm, A method for 3d reconstruction of piecewise planar objects from single panoramic images, 1st International IEEE Workshop on Omnidirectional Vision at CVPR, 2000, pp. 119-126.Google Scholar
  80. 80.
    P. Sturm and S. Ramalingam, A generic concept for camera calibration, Proceed-ings of the European Conference on Computer Vision, Prague, Czech Republic, vol. 2, Springer, May 2004, pp. 1-13.Google Scholar
  81. 81.
    W. Sturzl, H. Dahmen, and H. Mallot, The quality of catadioptric imaging -application to omnidirectional stereo, European Conference on Computer Vision, 2004, pp. LNCS 3021:614-627.Google Scholar
  82. 82.
    T. Svoboda, T. Pajdla, and V. Hlaváč, Epipolar geometry for panoramic cam-eras, Proc. European Conf. Computer Vision, July 1998, pp. 218-231.Google Scholar
  83. 83.
    R. Talluri and J. K. Aggarwal, Mobile robot self-location using model-image feature correspondence, IEEE Transactions on Robotics and Automation 12 (1996), no. 1, 63-77.Google Scholar
  84. 84.
    G. Thomas, Real-time panospheric image dewarping and presentation for remote mobile robot control, Journal of Advanced Robotics 17 (2003), no. 4, 359-368.Google Scholar
  85. 85.
    S. Thrun and A. Bucken, Integrating grid-based and topological maps for mobile robot navigation, Proceedings of the 13th National Conference on Artifical Intel-ligence (AAAI’96), 1996.Google Scholar
  86. 86.
    S. Watanabe, Karhunen-loève expansion and factor analysis, Transactions of the 4th Prague Conference on Information Theory, Statistical Decision Functions and Random Processes, 1965, pp. 635-660.Google Scholar
  87. 87.
    R. Wehner and S. Wehner, Insect navigation: use of maps or ariadne’s thread?, Ethology, Ecology, Evolution 2 (1990), 27-48.Google Scholar
  88. 88.
    N. Winters, A holistic approach to mobile robot navigation using omnidirectional vision, Ph.D. thesis, University of Dublin, Trinity College, 2002.Google Scholar
  89. 89.
    N. Winters, J. Gaspar, G. Lacey, and J. Santos-Victor, Omni-directional vision for robot navigation, 1st International IEEE Workshop on Omni-directional Vision at CVPR, 2000, pp. 21-28.Google Scholar
  90. 90.
    N. Winters and J. Santos-Victor, Omni-directional visual navigation, 7th Inter-national Symposium on Intelligent Robotics Systems (SIRS’99), July 1999, pp. 109-118.Google Scholar
  91. 91.
    N. Winters and G. Lacey, Overview of tele-operation for a mobile robot, TMR Workshop on Computer Vision and Mobile Robots. (CVMR’98), September 1999.Google Scholar
  92. 92.
    N. Winters and J. Santos-Victor, Omni-directional visual navigation, Proc. Int. Symp. on Intelligent Robotic Systems, July 1999, pp. 109-118.Google Scholar
  93. 93.
    P. Wunsch and G. Hirzinger, Real-time visual tracking of 3-d objects with dynamic handling of occlusion, IEEE Int. Conf. on Robotics and Automation, April 1997, pp. 2868-2873.Google Scholar
  94. 94.
    Y. Yagi, Omnidirectional sensing and its applications, IEICE Transactions on Information and Systems (1999), no. E82-D-3, 568-579.Google Scholar
  95. 95.
    Y. Yagi, Y. Nishizawa, and M. Yachida, Map-based navigation for mobile robot with omnidirectional image sensor COPIS, IEEE Trans. Robotics and Automa-tion 11 (1995), no. 5, 634-648.Google Scholar
  96. 96.
    K. Yamazawa, Y. Yagi, and M. Yachida, Obstacle detection with omnidirectional image sensor hyperomni vision, IEEE ICRA, 1995, pp. 1062-1067.Google Scholar
  97. 97.
    J. Zheng and S. Tsuji, Panoramic representation for route recognition by a mobile robot, International Journal of Computer Vision 9 (1992), no. 1, 55-76.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • José Gaspar
    • 1
  • Niall Winters
    • 2
  • Etienne Grossmann
    • 1
  • José Santos- Victor
    • 1
  1. 1.Instituto de Sistemas e RobóticaInstituto Superior TécnicoLisboaPortugal
  2. 2.London Knowledge LabLondonUK

Personalised recommendations