Abstract
This article presents some experiments of a real-time navigation system driven by two cameras pointing laterally to the navigation direction (Divergent Stereo). Similarly to what has been proposed in (Franceschini et al. 1991; Coombs and Roberts 1992), our approach (Sandini et al. 1992; Santos-Victor et al. 1993) assumes that, for navigation purposes, the driving information is not distance (as it is obtainable by a stereo setup) but motion and, more precisely, by the use of qualitative optical-flow information computed over nonoverlapping areas of the visual field of two cameras.
Following this idea, a mobile vehicle has been equipped with a pair of cameras looking laterally (much like honeybees) and a controller based on fast, real-time computation of optical flow has been implemented. The control of the mobile robot (Robee) is based on the comparison between the apparent image velocity of the left and the right cameras. The solution adopted is derived from recent studies (Srinivasan 1991) describing the behavior of freely flying honeybees and the mechanisms they use to perceive range.
This qualitative information (no explicit measure of depth is performed) is used in many experiments to show the robustness of the approach, and a detailed description of the control structure is presented to demonstrate the feasibility of the approach in driving the mobile robot within a cluttered environment.
A discussion about the potentialities of the approach and the implications in terms of sensor structure is also presented.
Similar content being viewed by others
References
Aloimonos, J., 1990. Purposive and qualitative active vision,Proc. International Workshop on Active Control in Visual Perception, Antibes (France).
Astrom, K. and Wittenmark, B. 1986.Computer Controlled Systems: Theory and Design. Prentice-Hall, Englewood Cliffs, NJ.
Bajcsy, R.K. 1985. Active perception vs. passive perception,Proc. 3rd IEEE Workshop on Computer Vision: Representation and Control, Bellaire MI, pp. 13–16.
Ballard, D.H., Nelson R.C., and Yamauchi, B. 1989. Animate vision,Optics News 15(5):17–25.
Coombs, D. and Roberts, K. 1992. Centering behavior using peripheral vision, D.P. Casasent (Ed.),Intelligent Robots and Computer Vision XI: Algorithms, Techniques and Active Vision, pp. 714–721. SPIE, Vol. 1825, Nov. 1988.
DeMicheli, E., Sandini, G., Tistarelli, M., and Torre, V. 1988. Estimation of visual motion and 3d motion parameters from singular pointsProc. IEEE Intern. Workshop on Intelligent Robots and Systems, Tokyo, Japan.
Enkelmann, W. 1990. Obstacle detection by evaluation of optical flow fields from image sequences,Proc. 1st Europe. Conf. Computer Vis. Antibes (France), Springer Verlag, pp. 134–138.
Fermüller, C. 1993. Navigation preliminaries. In Yiannis Aloimonos (Ed.),Active Perception. Lawrennce Erlbaum Assoc.: Hillsdale, NJ, 1991, pp. 103–150.
Ferrari, M., Fossa, M, Grosso, E., Magrassi, M., and Sandini, G. 1991. A practical implementation of a multilevel architecture for vision-based navigation,Proc. 5th Intern. Conf. Advan. Robot. Pisa, Italy, June, pp. 1092–1098.
Fossa, M., Grosso, E., Ferrari, F., Sandini, G., and Zapendouski, M. 1992. A visually guided mobile robot acting in indoor environments,Proc. of IEEE Workshop on Applications of Computer Vision, Palm Springs, U.S.A.
Franceschini, N., Pichon J., and Blances, C. 1991. Real time visuo-motor control: from flies to robots,Proc. 5th Intern. Conf. Advan. Robot. Pisa, Italy, June.
Horn, P.K.P. and Schunck, B.G. 1981. Determining optical flow,Artificial Intelligence 17(1–3):185–204.
Horridge, G., 1987. The evolution of visual processing and the construction of seeing systems,Proc. Roy. Soc. London, pp. 279–292.
Lehrer, M., Srinivasan, M.V., Zhang, S.W., and Horridge, G.A. 1988. Motion cues provide the bee's visual world with a third dimension.Nature, 332(6162): 356–357, 1988.
Nagel, H.-H., 1987. On the estimation of optical flow: Relations between different approaches and some new results,Artificial Intelligence 33:299–323.
Sandini, G., Gandolfo, F., Grosso, E., and Tistarelli, M. 1993. Vision during action. In Yiannis Aloimonos (Ed.), Active Perception, Lawrence Erlbaum Assoc. Hillsdale, NJ: pp. 151–190.
Sandini, G., Santos-Victor J., Curotto, F., and Garibaldi, G. 1992. Robotic bees, Tech. Rept. LIRA-Lab-University of Genova, October.
Sandini G. and Tistarelli, M. 1990. Robust Obstacle Detection Using Optical Flow,Proc. of IEEE Intern. Workshop on Robust Computer Vision, Seattle, Oct. 1–3.
Santos-Victor, J., Sandini, G., Curotto, F., and Garibaldi, S. 1993. Divergent stereo for robot navigation: Learning from bees,Proc. Comput. Vis. Patt. Recog. New York.
Srinivasan, M.V. 1992. Distance perception in insects,Cur. Dir. Psycholog. Sci. 1:22–26.
Srinivasan, M.V., Lehrer, M., Kirchner, W.H., and Zhang, S.W. 1991. Range perception through apparent image speed in freely flying honeybees,Visual Neuroscience, 6:519–535.
Uras, S., Girosi, F., Verri, A., and Torre, V. 1988. Computational approach to motion perception,Biological Cybernetics 60:69–87.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Santos-Victor, J., Sandini, G., Curotto, F. et al. Divergent stereo in autonomous navigation: From bees to robots. Int J Comput Vision 14, 159–177 (1995). https://doi.org/10.1007/BF01418981
Received:
Revised:
Issue Date:
DOI: https://doi.org/10.1007/BF01418981