MICAI 2005: MICAI 2005: Advances in Artificial Intelligence pp 1001-1011 | Cite as
Visual Planning for Autonomous Mobile Robot Navigation
Abstract
For autonomous mobile robots following a planned path, self-localization is a very important task. Cumulative errors derived from the different noisy sensors make it absolutely necessary. Absolute robot localization is commonly made measuring relative distance from the robot to previously learnt landmarks on the environment. Landmarks could be interest points, colored objects, or rectangular regions as posters or emergency signs, which are very useful and not intrusive beacons in human environments. This paper presents an active localization method: a visual planning function selects from a free collision path and a set of planar landmarks, a subset of visible landmarks and the best combination of camera parameters (pan, tilt and zoom) for positions sampled along the path. A visibility measurement and some utility measurements were defined in order to select for each position, the camera modality and the subset of landmarks that maximize these local criteria. Finally, a dynamic programming method is proposed in order to minimize saccadic movements all over the trajectory.
Keywords
Plan Path Interest Point Camera Parameter Saccadic Movement Camera ModalityPreview
Unable to display preview. Download preview PDF.
References
- 1.Burschka, D., Geiman, J., Hager, G.: Optimal landmark configuration for vision-based control of mobile robots. In: Proc. of 2003 IEEE ICRA, Taipei, Taiwan, September 14-19, pp. 3917–3922 (2003)Google Scholar
- 2.Madsen, C.B., Andersen, C.S.: Optimal landmark selection for triangulation of robot position. Robotics and Autonomous Systems 23(4), 277–292 (1998)CrossRefGoogle Scholar
- 3.Hayet, J.B., Lerasle, F., Devy, M.: Visual Landmarks Detection and Recognition for Mobile Robot Navigation. In: Proc. 2003 IEEE Conf. on Computer Vision and Pattern Recognition (CVPR 2003), Madison, Wisconsin, USA, June 2003, vol. II, pp. 313–318 (2003)Google Scholar
- 4.Jurie, F., Dhome, M.: Hyperplane Approximation for Template Matching. IEEE. Trans. on Pattern Analysis and Machine Intelligence 24(7), 996–1000 (2002)CrossRefGoogle Scholar
- 5.Möller, R.: Perception through Anticipation - An Approach to Behaviour-based Perception. In: Proc. New Trends in Cognitive Science, Vienna, pp. 184–190 (1997)Google Scholar
- 6.Tarabanis, K.A., Tsai, R.Y., Kaul, A.: Computing occlusion-free view-points. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(3), 279–292 (1996)CrossRefGoogle Scholar
- 7.Klein, K., Sequeira, V.: View planning for the 3D modelling of Real World Scenes. In: 2000 IEEE/RSJ IROS, vol. II, pp. 943–948 (2000)Google Scholar
- 8.Tovar, B., Murrieta-Cid, R., Esteves, C.: Robot Motion Planning for Map Building. In: Proc. IEE/RSJ IROS 2002, Lausanne, Switzerland (November 2002)Google Scholar
- 9.González-Banos, H., Latombe, J.C.: A randomized art-gallery algorithm for sensor placement. In: ACM Symposium on Computational Geometry, SCG 2001 (2001)Google Scholar
- 10.Deng, X., Milios, E., Mirzaian, A.: Landmark selection strategies for path execution. Robotics and Autonomous Systems 17, 171–185 (1996)CrossRefGoogle Scholar
- 11.Ayala-Ramirez, V., Devy, M.: Active selection and tracking of multiple landmarks for visual navigation. In: 2nd Int. Sym. on Robotics and Automation (ISRA 2000), Monterrey, Mexico, 10-12 November, pp. 557–562 (2000)Google Scholar
- 12.Sala, P.L., Sim, R., Shokoufandeh, A., Dickinson, S.J.: Landmark Selection for Vision-Based Navigation, accepted in. In: IEEE IROS, Sendai, Japan, September 28 – October 2 (2004)Google Scholar
- 13.Marin Hernandez, A., Devy, M.: Target and Environments Complexity Characterization for Automatic Visual Tracker Selection in Mobile Robotic Tasks. In: Proc. 4th Int. Sym. on Robotics and Automation (ISRA 2004), Queretaro, Mexico (August 2004)Google Scholar