Combination of active sensing and sensor fusion for collision avoidance in mobile robots

  • Terence Chek
  • Hion Heng
  • Yoshinori Kuno
  • Yoshiaki Shirai
Poster Session D: Biomedical Applications, Detection, Control & Surveillance, Inspection, Optical Character Recognition
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1311)

Abstract

Presently, mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. These systems each have their strengths and weaknesses. To fully utilise the strengths of both the sonar and visual sensing systems, this paper proposes a fusion of navigating methods involving both the sonar and visual systems as primary sources to produce a fast, efficient and reliable obstacle-avoiding and navigating system. Furthermore, to further enhance a better perception of the surroundings and to improve the navigation capabilities of the mobile robot, active sensing modules are also included. The result is an active sensor fusion system for the collision avoiding behaviour of mobile robots. This behaviour can then be incorporated into other purposive behaviours (e.g. Object Seeking, Path Finding etc.). The validity of this system is also shown in real robot experiments.

Keywords

Mobile Robot Collision Avoidance Intelligent Robot Sonar Data Mobile Robot Navigation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Reeve, S., Durrant-Whyte, H.: A qualitative approach to sensor data fusion for mobile robot navigation. Proc. 14th Int. Joint Conf. on Artificial Intelligence (1995) 36-41Google Scholar
  2. 2.
    Kröse, B.J.A., Eecen, M.: A self-organizing representation of sensor space for mobile robot navigation. Proc. 1994 IEEE/RSJ/GI Int. Conf. on Intelligent Robots and Systems (1994) 9–14Google Scholar
  3. 3.
    Nakamura, T., Takamura, S., Asada, M.: Behaviour-based map representation for a sonar-based mobile robot by statistical methods. Proc. 1996 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (1996) 276–283Google Scholar
  4. 4.
    Horswill, I.: Visual collision avoidance by segmentation. Proc. 1994 IEEE/RSJ/GI Int. Conf. on Intelligent Robots and Systems (1994) 902–909Google Scholar
  5. 5.
    Ohno, T., Ohya, A., Yuta, S.: Autonomous navigation for mobile robots referring pre-recorded image sequence. Proc. 1994 IEEE/RSJ/GI Int. Conf. on Intelligent Robots and Systems (1994) 902–909Google Scholar
  6. 6.
    Kuniyoshi, Y., Riekki, J., Ishii, M., Rougeaux, S., Kita, N., Sakane, S., Kakikura, M.: Vision-based behaviors for multi-robot cooperation. Proc. 1994 IEEE/RSJ/GI Int. Conf. on Intelligent Robots and Systems (1994) 925–932Google Scholar
  7. 7.
    Liu, H.C., Hong, T.H., Herman, M., Chellappa, R.: Image gradient evolution — a visual cue for collision avoidance. Proc. 13th Int. Conf. on Pattern Recognition, Volume I, Track A: Computer Vision (1996) 446–450Google Scholar
  8. 8.
    Cheng, G., Zelinsky, A.: Real-time visual behaviours for navigating a mobile robot. Proc. 1996 IEEE/RSJ Int. Conf. on Intelligent Robots and Systems (1996) 973–980Google Scholar
  9. 9.
    Nourbakhsh, I.R., Andre, D., Tomasi, C., Genesereth, M.R.: Obstacle avoidance via depth from focus. Proc. 1996 Image Understanding Workshop (1996) 1339–1344Google Scholar
  10. 10.
    Kweon, I.S., Kuno, Y., Watanabe, M., Onoguchi, K.: Behaviour-based mobile robot using active sensor fusion. Proc. 1992 IEEE Int. Conf. on Robotics and Automation (1992) 1675–1682Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1997

Authors and Affiliations

  • Terence Chek
    • 1
  • Hion Heng
    • 1
  • Yoshinori Kuno
    • 1
  • Yoshiaki Shirai
    • 1
  1. 1.Osaka UniversitySuita, OsakaJapan

Personalised recommendations