Image-Based Absolute Positioning System for Mobile Robot Navigation

  • JaeMu Yun
  • EunTae Lyu
  • JangMyung Lee
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4109)

Abstract

Position estimation is one of the most important functions for the mobile robot navigating in the unstructured environment. Most of previous localization schemes estimate current position and pose of a mobile robot by applying various localization algorithms with the information obtained from sensors which are set on the mobile robot, or by recognizing an artificial landmark attached on the wall or objects of the environment as natural landmarks. Several drawbacks about them have been brought up. To compensate the drawbacks, a new localization method that estimates the absolute position of the mobile robot by using a fixed camera on the ceiling in the corridor is proposed. And also, the proposed method can improve the success rate for position estimation, since it calculates the real size of an object. This is not a relative localization scheme which reduces the position error through algorithms with noisy sensor data, but a kind of absolute localization. The effectiveness of the proposed localization scheme is demonstrated through the experiments.

References

  1. 1.
    Olson, C.F.: Probabilistic Self-Localization for Mobile Robots. IEEE Trans. on Robotics and Automation 16(1), 55–66 (2000)CrossRefGoogle Scholar
  2. 2.
    Jetto, L., Longhi, S., Venturini, G.: Development and Experimental Validation of an Adaptive Extended Kalman Filter for the Localizaiton of Mobile Robots. IEEE Trans. on Robotics and Automation 15(2), 219–229 (1999)CrossRefGoogle Scholar
  3. 3.
    Curran, A., Kyriakopoulos, K.J.: Sensor-Based Self-Localization for Wheeled Mobile Robots. In: Proc. of ICRA, vol. 1, pp. 8–13 (May 1993)Google Scholar
  4. 4.
    Ching-Chih, T.: A localization system of a mobile robot by fusing dead-reckoning and ultrasonic measurements. IEEE Trans. on Instrumentation and Measurement 47(5), 1399–1404 (1998)CrossRefGoogle Scholar
  5. 5.
    Hognbo, W., Cheolung, K., Shin-ichirou, T., Takakazu, I.: Computer Control of Wheel Chair by Using Landmarks. In: Proc. of KACC (October 1995)Google Scholar
  6. 6.
    Mata, M., Armingol, J.M., de la Escalera, A., Salichs, M.A.: A visual landmark recognition system for topological navigation of mobile robots. In: Proc. of ICRA, vol. 2, pp. 1124–1129 (May 2001)Google Scholar
  7. 7.
    Il-Myung, K., Wan-Cheol, K., Kyung-Sik, Y., Jang-Myung, L.: Navigation of a Mobile Robot Using Hand Gesture Recognition. Trans. on Control, Automation and Systems Engineering 8(7), 599–606 (2002)CrossRefGoogle Scholar
  8. 8.
    Choi, S.Y., Lee, J.M.: Applications of moving windows technique to autonomous vehicle navigation. Image and Vision Computing, 120–130 (January 2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • JaeMu Yun
    • 1
  • EunTae Lyu
    • 1
  • JangMyung Lee
    • 1
  1. 1.Department of Electronics EngineeringPusan National UniversityBusanKorea

Personalised recommendations