Advertisement

Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Task-oriented navigation algorithms for an outdoor environment with colored borders and obstacles

Abstract

This paper presents task-oriented navigation algorithms used for an outdoor environment. The goals of the navigation are recognizing colored border lines on both sides of a path, avoiding obstacles on the path, and navigating the given path. To recognize the colored border lines with one camera, we apply a support vector data description method, which employs six color features extracted from two color models. To avoid collision with obstacles on the path, we fuse the data of the lines measured by a camera and the obstacles measured by a laser range finder. These algorithms were applied to autonomous navigation of about 100 m long curved track. We demonstrate that a four-wheel skid-steering mobile robot successfully finishes the mission.

This is a preview of subscription content, log in to check access.

References

  1. 1

    Buluswar SD, Draper BA (1998) Color recognition in outdoor images. In: Computer vision, 1998. Sixth international conference on 4–7 Jan 1998, pp 171–177. doi:10.1109/iccv.1998.710715

  2. 2

    Crowley J (1985) Navigation for an intelligent mobile robot. IEEE J Robotics Autom 1(1): 31–41

  3. 3

    Darpa http://www.darpagrandchallenge.com

  4. 4

    De Wit CC, Bastin G, Siciliano B (1996) Theory of robot control. Springer, New York, Inc.

  5. 5

    Folkesson J, Jensfelt P, Christensen H (2005) Vision SLAM in the measurement subspace. In: Proceedings of the IEEE international conference on robotics and automation

  6. 6

    Goldberg DE (1989) Genetic algorithms in search, optimization, and machine learning. Addison-wesley, Boston

  7. 7

    GwangJuTechnopark http://www.robotcenter.or.kr

  8. 8

    Jung E-j, Yi B-J, Kim W (2011) Motion planning algorithms of an omni-directional mobile robot with active caster wheels. Intell Service Robot 4(3): 167–180. doi:10.1007/s11370-011-0089-4

  9. 9

    Kim J, Park C, Kweon I (2011) Vision-based navigation with efficient scene recognition. Intell Service Robot 4(3):191–202. doi:10.1007/s11370-011-0091-x

  10. 10

    Kozlowski K, Pazderski D (2004) Modeling and control of a 4-wheel skid-steering mobile robot. Int J Appl Math Comput Sci 14(4): 477–496

  11. 11

    Martin DR, Fowlkes CC, Malik J (2004) Learning to detect natural image boundaries using local brightness, color, and texture cues. IEEE Trans Pattern Anal Mach Intell 26(5): 530–549

  12. 12

    Mitchell TM, Carbonell JG (1986) Machine learning: a guide to current research. Springer, Berlin

  13. 13

    New Technology Foundation http://www.ntf.or.jp/challenge

  14. 14

    Pearl J (1988) Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan Kaufmann, Massachusetts

  15. 15

    Se S, Lowe DG, Little JJ (2005) Vision-based global localization and mapping for mobile robots. IEEE Trans Robot 21(3): 364–375

  16. 16

    Tachibana S, Ishihara S (1960) On infinitesimal holomorphically projective transformations in Kählerian manifolds. Tohoku Math J 12(1): 77–101

  17. 17

    Tax DMJ, Duin RPW (2004) Support vector data description. Mach Learn 54(1): 45–66

  18. 18

    Teslić L, Klančar G, Škrjanc I (2007) Simulation of a mobile robot with an LRF in a 2D environment and map building. Robot Motion Control 2007: 239–246

  19. 19

    Teslic L, Skrjanc I, Klancar G (2010) Using a LRF sensor in the Kalman-filtering-based localization of a mobile robot. ISA Trans 49(1): 145–153

  20. 20

    Thrun S, Montemerlo M, Dahlkamp H, Stavens D, Aron A, Diebel J, Fong P, Gale J, Halpenny M, Hoffmann G, Lau K, Oakley C, Palatucci M, Pratt V, Stang P, Strohband S, Dupont C, Jendrossek L-E, Koelen C, Markey C, Rummel C, van Niekerk J, Jensen E, Alessandrini P, Bradski G, Davies B, Ettinger S, Kaehler A, Nefian A, Mahoney P (2007) Stanley: the Robot that won the DARPA Grand Challenge The 2005 DARPA grand challenge. In: Buehler M, Iagnemma K, Singh S (eds) Springer tracts in advanced robotics. Springer, Berlin, pp 36–143. doi:10.1007/978-3-540-73429-1_1

  21. 21

    Tomono M (2004) Building an object map for mobile robots using LRF scan matching and vision-based object recognition. In: Robotics and automation, 2004. Proceedings, ICRA’04. IEEE international conference on April 26–May 1 2004, vol 3764, pp 3765-3770. doi:10.1109/robot.2004.1308855

  22. 22

    Weifeng Z, Zhaoda Z (2007) Recognition method for color image of target based on space transformation distance. J Nanjing University of Aeronautics Astronautics 5

  23. 23

    Yang T, Shadlen MN (2007) Probabilistic reasoning by neurons. Nature 447(7148): 1075–1080

Download references

Author information

Correspondence to Byung-Ju Yi.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Jung, E., Yi, B. Task-oriented navigation algorithms for an outdoor environment with colored borders and obstacles. Intel Serv Robotics 6, 69–77 (2013). https://doi.org/10.1007/s11370-012-0114-2

Download citation

Keywords

  • Color detection
  • Sensor fusion
  • Mobile robot
  • Outdoor navigation