Intelligent Service Robotics

, Volume 10, Issue 1, pp 67–76 | Cite as

Localization of AUVs using visual information of underwater structures and artificial landmarks

  • Jongdae Jung
  • Ji-Hong Li
  • Hyun-Taek Choi
  • Hyun Myung
Original Research Paper


Autonomous underwater vehicles (AUVs) can perform flexible operations in complex underwater environments due to their autonomy. Localization is one of the key components of autonomous navigation. Since the inertial navigation system of an AUV suffers from drift, observing fixed objects in an inertial reference system can enhance the localization performance. In this paper, we propose a method of localizing AUVs by exploiting visual measurements of underwater structures and artificial landmarks. In a framework of particle filtering, a camera measurement model that emulates the camera’s observation of underwater structures is designed. The particle weight is then updated based on the extracted visual information of the underwater structures. Detected artificial landmarks are also used in the particle weight update. The proposed method is validated by experiments performed in a structured basin environment.


Autonomous navigation Localization Vision Underwater structures Particle filter 



This research was supported in part by grant No. 10043928 from the Industrial Source Technology Development Programs of the MOTIE (Ministry Of Trade, Industry & Energy), Korea, and in part by the project, Development of basic SLAM technologies for autonomous underwater robot and software environment for MOSS-IvP sponsored by Korea Research Institute of Ships & Ocean Engineering (KRISO). The students are supported by Korea Ministry of Land, Infrastructure and Transport (MOLIT) as U-City Master and Doctor Course Grant Program.


  1. 1.
    Caccia M (2007) Vision-based ROV horizontal motion control: near-seafloor experimental results. Control Eng Pract 15(6):703–714CrossRefGoogle Scholar
  2. 2.
    Ferreira F, Veruggio G, Caccia M, Bruzzone G (2012) Real-time optical SLAM-based mosaicking for unmanned underwater vehicles. Intell Serv Robot 5(1):55–71CrossRefGoogle Scholar
  3. 3.
    Leabourne KN, Rock SM, Fleischer SD, Burton R (1997) Station keeping of an ROV using vision technology. Proc MTS/IEEE OCEANS97 1:634–640CrossRefGoogle Scholar
  4. 4.
    Negahdaripour S, Firoozfam P (2006) An ROV stereo vision system for ship-hull inspection. IEEE J Ocean Eng 31(3):551–564CrossRefGoogle Scholar
  5. 5.
    Whitcomb L, Yoerger D, Singh H, Howland J (1999) Advances in underwater robot vehicles for deep ocean exploration: navigation, control, and survey operations. In: 9th international symposium on robotics research, pp 346–353Google Scholar
  6. 6.
    Hollinger GA, Englot B, Hover FS, Mitra U, Sukhatme GS (2013) Active planning for underwater inspection and the benefit of adaptivity. Int J Robot Res 32(1):3–18CrossRefGoogle Scholar
  7. 7.
    Jun BH, Park JY, Lee FY, Lee PM, Lee CM, Kim K, Lim YK, Oh JH (2009) Development of the AUV ISiMI and a free running test in an ocean engineering basin. Ocean Eng 36(1):2–14CrossRefGoogle Scholar
  8. 8.
    Kim A, Eustice R (2009) Pose-graph visual SLAM with geometric model selection for autonomous underwater ship hull inspection. In: Procedings on IEEE/RSJ international conference on intelligent robotics and systems, pp 1559–1565Google Scholar
  9. 9.
    Marani G, Choi S (2010) Underwater target localization. IEEE Robot Autom Mag 17(1):64–70CrossRefGoogle Scholar
  10. 10.
    Kim DH, Lee DH, Myung H, Choi HT (2014) Artificial landmark-based underwater localization for AUV using weighted template matching. Intell Serv Robot 7(3):175–184CrossRefGoogle Scholar
  11. 11.
    Paull L (2014) AUV navigation and localization: a review. IEEE J Ocean Eng 93(1):131–149CrossRefGoogle Scholar
  12. 12.
    Fallon MF, Kaess M, Johannsson H, Leonard JJ (2011) Efficient AUV navigation fusing acoustic ranging and side-scan sonar. In: Proceedings of the IEEE international conference on robotics and automation, pp 2398–2405Google Scholar
  13. 13.
    Johannsson H, Kaess M, Englot B, Hover F, Leonard JJ (2010) Imaging sonar-aided navigation for autonomous underwater harbor surveillance. In: Proceedings of the IEEE/RSJ international conference on Intelligent and Robotic Systems, pp 4396–4403Google Scholar
  14. 14.
    Bulow H, Birk A (2011) Spectral registration of noisy sonar data for underwater 3D mapping. Auton Robot 30:307–331CrossRefGoogle Scholar
  15. 15.
    Pfingsthorn M, Birk A, Bulow H (2012) Uncertainty estimation for a 6-DOF spectral registration method as basis for sonar-based under- water 3D SLAM. In: Proceedings of the international conference on robotics and automation, pp 3049–3054Google Scholar
  16. 16.
    Pathak K, Birk A, Vaskevicius N (2010) Plane-based registration of sonar data for underwater 3D mapping. In: Proceedings of the IEEE/RSJ international conference on intelligent robots and systems, pp 4880–4885Google Scholar
  17. 17.
    Balasuriya B, Takai M, Lam W, Ura T, Kuroda Y (1997) Vision based autonomous underwater vehicle navigation: underwater cable tracking. Proc MTS/IEEE OCEANS97 2:1418–1424CrossRefGoogle Scholar
  18. 18.
    Hover FS, Eustice RM, Kim A, Englot B, Johannsson H, Kaess M, Leonard JJ (2012) Advanced perception, navigation and planning for autonomous in-water ship hull inspection. Int J Robot Res 31(12):1445–1464CrossRefGoogle Scholar
  19. 19.
    Kim A, Eustice R (2013) Real-time visual SLAM for autonomous underwater hull inspection using visual saliency. IEEE T Robot 29(3):719–733CrossRefGoogle Scholar
  20. 20.
    Lowe DG (1999) Object recognition from local scale-invariant features. Proc IEEE Int Conf Comput Vis 2:1150–1157Google Scholar
  21. 21.
    Bay H, Ess A, Tuytelaars T, Van Gool L (2008) Speeded-up robust features (SURF). Comput Vis Image Und 110:346–359CrossRefGoogle Scholar
  22. 22.
    Maurice FF, Johannsson H, Leonard JJ (2012) Efficient scene simulation for robust Monte Carlo localization using an RGB-D camera. In: IEEE international conference on robotics and automation, pp 1663–1670Google Scholar
  23. 23.
    Gerard P, Gagalowicz A (2000) Three dimensonal model-based tracking using texture learning and matching. Pattern Recogn Lett 21:1095–1103CrossRefzbMATHGoogle Scholar
  24. 24.
    Noyer J, Lanvin P, Benjelloun M (2004) Model-based tracking of 3D objects based on a sequential Monte-Carlo method. Conf Signals Syst Comput 2:1744–1748Google Scholar
  25. 25.
    Zang C, Hashimoto K (2011) Camera localization by CAD model matching. In: 2011 IEEE/SICE international symposium on system integration, pp 30–35Google Scholar
  26. 26.
    Hoermann S, Borges P (2014) Vehicle localization and classification using off-board vision and 3-D models. IEEE T Robot 30(2):432–447CrossRefGoogle Scholar
  27. 27.
    Kondo H, Maki T, Ura T, Nose Y, Sakamaki T, Inaishi M (2004) Relative navigation of an autonomous underwater vehicle using a light-section profiling system. In: Proceedings of the international conference on intelligent robots and systems, pp 1103–1108Google Scholar
  28. 28.
    Thrun S, Burgard W, Fox D (2005) Probabilistic robotics. MIT press, CambridgezbMATHGoogle Scholar
  29. 29.
    Dellaert F, Fox D, Burgard W, Thrun S (1999) Monte Carlo localization for mobile robots. Proc IEEE Int Conf Robot Autom 2:1322–1328CrossRefzbMATHGoogle Scholar
  30. 30.
    Li JH, Lee MJ, Kim JG, Kim JT, Suh J (2014) Development of P-SURO II hybrid AUV and its experimental study. In: Proceedings of the OCEANS, pp 1–6Google Scholar
  31. 31.
    Fernandez-Madrigal J, Claraco J (2013) Simultaneous localization and mapping for mobile robots: introduction and methods. Information Science Reference, Hershey, PACrossRefGoogle Scholar
  32. 32.
  33. 33.
    Vincent L, Soille P (1991) Watersheds in digital spaces: an efficient algorithm based on immersion simulations. IEEE T Pattern Anal 6(13):583–598CrossRefGoogle Scholar
  34. 34.
    Ilea DE, Whelan PF (2006) Color image segmentation using a spatial k-means clustering algorithm. In: IMVIP 2006—10th international machine vision and image processing conference, pp 1–8Google Scholar
  35. 35.
    Achanta R, Shaji A, Smith K, Lucchi A, Fua P, Ssstrunk S (2012) SLIC superpixels compared to state-of-the-art superpixel methods. IEEE T Pattern Anal 34(11):2274–2282CrossRefGoogle Scholar
  36. 36.
    Lee YJ, Lee JH, Choi HT (2014) A framework of recognition and tracking for underwater objects based on sonar images: part 1. Design and recognition of artificial landmark considering characteristics of sonar images. J Inst Electron Inf Eng 51(2):422–429 In KoreanGoogle Scholar
  37. 37.
    Olson E (2011) AprilTag: a robust and flexible visual fiducial system. In: Procedings on IEEE international conference on robotics and automation, pp 3400–3407Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2016

Authors and Affiliations

  • Jongdae Jung
    • 1
  • Ji-Hong Li
    • 2
  • Hyun-Taek Choi
    • 1
  • Hyun Myung
    • 3
  1. 1.Ocean System Engineering Research DivisionKorea Research Institute of Ships and Ocean Engineering (KRISO)DaejeonRepublic of Korea
  2. 2.Applied Technology DivisionKorea Institute of Robot and Convergence (KIRO)PohangRepublic of Korea
  3. 3.Urban Robotics Laboratory, KAISTDaejeonRepublic of Korea

Personalised recommendations