Skip to main content

Advertisement

Log in

Experimental verification of turbidity tolerance of stereo-vision-based 3D pose estimation system

  • Original article
  • Published:
Journal of Marine Science and Technology Aims and scope Submit manuscript

Abstract

This paper presents the verification of the turbidity tolerance of a stereo-vision-based 3D pose estimation system for underwater docking applications. To the best of the authors’ knowledge, no studies have yet been conducted on 3D pose (position and orientation) estimation against turbidity for underwater vehicles. Therefore, the effect of turbidity on the 3D pose estimation performance of underwater vehicles and a method of operating under turbid conditions were studied in this work. A 3D pose estimation method using the real-time multi-step genetic algorithm (RM-GA) proposed by the authors in the previous works shows robust pose estimation performance against changing environmental conditions. This paper discusses how and why the RM-GA is well suited to effective 3D pose estimation, even when turbid conditions disturb visual servoing. The experimental results confirm the performance of the proposed 3D pose estimation system under different levels of turbidity. To demonstrate the practical usefulness of the RM-GA, docking experiments were conducted in a turbid pool and a real sea environment to verify the performance and tolerance of the proposed system under turbid conditions. The experimental results verify the robustness of the system against turbidity, presenting a possible solution to a major problem in the field of robotics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21
Fig. 22
Fig. 23
Fig. 24
Fig. 25
Fig. 26
Fig. 27
Fig. 28
Fig. 29
Fig. 30
Fig. 31

Similar content being viewed by others

References

  1. Ridao P, Carreras M, Ribas D, Garcia R (2010) Visual inspection of hydroelectric dams using an autonomous underwater vehicle. J Field Robot 27(6):759–778

    Article  Google Scholar 

  2. Negahdaripour S, Firoozfam P (2006) An ROV stereo vision system for ship-hull inspection. IEEE J Ocean Eng 31(3):551–564

    Article  Google Scholar 

  3. Bingham B, Foley B, Singh H, Camilli R, Delaporta K, Eustice R, Mallios A, Mindell D, Roman C, Sakellariou D (2010) Robotic tools for deep water archaeology: surveying an ancient shipwreck with an autonomous underwater vehicle. J Field Robot 27(6):702–17

    Article  Google Scholar 

  4. Negre A, Pradalier C, Dunbabin M (2008) Robust vision-based underwater homing using self-similar landmarks. J Field Robot 25(6–7):360–77

    Article  MATH  Google Scholar 

  5. McEwen RS, Hobson BW, McBride L, Bellingham JG (2008) Docking control system for a 54-cm-diameter (21-in) AUV. IEEE J Ocean Eng 33(4):550–62

    Article  Google Scholar 

  6. Cowen S, Briest S, Dombrowski J (1997) Underwater docking of autonomous undersea vehicles using optical terminal guidance. OCEANS’97 MTS/IEEE Conf Proc 2:1143–1147

    Article  Google Scholar 

  7. Myint M, Yonemori K, Lwin KN, Yanou A, Minami M (2017) Dual-eyes vision-based Docking system for autonomous underwater vehicle: an approach and experiments. J Intell Robot Syst. https://doi.org/10.1007/s10846-017-0703-6

    Google Scholar 

  8. Feezor MD, Sorrell FY, Blankinship PR, Bellingham JG (2001) Autonomous underwater vehicle homing/docking via electromagnetic guidance. IEEE J Ocean Eng 26(4):515–21

    Article  Google Scholar 

  9. Teo K, Goh B, Chai OK (2015) Fuzzy docking guidance using augmented navigation system on an AUV. IEEE J Ocean Eng 40(2):349–61

    Article  Google Scholar 

  10. Palomeras N, Penalver A, Massot-Campos M, Vallicrosa G, Negre PL, Fernandez JJ, Ridao P, Sanz PJ, Oliver-Codina G, Palomer A (2014) I-AUV docking and intervention in a subsea panel. In: Intelligent robots and systems (IROS 2014), 2014 IEEE/RSJ international conference, pp 2279–2285

  11. Batista P, Silvestre C, Oliveira P, (2012) A two-step control strategy for docking of autonomous underwater vehicles. In: American control conference (ACC), pp 5395–5400

  12. White KA, Smith SM, Ganesan K, Kronen D, Rae GJ, Langenbach RM (1996) Performance results of a fuzzy behavioral altitude flight controller and rendezvous and docking of an autonomous underwater vehicles with fuzzy control. In: Autonomous underwater vehicle technology, 1996 AUV’96, proceedings of the 1996 symposium, pp 17–124

  13. Park JY, Jun BH, Lee PM, Lee FY, Oh JH (2007) Experiment on underwater docking of an autonomous underwater vehicle ISiMI’ using optical terminal guidance. In: OCEANS 2007, Europe, pp 1–6

  14. Park JY, Jun BH, Lee PM, Oh J (2009) Experiments on vision guided docking of an autonomous underwater vehicle using one camera. Ocean Eng 36(1):48–61

    Article  Google Scholar 

  15. Maki T, Shiroku R, Sato Y, Matsuda T, Sakamaki T, Ura T (2013) Docking method for hovering type AUVs by acoustic and visual positioning. In: Underwater technology symposium (UT), 2013 IEEE international, pp 1–6

  16. Palomeras N, Ridao P, Ribas D, Vallicrosa G (2014) Autonomous I-AUV docking for fixed-base manipulation. IFAC Proc 47(3):12160–5

    Article  Google Scholar 

  17. Garcia R, Gracias N (2011) Detection of interest points in turbid underwater images. In: OCEANS, 2011 IEEE-Spain, pp 1–9

  18. Codevilla F, Gaya JD, Duarte N, Botelho S (2004) Achieving turbidity robustness on underwater images local feature detection. Int J Comput Vis 60(2):91–110

    Article  Google Scholar 

  19. Roser M, Dunbabin M, Geiger A (2014) Simultaneous underwater visibility assessment, enhancement and improved stereo. In: Robotics and automation (ICRA), 2014 IEEE international conference, pp 3840–3847

  20. Toka V, Sankaramurthy NH, Kini RP, Avanigadda PK, Kar S (2016) A fast method of fog and haze removal. In: Acoustics, speech and signal processing (ICASSP), 2016 IEEE international conference on 20 Mar 2016, pp 1224–1228

  21. Negru M, Nedevschi S (2013) Image based fog detection and visibility estimation for driving assistance systems. In: Intelligent computer communication and processing (ICCP), 2013 IEEE international conference on 5 Sept 2013, pp 163–168

  22. Tan RT (2008) Visibility in bad weather from a single image. In: Computer vision and pattern recognition. CVPR 2008. IEEE conference on 23 June 2008, pp 1–8

  23. Satat G, Tancik M, Raskar R (2018) Towards photography through realistic fog. In: Computational photography (ICCP), 2018 IEEE international conference on 4 May 2018, pp 1–10

  24. Myint M, Yonemori K, Yanou A, Ishiyama S, Minami M, (2015) Robustness of visual-servo against air bubble disturbance of underwater vehicle system using three-dimensional marker and dual-eye cameras. In: OCEANS’15 MTS/IEEE Washington, pp 1–8

  25. Myint M, Yonemori K, Yanou A, Minami M, Ishiyama S (2015) Visual-servo-based autonomous docking system for underwater vehicle using dual-eyes camera 3D-pose tracking. In: System integration (SII), 2015 IEEE/SICE international symposium, pp 989–994

  26. Myint M, Yonemori K, Yanou A, Lwin KN, Minami M, Ishiyama S (2016) Visual-based deep sea docking simulation of underwater vehicle using dual-eyes cameras with lighting adaptation. In: OCEANS 2016-Shanghai, pp 1–8

  27. Myint M, Yonemori K, Yanou A, Lwin KN, Mukada N, Minami M, (2016) Dual-eyes visual-based sea docking for sea bottom battery recharging. In: OCEANS 2016 MTS/IEEE Monterey, pp 1–7

  28. Li X, Nishida Y, Myint M, Yonemori K, Mukada N, Lwin KN, Takayuki M, Minami M (2017) Dual-eyes vision-based docking experiment of AUV for sea bottom battery recharging. In: OCEANS 2017 Aberdeen, pp 1–5

  29. Myint M, Yonemori K, Yanou A, Lwin KN, Minami M, Ishiyama S (2016) Visual servoing for underwater vehicle using dual-eyes evolutionary real-time pose tracking. JRM 28(4):543–58

    Article  Google Scholar 

  30. Chen HH, Wu CM (2002) ) Use of a numerical technique for reducing effects of environmental luminance and turbidity on underwater imaging. OCEANS’02 MTS/IEEE 4:2383–2389

    Article  Google Scholar 

  31. Hutchinson S, Hager GD, Corke PI (1996) A tutorial on visual servo control. IEEE Trans Robot Autom 12(5):651–70

    Article  Google Scholar 

  32. Song W, Minami M, Aoyagi S (2008) On-line stable evolutionary recognition based on unit quaternion representation by motion-feedforward compensation. Int J Intell Comput Med Sci Image Process 2(2):127–39

    Google Scholar 

  33. Minami M, Agbanhan J, Asakura T (2003) Evolutionary scene recognition and simultaneous position/orientation detection. Soft Comput Meas Inf Acquis. Springer, Berlin, Heidelberg, pp 178–207

    Chapter  Google Scholar 

  34. Song W, Fujia Y, Minami M (2014) 3D visual servoing by feedforward evolutionary recognition. J Adv Mech Des Syst Manuf 4(4):739–55

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by JSPS KAKENHI Grant Number JP16K06183. The authors would like to thank Monbukagakusho; Mitsui Engineering and Shipbuilding Co., Ltd.; and Kowa Corporation for their collaboration and support for this study. The authors would also like to express their thanks to Dr. Yuya Nishida, Prof. Toshihiko Maki, and Prof. Tamaki Ura for their help and support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Myo Myint.

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Myint, M., Lwin, K.N., Mukada, N. et al. Experimental verification of turbidity tolerance of stereo-vision-based 3D pose estimation system. J Mar Sci Technol 24, 756–779 (2019). https://doi.org/10.1007/s00773-018-0586-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00773-018-0586-7

Keywords

Navigation