Skip to main content
Log in

Autonomous Underwater Vehicle Control for Fishnet Inspection in Turbid Water Environments

  • Regular Papers
  • Robot and Applications
  • Published:
International Journal of Control, Automation and Systems Aims and scope Submit manuscript

Abstract

Fisheries are essential for the economic supply of proteins. Detecting damaged fishnets using autonomous underwater vehicles (AUVs) may be an efficient and safe solution for avoiding dangers to human divers. However, in turbid underwater environments, visibility is significantly degraded by floating particles that cause light attenuation, which is one of the main problems for accurate underwater inspection by optical cameras. To obtain clear images for net inspection, we propose an AUV pose control strategy for fish farming net inspection in turbid water, based on the mean gradient feature over the partial or entire image. To alleviate the laborious human process of setting the desired set-point for distance control, a convolutional neural network (CNN) is trained offline using a supervised learning method and combined with a controller. The proposed method can maintain a relatively constant relative pose with respect to a fishnet, which is sufficient to acquire clear net images in turbid water and check whether a part of the net is damaged or not. Experimental results in both swimming pools and real fish farm environments demonstrated the effectiveness of the proposed methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. FAO, “Fishery and Aquaculture Statistics 2015,” FAO Fisheries and Aquaculture Department, 2017.

  2. D. Jackson, A. Drumm, S. McEvoy, Ø. Jensen, D. Mendiola, G. Gabiña, J. A. Borg, N. Papageorgiou, Y. Karakassis, and K. D. Black, “A pan-European valuation of the extent, causes and cost of escape events from sea cage fish farming,” Aquaculture, vol. 436, no. 20, pp. 21–26, 2015.

    Article  Google Scholar 

  3. J. Vaganay, M. L. Elkins, S. Willcox, F. S. Hover, R. S. Damus, S. Desset, J. P. Morash, and V. C. Polidoro, “Ship hull inspection by hull-relative navigation and control,” Proc. of OCEANS, Washington, 2005.

  4. J. Vaganay, M. Elkins, D. Esposito, W. O’Halloran, F. Hover, and M. Kokko, “Ship hull inspection with the HAUV: US Navy and NATO demonstrations results,” Proc. of OCEANS, Boston, 2006.

  5. F. S. Hover, R. M. Eustice, A. Kim, B. Englot, H. Johannsson, M. Kaess, and J. J. Leonard, “Advanced perception, navigation and planning for autonomous in-water ship hull inspection,” The International Journal of Robotics Research, vol. 31, no. 12, pp. 1445–1464, 2012.

    Article  Google Scholar 

  6. A. Kim and R. M. Eustice, “Real-time visual SLAM for autonomous underwater hull inspection using visual saliency,” IEEE Transactions on Robotics, vol. 29, no. 3, pp. 719–733, 2013.

    Article  Google Scholar 

  7. A. Khan, S. S. A. Ali, F. Meriaudeau, A. S. Malik, L. S. Soon, and T. N. Seng, “Visual feedback-based heading control of autonomous underwater vehicle for pipeline corrosion inspection,” International Journal of Advanced Robotic Systems, vol. 14, no. 3, pp. 1–13, 2017.

    Article  Google Scholar 

  8. A. Duda, J. Schwendner, A. Stahl, and P. Rundtop, “Visual pose estimation for autonomous inspection of fish pens,” Proc. of OCEANS, Genova, 2015.

  9. P. Rundtop and K. Frank, “Experimental evaluation of hydroacoustic instruments for ROV navigation along aquaculture net pens,” Aquacultural Engineering, vol. 74, no. 143–156, 2016.

  10. H. B. Amundsen, W. Caharija, and K. Y. Pettersen, “Autonomous ROV inspections of aquaculture net pens using DVL,” IEEE Journal of Oceanic Engineering, vol. 47, no. 1, pp. 1–19, 2022.

    Google Scholar 

  11. D. Lee, G. Kim, D. Kim, H. Myung, and H. T. Choi, “Vision-based object detection and tracking for autonomous navigation of underwater robots,” Ocean Engineering, vol. 48, pp. 59–68, 2012.

    Article  Google Scholar 

  12. H. H. Chen, W. N. Chuang, and C. C. Wang, “Vision-based line detection for underwater inspection of breakwater construction using an ROV,” Ocean Engineering, vol. 109, pp. 20–33, 2015.

    Article  Google Scholar 

  13. Y. Li, Y. Jiang, J. Cao, B. Wang, and Y. Li, “AUV docking experiments based on vision positioning using two cameras,” Ocean Engineering, vol. 110, pp. 163–173, 2015.

    Article  Google Scholar 

  14. V. Chalkiadakis, N. Papandroulakis, G. Livanos, K. Moirogiorgou, G. Giakos, and M. Zervakis, “Designing a small-sized autonomous underwater vehicle architecture for regular periodic fish-cage net inspection,” Proc. of IEEE International Conference on Imaging Systems and Techniques (IST), 2017.

  15. G. Livanos, M. Zervakis, V. Chalkiadakis, K. Moirogiorgou, G. Giakos, and N. Papandroulakis, “Intelligent navigation and control of a prototype autonomous underwater vehicle for automated inspection of aquaculture net pen cages,” Proc. of IEEE International Conference on Imaging Systems and Techniques (IST), 2018.

  16. Q. Tao, K. Huang, C. Qin, B. Guo, R. Lam, and F. Zhang, “Omnidirectional surface vehicle for fish cage inspection,” Proc. of OCEANS, Charleston, 2018.

  17. T. X. Lin, Q. Tao, and F. Zhang, “Planning for fishnet inspection with an autonomous OSV,” Proc. of IEEE International Conference on System Science and Engineering (ICSSE), 2020.

  18. D. A. Duecker, T. Hansen, and E. Kreuzer, “RGB-D camera-based navigation for autonomous underwater inspection using low-cost micro AUVs,” Proc. of IEEE/OES Autonomous Underwater Vehicles Symposium (AUV), 2020.

  19. M. Bjerkeng, T. Kirkhus, W. Caharija, J. T. Thielemann, H. B. Amundsen, S. J. Ohrem, and E. I. Grøtli, “ROV navigation in a fish cage with laser-camera triangulation,” Journal of Marine Science and Engineering, vol. 9, no. 1, pp. 1–16, 2021.

    Article  Google Scholar 

  20. S. Paspalakis, K. Moirogiorgou, N. Papandroulakis, G. Giakos, and M. Zervakis, “Automated fish cage net inspection using image processing techniques,” IET Image Processing, vol. 14, no. 10, pp. 2028–2034, 2020.

    Article  Google Scholar 

  21. J. Betancourt, W. Coral, and J. Colorado, “An integrated ROV solution for underwater net-cage inspection in fish farms using computer vision,” SN Applied Sciences, vol. 2, no. 12, pp. 1–15, 2020.

    Article  Google Scholar 

  22. D. Jeong, Y. Kim, C. Kim, H. Lee, H. Yu, W. Jung, H. Oh, and J. Ryu, “Judgement of tear of fish farming nets using deep learning,” Journal of Institute of Control, Robotics and Systems (in Korean), vol. 24, no. 9, pp. 822–828, 2018.

    Article  Google Scholar 

  23. M. Han, Z. Lyu, and T. Qiu, “A review on intelligence dehazing and color restoration for underwater images,” IEEE Transactions on Systems, Man, and Cybernetics: Systems, vol. 50, no. 5, pp. 1820–1832, 2020.

    Article  Google Scholar 

  24. C. Shen, W. Gao, Z. Song, T. Yao, B. Li, F. Li, and M. Wu, “A new effective image fusion algorithm based on NSCT and PCNN,” Journal of Information and Computational Science, vol. 12, no. 10, pp. 4137–4144, 2015.

    Article  Google Scholar 

  25. I. Sobel and G. Feldman, “A 3 × 3 isotropic gradient operator for image Processing,” A Talk at the Stanford Artificial Project, pp. 271–272, 1968.

    Google Scholar 

  26. N. Otsu, “A threshold selection method from gray-level histograms,” IEEE Transactions on Systems, Man, and Cybernetics, vol. 9, no. 1, pp. 62–66, 1979.

    Article  Google Scholar 

  27. Blue Robotics, [Online]. Available: https://www.bluerobotics.com/ Accessed on December 17, 2021.

  28. M. Sandler, A. Howard, M. Zhu, A. Zhmoginov, and L. C. Chen, “MobileNetv2: Inverted residuals and linear bottlenecks,” Proc. of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 4510–4520, 2018.

    Google Scholar 

  29. J. Deng, W. Dong, R. Socher, L. J. Li, K. Li, and F.-F. Li, “Imagenet: A large-scale hierarchical image database,” Proc. of IEEE Conference on Computer Vision and Pattern Recognition, pp. 248–255, 2009.

    Google Scholar 

  30. D. P. Kingma, and J. Ba, “Adam: A method for stochastic optimization,” arXiv preprint arXiv:1412.6980, 2014.

  31. Pixhawk, [Online]. Available: http://pixhawk.org/ Accessed on December 17, 2021.

  32. Ardusub, [Online]. Available: https://www.ardusub.com/ Accessed on December 17, 2021.

  33. QGroundControl, [Online]. Available: http://qgroundcontrol.com/ Accessed on December 17, 2021.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jeha Ryu.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

This work was supported by GIST Research Institute (GRI) grant funded by the GIST.

Hoosang Lee received his B.S. degree in robotics from Kwangwoon University, Seoul, Korea in 2018, and the M.S. degree in intelligent robotics technology from Gwangju Institute of Science and Technology (GIST), Gwangju, Korea in 2020. His research interests include robot control and robot learning.

Daehyeon Jeong received his B.S. degree in mechanical engineering from Korea University, Seoul, Korea in 2013, and the M.S. degree in mechatronics, and the Ph.D. degree in intelligent robotics technology from Gwangju Institute of Science and Technology, Gwangju, Korea, in 2015 and 2022, respectively. His research interests include artificial intelligence and autonomous driving.

Hongje Yu received his B.S. degree in mechanical engineering, and the M.S. degree in intelligent robotics technology from Gwangju Institute of Science and Technology (GIST) in 2018 and 2020, respectively. His research interests include control engineering.

Jeha Ryu received his B.S., M.S., and Ph.D. degrees in mechanical engineering from Seoul National University, Seoul, Korea, in 1982, from Korea Advanced Institute of Science and Technology (KAIST), in 1984, and from University of Iowa, in 1991, respectively. He was a professor in the Department of Mechatronics, and now in the School of Integrated Technology at Gwangju Institute of Science and Technology (GIST). His research interests include robot/vehicle kinematics, dynamics, control, haptics, virtual reality, and artificial intelligence.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Lee, H., Jeong, D., Yu, H. et al. Autonomous Underwater Vehicle Control for Fishnet Inspection in Turbid Water Environments. Int. J. Control Autom. Syst. 20, 3383–3392 (2022). https://doi.org/10.1007/s12555-021-0357-9

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12555-021-0357-9

Keywords

Navigation