Advertisement

Journal of Intelligent & Robotic Systems

, Volume 91, Issue 2, pp 279–289 | Cite as

Object Recognition and Semantic Mapping for Underwater Vehicles Using Sonar Data

  • Matheus dos Santos
  • Paulo DrewsJr.
  • Pedro Núñez
  • Silvia Botelho
Article
  • 128 Downloads

Abstract

The application of robots as a tool to explore underwater environments has increased in the last decade. Underwater tasks such as inspection, maintenance, and monitoring can be automatized by robots. The understanding of the underwater environments and the object recognition are required features that are becoming a critical issue for these systems. On this work, a method to provide a semantic mapping on the underwater environment is provided. This novel system is independent of the water turbidity and uses acoustic images acquired by Forward-Looking Sonar (FLS). The proposed method efficiently segments and classifies the structures in the scene using geometric information of the recognized objects. Therefore, a semantic map of the scene is created, which allows the robot to describe its environment according to high-level semantic features. Finally, the proposal is evaluated in a real dataset acquired by an underwater vehicle in a marina area. Experimental results demonstrate the robustness and accuracy of the method described in this paper.

Keywords

Robot vision Underwater robot Semantic mapping Object recognition Forward looking sonar 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Notes

Acknowledgments

We thank to CNPq, CAPES, FAPERGS, Oil Brazilian Agency, PRH-27 FURG-ANP/MCT and IBP – Brazillian Petroleum, Gas and Biofuels Institute to support this research. This paper is a contribution of the INCT-Mar COI funded by CNPq Grant Number 610012/2011-8 and CAPES-DGPU project BS-NAVLOC (CAPES no 321/15, DGPU 7523/14-9, MEC project PHBP14/00083): Brazil-Spain cooperation on navigation and localization for autonomous robots on terrestrial and underwater environments.

References

  1. 1.
    Botelho, S., Drews, P. Jr., Figueiredo, M.S., Rocha, C., Oliveira, G.L.: Appearance-based odometry and mapping with feature descriptors for underwater robots. J. Braz. Comput. Soc. 15, 47–54 (2009)CrossRefGoogle Scholar
  2. 2.
    Bradski, G.: The OpenCV library. Dr. Dobb’s J. Softw. Tools Prof. Prog. 25 (11), 120–123 (2000). ISSN: 1044–789X, http://www.drdobbs.com/open-source/the-opencv-library/184404319 Google Scholar
  3. 3.
    Chang, C.C., Lin, C.J.: Libsvm: a library for support vector machines. ACM Trans. Intell. Syst. Technol. (TIST) 2(3), 27 (2011)Google Scholar
  4. 4.
    Cho, H., Pyo, J., Gu, J., Jeo, H., Yu, S.C.: Experimental results of rapid underwater object search based on forward-looking imaging sonar. In: Underwater Technology (UT), 2015 IEEE, pp 1–5 (2015).  https://doi.org/10.1109/UT.2015.7108235
  5. 5.
    Galceran, E., Djapic, V., Carreras, M., Williams, D.P.: A real-time underwater object detection algorithm for multi-beam forward looking sonar. Navigation, Guidance and Control of Underwater Vehicles (NGCUV) 3, 306–311 (2012)Google Scholar
  6. 6.
    Guo, J., Cheng, S.W., Liu, T.C.: Auv obstacle avoidance and navigation using image sequences of a sector scanning sonar. In: Proceedings of the 1998 International Symposium on Underwater Technology, 1998, pp 223–227 (1998).  https://doi.org/10.1109/UT.1998.670096
  7. 7.
    Guth, F., Silveira, L., Botelho, S.S., Drews, P. Jr., Ballester, P.: Underwater slam: challenges, state of the art, algorithms and a new biologically-inspired approach. In: IEEE 5th RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics, pp 1–6 (2014)Google Scholar
  8. 8.
    Hurtós, N.V.: Forward-Looking Sonar Mosaicing for Underwater Environments. Ph.D. Thesis, Universitat de Girona (2014)Google Scholar
  9. 9.
    Kim, K., Neretti, N., Intrator, N.: Mosaicing of acoustic camera images. IEE Proceedings Radar, Sonar and Navigation 152(4), 263–270 (2005).  https://doi.org/10.1049/ip-rsn:20045015 CrossRefGoogle Scholar
  10. 10.
    Kostavelis, I., Gasteratos, A.: Semantic mapping for mobile robotics tasks: a survey. Robot. Auton. Syst. 66, 86–103 (2015)CrossRefGoogle Scholar
  11. 11.
    Lu, Y., Sang, E.: Underwater target’s size/shape dynamic analysis for fast target recognition using sonar images. In: Proceedings of the 1998 International Symposium on Underwater Technology, 1998, pp 172–175 (1998).  https://doi.org/10.1109/UT.1998.670085
  12. 12.
    Machado, M., Zaffari, G., Ballester, P., Drews, P. Jr., Botelho, S.: A Topological Descriptor of Forward Looking Sonar Images for Navigation and Mapping, pp 120–134. Springer International Publishing, Cham (2016).  https://doi.org/10.1007/978-3-319-47247-8_8 Google Scholar
  13. 13.
    Negahdaripour, S., Firoozfam, P., Sabzmeydani, P.: On processing and registration of forward-scan acoustic video imagery. In: Proceedings the 2nd Canadian Conference on Computer and Robot Vision, 2005, pp 452–459 (2005).  https://doi.org/10.1109/CRV.2005.57
  14. 14.
    Reed, S., Petillot, Y., Bell, J.: An automatic approach to the detection and extraction of mine features in sidescan sonar. IEEE J. Ocean. Eng. 28(1), 90–105 (2003)CrossRefGoogle Scholar
  15. 15.
    Ribas, D., Ridao, P., Tardós, J.D., Neira, J.: Underwater slam in man-made structured environments. J. Field Rob. 25(11-12), 898–921 (2008).  https://doi.org/10.1002/rob.20249 CrossRefzbMATHGoogle Scholar
  16. 16.
    Santos, M.M., Drews, P. Jr., Nunez, P., Botelho, S.S.C.: Semantic mapping on underwater environment using sonar data. In: IEEE 13th Latin American Robotics Symposium LARS, pp 1–6 (2016)Google Scholar
  17. 17.
    Silveira, L., Guth, F., Drews, P., Ballester, P., Machado, M., Codevilla, F., Duarte, N., Botelho, S.: An open-source bio-inspired solution to underwater SLAM. In: IFAC Workshop on Navigation, Guidance and Control of Underwater Vehicles NGCUV (2015)Google Scholar
  18. 18.
    Thrun, S.: Robotic mapping: a survey. In: Exploring Artificial Intelligence in the New Millennium, pp 1–35 (2003)Google Scholar

Copyright information

© Springer Science+Business Media B.V. 2017

Authors and Affiliations

  1. 1.NAUTEC, Intelligent Robotics and Automation Group - Center of Computational ScienceUniv. Federal do Rio Grande - FURGRio GrandeBrazil
  2. 2.ROBOLAB, Robotics Laboratory - Computer and Communication Technology DepartmentUniversidad de ExtremaduraCáceresSpain

Personalised recommendations