Advertisement

State Estimation and Localization for ROV-Based Reactor Pressure Vessel Inspection

  • Timothy E. LeeEmail author
  • Nathan Michael
Conference paper
Part of the Springer Proceedings in Advanced Robotics book series (SPAR, volume 5)

Abstract

A vision-based extended Kalman filter is proposed to estimate the state of a remotely operated vehicle (ROV) used for inspection of a nuclear reactor pressure vessel. The state estimation framework employs an overhead, pan-tilt-zoom (PTZ) camera as the primary sensing modality. In addition to the camera state, a map of the nuclear reactor vessel is also estimated from a prior. We conduct experiments to validate the framework in terms of accuracy and robustness to environmental image degradation due to speckling and color attenuation. Subscale mockup experiments highlight estimate consistency as compared to ground truth despite visually degraded operating conditions. Full-scale platform experiments are conducted using the actual inspection system in a dry setting. In this case, the ROV achieves a lower state uncertainty as compared to subscale mockup evaluation. For both subscale and full-scale experiments, the state uncertainty was robust to environmental image degradation effects.

Notes

Acknowledgements

We gratefully acknowledge support from Westinghouse Electric Company, LLC.

References

  1. 1.
    Arthur, D., Vassilvitskii, S.: K-means++: the advantages of careful seeding. In: Proceedings of the ACM-SIAM Symposium on Discrete Algorithms, pp. 1027–1035 (2007)Google Scholar
  2. 2.
    Bartoli, A., Sturm, P.: The 3d line motion matrix and alignment of line reconstructions. Int. J. Comput. Vis. 57, 159–178 (2004)Google Scholar
  3. 3.
    Canny, J.: A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intell. 6, 679–698 (1986)Google Scholar
  4. 4.
    Carreras, M., Ridao, P., García, R., Nicosevici, T.: Vision-based localization of an underwater robot in a structured environment. In: Proceedings of the IEEE International Conference on Robot and Automation, vol. 1, pp. 971–976 (2003)Google Scholar
  5. 5.
    Cho, B.H., Byun, S.H., Shin, C.H., Yang, J.B., Song, S.I., Oh, J.M.: Keprovt: Underwater robotic system for visual inspection of nuclear reactor internals. Nucl. Eng. des. 231(3), 327–335 (2004)Google Scholar
  6. 6.
    Collins, R.T., Tsin, Y.: Calibration of an outdoor active camera system. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1 (1999)Google Scholar
  7. 7.
    Corke, P., Detweiler, C., Dunbabin, M., Hamilton, M., Rus, D., Vasilescu, I.: Experiments with underwater robot localization and tracking. In: Proceedings of the IEEE International Conference on Robot and Automation, pp. 4556–4561 (2007)Google Scholar
  8. 8.
    Doyle, D.D., Jennings, A.L., Black, J.T.: Optical flow background estimation for real-time pan/tilt camera object tracking. Measurement 48, 195–207 (2014)Google Scholar
  9. 9.
    Furgale, P., Rehder, J., Siegwart, R.: Unified temporal and spatial calibration for multi-sensor systems. In: Proceedings of the IEEE/RSJ InternationaL Conference on Intelligence Robots and Systems, pp. 1280–1286 (2013)Google Scholar
  10. 10.
    Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press (2003)Google Scholar
  11. 11.
    Hover, F.S., Eustice, R.M., Kim, A., Englot, B., Johannsson, H., Kaess, M., Leonard, J.J.: Advanced perception, navigation and planning for autonomous in-water ship hull inspection. Int. J. Robot. Res. 31(12), 1445–1464 (2012)Google Scholar
  12. 12.
    Jain, S., Neumann, U.: Real-time camera pose and focal length estimation. IEEE International Conference on Pattern Recognition, vol. 1, pp. 551–555 (2006)Google Scholar
  13. 13.
    Kelly, J., Sukhatme, G.S.: Visual-inertial sensor fusion: localization, mapping and sensor-to-sensor self-calibration. Intl. J. Robot. Res. 30(1), 56–79 (2011)Google Scholar
  14. 14.
    Kim, A., Eustice, R.: Pose-graph visual slam with geometric model selection for autonomous underwater ship hull inspection. In: Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1559–1565 (2009)Google Scholar
  15. 15.
    Kinsey, J.C., Eustice, R.M., Whitcomb, L.L.: A survey of underwater vehicle navigation: recent advances and new challenges. In: IFAC Conference of Manoeuvering and Ctrl of Marine Craft, vol. 88 (2006)Google Scholar
  16. 16.
    LaViola, J.J.: A comparison of unscented and extended kalman filtering for estimating quaternion motion. In: Proceedings of the American Control Conference, vol. 3, pp. 2435–2440. IEEE (2003)Google Scholar
  17. 17.
    Matas, J., Galambos, C., Kittler, J.: Robust detection of lines using the progressive probabilistic hough transform. Comput. Vis. Image Underst. 78(1), 119–137 (2000)Google Scholar
  18. 18.
    Mazumdar, A., Lozano, M., Fittery, A., Asada, H.H.: A compact, maneuverable, underwater robot for direct inspection of nuclear power piping systems. In: Proceedings of the IEEE Internatinal Conference on Robot and Automation, pp. 2818–2823 (2012)Google Scholar
  19. 19.
    Murphy, R.R., Steimle, E., Hall, M., Lindemuth, M., Trejo, D., Hurlebaus, S., Medina-Cetina, Z., Slocum, D.: Robot-assisted bridge inspection. J. Intell. Robot. Syst. 64(1), 77–95 (2011)Google Scholar
  20. 20.
    Odakura, M., Kometani, Y., Koike, M., Tooma, M., Nagashima, Y.: Advanced inspection technologies for nuclear power plants. Hitachi Rev. 58(2), 82–87 (2009)Google Scholar
  21. 21.
    Shea, H.R.: Effects of radiation on mems. In: Proceedings of SPIE, vol. 7928, pp. 79,280E–1–79,280E–13 (2011)Google Scholar
  22. 22.
    Shkurti, F., Rekleitis, I., Scaccia, M., Dudek, G.: State estimation of an underwater robot using visual and inertial information. In: Proceedings of the IEEE/RSJ International Conference on Intelligence Robots and Systems, pp. 5054–5060 (2011)Google Scholar
  23. 23.
    Sinha, S.N., Pollefeys, M.: Pan-tilt-zoom camera calibration and high-resolution mosaic generation. Comput. Vis. Image Underst. 103(3), 170–183 (2006)Google Scholar
  24. 24.
    Sola, J., Vidal-Calleja, T., Civera, J., Montiel, J.M.M.: Impact of landmark parametrization on monocular ekf-slam with points and lines. Int. J. Comput. Vis. 97(3), 339–368 (2012)Google Scholar
  25. 25.
    Tsui, C.L., Schipf, D., Lin, K.R., Leang, J., Hsieh, F.J., Wang, W.C.: Using a time of flight method for underwater 3-dimensional depth measurements and point cloud imaging. In: IEEE OCEANS Conference, pp. 1–6 (2014)Google Scholar
  26. 26.
    Yuen, H., Princen, J., Illingworth, J., Kittler, J.: Comparative study of hough transform methods for circle finding. Image Vis. Comput. 8(1), 71–77 (1990)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Robotics InstituteCarnegie Mellon UniversityPittsburghUSA

Personalised recommendations