Automatic Detection of Structural Deficiencies Using 4D Hue-Assisted Analysis of Color Point Clouds
Recent developments in the fields of robotics and remote sensing technologies such as 3D laser scanners and photogrammetric approaches have provided an unprecedented opportunity to collect a massive amount of data from infrastructure systems in a contactless and nondestructive manner, which can potentially improve the structural health monitoring process. However, the complex nature of these geometrically accurate and high-resolution 3D models makes it inefficient and time-consuming to manually analyze and manipulate them and automating this process continue to pose a challenge. Thus, procedures that automate the data processing in order to detect a variety of damages are desired to make full use of these modern inspection technologies as a tool for infrastructure integrity assessment and asset management. The aim of this paper is to present a new algorithm to automatically identify and evaluate structural deficiencies in massive 3D point clouds of complex infrastructure systems. This approach takes advantage of both local geometry and color data properties associated with each point to improve the damage detection capabilities in a variety of scenarios. Linear and non-linear transformations from the RGB color space to non-RGB spaces were performed to increase separability between the damage and the structure and to achieve robustness to changes in illumination. Recently, a complex and large-scale gravity dam in Maryland, USA has served as a test bed for the developed methodology. In this experiment, a multi-scale photogrammetric computer vision approach was utilized to generate accurate and highly detailed 3D models of the targeted dam. In order to maximize the accessibility and to overcome geometric constraints, different multi-rotor Unmanned Aerial Vehicle (UAV) platforms with varied payload and maneuverability capabilities, each equipped with different optical sensors were used in this study. Experimental results demonstrate that the presented 4D point cloud analysis method can accurately detect and quantify a variety of anomalies from spalling to moisture infiltration in exposed concrete structures.
Keywords3D point cloud Unmanned Aerial Vehicle (UAV) Computer vision Structural health monitoring Damage detection
This material is based upon the work supported by the National Science Foundation (NSF) under Grant No. CMMI-1433765. The authors would also like to acknowledge the support made by NVIDIA Corporation with the donation of a Tesla K40 GPU used in this research. Any opinions, findings, and conclusions or recommendations expressed in this publication are those of the authors and do not necessarily reflect the views of the NSF or NVIDIA.
- 20.Besl, P.J., McKay, N.D.: Method for registration of 3-D shapes. Presented at the Robotics-DL tentative (1992)Google Scholar
- 21.Toldo, R., Beinat, A., Crosilla, F.: Global registration of multiple point clouds embedding the Generalized Procrustes Analysis into an ICP framework. In: 3DPVT 2010 Conference (2010)Google Scholar
- 22.Muja, M., Lowe, D.G.: Fast approximate nearest neighbors with automatic algorithm configuration. VISAPP 1. 2, 2 (2009)Google Scholar
- 23.Vezhnevets, V., Sazonov, V., Andreeva, A.: A survey on pixel-based skin color detection techniques. In: Proc. Graphicon, pp. 85–92. Moscow (2003)Google Scholar
- 26.Siegel, M., Gunatilake, P.: Remote enhanced visual inspection of aircraft by a mobile robot. In: Proc. of the 1998 IEEE Workshop on Emerging Technologies, Intelligent Measurement and Virtual Systems for Instrumentation and Measurement (ETIMVIS’98), pp. 49–58 (1998)Google Scholar
- 27.Gonzalez, R.C., Woods, R.E.: Digital Image Processing. Prentice Hall, New Jersey (2008)Google Scholar
- 30.Men, H., Pochiraju, K.: Hue-assisted automatic registration of color point clouds. J. Comput. Des. Eng. 1, 223–232 (2014)Google Scholar
- 33.Wesolkowski, S.: Color image edge detection and segmentation: a comparison of the vector angle and the euclidean distance color similarity measures. https://uwspace.uwaterloo.ca/handle/10012/937 (1999)