Journal of Nondestructive Evaluation

, Volume 31, Issue 3, pp 225–244

Sparse Deconvolution Methods for Ultrasonic NDT

Application on TOFD and Wall Thickness Measurements


  • Florian Boßmann
    • Institute for Numerical and Applied MathematicsUniversity of Göttingen
    • Institute for Numerical and Applied MathematicsUniversity of Göttingen
  • Thomas Peter
    • Institute for Numerical and Applied MathematicsUniversity of Göttingen
  • Oliver Nemitz
    • Salzgitter Mannesmann Forschung GmbH
  • Till Schmitte
    • Salzgitter Mannesmann Forschung GmbH
Open AccessArticle

DOI: 10.1007/s10921-012-0138-8

Cite this article as:
Boßmann, F., Plonka, G., Peter, T. et al. J Nondestruct Eval (2012) 31: 225. doi:10.1007/s10921-012-0138-8


In this work we present two sparse deconvolution methods for nondestructive testing. The first method is a special matching pursuit (MP) algorithm in order to deconvolve the mixed data (signal and noise), and thus to remove the unwanted noise. The second method is based on the approximate Prony method (APM). Both methods employ the sparsity assumption about the measured ultrasonic signal as prior knowledge. The MP algorithm is used to derive a sparse representation of the measured data by a deconvolution and subtraction scheme. An orthogonal variant of the algorithm (OMP) is presented as well. The APM technique also relies on the assumption that the desired signals are sparse linear combinations of (reflections of) the transmitted pulse. For blind deconvolution, where the transducer impulse response is unknown, we offer a general Gaussian echo model whose parameters can be iteratively adjusted to the real measurements. Several test results show that the methods work well even for high noise levels. Further, an outlook for possible applications of these deconvolution methods is given.


Time of flight diffractionMatching pursuitOrthogonal matching pursuitApproximate Prony methodSparse blind deconvolutionParameter estimationSparse representation
Download to read the full article text

Copyright information

© The Author(s) 2012