Accelerating data uncertainty quantification by solving linear systems with multiple right-hand sides
- 2k Downloads
The subject of this work is accelerating data uncertainty quantification. In particular, we are interested in expediting the stochastic estimation of the diagonal of the inverse covariance (precision) matrix that holds a wealth of information concerning the quality of data collections, especially when the matrices are symmetric positive definite and dense. Schemes built on direct methods incur a prohibitive cubic cost. Recently proposed iterative methods can remedy this but the overall cost is raised again as the convergence of stochastic estimators can be slow. The motivation behind our approach stems from the fact that the computational bottleneck in stochastic estimation is the application of the precision matrix on a set of appropriately selected vectors. The proposed method combines block conjugate gradient with a block-seed approach for multiple right-hand sides, taking advantage of the nature of the right-hand sides and the fact that the diagonal is not sought to high accuracy. Our method is applicable if the matrix is only known implicitly and also produces a matrix-free diagonal preconditioner that can be applied to further accelerate the method. Numerical experiments confirm that the approach is promising and helps contain the overall cost of diagonal estimation as the number of samples grows.
KeywordsUncertainty quantification Matrix-free methods Iterative Preconditioning Multiple right-hand sides Conjugate gradient Block conjugate gradient Seed methods Stochastic diagonal of inverse estimator Rademacher vectors
Unable to display preview. Download preview PDF.
- 2.Abdel-Rehim, A.M., Morgan, R.B., Wilcox, W.: Improved seed methods for symmetric positive definite linear equations with multiple right-hand sides (2008). http://arxiv.org/abs/0810.0330 [math-ph]
- 3.Anitescu, M., Chen, J., Wang, L.: A matrix-free approach for solving the Gaussian process maximum likelihood problem. SIAM J. Sci. Comput. 34(1), A240–A262 (2012)Google Scholar
- 7.Bekas, C., Curioni, A., Fedulova, I.: Low cost high perf. uncertainty quantification. In: Worskhop on High Performance Computational Finance, Supercomputing’09, Portland. Portland, Oregon (2009)Google Scholar
- 8.Bekas, C., Curioni, A., Fedulova, I.: Low-cost data uncertainty quantification. Concurr. Comput.: Practice and Experience 24(8), 908–920 (2012)Google Scholar
- 12.Brezinski, C., Fika, P., Mitrouli, M.: Moments of a linear operator, with applications to the trace of the inverse of matrices and the solution of equations. Numer. Linear Algebra Appl. (2011)Google Scholar
- 15.Chen, J.: A deflated version of the block conjugate gradient algorithm with an application to Gaussian process maximum likelihood estimation. Argonne National Laboratory (2011, Preprint). ANL/MCS-P1927-0811Google Scholar
- 18.Golub, G.H., Meurant, G.: Matrices, Moments and Quadrature with Applications. Princeton University Press (2010)Google Scholar
- 20.Gutknecht, M.: Block Krylov space methods for linear systems with multiple right-hand sides: an introduction. In: Siddiqi, A.H., Duff, I.S., Christensen, O. (eds.) Modern Mathematical Models, Methods and Algorithms for Real World Systems, pp. 420–447. Anamaya Publishers, New Delhi (2007)Google Scholar
- 25.Lin, L., Yang, C., Meza, J.C., Lu, J., Ying, L., Weinan, E.: SelInv—an algorithm for selected inversion of a sparse symmetric matrix. ACM Trans. Math. Softw. 7(4), 40:1–40:19 (2011)Google Scholar
- 37.Visweswariah, K., Olsen, P., Gopinath, R., Axelrod, S.: Maximum likelihood training of subspaces for inverse covariance modeling. In: Proc. ICASSP, vol. 1, pp. 848–851 (2003)Google Scholar