Abstract
While sparse grid least squares regression algorithms have been frequently used to tackle Big Data problems with a huge number of input data in the last 15 years, a thorough theoretical analysis of stability properties, error decay behavior and appropriate couplings between the dataset size and the grid size has not been provided yet. In this paper, we will present a framework which will allow us to close this gap and rigorously derive upper bounds on the expected error for sparse grid least squares regression. Furthermore, we will verify that our theoretical convergence results also match the observed rates in numerical experiments.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
For higher order spline bases, a larger choice of s can be exploited here. However, one needs to prove an analogous result to Theorem 4 for the corresponding basis functions first.
References
A. Banerjee, S. Merugu, I.S. Dhillon, J. Ghosh, Clustering with Bregman divergences. J. Mach. Learn. Res. 6, 1705–1749 (2005)
B. Bohn, Error analysis of regularized and unregularized least-squares regression on discretized function spaces. PhD thesis, Institute for Numerical Simulation, University of Bonn, 2017
B. Bohn, M. Griebel, An adaptive sparse grid approach for time series predictions, in Sparse Grids and Applications, ed. by J. Garcke, M. Griebel. Lecture Notes in Computational Science and Engineering, vol. 88 (Springer, Berlin, 2012), pp. 1–30
H.-J. Bungartz, M. Griebel, Sparse grids. Acta Numer. 13, 147–269 (2004)
H.-J. Bungartz, D. Pflüger, S. Zimmer, Adaptive sparse grid techniques for data mining, in Modelling, Simulation and Optimization of Complex Processes 2006, Proceedings of International Conference on HPSC, Hanoi, ed. by H. Bock, E. Kostina, X. Hoang, R. Rannacher (Springer, Berlin, 2008), pp. 121–130
A. Chkifa, A. Cohen, G. Migliorati, F. Nobile, R. Tempone, Discrete least squares polynomial approximation with random evaluations - application to parametric and stochastic elliptic PDEs. ESAIM: Math. Modell. Numer. Anal. 49(3), 815–837 (2015)
A. Cohen, M. Davenport, D. Leviatan, On the stability and accuracy of least squares approximations. Found. Comput. Math. 13, 819–834 (2013)
C. Feuersänger, Sparse grid methods for higher dimensional approximation. PhD thesis, Institute for Numerical Simulation, University of Bonn, 2010
J. Garcke, Maschinelles Lernen durch Funktionsrekonstruktion mit verallgemeinerten dünnen Gittern. PhD thesis, Institute for Numerical Simulation, University of Bonn, 2004
J. Garcke, M. Griebel, M. Thess, Data mining with sparse grids. Computing 67(3), 225–253 (2001)
M. Griebel, P. Oswald, Tensor product type subspace splitting and multilevel iterative methods for anisotropic problems. Adv. Comput. Math. 4, 171–206 (1995)
M. Griebel, C. Rieger, B. Zwicknagl, Multiscale approximation and reproducing kernel Hilbert space methods. SIAM J. Numer. Anal. 53(2), 852–873 (2015)
M. Griebel, C. Rieger, B. Zwicknagl, Regularized kernel based reconstruction in generalized Besov spaces. Found. Comput. Math. 18(2), 459–508 (2018)
T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning (Springer, Berlin, 2001)
M. Hegland, Data mining techniques. Acta Numer. 10, 313–355 (2001)
S. Knapek, Approximation und Kompression mit Tensorprodukt-Multiskalenräumen. PhD thesis, Institute for Numerical Simulation, University of Bonn, 2000
G. Migliorati, F. Nobile, E. von Schwerin, R. Tempone, Analysis of discrete L 2 projection on polynomial spaces with random evaluations. Found. Comput. Math. 14, 419–456 (2014)
G. Migliorati, F. Nobile, R. Tempone, Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points. J. Multivar. Anal. 142, 167–182 (2015)
D. Pflüger, B. Peherstorfer, H.-J. Bungartz, Spatially adaptive sparse grids for high-dimensional data-driven problems. J. Complexity 26(5), 508–522 (2010)
B. Schölkopf, A. Smola, Learning with Kernels – Support Vector Machines, Regularization, Optimization, and Beyond. (The MIT Press, Cambridge, 2002)
J. Tropp, User-friendly tail bounds for sums of random matrices. Found. Comput. Math. 12(4), 389–434 (2011)
Acknowledgements
The author was supported by the Sonderforschungsbereich 1060 The Mathematics of Emergent Effects funded by the Deutsche Forschungsgemeinschaft.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this paper
Cite this paper
Bohn, B. (2018). On the Convergence Rate of Sparse Grid Least Squares Regression. In: Garcke, J., Pflüger, D., Webster, C., Zhang, G. (eds) Sparse Grids and Applications - Miami 2016. Lecture Notes in Computational Science and Engineering, vol 123. Springer, Cham. https://doi.org/10.1007/978-3-319-75426-0_2
Download citation
DOI: https://doi.org/10.1007/978-3-319-75426-0_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-75425-3
Online ISBN: 978-3-319-75426-0
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)