Skip to main content

On the Convergence Rate of Sparse Grid Least Squares Regression

  • Conference paper
  • First Online:
Sparse Grids and Applications - Miami 2016

Part of the book series: Lecture Notes in Computational Science and Engineering ((LNCSE,volume 123))

Abstract

While sparse grid least squares regression algorithms have been frequently used to tackle Big Data problems with a huge number of input data in the last 15 years, a thorough theoretical analysis of stability properties, error decay behavior and appropriate couplings between the dataset size and the grid size has not been provided yet. In this paper, we will present a framework which will allow us to close this gap and rigorously derive upper bounds on the expected error for sparse grid least squares regression. Furthermore, we will verify that our theoretical convergence results also match the observed rates in numerical experiments.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    For higher order spline bases, a larger choice of s can be exploited here. However, one needs to prove an analogous result to Theorem 4 for the corresponding basis functions first.

References

  1. A. Banerjee, S. Merugu, I.S. Dhillon, J. Ghosh, Clustering with Bregman divergences. J. Mach. Learn. Res. 6, 1705–1749 (2005)

    MathSciNet  MATH  Google Scholar 

  2. B. Bohn, Error analysis of regularized and unregularized least-squares regression on discretized function spaces. PhD thesis, Institute for Numerical Simulation, University of Bonn, 2017

    Google Scholar 

  3. B. Bohn, M. Griebel, An adaptive sparse grid approach for time series predictions, in Sparse Grids and Applications, ed. by J. Garcke, M. Griebel. Lecture Notes in Computational Science and Engineering, vol. 88 (Springer, Berlin, 2012), pp. 1–30

    Google Scholar 

  4. H.-J. Bungartz, M. Griebel, Sparse grids. Acta Numer. 13, 147–269 (2004)

    Article  MathSciNet  Google Scholar 

  5. H.-J. Bungartz, D. Pflüger, S. Zimmer, Adaptive sparse grid techniques for data mining, in Modelling, Simulation and Optimization of Complex Processes 2006, Proceedings of International Conference on HPSC, Hanoi, ed. by H. Bock, E. Kostina, X. Hoang, R. Rannacher (Springer, Berlin, 2008), pp. 121–130

    Google Scholar 

  6. A. Chkifa, A. Cohen, G. Migliorati, F. Nobile, R. Tempone, Discrete least squares polynomial approximation with random evaluations - application to parametric and stochastic elliptic PDEs. ESAIM: Math. Modell. Numer. Anal. 49(3), 815–837 (2015)

    Article  MathSciNet  Google Scholar 

  7. A. Cohen, M. Davenport, D. Leviatan, On the stability and accuracy of least squares approximations. Found. Comput. Math. 13, 819–834 (2013)

    Article  MathSciNet  Google Scholar 

  8. C. Feuersänger, Sparse grid methods for higher dimensional approximation. PhD thesis, Institute for Numerical Simulation, University of Bonn, 2010

    Google Scholar 

  9. J. Garcke, Maschinelles Lernen durch Funktionsrekonstruktion mit verallgemeinerten dünnen Gittern. PhD thesis, Institute for Numerical Simulation, University of Bonn, 2004

    Google Scholar 

  10. J. Garcke, M. Griebel, M. Thess, Data mining with sparse grids. Computing 67(3), 225–253 (2001)

    Article  MathSciNet  Google Scholar 

  11. M. Griebel, P. Oswald, Tensor product type subspace splitting and multilevel iterative methods for anisotropic problems. Adv. Comput. Math. 4, 171–206 (1995)

    Article  MathSciNet  Google Scholar 

  12. M. Griebel, C. Rieger, B. Zwicknagl, Multiscale approximation and reproducing kernel Hilbert space methods. SIAM J. Numer. Anal. 53(2), 852–873 (2015)

    Article  MathSciNet  Google Scholar 

  13. M. Griebel, C. Rieger, B. Zwicknagl, Regularized kernel based reconstruction in generalized Besov spaces. Found. Comput. Math. 18(2), 459–508 (2018)

    Article  MathSciNet  Google Scholar 

  14. T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning (Springer, Berlin, 2001)

    Book  Google Scholar 

  15. M. Hegland, Data mining techniques. Acta Numer. 10, 313–355 (2001)

    Article  MathSciNet  Google Scholar 

  16. S. Knapek, Approximation und Kompression mit Tensorprodukt-Multiskalenräumen. PhD thesis, Institute for Numerical Simulation, University of Bonn, 2000

    Google Scholar 

  17. G. Migliorati, F. Nobile, E. von Schwerin, R. Tempone, Analysis of discrete L 2 projection on polynomial spaces with random evaluations. Found. Comput. Math. 14, 419–456 (2014)

    MathSciNet  MATH  Google Scholar 

  18. G. Migliorati, F. Nobile, R. Tempone, Convergence estimates in probability and in expectation for discrete least squares with noisy evaluations at random points. J. Multivar. Anal. 142, 167–182 (2015)

    Article  MathSciNet  Google Scholar 

  19. D. Pflüger, B. Peherstorfer, H.-J. Bungartz, Spatially adaptive sparse grids for high-dimensional data-driven problems. J. Complexity 26(5), 508–522 (2010)

    Article  MathSciNet  Google Scholar 

  20. B. Schölkopf, A. Smola, Learning with Kernels – Support Vector Machines, Regularization, Optimization, and Beyond. (The MIT Press, Cambridge, 2002)

    Google Scholar 

  21. J. Tropp, User-friendly tail bounds for sums of random matrices. Found. Comput. Math. 12(4), 389–434 (2011)

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The author was supported by the Sonderforschungsbereich 1060 The Mathematics of Emergent Effects funded by the Deutsche Forschungsgemeinschaft.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Bastian Bohn .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bohn, B. (2018). On the Convergence Rate of Sparse Grid Least Squares Regression. In: Garcke, J., Pflüger, D., Webster, C., Zhang, G. (eds) Sparse Grids and Applications - Miami 2016. Lecture Notes in Computational Science and Engineering, vol 123. Springer, Cham. https://doi.org/10.1007/978-3-319-75426-0_2

Download citation

Publish with us

Policies and ethics