Skip to main content

Performance Analysis of Updating-QR Supported OLS Against Stochastic Gradient Descent

  • Conference paper
  • First Online:
Intelligent Systems Technologies and Applications

Part of the book series: Advances in Intelligent Systems and Computing ((AISC,volume 384))

  • 1456 Accesses

Abstract

Regression model is a well-studied method for the prediction of real-valued data. Depending on the structure of the data involved, different approaches have been adopted for estimating the parameters which includes the Linear Equation solver, Gradient Descent, Least Absolute Shrinkage and Selection Operator (LASSO) and the like. The performance of each of them varies based on the data size and computation involved. Many methods have been introduced to improve their performance like QR factorization in least squares problem. Our focus is on the analysis of performance of gradient descent and QR based ordinary least squares for estimating and updating the parameters under varying data size. We have considered both tall/skinny as well as short/fat matrices. We have implemented Block Householders method of QR factorization in Compute Unified Device Architecture (CUDA) platform using Graphics Processor Unit (GPU) GTX 645 with the initial set of data. New upcoming data is updated directly to the existing Q and R factors rather than applying QR factorization from the scratch. This updating-QR platform is then utilized for performing regression analysis. This will support the regression analysis process on-the-fly. The results are then compared against the gradient descent implemented. The results prove that parallel-QR method for regression analysis achieves speed-up of up to 22x compared with the gradient descent method when the attribute size is larger than the sample size and speed-up of up to 2x when the sample size is larger than the attribute size. Our implementation results also prove that the updating-QR method achieves speed-up approaching 2x over the gradient descent method for large datasets when the sample size is less than the attribute size.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Anderson, M., Ballard, G., Demmel, J., Keutzer, K.: Communication-avoiding QR decomposition for GPUs. In: 2011 IEEE International Parallel & Distributed Processing Symposium (IPDPS), pp. 48–58. IEEE (2011)

    Google Scholar 

  2. Ng, A.Y.: Machine Learning. https://www.coursera.org/learn/machine-learning/

  3. Bernardelli, M.: Usage of algorithm of fast updates of QR decomposition to solution of linear regression models, pp. 699–705. EDIS - Publishing Institution of the University of Zilina (2012)

    Google Scholar 

  4. Bischof, C., Van Loan, C.: The WY representation for products of Householder matrices. SIAM Journal on Scientific and Statistical Computing 8(1), s2–s13 (1987)

    Article  Google Scholar 

  5. Bjrck, K.: Numerical Methods for Least Squares Problems. Siam Philadelphia (1996)

    Google Scholar 

  6. Blake, C., Merz, C.J.: UCI repository of machine learning databases (1998)

    Google Scholar 

  7. Bottou, L.: Large-scale machine learning with stochastic gradient descent. In: Proceedings of COMPSTAT 2010, pp. 177–186. Springer (2010)

    Google Scholar 

  8. Golub, G.H., Van Loan, C.F.: Matrix Computations, 3rd edn. Johns Hopkins University Press, Baltimore (1996)

    Google Scholar 

  9. Hammarling, S., Lucas, C.: Updating the QR Factorization and the Least Squares Problem

    Google Scholar 

  10. Hastie, T., Tibshirani, R., Friedman, J.: The elements of statistical learning: data mining, inference and prediction, 2 edn. Springer (2009)

    Google Scholar 

  11. Kerr, A., Campbell, D., Richards, M.: QR decomposition on GPUs. In: Proceedings of 2nd Workshop on General Purpose Processing on Graphics Processing Units, GPGPU-2, pp. 71–78. ACM, New York (2009). doi:10.1145/1513895.1513904

  12. Murphy, K.P.: Machine Learning: A Probabilistic Perspective. The MIT Press (2012)

    Google Scholar 

  13. Nvidia, C.: Cublas library, vol. 15. NVIDIA Corporation, Santa Clara (2008)

    Google Scholar 

  14. Nvidia, C.: Programming guide (2008)

    Google Scholar 

  15. Owens, J.D., Houston, M., Luebke, D., Green, S., Stone, J.E., Phillips, J.C.: GPU computing. Proceedings of the IEEE 96(5), 879–899 (2008)

    Article  Google Scholar 

  16. Stoilov, T., Stoilova, K.: Algorithm and software implementation of QR decomposition of rectangular matrices. In: Int. Conf. Computer Systems and Technologies-CompSysTech (2004)

    Google Scholar 

  17. Zhang, T.: Solving large scale linear prediction problems using stochastic gradient descent algorithms. In: Proceedings of the Twenty-first International Conference on Machine Learning, ICML 2004, p. 116. ACM, New York (2004). doi:10.1145/1015330.1015332

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Remya R. K. Menon .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this paper

Cite this paper

Menon, R.R.K., Namitha, K. (2016). Performance Analysis of Updating-QR Supported OLS Against Stochastic Gradient Descent. In: Berretti, S., Thampi, S., Srivastava, P. (eds) Intelligent Systems Technologies and Applications. Advances in Intelligent Systems and Computing, vol 384. Springer, Cham. https://doi.org/10.1007/978-3-319-23036-8_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-23036-8_25

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-23035-1

  • Online ISBN: 978-3-319-23036-8

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics