Skip to main content

C-LASSO Estimator for Generalized Additive Logistic Regression Based on B-Spline

  • Chapter
  • First Online:
Data Science and Digital Business

Abstract

Generalized Additive logistic Regression model (GALRM) is a very important nonparametric regression model. It can be used for binary classification or for predicting the certainty of a binary outcome by using generalized additive models, which known as modern techniques from statistical learning, and the penalized log-likelihood criterion. In our chapter, we develop an estimation problem for GALRM based on B-spline and Least Absolute Shrinkage and Selection Operator (LASSO), unlike the traditional solutions; we will express the LASSO problem as a conic quadratic optimization problem which is a well structured convex optimization program, and solve it great and very efficient interior points methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aitchison, P. W. (1982). Generalized inverse matrices and their applications. International Journal of Mathematical Education in Science and Technology, 13(1), 99–109.

    Article  Google Scholar 

  2. Aster, R. C., Borchers, B., & Thurber, C. H. (2013). Parameter estimation and inverse problems. New York: Academic Press.

    Google Scholar 

  3. Bagirov, A., Karmitsa, N., & Mäkelä, M. M. (2014). Introduction to nonsmooth optimization: Theory. Springer, Switzerland: Practice and Software.

    Book  Google Scholar 

  4. Ben-Tal, A., & Nemirovski, A. (2001). Lectures on modern convex optimization: Analysis, algorithms and engineering applications. MOS-SIAM Series on Optimization (Philadelphia).

    Google Scholar 

  5. Blumschein, P., Hung, W., & Jonassen, D. (2009). Model-based approaches to learning: Using systems models and simulations to improve understanding and problem solving in complex domains. Rotterdam: Sense Publishers.

    Google Scholar 

  6. Bock, H. H., Sokolowski, A., & Jajuga, K. (2002). Classification, clustering, and data analysis: Recent advances and applications. New York: Springer.

    Google Scholar 

  7. Boyd, S., Vandenberghe, & L. Vandenberghe. (2004). Convex optimization. Cambridge University Press, Cambridge.

    Google Scholar 

  8. Burnham, K. P., & Anderson, D. R. (2002). Model selection and inference: A practical information theoretic approach. New York: Springer.

    Google Scholar 

  9. Cox, M. G. (1982). Practical spline approximation. In: Topics in Numerical Analysis. Lecture Notes in Mathematics, vol. 630. Berlin: Springer.

    Chapter  Google Scholar 

  10. De Boor, C. (2001). Practical guide to splines. New York: Springer.

    Google Scholar 

  11. Eggermont, P. P., Paul, P., & La Riccia, V. N. (2009). Maximum penalized likelihood es timation, volume II: Regression. New York: Springer.

    Google Scholar 

  12. Fletcher, R. (1987). Practical methods of optimization. New York: Wiley.

    Google Scholar 

  13. Golub, G. H., Heath, M., & Wahba, G. (1979). Generalized cross-validation as a method for choosing a good ridge parameter. Technometrics, 21(2), 215–223.

    Article  Google Scholar 

  14. Golub, G. H., & Van Loan, C. F. (1996). Matrix computations. Baltimore: The John Hopkins Universi ty Press.

    Google Scholar 

  15. Hastie, T., & Tibshirani, R. (1986). Generalized additive models. Statistical Science, 1(3), 297–310.

    Article  Google Scholar 

  16. Hastie, T. J., & Tibshirani, R. J. (1990). Generalized additive models. New York: Chapman and Hall.

    Google Scholar 

  17. Hastie, T., Tibshirani, R., & Friedman, J. H. (2001). The element of statistical learning. New York: Springer.

    Book  Google Scholar 

  18. Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1), 55–67.

    Article  Google Scholar 

  19. Hosmer, D. W., & Lemeshow, S. (2000). Applied logistic regression. New York: Wiley.

    Book  Google Scholar 

  20. Karmarkar, N. (1984). A new polynomial-time algorithm for linear programming. Combinatorica, 4, 373–395.

    Article  Google Scholar 

  21. Kelley, C.T. (1995). Iterative methods for linear and nonlinear equations. MOS-SIAM Series on Optimization (Philadelphia).

    Google Scholar 

  22. Lobo, M. S., Vandenberghe, L., Boyd, S., & Lebret, H. (1998). Applications of second-order cone programming. Linear Algebra and its Applications, 284, 193–228.

    Article  Google Scholar 

  23. McCullagh, P., & Nelder, J. A. (1989). Generalized linear models. New York: Chapman and Hall/CRC Mono graphs on Statistics and Applied Probability.

    Book  Google Scholar 

  24. Nesterov, Yu., & Nemirovski, A. (1994). Interior-point polynomial methods in convex programming. Philadelphia: MOS-SIAM Series on Optimization.

    Book  Google Scholar 

  25. Pringle, R. M., Rayner, A. A. (1971). Generalized inverse matrices with applications to statistics. NewYork: Hafner Publishing.

    Google Scholar 

  26. Quarteroni, A., Sacco, R., & Saleri, F. (2000). Numerical mathematics. New York: Springer.

    Google Scholar 

  27. Renegar, J. (1987). A mathematical view of interior-point methods in convex optimization. Philadelphia: MOS-SIAM Series on Optimization.

    Google Scholar 

  28. Schmidt, M., Fung, G., Rosales, R. (2009). Optimization methods for L1-regularization. UBC Technical Report TR-2009-19.

    Google Scholar 

  29. Shevade, S. K., & Keerthi, S. S. (2003). A simple and efficient algorithm for gene selection using sparse logistic regression. Bioinformatics, 19(17), 2246–2253.

    Article  Google Scholar 

  30. Shor, N. Z. (1985). Minimization methods for non-differentiable functions. Berlin: Springer.

    Book  Google Scholar 

  31. Taylan, P., Weber, G. W., & Beck, A. (2007). New approaches to regression by generalized additive models and continuous optimization for modern applications in finance, science and technology. Optimization, 56(5–6), 675–698.

    Article  Google Scholar 

  32. Tibshirani, R. (1996). Regression shrinkage and selection via the LASSO. Journal of the Royal Statistical Society Series B (Statistical Methodology), 58, 267–288.

    Google Scholar 

  33. Walker, S., & Page, C. (2001). Generalized ridge regression and a generalization of the CP statistics. Journal of Applied Statistics, 28(7), 911–922.

    Article  Google Scholar 

  34. Weisberg, S. (2005). Applied linear regression. New Jersey: Wiley.

    Book  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pakize Taylan .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Taylan, P., Weber, G.W. (2019). C-LASSO Estimator for Generalized Additive Logistic Regression Based on B-Spline. In: García Márquez, F., Lev, B. (eds) Data Science and Digital Business. Springer, Cham. https://doi.org/10.1007/978-3-319-95651-0_10

Download citation

Publish with us

Policies and ethics