C-LASSO Estimator for Generalized Additive Logistic Regression Based on B-Spline

  • Pakize TaylanEmail author
  • Gerhard Wilhelm Weber


Generalized Additive logistic Regression model (GALRM) is a very important nonparametric regression model. It can be used for binary classification or for predicting the certainty of a binary outcome by using generalized additive models, which known as modern techniques from statistical learning, and the penalized log-likelihood criterion. In our chapter, we develop an estimation problem for GALRM based on B-spline and Least Absolute Shrinkage and Selection Operator (LASSO), unlike the traditional solutions; we will express the LASSO problem as a conic quadratic optimization problem which is a well structured convex optimization program, and solve it great and very efficient interior points methods.


  1. 1.
    Aitchison, P. W. (1982). Generalized inverse matrices and their applications. International Journal of Mathematical Education in Science and Technology, 13(1), 99–109.CrossRefGoogle Scholar
  2. 2.
    Aster, R. C., Borchers, B., & Thurber, C. H. (2013). Parameter estimation and inverse problems. New York: Academic Press.Google Scholar
  3. 3.
    Bagirov, A., Karmitsa, N., & Mäkelä, M. M. (2014). Introduction to nonsmooth optimization: Theory. Springer, Switzerland: Practice and Software.CrossRefGoogle Scholar
  4. 4.
    Ben-Tal, A., & Nemirovski, A. (2001). Lectures on modern convex optimization: Analysis, algorithms and engineering applications. MOS-SIAM Series on Optimization (Philadelphia).Google Scholar
  5. 5.
    Blumschein, P., Hung, W., & Jonassen, D. (2009). Model-based approaches to learning: Using systems models and simulations to improve understanding and problem solving in complex domains. Rotterdam: Sense Publishers.Google Scholar
  6. 6.
    Bock, H. H., Sokolowski, A., & Jajuga, K. (2002). Classification, clustering, and data analysis: Recent advances and applications. New York: Springer.Google Scholar
  7. 7.
    Boyd, S., Vandenberghe, & L. Vandenberghe. (2004). Convex optimization. Cambridge University Press, Cambridge.Google Scholar
  8. 8.
    Burnham, K. P., & Anderson, D. R. (2002). Model selection and inference: A practical information theoretic approach. New York: Springer.Google Scholar
  9. 9.
    Cox, M. G. (1982). Practical spline approximation. In: Topics in Numerical Analysis. Lecture Notes in Mathematics, vol. 630. Berlin: Springer.CrossRefGoogle Scholar
  10. 10.
    De Boor, C. (2001). Practical guide to splines. New York: Springer.Google Scholar
  11. 11.
    Eggermont, P. P., Paul, P., & La Riccia, V. N. (2009). Maximum penalized likelihood es timation, volume II: Regression. New York: Springer.Google Scholar
  12. 12.
    Fletcher, R. (1987). Practical methods of optimization. New York: Wiley.Google Scholar
  13. 13.
    Golub, G. H., Heath, M., & Wahba, G. (1979). Generalized cross-validation as a method for choosing a good ridge parameter. Technometrics, 21(2), 215–223.CrossRefGoogle Scholar
  14. 14.
    Golub, G. H., & Van Loan, C. F. (1996). Matrix computations. Baltimore: The John Hopkins Universi ty Press.Google Scholar
  15. 15.
    Hastie, T., & Tibshirani, R. (1986). Generalized additive models. Statistical Science, 1(3), 297–310.CrossRefGoogle Scholar
  16. 16.
    Hastie, T. J., & Tibshirani, R. J. (1990). Generalized additive models. New York: Chapman and Hall.Google Scholar
  17. 17.
    Hastie, T., Tibshirani, R., & Friedman, J. H. (2001). The element of statistical learning. New York: Springer.CrossRefGoogle Scholar
  18. 18.
    Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1), 55–67.CrossRefGoogle Scholar
  19. 19.
    Hosmer, D. W., & Lemeshow, S. (2000). Applied logistic regression. New York: Wiley.CrossRefGoogle Scholar
  20. 20.
    Karmarkar, N. (1984). A new polynomial-time algorithm for linear programming. Combinatorica, 4, 373–395.CrossRefGoogle Scholar
  21. 21.
    Kelley, C.T. (1995). Iterative methods for linear and nonlinear equations. MOS-SIAM Series on Optimization (Philadelphia).Google Scholar
  22. 22.
    Lobo, M. S., Vandenberghe, L., Boyd, S., & Lebret, H. (1998). Applications of second-order cone programming. Linear Algebra and its Applications, 284, 193–228.CrossRefGoogle Scholar
  23. 23.
    McCullagh, P., & Nelder, J. A. (1989). Generalized linear models. New York: Chapman and Hall/CRC Mono graphs on Statistics and Applied Probability.CrossRefGoogle Scholar
  24. 24.
    Nesterov, Yu., & Nemirovski, A. (1994). Interior-point polynomial methods in convex programming. Philadelphia: MOS-SIAM Series on Optimization.CrossRefGoogle Scholar
  25. 25.
    Pringle, R. M., Rayner, A. A. (1971). Generalized inverse matrices with applications to statistics. NewYork: Hafner Publishing.Google Scholar
  26. 26.
    Quarteroni, A., Sacco, R., & Saleri, F. (2000). Numerical mathematics. New York: Springer.Google Scholar
  27. 27.
    Renegar, J. (1987). A mathematical view of interior-point methods in convex optimization. Philadelphia: MOS-SIAM Series on Optimization.Google Scholar
  28. 28.
    Schmidt, M., Fung, G., Rosales, R. (2009). Optimization methods for L1-regularization. UBC Technical Report TR-2009-19.Google Scholar
  29. 29.
    Shevade, S. K., & Keerthi, S. S. (2003). A simple and efficient algorithm for gene selection using sparse logistic regression. Bioinformatics, 19(17), 2246–2253.CrossRefGoogle Scholar
  30. 30.
    Shor, N. Z. (1985). Minimization methods for non-differentiable functions. Berlin: Springer.CrossRefGoogle Scholar
  31. 31.
    Taylan, P., Weber, G. W., & Beck, A. (2007). New approaches to regression by generalized additive models and continuous optimization for modern applications in finance, science and technology. Optimization, 56(5–6), 675–698.CrossRefGoogle Scholar
  32. 32.
    Tibshirani, R. (1996). Regression shrinkage and selection via the LASSO. Journal of the Royal Statistical Society Series B (Statistical Methodology), 58, 267–288.Google Scholar
  33. 33.
    Walker, S., & Page, C. (2001). Generalized ridge regression and a generalization of the CP statistics. Journal of Applied Statistics, 28(7), 911–922.CrossRefGoogle Scholar
  34. 34.
    Weisberg, S. (2005). Applied linear regression. New Jersey: Wiley.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Science FacultyDicle UniversityDiyarbakırTurkey
  2. 2.Faculty of Management EngineeringPoznan Technology UniversityPoznanPoland
  3. 3.IAMMETUAnkaraTurkey

Personalised recommendations