Abstract
Generalized Additive logistic Regression model (GALRM) is a very important nonparametric regression model. It can be used for binary classification or for predicting the certainty of a binary outcome by using generalized additive models, which known as modern techniques from statistical learning, and the penalized log-likelihood criterion. In our chapter, we develop an estimation problem for GALRM based on B-spline and Least Absolute Shrinkage and Selection Operator (LASSO), unlike the traditional solutions; we will express the LASSO problem as a conic quadratic optimization problem which is a well structured convex optimization program, and solve it great and very efficient interior points methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Aitchison, P. W. (1982). Generalized inverse matrices and their applications. International Journal of Mathematical Education in Science and Technology, 13(1), 99–109.
Aster, R. C., Borchers, B., & Thurber, C. H. (2013). Parameter estimation and inverse problems. New York: Academic Press.
Bagirov, A., Karmitsa, N., & Mäkelä, M. M. (2014). Introduction to nonsmooth optimization: Theory. Springer, Switzerland: Practice and Software.
Ben-Tal, A., & Nemirovski, A. (2001). Lectures on modern convex optimization: Analysis, algorithms and engineering applications. MOS-SIAM Series on Optimization (Philadelphia).
Blumschein, P., Hung, W., & Jonassen, D. (2009). Model-based approaches to learning: Using systems models and simulations to improve understanding and problem solving in complex domains. Rotterdam: Sense Publishers.
Bock, H. H., Sokolowski, A., & Jajuga, K. (2002). Classification, clustering, and data analysis: Recent advances and applications. New York: Springer.
Boyd, S., Vandenberghe, & L. Vandenberghe. (2004). Convex optimization. Cambridge University Press, Cambridge.
Burnham, K. P., & Anderson, D. R. (2002). Model selection and inference: A practical information theoretic approach. New York: Springer.
Cox, M. G. (1982). Practical spline approximation. In: Topics in Numerical Analysis. Lecture Notes in Mathematics, vol. 630. Berlin: Springer.
De Boor, C. (2001). Practical guide to splines. New York: Springer.
Eggermont, P. P., Paul, P., & La Riccia, V. N. (2009). Maximum penalized likelihood es timation, volume II: Regression. New York: Springer.
Fletcher, R. (1987). Practical methods of optimization. New York: Wiley.
Golub, G. H., Heath, M., & Wahba, G. (1979). Generalized cross-validation as a method for choosing a good ridge parameter. Technometrics, 21(2), 215–223.
Golub, G. H., & Van Loan, C. F. (1996). Matrix computations. Baltimore: The John Hopkins Universi ty Press.
Hastie, T., & Tibshirani, R. (1986). Generalized additive models. Statistical Science, 1(3), 297–310.
Hastie, T. J., & Tibshirani, R. J. (1990). Generalized additive models. New York: Chapman and Hall.
Hastie, T., Tibshirani, R., & Friedman, J. H. (2001). The element of statistical learning. New York: Springer.
Hoerl, A. E., & Kennard, R. W. (1970). Ridge regression: Biased estimation for nonorthogonal problems. Technometrics, 12(1), 55–67.
Hosmer, D. W., & Lemeshow, S. (2000). Applied logistic regression. New York: Wiley.
Karmarkar, N. (1984). A new polynomial-time algorithm for linear programming. Combinatorica, 4, 373–395.
Kelley, C.T. (1995). Iterative methods for linear and nonlinear equations. MOS-SIAM Series on Optimization (Philadelphia).
Lobo, M. S., Vandenberghe, L., Boyd, S., & Lebret, H. (1998). Applications of second-order cone programming. Linear Algebra and its Applications, 284, 193–228.
McCullagh, P., & Nelder, J. A. (1989). Generalized linear models. New York: Chapman and Hall/CRC Mono graphs on Statistics and Applied Probability.
Nesterov, Yu., & Nemirovski, A. (1994). Interior-point polynomial methods in convex programming. Philadelphia: MOS-SIAM Series on Optimization.
Pringle, R. M., Rayner, A. A. (1971). Generalized inverse matrices with applications to statistics. NewYork: Hafner Publishing.
Quarteroni, A., Sacco, R., & Saleri, F. (2000). Numerical mathematics. New York: Springer.
Renegar, J. (1987). A mathematical view of interior-point methods in convex optimization. Philadelphia: MOS-SIAM Series on Optimization.
Schmidt, M., Fung, G., Rosales, R. (2009). Optimization methods for L1-regularization. UBC Technical Report TR-2009-19.
Shevade, S. K., & Keerthi, S. S. (2003). A simple and efficient algorithm for gene selection using sparse logistic regression. Bioinformatics, 19(17), 2246–2253.
Shor, N. Z. (1985). Minimization methods for non-differentiable functions. Berlin: Springer.
Taylan, P., Weber, G. W., & Beck, A. (2007). New approaches to regression by generalized additive models and continuous optimization for modern applications in finance, science and technology. Optimization, 56(5–6), 675–698.
Tibshirani, R. (1996). Regression shrinkage and selection via the LASSO. Journal of the Royal Statistical Society Series B (Statistical Methodology), 58, 267–288.
Walker, S., & Page, C. (2001). Generalized ridge regression and a generalization of the CP statistics. Journal of Applied Statistics, 28(7), 911–922.
Weisberg, S. (2005). Applied linear regression. New Jersey: Wiley.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Taylan, P., Weber, G.W. (2019). C-LASSO Estimator for Generalized Additive Logistic Regression Based on B-Spline. In: GarcÃa Márquez, F., Lev, B. (eds) Data Science and Digital Business. Springer, Cham. https://doi.org/10.1007/978-3-319-95651-0_10
Download citation
DOI: https://doi.org/10.1007/978-3-319-95651-0_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-95650-3
Online ISBN: 978-3-319-95651-0
eBook Packages: Business and ManagementBusiness and Management (R0)