Abstract
We will study causal relationships of a known form between random variables. Given a model, we distinguish one or more dependent (endogenous) variables \(\mathbf{Y} = ({Y }_{1},\ldots,{Y }_{l}),l \in\mathbb{N}\), which are explained by a model, and independent (exogenous, explanatory) variables \(\mathbf{X} = ({X}_{1},\ldots,{X}_{p}),p \in\mathbb{N}\), which explain or predict the dependent variables by means of the model. Such relationships and models are commonly referred to as regression models.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Akaike, H.: A new look at the statistical model identification. IEEE Trans. Automat. Contr. 19, 716–723 (1974)
Akdeniz, F., Yüksel, G., Wan, A.T.K.: The moments of the operational almost unbiased ridge regression estimator. Appl. Math. Comput. 153, 673–684 (2004)
Amemiya, T.: Non-linear regression models. In: Griliches, Z., Intriligator, M.D. (eds.) Handbook of Econometrics, vol. 1, North-Holland Publishing Company, Amsterdam (1983)
Amemiya, T.: Advanced Econometrics. Harvard University Press, Cambridge, USA (1985)
Austin, P.C.: Bootstrap model selection had similar performance for selecting authentic and noise variables compared to backward variable elimination: A simulation study. J. Clin. Epidemiol. 61, 1009–1017 (2008)
Bang, Y.H., Yoo, C.K., Lee, I.-B.: Nonlinear PLS modeling with fuzzy inference system. Chemometr. Intell. Lab. Syst. 64, 137–155 (2003)
Barlow, J.L.: Numerical aspects of solving linear least squares problems. In: Rao, C.R. (eds.) Handbook of Statistics, Volume 9. Elsevier, Amsterdam, London, New York, Tokyo (1993)
Barros, A.S., Rutledge, D.N.: Genetic algorithm applied to the selection of principal components. Chemometr. Intell. Lab. Syst. 40, 65–81 (1998)
Bates, D.M., Watts, D.G.: Nonlinear Regression Analysis and Its Applications. Wiley, New York, USA (1988)
Bedrick, E.J., Tsai, C.-L.: Model selection for multivariate regression in small samples. Biometrics 50, 226–231 (1994)
Berglund, A., Wold, S.: INLR, implicit nonlinear latent variable regression. J. Chemometr. 11, 141–156 (1997)
Berglund, A., Kettaneh, W.S., Bendwell, N., Cameron, D.R.: The GIFI approach to non-linear PLS modelling. J. Chemometr. 15, 321–336 (2001)
Berndt, E.R., Hall, B.H., Hall, R.E., Hausman, J.A.: Estimation and inference in nonlinear structural models. Ann. Econometr. Soc. Meas. 3, 653–666 (1974)
Björck, A.: Numerical Methods for Least Squares Problems. SIAM Press, Philadelphia, USA (1996)
Björkström, A., Sundberg, R.: Continuum regression is not always continuous. J. Roy. Stat. Soc. B 58, 703–710 (1996)
Björkström, A., Sundberg, R.: A generalized view on continuum regression. Scand. J. Stat. 26, 17–30 (1999)
Bondell, H.D., Li, L.: Shrinkage inverse regression estimation for model-free variable selection. J. Roy. Stat. Soc. B 71, 287–299 (2009)
Brooks, R., Stone, M.: Continuum regression: cross-validated sequentially constructed prediction embracing ordinary least squares, partial least squares and principal component regression. J. Roy. Stat. Soc. B 52, 237–269 (1990)
Brooks, R., Stone, M.: Joint continuum regression for multiple predicants. J. Am. Stat. Assoc. 89, 1374–1377 (1994)
Butler, N.A., Denham, M.C.: The peculiar shrinkage properties of partial least squares regression. J. Roy. Stat. Soc. B 62, 585–593 (2000)
Chambers, L.: Practical Handbook of Genetic Algorithms: Complex Coding Systems, vol. III, CRC Press, USA (1998)
Chawla, J.S.: A note on ridge regression. Stat. Probab. Lett. 9, 343–345 (1990)
Chong, I.-G., Jun, C.-H.: Performance of some variable selection methods when multicollinearity is present. Chemometr. Intell. Lab. Syst. 78, 103–112 (2005)
Coutis, C.: Partial least squares algorithm yields shrinkage estimators. Ann. Stat. 24, 816–824 (1996)
Dagenais, M.G.: Extension of the ridge regression technique to non-linear models with additive errors. Econ. Lett. 12, 169–174 (1983)
Danilov, D., Magnus, J.R.: On the harm that ignoring pretesting can cause. J. Econometr. 122, 27–46 (2004)
Denham, M.C.: Prediction intervals in partial least squares. J. Chemometr. 11, 39–52 (1997)
Depczynski, U., Frost, V.J., Molt, K.: Genetic algorithms applied to the selection of factors in principal component regression. Anal. Chim. Acta 420, 217–227 (2000)
Durand, J.-F., Sabatier, R.: Additive spline for partial least squares regression. J. Am. Stat. Assoc. 92, 1546–1554 (1997)
Edwards, D., Havranek, T.: A fast model selection procedure for large families of models. J. Am. Stat. Assoc. 82, 205–213 (1987)
Efron, B., Hastie, T., Johnstone, I., Tibshirani. R.: Least angle regression. Ann. Stat. 32, 407–499 (2004)
Efroymson, M.A.: Multiple regression analysis. In: Ralston, A., Wilf, H.S. (eds.) Mathematical Methods for Digital Computers, vol. 1, Wiley, New York, USA (1960)
Faber, N.M., Song, X.-H., Hopke, P.K.: Sample specific standard error of prediction for partial least squares regression. Trends Analyt. Chem. 22, 330–334 (2003)
Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc. 96, 1348–1360 (2001)
Farebrother, R.W.: Further results on the mean square error of ridge estimation. J. Roy. Stat. Soc. B 38, 248–250 (1976)
Fletcher, R., Powell, M.J.D.: A rapidly convergent descent method for minimization. Comput. J. 6, 163–168 (1963)
Frank, I.E., Friedman, J.H., Wold, S., Hastie, T., Mallows, C.: A statistical view of some chemometrics regression tools. Technometrics 35(2), 109–148 (1993)
Garthwaite, P.H.: An interpretation of partial least squares. The J. Am. Stat. Assoc. 89, 122–127 (1994)
Gentle, J.E.: Numerical Linear Algebra for Applications in Statistics. Springer, New York, USA (1998)
Gruber, M.H.J.: Improving efficiency by shrinkage: the James-Stein and ridge regression estimators. Marcel Dekker, Incorporation, New York, USA (1998)
Gunst, R.F., Mason, R.L.: Regression Analysis and Its Application: A Data-Oriented Approach. Marcel Dekker, Incorporation, New York, USA (1980)
Härdle, W.: Applied Nonparametric Regression. Cambridge University Press, Cambridge, UK (1992)
Härdle, W., Simar, L.: Applied Multivariate Statistical Analysis. Springer, Heidelberg, Germany (2003)
Hawkins, D.M., Yin, X.: A faster algorithm for ridge regression of reduced rank data. Comput. Stat. Data Anal. 40, 253–262 (2002)
Heij, C., Groenen, P.J.F., van Dijk, D.: Forecast comparison of principal component regression and principal covariate regression. Comput. Stat. Data Anal. 51, 3612–3625 (2007)
Helland, I.S.: Some theoretical aspects of partial least squares regression. Chemometr. Intell. Lab. Syst. 58, 97–107 (2001)
Helland, I.S., Almoy, T.: Comparison of prediction methods when only a few components are relevant. J. Am. Stat. Assoc. 89, 583–591 (1994)
Hocking, R.R.: Methods and Applications of Linear Models: Regression and the Analysis of Variance, 2nd edn. Wiley, New York, USA (1996)
Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation of nonorthogonal problems. Technometrics 12, 55–67 (1970)
Hoerl, A.E., Kennard, R.W., Baldwin, K.F.: Ridge regression: some simulations. Comm. Stat. 4, 105–123 (1975)
Hughes, A.W., Maxwell, L.K.: Model selection using AIC in the presence of one-sided information. J. Stat. Plann. Infer. 115, 379–411 (2003)
Hwang, J.T.G., Nettleton, D.: Principal components regression with data-chosen components and related methods. Technometrics 45, 70–79 (2003)
Ibrahim, J.G., Ming-Hui, C.: Predictive variable selection for the multivariate linear model. Biometrics 53, 465–478 (1997)
Jian, W., Liu, X.: Consistent model selection based on parameter estimates. J. Stat. Plann. Infer. 121, 265–283 (2004)
Jollife, I.T.: A note on the use of the principle components in regression. Appl. Stat. 31(3), 300–303 (1982)
Jong, S.: SIMPLS: An alternative approach to partial least squares regression. Chemometr. Intell. Lab. Syst. 18, 251–263 (1993)
Jong, S.: PLS shrinks. J. Chemometr. 9, 323–326 (1995)
Judge, G.G., Bock, M.E.: Biased estimation. In: Griliches, Z., Intriligator, M.D. (eds.) Handbook of Econometrics. vol. 1, North-Holland Publishing Company, Amsterdam (1983)
Kadiyala, K.: A class of almost unbiased and efficient estimators of regression coefficients. Econ. Lett. 16, 293–296 (1984)
Kennedy, W.J., Gentle, J.E.: Stat. Comput. Marcel Dekker, Incorporation, New York, USA (1980)
Kibria, G.: On preliminary test ridge regression estimators for linear restrictions in a regression model with non-normal disturbances. Comm. Stat. Theor. Meth. 25, 2349–2369 (1996)
Kim, M., Hill, R.C.: Shrinkage estimation in nonlinear regression: the Box-Cox transformation. J. Econometr. 66, l–33 (1995)
Knight, K., Fu, W.: Asymptotics for Lasso-type estimators. Ann. Stat. 28, 1356–1389 (2000)
Leardi, R., Gonzáles, A.L.: Genetic algorithms applied to feature selection in PLS regression: how and when to use them. Chemometr. Intell. Lab. Syst. 41, 195–207 (1998)
Leamer, E.E.: Model choice and specification analysis. In: Griliches, Z., Intriligator, M.D. (eds.) Handbook of Econometrics. vol. 1, North-Holland Publishing Company, Amsterdam (1983)
Li, K.-C.: Asymptotic optimality for C p , C L , cross-validation and generalized cross-validation: discrete index set. Ann. Stat. 15, 958–975 (1987)
Li, B., Morris, J., Martin, E.B.: Model section for partial least squares regression. Chemometr. Intell. Lab. Syst. 64, 79–89 (2002)
Li, L., Cook, R.D., Nachtsheim, C.J.: Model-free variable selection. J. Roy. Stat. Soc. B 67, 285–299 (2005)
Lipovetski, S.: Enhanced ridge regression. Math. Comput. Model. 51, 338–348 (2010)
Magnus, J.R.: The traditional pretest estimator. Theor. Probab. Appl. 44(2), 293–308 (1999)
Magnus, J.R.: Estimation of the mean of a univariate normal distribution with known variance. Econometr. J. 5, 225–236 (2002)
Magnus, J.R., Powell, O., Prüfer, P.: A comparison of two model averaging techniques with an application to growth empirics. J. Econometr. 154, 139–153 (2010)
Malthouse, E.C., Tamhane, A.C., Mah, R.S.H.: Nonlinear partial least squares. Comput. Chem. Eng. 21(8), 875–890 (1997)
Marquardt, D.W.: An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math. 11, 431–441 (1963)
McDonald, G.C., Schwing, R.C.: Instabilities of regression estimates relating air pollution to mortality. Technometrics 15, 463–482 (1973)
Miller, A.J.: Selection of subsets of regression variables. J. Roy. Stat. Soc. A 147(3), 389–425 (1984)
Miller, A.: Subset Selection in Regression, Chapman & Hall/CRC, USA (2002)
Montgomery, D.C., Peck, E.A., Vining, G.G.: Introduction to Linear Regression Analysis, 3rd edn. Wiley, New York, USA (2001)
Ngo, S.H., Kemény, S., Deák, A.: Performance of the ridge regression methods as applied to complex linear and nonlinear models. Chemometr. Intell. Lab. Syst. 67, 69-78 (2003)
Ohtani, K.: On small sample properties of the almost unbiased generalized ridge estimator. Comm. Stat. Theor. Meth. 22, 2733–2746 (1986)
Osborne, M.R., Presnell, B., Turlach. B.A.: On the Lasso and its dual. J. Comput. Graph. Stat. 9, 319–337 (1999)
Osten, D.W.: Selection of optimal regression models via cross-validation. J. Chemometr. 2, 39–48 (1988)
Park, M.-Y., Hastie, T.: An L1 regularization-path algorithm for generalized linear models. J. Roy. Stat. Soc. B 69, 659–677 (2007)
Phatak, A., Reilly, P.M., Pendilis, A.: The asymptotic variance of the univariate PLS estimator. Lin. Algera Appl. 354, 245–253 (2002)
Pötscher, B.M., Leeb, H.: On the distribution of penalized maximum likelihood estimators: the LASSO, SCAD, and thresholding. J. Multivariate Anal. 100, 2065–2082 (2009)
Rao, C.R., Toutenberg, H.: Linear Models, Springer, New York, USA (1999)
Rao, C.R., Wu, Y.: A strongly consistent procedure for model selection in a regression problem. Biometrika 76, 369–374 (1989)
Qin, S., McAvoy, T.: Nonlinear PLS modeling using neural networks. Comput. Chem. Eng. 16, 379–391 (1992)
Qin, S.J.: Recursive PLS algorithms for adaptive data modeling. Comput. Chem. Eng. 22(4), 503–514 (1997)
Schwarz, G.: Estimating the dimension of a model. Ann. Stat. 6, 461–464 (1978)
Seber, G.A.F., Wild, C.J.: Nonlinear Regression, Wiley, New York, USA (2003)
Shao, J.: Linear model selection by cross-validation. J. Am. Stat. Assoc. 88, 486–494 (1993)
Shao, J.: An asymptotic theory for linear model selection. Stat. Sin. 7, 221–264 (1997)
Shen, X., Huang, H.-C., Ye, J.: Inference after model selection. J. Am. Stat. Assoc. 99, 751–762 (2004)
Shen, X., Ye, J.: Adaptive model selection. J. Am. Stat. Assoc. 97, 210–221 (2002)
Shi, P., Tsai, C.-L.: A note on the unification of the Akaike information criterion. J. Roy. Stat. Soc. B 60, 551–558 (1998)
Shiaishi, T., Konno, Y.: On construction of improved estimators in multiple-design multivariate linear models under general restrictions. Ann. Inst. Stat. Math. 46, 665–674 (1995)
Shibata, R.: An optimal selection of regression variables. Biometrika 68, 45–54 (1981)
Shibata, R.: Approximate efficiency of a selection procedure for the number of regression variables. Biometrika 71, 43–49 (1984)
Singh, R.K., Pandey, S.K., Srivastava, V.K.: A generalized class of shrinkage estimators in linear regression when disturbances are not normal. Comm. Stat. Theor. Meth. 23, 2029–2046 (1994)
Spanos, A., McGuirk, A.: The problem of near-multicollinearity revisited: erratic vs systematic volatility. J. Econometr. 108, 365–393 (2002)
Stoica, P., Söderström, T.: Partial least squares: a first-order analysis. Scand. J. Stat. 25, 17–26 (1998)
Stone, M.: Cross-validatory choice and assessment of statistical predictions. J. Roy. Stat. Soc. B 36, 111–147 (1974)
Sundberg, R.: Continuum regression and ridge regression. J. Roy. Stat. Soc. B 55, 653–659 (1993)
Swamy, P.A.V.B., Mehtam J,S,, Rappoport, P.N.: Two methods of evaluating Hoerl and Kennard’s ridge regression. Comm. Stat. A 12, 1133–1155
Tateishi, S., Matsui, H., Konishi, S.: Nonlinear regression modeling via the lasso-type regularization. J. Stat. Plann. Infer. 140, 1125–1134 (2010)
Thisted, R.A.: Elements of Statistical Computing. Chapman and Hall, London New York (1988)
Tibshirani, R.: Regression shrinkage and selection via Lasso. J. Roy. Stat. Soc. B 58, 267–288 (1996)
Tibshirani, R., Saunders, M., Rosset, S., Zhu, J., Knight, K.: Sparsity and smoothness via the fused Lasso. J. Roy. Stat. Soc. B 67, 91–108 (2005)
Trygg, J., Wold, S.: Orthogonal projections to latent structures, O-PLS. J. Chemometr. 16(3), 119–128 (2002)
Ullah, A., Sristava, V.K., Chandra, R.: Properties of shrinkage estimators in linear regression when disturbances are not normal. J. Econometr. 21, 289–402 (1983)
Vinod, H.D., Ullah, A.: Recent Advances in Regression Methods. Marcel Dekker Incorporation, New York, USA (1981)
Wan, A.T.K.: On generalized ridge regression estimators under collinearity and balanced loss. Appl. Math. Comput. 129, 455–467 (2002)
Wang, H., Li, G., Tsai, C.-L.: Regression coefficient and autoregressive order shrinkage and selection via the lasso. J. Roy. Stat. Soc. B 69, 63–78 (2007)
Wang, S.G., Chow, S.C.: A note on adaptive generalized ridge regression estimator. Stat. Probab. Lett. 10, 17–21 (1990)
Wang, S.G., Tse, S.K., Chow, S.C.: On the measures of multicollinearity in least squares regression. Stat. Probab. Lett. 9, 347–355 (1990)
Wasserman, G.S., Sudjianto, A.: All subsets regression using a generic search algorithm. Comput. Ind. Eng. 27, 489–492 (1994)
Wegelin, J.A.: A survey of partial least squares (PLS) methods, with emphasis on the two-block case. Technical Report 371, Department of Statistics, University of Washington, Seattle (2000)
Weiss, R.E.: The influence of variable selection: a bayesian diagnostic perspective. J. Am. Stat. Assoc. 90, 619–625 (1995)
Wentzell, P.D., Montoto, L.V.: Comparison of principal components regression and partial least squares regression through generic simulations of complex mixtures. Chemometr. Intell. Lab. Syst. 65, 257–279 (2003)
Wold, H.: Estimation of principle components and related models by iterative least squares. In: Krishnaiaah (eds.) Multivariate analysis. Academic Press, New York (1966)
Wold, S.: Cross-validation estimation of the number of components in factor and principal components analysis. Technometrics 24, 397–405 (1978)
Wold, S., Kettaneh-Wold, N., Skagerberg, B.: Nonlinear PLS modelling. Chemometr. Intell. Lab. Syst. 7, 53–65 (1989)
Wold, S.: Nonlinear partial least squares modelling II. Spline inner relation. Chemometr. Intell. Lab. Syst. 14, 71–84 (1992)
Wold S., Trygg J., Berglund A., and Atti H.: Some recent developments in PLS modeling. Chemometrics Intelligent Laboratory System 58, 131–150 (2001)
Yanagihara, H., Satoh, K.: An unbiased Cp criterion for multivariate ridge regression. J. Multivariate Anal. 101, 1226–1238 (2010)
Zhang, P.: Inference after variable selection in linear regression models. Biometrika 79(4), 741–746 (1992)
Zhang, M.H., Xu, Q.S., Massart, D.L.: Averaged and weighted average partial least squares. Anal. Chim. Acta 504, 279–289 (2004)
Zheng, X., Loh, W.-Y.: Consistent variable selection in linear models. J. Am. Stat. Assoc. 90, 151–156 (1995)
Zou, H.: The Adaptive Lasso and Its Oracle Properties. J. Am. Stat. Assoc. 101, 1418–1429 (2006)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Čı́žek, P. (2012). (Non) Linear Regression Modeling. In: Gentle, J., Härdle, W., Mori, Y. (eds) Handbook of Computational Statistics. Springer Handbooks of Computational Statistics. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21551-3_23
Download citation
DOI: https://doi.org/10.1007/978-3-642-21551-3_23
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-21550-6
Online ISBN: 978-3-642-21551-3
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)