Skip to main content

(Non) Linear Regression Modeling

  • Chapter
  • First Online:

Part of the book series: Springer Handbooks of Computational Statistics ((SHCS))

Abstract

We will study causal relationships of a known form between random variables. Given a model, we distinguish one or more dependent (endogenous) variables \(\mathbf{Y} = ({Y }_{1},\ldots,{Y }_{l}),l \in\mathbb{N}\), which are explained by a model, and independent (exogenous, explanatory) variables \(\mathbf{X} = ({X}_{1},\ldots,{X}_{p}),p \in\mathbb{N}\), which explain or predict the dependent variables by means of the model. Such relationships and models are commonly referred to as regression models.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   259.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   329.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   329.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  • Akaike, H.: A new look at the statistical model identification. IEEE Trans. Automat. Contr. 19, 716–723 (1974)

    Article  MathSciNet  MATH  Google Scholar 

  • Akdeniz, F., Yüksel, G., Wan, A.T.K.: The moments of the operational almost unbiased ridge regression estimator. Appl. Math. Comput. 153, 673–684 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  • Amemiya, T.: Non-linear regression models. In: Griliches, Z., Intriligator, M.D. (eds.) Handbook of Econometrics, vol. 1, North-Holland Publishing Company, Amsterdam (1983)

    Google Scholar 

  • Amemiya, T.: Advanced Econometrics. Harvard University Press, Cambridge, USA (1985)

    Google Scholar 

  • Austin, P.C.: Bootstrap model selection had similar performance for selecting authentic and noise variables compared to backward variable elimination: A simulation study. J. Clin. Epidemiol. 61, 1009–1017 (2008)

    Article  Google Scholar 

  • Bang, Y.H., Yoo, C.K., Lee, I.-B.: Nonlinear PLS modeling with fuzzy inference system. Chemometr. Intell. Lab. Syst. 64, 137–155 (2003)

    Article  Google Scholar 

  • Barlow, J.L.: Numerical aspects of solving linear least squares problems. In: Rao, C.R. (eds.) Handbook of Statistics, Volume 9. Elsevier, Amsterdam, London, New York, Tokyo (1993)

    Google Scholar 

  • Barros, A.S., Rutledge, D.N.: Genetic algorithm applied to the selection of principal components. Chemometr. Intell. Lab. Syst. 40, 65–81 (1998)

    Article  Google Scholar 

  • Bates, D.M., Watts, D.G.: Nonlinear Regression Analysis and Its Applications. Wiley, New York, USA (1988)

    MATH  Google Scholar 

  • Bedrick, E.J., Tsai, C.-L.: Model selection for multivariate regression in small samples. Biometrics 50, 226–231 (1994)

    Article  MATH  Google Scholar 

  • Berglund, A., Wold, S.: INLR, implicit nonlinear latent variable regression. J. Chemometr. 11, 141–156 (1997)

    Article  Google Scholar 

  • Berglund, A., Kettaneh, W.S., Bendwell, N., Cameron, D.R.: The GIFI approach to non-linear PLS modelling. J. Chemometr. 15, 321–336 (2001)

    Article  Google Scholar 

  • Berndt, E.R., Hall, B.H., Hall, R.E., Hausman, J.A.: Estimation and inference in nonlinear structural models. Ann. Econometr. Soc. Meas. 3, 653–666 (1974)

    Google Scholar 

  • Björck, A.: Numerical Methods for Least Squares Problems. SIAM Press, Philadelphia, USA (1996)

    Book  MATH  Google Scholar 

  • Björkström, A., Sundberg, R.: Continuum regression is not always continuous. J. Roy. Stat. Soc. B 58, 703–710 (1996)

    MATH  Google Scholar 

  • Björkström, A., Sundberg, R.: A generalized view on continuum regression. Scand. J. Stat. 26, 17–30 (1999)

    Article  MATH  Google Scholar 

  • Bondell, H.D., Li, L.: Shrinkage inverse regression estimation for model-free variable selection. J. Roy. Stat. Soc. B 71, 287–299 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  • Brooks, R., Stone, M.: Continuum regression: cross-validated sequentially constructed prediction embracing ordinary least squares, partial least squares and principal component regression. J. Roy. Stat. Soc. B 52, 237–269 (1990)

    MathSciNet  MATH  Google Scholar 

  • Brooks, R., Stone, M.: Joint continuum regression for multiple predicants. J. Am. Stat. Assoc. 89, 1374–1377 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  • Butler, N.A., Denham, M.C.: The peculiar shrinkage properties of partial least squares regression. J. Roy. Stat. Soc. B 62, 585–593 (2000)

    Article  MathSciNet  MATH  Google Scholar 

  • Chambers, L.: Practical Handbook of Genetic Algorithms: Complex Coding Systems, vol. III, CRC Press, USA (1998)

    Book  Google Scholar 

  • Chawla, J.S.: A note on ridge regression. Stat. Probab. Lett. 9, 343–345 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  • Chong, I.-G., Jun, C.-H.: Performance of some variable selection methods when multicollinearity is present. Chemometr. Intell. Lab. Syst. 78, 103–112 (2005)

    Article  Google Scholar 

  • Coutis, C.: Partial least squares algorithm yields shrinkage estimators. Ann. Stat. 24, 816–824 (1996)

    Article  Google Scholar 

  • Dagenais, M.G.: Extension of the ridge regression technique to non-linear models with additive errors. Econ. Lett. 12, 169–174 (1983)

    Article  MathSciNet  Google Scholar 

  • Danilov, D., Magnus, J.R.: On the harm that ignoring pretesting can cause. J. Econometr. 122, 27–46 (2004)

    Article  MathSciNet  Google Scholar 

  • Denham, M.C.: Prediction intervals in partial least squares. J. Chemometr. 11, 39–52 (1997)

    Article  Google Scholar 

  • Depczynski, U., Frost, V.J., Molt, K.: Genetic algorithms applied to the selection of factors in principal component regression. Anal. Chim. Acta 420, 217–227 (2000)

    Article  Google Scholar 

  • Durand, J.-F., Sabatier, R.: Additive spline for partial least squares regression. J. Am. Stat. Assoc. 92, 1546–1554 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  • Edwards, D., Havranek, T.: A fast model selection procedure for large families of models. J. Am. Stat. Assoc. 82, 205–213 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  • Efron, B., Hastie, T., Johnstone, I., Tibshirani. R.: Least angle regression. Ann. Stat. 32, 407–499 (2004)

    Google Scholar 

  • Efroymson, M.A.: Multiple regression analysis. In: Ralston, A., Wilf, H.S. (eds.) Mathematical Methods for Digital Computers, vol. 1, Wiley, New York, USA (1960)

    Google Scholar 

  • Faber, N.M., Song, X.-H., Hopke, P.K.: Sample specific standard error of prediction for partial least squares regression. Trends Analyt. Chem. 22, 330–334 (2003)

    Article  Google Scholar 

  • Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Am. Stat. Assoc. 96, 1348–1360 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  • Farebrother, R.W.: Further results on the mean square error of ridge estimation. J. Roy. Stat. Soc. B 38, 248–250 (1976)

    MathSciNet  MATH  Google Scholar 

  • Fletcher, R., Powell, M.J.D.: A rapidly convergent descent method for minimization. Comput. J. 6, 163–168 (1963)

    Article  MathSciNet  MATH  Google Scholar 

  • Frank, I.E., Friedman, J.H., Wold, S., Hastie, T., Mallows, C.: A statistical view of some chemometrics regression tools. Technometrics 35(2), 109–148 (1993)

    Article  MATH  Google Scholar 

  • Garthwaite, P.H.: An interpretation of partial least squares. The J. Am. Stat. Assoc. 89, 122–127 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  • Gentle, J.E.: Numerical Linear Algebra for Applications in Statistics. Springer, New York, USA (1998)

    MATH  Google Scholar 

  • Gruber, M.H.J.: Improving efficiency by shrinkage: the James-Stein and ridge regression estimators. Marcel Dekker, Incorporation, New York, USA (1998)

    MATH  Google Scholar 

  • Gunst, R.F., Mason, R.L.: Regression Analysis and Its Application: A Data-Oriented Approach. Marcel Dekker, Incorporation, New York, USA (1980)

    MATH  Google Scholar 

  • Härdle, W.: Applied Nonparametric Regression. Cambridge University Press, Cambridge, UK (1992)

    Google Scholar 

  • Härdle, W., Simar, L.: Applied Multivariate Statistical Analysis. Springer, Heidelberg, Germany (2003)

    MATH  Google Scholar 

  • Hawkins, D.M., Yin, X.: A faster algorithm for ridge regression of reduced rank data. Comput. Stat. Data Anal. 40, 253–262 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  • Heij, C., Groenen, P.J.F., van Dijk, D.: Forecast comparison of principal component regression and principal covariate regression. Comput. Stat. Data Anal. 51, 3612–3625 (2007)

    Article  MATH  Google Scholar 

  • Helland, I.S.: Some theoretical aspects of partial least squares regression. Chemometr. Intell. Lab. Syst. 58, 97–107 (2001)

    Article  Google Scholar 

  • Helland, I.S., Almoy, T.: Comparison of prediction methods when only a few components are relevant. J. Am. Stat. Assoc. 89, 583–591 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  • Hocking, R.R.: Methods and Applications of Linear Models: Regression and the Analysis of Variance, 2nd edn. Wiley, New York, USA (1996)

    MATH  Google Scholar 

  • Hoerl, A.E., Kennard, R.W.: Ridge regression: biased estimation of nonorthogonal problems. Technometrics 12, 55–67 (1970)

    Article  MATH  Google Scholar 

  • Hoerl, A.E., Kennard, R.W., Baldwin, K.F.: Ridge regression: some simulations. Comm. Stat. 4, 105–123 (1975)

    MATH  Google Scholar 

  • Hughes, A.W., Maxwell, L.K.: Model selection using AIC in the presence of one-sided information. J. Stat. Plann. Infer. 115, 379–411 (2003)

    Article  Google Scholar 

  • Hwang, J.T.G., Nettleton, D.: Principal components regression with data-chosen components and related methods. Technometrics 45, 70–79 (2003)

    Article  MathSciNet  Google Scholar 

  • Ibrahim, J.G., Ming-Hui, C.: Predictive variable selection for the multivariate linear model. Biometrics 53, 465–478 (1997)

    Article  MATH  Google Scholar 

  • Jian, W., Liu, X.: Consistent model selection based on parameter estimates. J. Stat. Plann. Infer. 121, 265–283 (2004)

    Article  Google Scholar 

  • Jollife, I.T.: A note on the use of the principle components in regression. Appl. Stat. 31(3), 300–303 (1982)

    Article  Google Scholar 

  • Jong, S.: SIMPLS: An alternative approach to partial least squares regression. Chemometr. Intell. Lab. Syst. 18, 251–263 (1993)

    Article  Google Scholar 

  • Jong, S.: PLS shrinks. J. Chemometr. 9, 323–326 (1995)

    Article  Google Scholar 

  • Judge, G.G., Bock, M.E.: Biased estimation. In: Griliches, Z., Intriligator, M.D. (eds.) Handbook of Econometrics. vol. 1, North-Holland Publishing Company, Amsterdam (1983)

    Google Scholar 

  • Kadiyala, K.: A class of almost unbiased and efficient estimators of regression coefficients. Econ. Lett. 16, 293–296 (1984)

    Article  MathSciNet  Google Scholar 

  • Kennedy, W.J., Gentle, J.E.: Stat. Comput. Marcel Dekker, Incorporation, New York, USA (1980)

    Google Scholar 

  • Kibria, G.: On preliminary test ridge regression estimators for linear restrictions in a regression model with non-normal disturbances. Comm. Stat. Theor. Meth. 25, 2349–2369 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  • Kim, M., Hill, R.C.: Shrinkage estimation in nonlinear regression: the Box-Cox transformation. J. Econometr. 66, l–33 (1995)

    Google Scholar 

  • Knight, K., Fu, W.: Asymptotics for Lasso-type estimators. Ann. Stat. 28, 1356–1389 (2000)

    MathSciNet  MATH  Google Scholar 

  • Leardi, R., Gonzáles, A.L.: Genetic algorithms applied to feature selection in PLS regression: how and when to use them. Chemometr. Intell. Lab. Syst. 41, 195–207 (1998)

    Article  Google Scholar 

  • Leamer, E.E.: Model choice and specification analysis. In: Griliches, Z., Intriligator, M.D. (eds.) Handbook of Econometrics. vol. 1, North-Holland Publishing Company, Amsterdam (1983)

    Google Scholar 

  • Li, K.-C.: Asymptotic optimality for C p , C L , cross-validation and generalized cross-validation: discrete index set. Ann. Stat. 15, 958–975 (1987)

    Article  MATH  Google Scholar 

  • Li, B., Morris, J., Martin, E.B.: Model section for partial least squares regression. Chemometr. Intell. Lab. Syst. 64, 79–89 (2002)

    Article  Google Scholar 

  • Li, L., Cook, R.D., Nachtsheim, C.J.: Model-free variable selection. J. Roy. Stat. Soc. B 67, 285–299 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  • Lipovetski, S.: Enhanced ridge regression. Math. Comput. Model. 51, 338–348 (2010)

    Google Scholar 

  • Magnus, J.R.: The traditional pretest estimator. Theor. Probab. Appl. 44(2), 293–308 (1999)

    Article  MathSciNet  MATH  Google Scholar 

  • Magnus, J.R.: Estimation of the mean of a univariate normal distribution with known variance. Econometr. J. 5, 225–236 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  • Magnus, J.R., Powell, O., Prüfer, P.: A comparison of two model averaging techniques with an application to growth empirics. J. Econometr. 154, 139–153 (2010)

    Article  Google Scholar 

  • Malthouse, E.C., Tamhane, A.C., Mah, R.S.H.: Nonlinear partial least squares. Comput. Chem. Eng. 21(8), 875–890 (1997)

    Article  Google Scholar 

  • Marquardt, D.W.: An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math. 11, 431–441 (1963)

    Article  MathSciNet  MATH  Google Scholar 

  • McDonald, G.C., Schwing, R.C.: Instabilities of regression estimates relating air pollution to mortality. Technometrics 15, 463–482 (1973)

    Article  Google Scholar 

  • Miller, A.J.: Selection of subsets of regression variables. J. Roy. Stat. Soc. A 147(3), 389–425 (1984)

    Article  MATH  Google Scholar 

  • Miller, A.: Subset Selection in Regression, Chapman & Hall/CRC, USA (2002)

    MATH  Google Scholar 

  • Montgomery, D.C., Peck, E.A., Vining, G.G.: Introduction to Linear Regression Analysis, 3rd edn. Wiley, New York, USA (2001)

    MATH  Google Scholar 

  • Ngo, S.H., Kemény, S., Deák, A.: Performance of the ridge regression methods as applied to complex linear and nonlinear models. Chemometr. Intell. Lab. Syst. 67, 69-78 (2003)

    Article  Google Scholar 

  • Ohtani, K.: On small sample properties of the almost unbiased generalized ridge estimator. Comm. Stat. Theor. Meth. 22, 2733–2746 (1986)

    Article  MathSciNet  Google Scholar 

  • Osborne, M.R., Presnell, B., Turlach. B.A.: On the Lasso and its dual. J. Comput. Graph. Stat. 9, 319–337 (1999)

    Google Scholar 

  • Osten, D.W.: Selection of optimal regression models via cross-validation. J. Chemometr. 2, 39–48 (1988)

    Article  Google Scholar 

  • Park, M.-Y., Hastie, T.: An L1 regularization-path algorithm for generalized linear models. J. Roy. Stat. Soc. B 69, 659–677 (2007)

    Article  MathSciNet  Google Scholar 

  • Phatak, A., Reilly, P.M., Pendilis, A.: The asymptotic variance of the univariate PLS estimator. Lin. Algera Appl. 354, 245–253 (2002)

    Article  MATH  Google Scholar 

  • Pötscher, B.M., Leeb, H.: On the distribution of penalized maximum likelihood estimators: the LASSO, SCAD, and thresholding. J. Multivariate Anal. 100, 2065–2082 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  • Rao, C.R., Toutenberg, H.: Linear Models, Springer, New York, USA (1999)

    Google Scholar 

  • Rao, C.R., Wu, Y.: A strongly consistent procedure for model selection in a regression problem. Biometrika 76, 369–374 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  • Qin, S., McAvoy, T.: Nonlinear PLS modeling using neural networks. Comput. Chem. Eng. 16, 379–391 (1992)

    Article  Google Scholar 

  • Qin, S.J.: Recursive PLS algorithms for adaptive data modeling. Comput. Chem. Eng. 22(4), 503–514 (1997)

    Article  Google Scholar 

  • Schwarz, G.: Estimating the dimension of a model. Ann. Stat. 6, 461–464 (1978)

    Article  MATH  Google Scholar 

  • Seber, G.A.F., Wild, C.J.: Nonlinear Regression, Wiley, New York, USA (2003)

    Google Scholar 

  • Shao, J.: Linear model selection by cross-validation. J. Am. Stat. Assoc. 88, 486–494 (1993)

    Article  MATH  Google Scholar 

  • Shao, J.: An asymptotic theory for linear model selection. Stat. Sin. 7, 221–264 (1997)

    MATH  Google Scholar 

  • Shen, X., Huang, H.-C., Ye, J.: Inference after model selection. J. Am. Stat. Assoc. 99, 751–762 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  • Shen, X., Ye, J.: Adaptive model selection. J. Am. Stat. Assoc. 97, 210–221 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  • Shi, P., Tsai, C.-L.: A note on the unification of the Akaike information criterion. J. Roy. Stat. Soc. B 60, 551–558 (1998)

    Article  MathSciNet  MATH  Google Scholar 

  • Shiaishi, T., Konno, Y.: On construction of improved estimators in multiple-design multivariate linear models under general restrictions. Ann. Inst. Stat. Math. 46, 665–674 (1995)

    Article  Google Scholar 

  • Shibata, R.: An optimal selection of regression variables. Biometrika 68, 45–54 (1981)

    Article  MathSciNet  MATH  Google Scholar 

  • Shibata, R.: Approximate efficiency of a selection procedure for the number of regression variables. Biometrika 71, 43–49 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  • Singh, R.K., Pandey, S.K., Srivastava, V.K.: A generalized class of shrinkage estimators in linear regression when disturbances are not normal. Comm. Stat. Theor. Meth. 23, 2029–2046 (1994)

    Article  MathSciNet  MATH  Google Scholar 

  • Spanos, A., McGuirk, A.: The problem of near-multicollinearity revisited: erratic vs systematic volatility. J. Econometr. 108, 365–393 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  • Stoica, P., Söderström, T.: Partial least squares: a first-order analysis. Scand. J. Stat. 25, 17–26 (1998)

    Article  MATH  Google Scholar 

  • Stone, M.: Cross-validatory choice and assessment of statistical predictions. J. Roy. Stat. Soc. B 36, 111–147 (1974)

    MATH  Google Scholar 

  • Sundberg, R.: Continuum regression and ridge regression. J. Roy. Stat. Soc. B 55, 653–659 (1993)

    MathSciNet  MATH  Google Scholar 

  • Swamy, P.A.V.B., Mehtam J,S,, Rappoport, P.N.: Two methods of evaluating Hoerl and Kennard’s ridge regression. Comm. Stat. A 12, 1133–1155

    Google Scholar 

  • Tateishi, S., Matsui, H., Konishi, S.: Nonlinear regression modeling via the lasso-type regularization. J. Stat. Plann. Infer. 140, 1125–1134 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  • Thisted, R.A.: Elements of Statistical Computing. Chapman and Hall, London New York (1988)

    MATH  Google Scholar 

  • Tibshirani, R.: Regression shrinkage and selection via Lasso. J. Roy. Stat. Soc. B 58, 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  • Tibshirani, R., Saunders, M., Rosset, S., Zhu, J., Knight, K.: Sparsity and smoothness via the fused Lasso. J. Roy. Stat. Soc. B 67, 91–108 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  • Trygg, J., Wold, S.: Orthogonal projections to latent structures, O-PLS. J. Chemometr. 16(3), 119–128 (2002)

    Article  Google Scholar 

  • Ullah, A., Sristava, V.K., Chandra, R.: Properties of shrinkage estimators in linear regression when disturbances are not normal. J. Econometr. 21, 289–402 (1983)

    Article  Google Scholar 

  • Vinod, H.D., Ullah, A.: Recent Advances in Regression Methods. Marcel Dekker Incorporation, New York, USA (1981)

    MATH  Google Scholar 

  • Wan, A.T.K.: On generalized ridge regression estimators under collinearity and balanced loss. Appl. Math. Comput. 129, 455–467 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  • Wang, H., Li, G., Tsai, C.-L.: Regression coefficient and autoregressive order shrinkage and selection via the lasso. J. Roy. Stat. Soc. B 69, 63–78 (2007)

    MathSciNet  Google Scholar 

  • Wang, S.G., Chow, S.C.: A note on adaptive generalized ridge regression estimator. Stat. Probab. Lett. 10, 17–21 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  • Wang, S.G., Tse, S.K., Chow, S.C.: On the measures of multicollinearity in least squares regression. Stat. Probab. Lett. 9, 347–355 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  • Wasserman, G.S., Sudjianto, A.: All subsets regression using a generic search algorithm. Comput. Ind. Eng. 27, 489–492 (1994)

    Article  Google Scholar 

  • Wegelin, J.A.: A survey of partial least squares (PLS) methods, with emphasis on the two-block case. Technical Report 371, Department of Statistics, University of Washington, Seattle (2000)

    Google Scholar 

  • Weiss, R.E.: The influence of variable selection: a bayesian diagnostic perspective. J. Am. Stat. Assoc. 90, 619–625 (1995)

    Article  MATH  Google Scholar 

  • Wentzell, P.D., Montoto, L.V.: Comparison of principal components regression and partial least squares regression through generic simulations of complex mixtures. Chemometr. Intell. Lab. Syst. 65, 257–279 (2003)

    Article  Google Scholar 

  • Wold, H.: Estimation of principle components and related models by iterative least squares. In: Krishnaiaah (eds.) Multivariate analysis. Academic Press, New York (1966)

    Google Scholar 

  • Wold, S.: Cross-validation estimation of the number of components in factor and principal components analysis. Technometrics 24, 397–405 (1978)

    Article  Google Scholar 

  • Wold, S., Kettaneh-Wold, N., Skagerberg, B.: Nonlinear PLS modelling. Chemometr. Intell. Lab. Syst. 7, 53–65 (1989)

    Article  Google Scholar 

  • Wold, S.: Nonlinear partial least squares modelling II. Spline inner relation. Chemometr. Intell. Lab. Syst. 14, 71–84 (1992)

    Article  Google Scholar 

  • Wold S., Trygg J., Berglund A., and Atti H.: Some recent developments in PLS modeling. Chemometrics Intelligent Laboratory System 58, 131–150 (2001)

    Article  Google Scholar 

  • Yanagihara, H., Satoh, K.: An unbiased Cp criterion for multivariate ridge regression. J. Multivariate Anal. 101, 1226–1238 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  • Zhang, P.: Inference after variable selection in linear regression models. Biometrika 79(4), 741–746 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  • Zhang, M.H., Xu, Q.S., Massart, D.L.: Averaged and weighted average partial least squares. Anal. Chim. Acta 504, 279–289 (2004)

    Article  Google Scholar 

  • Zheng, X., Loh, W.-Y.: Consistent variable selection in linear models. J. Am. Stat. Assoc. 90, 151–156 (1995)

    Article  MathSciNet  MATH  Google Scholar 

  • Zou, H.: The Adaptive Lasso and Its Oracle Properties. J. Am. Stat. Assoc. 101, 1418–1429 (2006)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Pavel Čı́žek .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Čı́žek, P. (2012). (Non) Linear Regression Modeling. In: Gentle, J., Härdle, W., Mori, Y. (eds) Handbook of Computational Statistics. Springer Handbooks of Computational Statistics. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-21551-3_23

Download citation

Publish with us

Policies and ethics