Abstract
This chapter discusses several extensions of the classical linear model. We first describe in Sect. 4.1 the general linear model and its applications. This model allows for correlated errors and heteroscedastic variances of the errors. Section 4.2 discusses several techniques to regularize the least squares estimator. Such a regularization may be useful in cases where the design matrix is highly collinear or even rank deficient. Moreover, regularization techniques allow for built-in variable selection. Section 4.4 describes Bayesian linear models as an alternative to the frequentist linear model framework. In modern statistics, Bayesian approaches have become increasingly more important and widely used.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Bibliography
Barbieri, M., & Berger, J. (2004). Optimal predictive model selection. The Annals of Statistics, 32, 870–897.
Bondell, H. D., & Reich, B. J. (2008). Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with OSCAR. Biometrics, 64, 115–123.
Box, G., & Tiao, G. (1992). Bayesian inference in statistical analysis, New York: Wiley.
Breusch, T., & Pagan, A. (1979). A simple test for heteroscedasticity and random coefficient variation. Econometrica, 47, 1287–1294.
Brockwell, P. J., & Davis, R. A. (2002). Introduction to time series and forecasting (2nd ed.). New York: Springer.
Bühlmann, P. (2006). Boosting for high-dimensional linear models. Annals of Statistics, 34, 559–583.
Bühlmann, P., & Hothorn, T. (2007). Boosting algorithms: Regularization, prediction and model fitting (with discussion). Statistical Science, 22, 477–505.
Bühlmann, P., & Yu, B. (2003). Boosting with the L2 loss: Regression and classification. Journal of the American Statistical Association, 98, 324–339.
Durbin, J., & Watson, G. (1950). Testing for serial correlation in least squares regression–I. Biometrika, 37, 409–428.
Durbin, J., & Watson, G. (1951). Testing for serial correlation in least squares regression–II, Biometrika, 38, 159–178.
Durbin, J., & Watson, G. (1971). Testing for serial correlation in least squares regression–III, Biometrika, 58, 1–42.
Efron, B., Hastie, T., Johnstone, I., & Tibshirani, R. (2004). Least angle regression. Annals of Statistics, 32, 407–451.
Fahrmeir, L., Kneib, T., & Konrath, S. (2010). Bayesian regularization in structured additive regression: A unifying perspective on shrinkage, smoothing and predictor selection. Statistics and Computing, 20, 203–219.
Fan, J., & Li, R. (2001). Variable selection via nonconcave penalized likelihood and its oracle properties. Journal of the American Statistical Association, 96, 1348–1360.
Fernandez, C., Ley, E., & Steel, M. F. J. (2001). Benchmark priors for Bayesian model averaging. Journal of Econometrics, 100, 381–427.
Foster, D., & George, E. (1994). The risk inflation criterion for multiple regression. The Annals of Statistics, 22, 1947–1975.
Friedman, J. (2001). Greedy function approximation: A gradient boosting machine. The Annals of Statistics, 29, 1189–1232.
Fu, W. J. (1998). Penalized regression: The bridge versus the LASSO. Journal of Computational and Graphical Statistics, 7, 397–416.
Gelman, A., Carlin, J., Stern, H., & Rubin, D. (2003). Bayesian data analysis. New York/ Boca Raton: Chapman & Hall/CRC.
George, E., & Foster, D. (2000). Calibration and empirical Bayes variable selection. Biometrica, 87, 731–747.
George, E., & Mc Culloch, R. (1993). Variable selection via Gibbs sampling. Journal of the American Statistical Association, 88, 881–889.
George, E., & Mc Culloch, R. (1997). Approaches for Bayesian variable selection. Statistica Sinica7, 339–373.
Goeman, J. J. (2010). L1 penalized estimation in the Cox proportional hazards model. Biometrical Journal, 52, 70–84.
Greene, W. H. (2000). Econometric analysis (4th ed.). Upper Saddle River: Prentice Hall.
Hamilton, J. D. (1994). Time series analysis. Princeton: Princeton University Press.
Hastie, T. J., Tibshirani, R. J., & Friedman, J. (2009). The elements of statistical learning. Berlin: Springer.
Ishwaran, H., & Rao, S. (2003). Detecting differentially expressed genes in microarrays using Bayesian model selection. Journal of the American Statistical Association, 98, 438–455.
Ishwaran, H., & Rao, S. (2005). Spike and slab variable selection: Frequentist and Bayesian strategies. Annals of Statistics, 33, 730–773.
Judge, G., Griffith, W., Hill, R., Lütkepohl, H., & Lee, T. (1980). The theory and practice of econometrics. New York: Wiley.
Leeflang, P. S. H., Wittink, D. R., Wedel, M., & Naert, P. A. (2000). Building models for marketing decisions. Dordrecht: Kluwer.
Ley, E., & Steel, F. (2009). On the effect of prior assumptions in Bayesian model averaging with applications to growth regression. Journal of Applied Econometrics, 24, 651–674.
Liang, F., Paulo, R., Molina, G., Clyde, M., & Berger, J. (2008). Mixtures of g-priors for Bayesian variable selection. Journal of the American Statistical Association, 103, 410–423.
Madigan, D., & York, J. (1995). Bayesian graphical models for discrete data. International Statistical Review, 63, 215–232.
Malsiner-Walli, G., & Wagner, H. (2011). Comparing spike and slab priors for Bayesian variable selection. Austrian Journal of Statistics, 40, 241–264.
Miller, A. (2002). Subset selection in regression. New York/Boca Raton: Chapman & Hall/CRC.
O’Hagan, A. (1994). Kendall’s advanced theory of statistics vol. 2b: Bayesian inference. London: Arnold.
Park, T., & Casella, G. (2008). The Bayesian lasso. Journal of the American Statistical Association, 103, 681–686.
Rigby, R. A., & Stasinopoulos, D. M. (2005). Generalized additive models for location, scale and shape. Applied Statistics, 54, 507–554.
Sydsaeter, K., Hammond, P., Seierstad, A., & Strom, A. (2005). Further mathematics for economic analysis. Upper Saddle River: Prentice Hall.
Tibshirani, R. (1996). Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society B, 58, 267–288.
Tutz, G., & Binder, H. (2006). Generalized additive modelling with implicit variable selection by likelihood based boosting. Biometrics, 62, 961–971.
White, H. (1980). A heteroscedasticity-consistent covariance matrix estimator and a direct test for heteroscedasticity. Econometrica, 48, 817–838.
Zellner, A. (1986). On assessing prior distributions and Bayesian regression analysis with g-prior distributions. In P. Goel, & A. Zellner (Eds.). Bayesian inference and decision techniques: Essays in honour of Bruno de Finetti (pp. 233–243). Amsterdam: North-Holland.
Zeugner, S. (2010). Bayesian model averaging with BMS. Technical report. Available at http://bms.zeugner.eu/tutorials/bms.pdf.
Zou, H., & Hastie, T. (2005). Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society Series B, 67, 301–320.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this chapter
Cite this chapter
Fahrmeir, L., Kneib, T., Lang, S., Marx, B. (2013). Extensions of the Classical Linear Model. In: Regression. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-34333-9_4
Download citation
DOI: https://doi.org/10.1007/978-3-642-34333-9_4
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-34332-2
Online ISBN: 978-3-642-34333-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)