Abstract
This chapter provides a preview of the book but is presented in a rather abstract setting and will be easier to follow after reading the rest of the book. The reader may omit this chapter on first reading and refer back to it as necessary. Chapters 2 to 9 consider multiple linear regression and experimental design models fit with least squares. Chapter 1 is useful for extending several techniques, such as response plots and plots for response transformations used in those chapters, to alternative fitting methods and to alternative regression models. Chapter 13 illustrates some of these extensions for the generalized linear model (GLM) and the generalized additive model (GAM).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Agresti, A. (2002). Categorical data analysis (2nd ed.). Hoboken, NJ: Wiley.
Aldrin, M., Bϕlviken, E., & Schweder, T. (1993). Projection pursuit regression for moderate non-linearities. Computational Statistics & Data Analysis, 16, 379–403.
Beaton, A. E., Martin, M. O., Mullis, I. V. S., Gonzales, E. J., Smith, T. A., & Kelly, D. L. (1996). Science Achievement in the Middle School Years: IEA’s Third International Mathematics and Science Study. Chestnut Hill, MA: TIMSS International Study Center.
Box, G. E. P. (1979). Robustness in the strategy of scientific model building. In R. Launer & G. Wilkinson (Eds.), Robustness in statistics (pp. 201–235). New York, NY: Academic Press.
Brillinger, D. R. (1977). The identification of a particular nonlinear time series. Biometrika, 64, 509–515.
Brillinger, D. R. (1983). A generalized linear model with “Gaussian” regressor variables. In P. J. Bickel, K. A. Doksum, & J. L. Hodges (Eds.), A Festschrift for Erich L. Lehmann (pp. 97–114). Pacific Grove, CA: Wadsworth.
Chang, J. (2006). Resistant Dimension Reduction. Ph.D. Thesis, Southern Illinois University, online at http://lagrange.math.siu.edu/Olive/sjingth.pdf
Chang, J., & Olive, D. J. (2007). Resistant dimension reduction. Pre-print, see http://lagrange.math.siu.edu/Olive/preprints.htm
Chang, J., & Olive, D. J. (2010). OLS for 1D regression models. Communications in statistics: Theory and methods, 39, 1869–1882.
Collett, D. (1999). Modelling binary data (1st ed.). Boca Raton, FL: Chapman & Hall/CRC.
Cook, R. D. (1998). Regression graphics: Ideas for studying regression through graphics. New York, NY: Wiley.
Cook, R. D., & Weisberg, S. (1999a). Applied regression including computing and graphics. New York, NY: Wiley.
Daniel, C., & Wood, F. S. (1980). Fitting equations to data (2nd ed.). New York, NY: Wiley.
Furnival, G., & Wilson, R. (1974). Regression by leaps and bounds. Technometrics, 16, 499–511.
James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). An introduction to statistical learning with applications in R. New York, NY: Springer.
Jones, H. L. (1946). Linear regression functions with neglected variables. Journal of the American Statistical Association, 41, 356–369.
Li, K. C., & Duan, N. (1989). Regression analysis under link violation. The Annals of Statistics, 17, 1009–1052.
Mallows, C. (1973). Some comments on C p . Technometrics, 15, 661–676.
Olive, D. J. (2013b). Plots for generalized additive models. Communications in Statistics: Theory and Methods, 42, 2610–2628.
Olive, D. J. (2017). Prediction and statistical learning, online course notes, see http://lagrange.math.siu.edu/Olive/slearnbk.htm
Olive, D. J., & Hawkins, D. M. (2005). Variable selection for 1D regression models. Technometrics, 47, 43–50.
Pelawa Watagoda, L. C. R., & Olive, D. J. (2017). Inference after variable selection, preprint at http://lagrange.math.siu.edu/Olive/ppvsinf.pdf
Severini, T. A. (1998). Some properties of inferences in misspecified linear models. Statistics & Probability Letters, 40, 149–153.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Olive, D.J. (2017). Introduction. In: Linear Regression. Springer, Cham. https://doi.org/10.1007/978-3-319-55252-1_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-55252-1_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-55250-7
Online ISBN: 978-3-319-55252-1
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)