Skip to main content

Introduction

  • Chapter
  • First Online:
Linear Regression

Abstract

This chapter provides a preview of the book but is presented in a rather abstract setting and will be easier to follow after reading the rest of the book. The reader may omit this chapter on first reading and refer back to it as necessary. Chapters 2 to 9 consider multiple linear regression and experimental design models fit with least squares. Chapter 1 is useful for extending several techniques, such as response plots and plots for response transformations used in those chapters, to alternative fitting methods and to alternative regression models. Chapter 13 illustrates some of these extensions for the generalized linear model (GLM) and the generalized additive model (GAM).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Agresti, A. (2002). Categorical data analysis (2nd ed.). Hoboken, NJ: Wiley.

    Book  MATH  Google Scholar 

  • Aldrin, M., Bϕlviken, E., & Schweder, T. (1993). Projection pursuit regression for moderate non-linearities. Computational Statistics & Data Analysis, 16, 379–403.

    Google Scholar 

  • Beaton, A. E., Martin, M. O., Mullis, I. V. S., Gonzales, E. J., Smith, T. A., & Kelly, D. L. (1996). Science Achievement in the Middle School Years: IEA’s Third International Mathematics and Science Study. Chestnut Hill, MA: TIMSS International Study Center.

    Google Scholar 

  • Box, G. E. P. (1979). Robustness in the strategy of scientific model building. In R. Launer & G. Wilkinson (Eds.), Robustness in statistics (pp. 201–235). New York, NY: Academic Press.

    Chapter  Google Scholar 

  • Brillinger, D. R. (1977). The identification of a particular nonlinear time series. Biometrika, 64, 509–515.

    Article  MathSciNet  MATH  Google Scholar 

  • Brillinger, D. R. (1983). A generalized linear model with “Gaussian” regressor variables. In P. J. Bickel, K. A. Doksum, & J. L. Hodges (Eds.), A Festschrift for Erich L. Lehmann (pp. 97–114). Pacific Grove, CA: Wadsworth.

    Google Scholar 

  • Chang, J. (2006). Resistant Dimension Reduction. Ph.D. Thesis, Southern Illinois University, online at http://lagrange.math.siu.edu/Olive/sjingth.pdf

  • Chang, J., & Olive, D. J. (2007). Resistant dimension reduction. Pre-print, see http://lagrange.math.siu.edu/Olive/preprints.htm

  • Chang, J., & Olive, D. J. (2010). OLS for 1D regression models. Communications in statistics: Theory and methods, 39, 1869–1882.

    Article  MathSciNet  MATH  Google Scholar 

  • Collett, D. (1999). Modelling binary data (1st ed.). Boca Raton, FL: Chapman & Hall/CRC.

    MATH  Google Scholar 

  • Cook, R. D. (1998). Regression graphics: Ideas for studying regression through graphics. New York, NY: Wiley.

    Book  MATH  Google Scholar 

  • Cook, R. D., & Weisberg, S. (1999a). Applied regression including computing and graphics. New York, NY: Wiley.

    Google Scholar 

  • Daniel, C., & Wood, F. S. (1980). Fitting equations to data (2nd ed.). New York, NY: Wiley.

    MATH  Google Scholar 

  • Furnival, G., & Wilson, R. (1974). Regression by leaps and bounds. Technometrics, 16, 499–511.

    Article  MATH  Google Scholar 

  • James, G., Witten, D., Hastie, T., & Tibshirani, R. (2013). An introduction to statistical learning with applications in R. New York, NY: Springer.

    Book  MATH  Google Scholar 

  • Jones, H. L. (1946). Linear regression functions with neglected variables. Journal of the American Statistical Association, 41, 356–369.

    Article  MathSciNet  MATH  Google Scholar 

  • Li, K. C., & Duan, N. (1989). Regression analysis under link violation. The Annals of Statistics, 17, 1009–1052.

    Article  MathSciNet  MATH  Google Scholar 

  • Mallows, C. (1973). Some comments on C p . Technometrics, 15, 661–676.

    MATH  Google Scholar 

  • Olive, D. J. (2013b). Plots for generalized additive models. Communications in Statistics: Theory and Methods, 42, 2610–2628.

    Google Scholar 

  • Olive, D. J. (2017). Prediction and statistical learning, online course notes, see http://lagrange.math.siu.edu/Olive/slearnbk.htm

  • Olive, D. J., & Hawkins, D. M. (2005). Variable selection for 1D regression models. Technometrics, 47, 43–50.

    Article  MathSciNet  Google Scholar 

  • Pelawa Watagoda, L. C. R., & Olive, D. J. (2017). Inference after variable selection, preprint at http://lagrange.math.siu.edu/Olive/ppvsinf.pdf

  • Severini, T. A. (1998). Some properties of inferences in misspecified linear models. Statistics & Probability Letters, 40, 149–153.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Olive, D.J. (2017). Introduction. In: Linear Regression. Springer, Cham. https://doi.org/10.1007/978-3-319-55252-1_1

Download citation

Publish with us

Policies and ethics