Abstract
The multiple regression model supposes that the explanatory variables are (i) independent of the error term and (ii) are linearly independent. This chapter looks at what happens when these assumptions do not hold. If the first assumption is violated, the implication is that the explanatory variables are dependent on the error term. Under these conditions, the ordinary least squares estimators are no longer consistent, and it is necessary to use another estimator called the instrumental variables estimator. The consequence of violating the second assumption is that the explanatory variables are not linearly independent. In other words, they are collinear. Finally, the chapter concentrates on the third problem related to the explanatory variables, namely, the question of the stability of the estimated model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
In the case where it is the explained variable that is observed with error, then the OLS estimator is still non-consistent, but is no longer biased.
- 2.
Of course, this example is purely illustrative in the sense that only six observations are considered.
- 3.
The demonstration is given in the appendix to this chapter.
- 4.
This is only possible if the model does not have a constant term.
- 5.
Note that a dummy variable is assigned to each quarter, which requires us not to introduce a constant term into the regression. We could also have written the model by introducing a constant term and only three dummy variables .
References
Belsley, D.A., Kuh, E. and R.E. Welsch (1980), Regression Diagnostics: Identifying Influential Data and Sources of Collinearity, John Wiley & Sons, New York.
Brown, R.L., Durbin, J. and J.M. Evans (1975), “Techniques for Testing the Constancy of Regression Relationship Over Time”, Journal of the Royal Statistical Society, 37, pp. 149–192.
Chow, G.C. (1960), “Tests of Equality Between Sets of Coefficients in two Linear Regressions”, Econometrica, 28, pp. 591–605.
Diebold, F.X. (2012), Elements of Forecasting, 4th edition, South Western Publishers.
Farrar, D.E. and R.R. Glauber (1967), “Multicollinearity in Regression Analysis: The Problem Revisited”, The Review of Economics and Statistics, 49, pp. 92–107.
Fox, J. (1997), Applied Regression Analysis, Linear Models, and Related Methods, Sage Publications.
Goldfeld, S.M. and R.E. Quandt (1972), Nonlinear Econometric Methods, North-Holland, Amsterdam.
Gouriéroux, C. and A. Monfort (2008), Statistics and Econometric Models, Cambridge University Press.
Greene, W. (2020), Econometric Analysis, 8th edition, Pearson.
Griliches, Z. and M. Intriligator (1983), Handbook of Econometrics, Vol. 1, Elsevier.
Hausman, J. (1978), “Specification Tests in Econometrics”, Econometrica, 46, pp. 1251–1271.
Hoerl, A.E. and R.W. Kennard (1970a), “Ridge Regression: Biased Estimation for Non-Orthogonal Problems”, Technometrics, pp. 55–68.
Hoerl, A.E. and R.W. Kennard (1970b), “Ridge Regression: Applications to Non-Orthogonal Problems”, Technometrics, pp. 69–82.
Johnston, J. and J. Dinardo (1996), Econometric Methods, 4th edition, McGraw Hill.
Judge, G.G., Griffiths, W.E., Hill, R.C., Lutkepohl, H. and T.C. Lee (1985), The Theory and Practice of Econometrics, 2nd edition, John Wiley & Sons.
Judge, G.G., Griffiths, W.E., Hill, R.C., Lutkepohl, H. and T.C. Lee (1988), Introduction to the Theory and Practice of Econometrics, John Wiley & Sons.
Kennedy, P. (2008), A Guide to Econometrics, 6th edition, MIT Press.
Klein, L.R. (1962), An Introduction to Econometrics, Prentice-Hall, Englewood Cliffs.
Leamer, E.E. (1983), “Model Choice and Specification Analysis”, in Griliches, Z. and M.D. Intriligator (eds), Handbook of Econometrics, Vol. I, North Holland.
Schmidt, P. (1976), Econometrics, Marcel Dekker, New York.
Swamy, P.A.V.B. (1971), Statistical Inference in Random Coefficient Regression Models, Springer Verlag.
Tobin, J. (1950), “A Statistical Demand Function for Food in the USA”, Journal of the Royal Statistical Society, Series A, pp. 113–141.
Author information
Authors and Affiliations
Appendix: Demonstration of the Formula for Constrained Least Squares Estimators
Appendix: Demonstration of the Formula for Constrained Least Squares Estimators
In order to determine the constrained least squares estimator, we need to solve a minimization program of sum of squared residuals:
under the constraint: \(\boldsymbol {R\hat {\beta }}_{0}=\boldsymbol {r}\).
We define the Lagrange function:
where \(\boldsymbol {\lambda }\) is a column vector formed by the q Lagrange multipliers. We calculate the partial derivatives:
and:
Canceling these partial derivatives, we have:
and:
Let us multiply each member of (5.123) by \(\boldsymbol {R}\left ( \boldsymbol {X}^{\prime }\boldsymbol {X}\right )^{-1}\):
Hence:
with \(\hat {\boldsymbol {\beta }}=\left ( \boldsymbol {X}^{\prime }\boldsymbol {X}\right )^{-1}\boldsymbol {X}^{\prime }\boldsymbol {Y}\) denoting the OLS estimator of the unconstrained model. It is then sufficient to replace \(\boldsymbol {\lambda }\) by its value in (5.123):
Hence:
which defines the constrained least squares estimator.
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this chapter
Cite this chapter
Mignon, V. (2024). Problems with Explanatory Variables: Random Variables, Collinearity, and Instability. In: Principles of Econometrics. Classroom Companion: Economics. Springer, Cham. https://doi.org/10.1007/978-3-031-52535-3_5
Download citation
DOI: https://doi.org/10.1007/978-3-031-52535-3_5
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-52534-6
Online ISBN: 978-3-031-52535-3
eBook Packages: Economics and FinanceEconomics and Finance (R0)