Skip to main content
Log in

Testing spatial regression models under nonregular conditions

  • Published:
Empirical Economics Aims and scope Submit manuscript

Abstract

In time series context, estimation and testing issues with autoregressive and moving average (ARMA) models are well understood. Similar issues in the context of spatial ARMA models for the disturbance of the regression, however, remain largely unexplored. In this paper, we discuss the problems of testing no spatial dependence in the disturbances against the alternative of spatial ARMA process incorporating the possible presence of spatial dependence in the dependent variable. The problems of conducting such a test are twofold. First, under the null hypothesis, the nuisance parameter is not identified, resulting in a singular information matrix (IM), which is a nonregular case in statistical inference. To take account of singular IM, we follow Davies (Biometrika 64(2):247–254, 1977; Biometrika 74(1):33–43, 1987) and propose a test procedure based on the supremum of the Rao score test statistic. Second, the possible presence of spatial lag dependence will have adverse effect on the performance of the test. Using the general test procedure of Bera and Yoon (Econom Theory 9:649–658, 1993) under local misspecification, we avoid the explicit estimation of the spatial autoregressive parameter. Thus our suggested tests are entirely based on ordinary least squares estimation. Tests suggested here can be viewed as a generalization of Anselin et al. (Reg Sci Urban Econ 26:77–104, 1996). We conduct Monte Carlo simulations to investigate the finite sample properties of the proposed tests. Simulation results show that our tests have good finite sample properties both in terms of size and power, compared to other tests in the literature. We also illustrate the applications of our tests through several data sets.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  • Anselin L (1988) Spatial econometrics: methods and models. Kluwer Academic Publishers, Dordrecht, The Netherlands

    Book  Google Scholar 

  • Anselin L (2003) Spatial externalities, spatial multipliers, and spatial econometrics. International Regional Science Review 26(2):153–166

    Article  Google Scholar 

  • Anselin L, Bera AK, Florax R, Yoon MJ (1996) Simple diagnostic tests for spatial dependence. Regional Science and Urban Economics 26:77–104

    Article  Google Scholar 

  • Baltagi B, Liu L (2011) An improved generalized moments estimator for a spatial moving average error model. Econ Lett 113:282–284

    Article  Google Scholar 

  • Behrens K, Ertur C, Koch W (2012) ‘Dual’ gravity: Using spatial econometrics to control for multilateral resistance. J Appl Econom 27:773–794

    Article  Google Scholar 

  • Bera AK, Yoon MJ (1993) Specification testing with locally misspecified alternatives. Econom Theory 9:649–658

    Article  Google Scholar 

  • Davidson R, MacKinnon JG (1987) Implicit alternatives and the local power of test statistics. Econometrica 55:1305–1329

    Article  Google Scholar 

  • Davies RB (1977) Hypothesis testing when a nuisance parameter is present only under the alternative. Biometrika 64(2):247–254

    Article  Google Scholar 

  • Davies RB (1987) Hypothesis testing when a nuisance parameter is present only under the alternative. Biometrika 74(1):33–43

    Google Scholar 

  • Fingleton B (2008) A generalized method of moments estimator for a spatial model with moving average errors, with application to real estate prices. Empiri Econ 34:35–57

    Article  Google Scholar 

  • Florax RJGM (1992) The University: A Regional Booster? : Economic Impacts of Academic Knowledge Infrastructure, Aldershot, Hants, England and Brookfield. Vt, USA

  • Harrison D, Rubinfeld DL (1978) Hedonic housing prices and the demand for clean air. J Environ Econ Manag 5:81–102

    Article  Google Scholar 

  • Kelejian HH, Prucha IR (2001) On the asymptotic distribution of the Moran I test statistic with applications. J Econom 104(2):219–257

    Article  Google Scholar 

  • Lam C, Souza PCL (2013) Regularization for spatial panel time series using adaptive Lasso. Mimeo, New York

    Google Scholar 

  • Pace RK, Gilley OW (1997) Using the spatial configuration of the data to improve estimation. J Real Estate Financ Econ 14:333–340

    Article  Google Scholar 

  • Poskitt DS, Tremayne AR (1980) Testing the specification of a fitted autoregressive-moving average model. Biometrika 67(2):359–363

    Article  Google Scholar 

  • Saikkonen P (1989) Asymptotic relative efficiency of the classical tests under misspecification. J Econom 42:351–369

    Article  Google Scholar 

  • Sen M, Bera AK, Kao YH (2012) A Hausman test for spatial regression model. In: Baltagi BH, CR Hill, Newey WK, White HL (eds) , vol 29. Advances in econometrics. Essays in honor of Jerry Hausman. Emerald Group Publishing Limited, Bingley, pp 547–559

  • Sharpe K (1978) Some properties of the crossings process generated by a stationary chi-squared process. Adv Appl Probab 10:373–391

    Article  Google Scholar 

  • Silvey SD (1959) The lagrange multiplier test. Ann Math Stat 30:389–407

    Article  Google Scholar 

  • Yao Q, Brockwell PJ (2005) Gaussian maximum likelihood estimation for ARMA models I: time series. J Time Ser Anal 27(6):857–875

    Article  Google Scholar 

  • Yao Q, Brockwell PJ (2006) Gaussian maximum likelihood estimation for ARMA models II: spatial processes. Bernoulli 12(3):403–429

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Anil K. Bera.

Appendices

Appendices

1.1 A score functions

The model we consider is

$$\begin{aligned} \begin{aligned} y&=\rho W y+X \beta +u\\ u&=\tau Wu+ \epsilon - \lambda W \epsilon , \end{aligned} \end{aligned}$$
(A.1)

and the log-likelihood function is

$$\begin{aligned} \begin{aligned} l(\theta )=&-\frac{n}{2} \mathrm{log} 2 \pi -\frac{n}{2} \mathrm{log} \sigma ^2 - \frac{1}{2 \sigma ^2} \epsilon ' \epsilon + \mathrm{log}|I-\rho W| \\&+ \mathrm{log}|I-\tau W| + log|(I-\lambda W)^{-1}|\\ =&\mathrm{Constant} -\frac{n}{2} \mathrm{log} \sigma ^2- \frac{1}{2 \sigma ^2}(AY-X \beta )' B' C^{-1'} C^{-1} B(AY-X \beta )\\&+ {log}|A| + {log}|B|+ {log}|C^{-1}|, \end{aligned} \end{aligned}$$
(A.2)

where \(\theta =(\beta ',\sigma ^2,\rho ,\tau ,\lambda )\) and \(A=I-\rho W\), \(B=I-\tau W\), \(C=I-\lambda W\).

The first derivatives are

$$\begin{aligned} \frac{\partial l}{\partial \beta }= & {} \frac{1}{\sigma ^2}X'B'C^{-1'}\epsilon \\ \frac{\partial l}{\partial \sigma ^2}= & {} -\frac{n}{2\sigma ^2}+\frac{1}{2\sigma ^4}\epsilon ' \epsilon \\ \frac{\partial l}{\partial \rho }= & {} -tr(A^{-1}W)+\frac{1}{\sigma ^2}\epsilon 'C^{-1}BWy\\ \frac{\partial l}{\partial \tau }= & {} -tr(B^{-1}W)+\frac{1}{\sigma ^2}\epsilon 'C^{-1}W(Ay-X\beta )\\ \frac{\partial l}{\partial \lambda }= & {} -tr(WC^{-1})-\frac{1}{\sigma ^2}\epsilon 'C^{-1}W\epsilon . \end{aligned}$$

Under \(H_0: \tau =\lambda \), we have \(B=C\) and hence

$$\begin{aligned} \frac{\partial l}{\partial \beta }|_{H_0}= & {} \frac{1}{\sigma ^2}X'\epsilon \\ \frac{\partial l}{\partial \sigma ^2}|_{H_0}= & {} -\frac{n}{2\sigma ^2}+\frac{1}{2\sigma ^4}\epsilon ' \epsilon \\ \frac{\partial l}{\partial \rho }|_{H_0}= & {} -tr(A^{-1}W)+\frac{1}{\sigma ^2}\epsilon 'Wy\\ \frac{\partial l}{\partial \tau }|_{H_0}= & {} -tr(B^{-1}W)+\frac{1}{\sigma ^2}\epsilon 'C^{-1}W(Ay-X\beta )\\ \frac{\partial l}{\partial \lambda }|_{H_0}= & {} -tr(WC^{-1})-\frac{1}{\sigma ^2}\epsilon 'C^{-1}W\epsilon . \end{aligned}$$

Moreover, if we put \(\rho =0\), then we have \(A=I\) and under \(H_0\), the above becomes

$$\begin{aligned} \frac{\partial l}{\partial \beta }|_{H_0}= & {} \frac{1}{\sigma ^2}X'\epsilon \\ \frac{\partial l}{\partial \sigma ^2}|_{H_0}= & {} -\frac{n}{2\sigma ^2}+\frac{1}{2\sigma ^4}\epsilon ' \epsilon \\ \frac{\partial l}{\partial \rho }|_{H_0}= & {} -tr(A^{-1}W)+\frac{1}{\sigma ^2}\epsilon 'Wy\\ \frac{\partial l}{\partial \tau }|_{H_0}= & {} -tr(B^{-1}W)+\frac{1}{\sigma ^2}\epsilon 'C^{-1}W\epsilon \\ \frac{\partial l}{\partial \lambda }|_{H_0}= & {} -tr(WC^{-1})-\frac{1}{\sigma ^2}\epsilon 'C^{-1}W\epsilon . \end{aligned}$$

1.2 B information matrix

From the previous section, the second derivatives of the log-likelihood function are

$$\begin{aligned} \frac{\partial ^2 l}{\partial \beta \partial \beta '}= & {} -\frac{1}{\sigma ^2}X'B'C^{-1'}C^{-1}BX\\ \frac{\partial ^2 l}{\partial \beta \partial \sigma ^2}= & {} \frac{\partial ^2 l}{\partial \sigma ^2 \partial \beta '}=-\frac{1}{\sigma ^4}X'\epsilon \\ \frac{\partial ^2 l}{\partial \beta \partial \rho }= & {} \frac{\partial ^2 l}{\partial \rho \partial \beta '}=-\frac{1}{\sigma ^2}X'B'C^{-1'}C^{-1}BWy\\ \frac{\partial ^2 l}{\partial \beta \partial \tau }= & {} \frac{\partial ^2 l}{\partial \tau \partial \beta '}=-\frac{1}{\sigma ^2}X'W'C^{-1'}\epsilon -\frac{1}{\sigma ^2} X'B'C^{-1'}C^{-1}W(Ay-X\beta )\\ \frac{\partial ^2 l}{\partial \beta \partial \lambda }= & {} \frac{\partial ^2 l}{\partial \lambda \partial \beta '}=\frac{1}{\sigma ^2}X'B'C^{-1'}W'C^{-1'}B\epsilon +\frac{1}{\sigma ^2}X'B'C^{-1'}C^{-1}WC^{-1}B(Ay-X\beta )\\ \frac{\partial ^2 l}{\partial (\sigma ^2)^2}= & {} \frac{n}{2\sigma ^4}-\frac{1}{\sigma ^6}\epsilon '\epsilon \\ \frac{\partial ^2 l}{\partial \sigma ^2 \partial \rho }= & {} \frac{\partial ^2 l}{\partial \rho \partial \sigma ^2}=-\frac{1}{\sigma ^4}\epsilon 'C^{-1}BWy\\ \frac{\partial ^2 l}{\partial \sigma ^2 \partial \tau }= & {} \frac{\partial ^2 l}{\partial \tau \partial \sigma ^2}=-\frac{1}{\sigma ^4}\epsilon 'C^{-1}W(Ay-X\beta )\\ \frac{\partial ^2 l}{\partial \sigma ^2 \partial \lambda }= & {} \frac{\partial ^2 l}{\partial \lambda \partial \sigma ^2}=-\frac{1}{\sigma ^4}\epsilon 'C^{-1}W\epsilon \\ \frac{\partial ^2 l}{\partial \rho ^2}= & {} -tr(A^{-1}WA^{-1}W)-\frac{1}{\sigma ^2}y'W'B'C^{-1'}C^{-1}BWy\\ \frac{\partial ^2 l}{\partial \rho \partial \tau }= & {} \frac{\partial ^2 l}{\partial \tau \partial \rho }=-\frac{1}{\sigma ^2}(Ay-X\beta )'W'C^{-1'}C^{-1}BWy -\frac{1}{\sigma ^2}\epsilon 'C^{-1}WWy\\ \frac{\partial ^2 l}{\partial \rho \partial \lambda }= & {} \frac{\partial ^2 l}{\partial \lambda \partial \rho }=\frac{1}{\sigma ^2}(Ay-X\beta )'B'C^{-1'}W'C^{-1'}C^{-1}BWy +\frac{1}{\sigma ^2}\epsilon 'C^{-1}WC^{-1}BWy\\ \frac{\partial ^2 l}{\partial \tau ^2}= & {} -tr(B^{-1}WB^{-1}W)-\frac{1}{\sigma ^2}(Ay-X\beta )'W'C^{-1'} C^{-1}W(Ay-X\beta )\\ \frac{\partial ^2 l}{\partial \tau \partial \lambda }= & {} \frac{\partial ^2 l}{\partial \lambda \partial \tau }=\frac{1}{\sigma ^2}(Ay-X\beta )'B'C^{-1'}W'C^{-1'}C^{-1}W(Ay-X\beta )\\&+\frac{1}{\sigma ^2}\epsilon 'C^{-1}WC^{-1}W(Ay-X\beta )\\ \frac{\partial ^2 l}{\partial \lambda ^2}= & {} -tr(WC^{-1}WC^{-1}) -\frac{1}{\sigma ^2}(Ay-X\beta )'B'C^{-1'}W'C^{-1'}C^{-1}W\epsilon \\&-\frac{1}{\sigma ^2}\epsilon 'C^{-1}WC^{-1}W\epsilon -\frac{1}{\sigma ^2} \epsilon 'C^{-1}WC^{-1}WC^{-1}B(Ay-X\beta ).\\ \end{aligned}$$

Therefore, the information matrix can be derived as

$$\begin{aligned} I(\theta ) = \frac{1}{n \sigma ^2}\left[ \begin{array}{lllll} J_{\beta } &{} J_{\beta \sigma ^2} &{} J_{\beta \rho } &{} J_{\beta \tau } &{} J_{\beta \lambda }\\ J_{\sigma ^2 \beta } &{} J_{\sigma ^2} &{} J_{\sigma ^2 \rho } &{} J_{\sigma ^2 \tau } &{} J_{\sigma ^2 \lambda } \\ J_{\rho \beta } &{} J_{\rho \sigma ^2} &{} J_{\rho } &{} J_{\rho \tau } &{} J_{\rho \lambda } \\ J_{\tau \beta } &{} J_{\tau \sigma ^2} &{} J_{\tau \rho } &{} J_{\tau } &{} J_{\tau \lambda } \\ J_{\lambda \beta } &{} J_{\lambda \sigma ^2} &{} J_{\lambda \rho } &{} J_{\lambda \tau } &{} J_{\lambda } \\ \end{array}\right] , \end{aligned}$$
(B.1)

where

$$\begin{aligned} J_{\beta }= & {} X'B'C^{-1'}C^{-1}BX\\ J_{\beta \sigma ^2}= & {} J_{\sigma ^2 \beta }=J_{\beta \tau }=J_{\tau \beta }=J_{\beta \lambda }=J_{\lambda \beta }=0\\ J_{\beta \rho }= & {} J_{\rho \beta }=X'B'C^{-1'}C^{-1}BA^{-1}WX\beta \\ J_{\sigma ^2}= & {} \frac{n}{2 \sigma ^2}\\ J_{\sigma ^2 \rho }= & {} J_{\rho \sigma ^2}=tr(C^{-1}BWA^{-1}B^{-1}C)\\ J_{\sigma ^2 \tau }= & {} J_{\tau \sigma ^2}=tr(C^{-1}WB^{-1}C)\\ J_{\sigma ^2 \lambda }= & {} J_{\lambda \sigma ^2}=-tr(C^{-1}W)\\ J_{\rho }= & {} \sigma ^2 tr(A^{-1}WA^{-1}W)+\beta 'X'A^{-1'}W'B'C^{-1'}C^{-1}BWA^{-1}X\beta \\&+\sigma ^2 tr(C'B^{-1'}A{-1'}W'B'C^{-1'}C^{-1}BWA^{-1}B^{-1}C) \\ J_{\rho \tau }= & {} J_{\tau \rho }=\sigma ^2[tr(C'B^{-1'}W'C^{-1'}C^{-1}BWA^{-1}B^{-1}C)+tr(C^{-1}WWA^{-1}B^{-1}C)]\\ J_{\rho \lambda }= & {} J_{\lambda \rho }=-\sigma ^2[tr(C'B^{-1'}W'C^{-1'}C^{-1}BWA^{-1}B^{-1}C)+tr(C^{-1}WC^{-1}BWA^{-1}B^{-1}C)]\\ J_{\tau }= & {} \sigma ^2[tr(C^{-1}BWA^{-1}B^{-1}C)+tr(C'B^{-1'}B'C^{-1'}W'C^{-1'}C^{-1}W)]\\ J_{\tau \lambda }= & {} J_{\lambda \tau }=-\sigma ^2[tr(C'B^{-1'}W'C^{-1'}C^{-1}BWA^{-1}B^{-1}C)+tr(C^{-1}WC^{-1}WC^{-1}BB^{-1}C)]\\ J_{\lambda }= & {} -\sigma ^2[tr(C'B^{-1'}B'C^{-1'}W'C^{-1'}C^{-1}W)+tr(C^{-1}WC^{-1}WC^{-1}BB^{-1}C)]. \end{aligned}$$

Under \(H_0:\tau =\lambda \), we have \(B=c\), and hence (B.1) becomes

$$\begin{aligned} I(\theta ) = \frac{1}{n \sigma ^2}\left[ \begin{array}{c|c} J_{11} &{} J_{12} \\ \hline J_{21} &{} J_{22} \\ \end{array}\right] , \end{aligned}$$
(B.2)

where the partition matrices are

When \(\tau =\lambda \) and \(\rho =0\), we further have \(A=I\) and notice that \(tr(W)=0\). Therefore (B.2), reduces to

$$\begin{aligned} I(\theta )=\frac{1}{n \sigma ^2}\left[ \begin{array}{ccccc} X'X &{} 0 &{} X'WX \beta &{} 0 &{} 0 \\ 0 &{} \frac{n}{2 \sigma ^2} &{} 0 &{} tr(C^{-1}W) &{} -tr(C^{-1}W) \\ X'WX \beta &{} 0 &{} J_{\rho } &{} J_{\rho \tau } &{} J_{\rho \lambda } \\ 0 &{} tr(C^{-1}W) &{} J_{\tau \rho } &{} J_{\tau } &{} J_{\tau \lambda } \\ 0 &{} -tr(C^{-1}W) &{} J_{\lambda \rho } &{} J_{\lambda \tau } &{} J_{\lambda } \\ \end{array}\right] , \end{aligned}$$
(B.3)

where

$$\begin{aligned} \begin{aligned} J_{\rho }&= \sigma ^2 [tr(WW)+tr(W' W)]+\beta ' X W'W X \beta \\ J_{\rho \tau }&= J^J_{\tau \rho } = \sigma ^2 [tr(W'C^{-1}W +tr(C^{-1} WW)]\\ J_{\rho \lambda }&= J^J_{\lambda \rho } = -\sigma ^2 [tr(W'C^{-1}W +tr(C^{-1} WW)]\\ J_{\tau }&= \sigma ^2 [tr(W' C^{-1'} C^{-1} W)+tr(C^{-1} W C^{-1} W)]\\ J_{\tau \lambda }&= J^J_{\lambda \tau } = -\sigma ^2 [tr(W' C^{-1'} C^{-1} W)+tr(C^{-1} W C^{-1} W)]\\ J_{\lambda }&= \sigma ^2 [tr(W' C^{-1'} C^{-1} W)+tr(C^{-1} W C^{-1} W)].\\ \end{aligned} \end{aligned}$$

1.3 C derivation of test statistics

We first derive adjusted RS test statistic assuming \(\lambda \) is given. Defining \(\underline{\theta }=(\beta ', \rho , \tau , \sigma ^2)'\), the log-likelihood can be rewritten as

$$\begin{aligned} \begin{aligned} l(\underline{\theta }|\lambda ) =&Constant -\frac{n}{2} log \sigma ^2- \frac{1}{2 \sigma ^2}[(I-\rho W)Y-X \beta ]' (I-\tau W)' C^{-1'} \cdot \\&C^{-1} (I-\tau W)[(I-\rho W)Y-X \beta ]+log|I-\rho W|\\&+log|I-\tau W|+log|C^{-1}|. \end{aligned} \end{aligned}$$
(C.1)

And for a given value of \(\lambda \), the score functions under joint null are

$$\begin{aligned} \begin{aligned} d_{\beta }(\lambda )&= \frac{1}{\sigma ^2}X'(Y-X \beta )\\ d_{\sigma ^2}(\lambda )&= -\frac{n}{2 \sigma ^2}+\frac{1}{2 \sigma ^4} (Y-X \beta )'(Y-X \beta )\\ d_{\rho }(\lambda )&= \frac{1}{\sigma ^2} (Y-X \beta )'WY\\ d_{\tau }(\lambda )&= \frac{1}{\sigma ^2} (Y-X \beta )' C^{-1} W (Y-X \beta ).\\ \end{aligned} \end{aligned}$$

The information matrix under the null when \(\rho =0\) and \(\lambda \) is given, denoted as \(I(\underline{\theta }|\lambda )|_{H_0}\), can be derived as

$$\begin{aligned} I(\theta |\lambda )|_{H^J_0}= & {} \frac{1}{n \sigma ^2}\left[ \begin{array}{cccc} J_{\beta }(\lambda ) &{} J_{\beta \sigma ^2}(\lambda ) &{} J_{\beta \rho }(\lambda ) &{} J_{\beta \tau }(\lambda ) \\ J_{\sigma ^2 \beta }(\lambda ) &{} J_{\sigma ^2}(\lambda ) &{} J_{\sigma ^2 \rho }(\lambda ) &{} J_{\sigma ^2 \tau }(\lambda ) \\ J_{\rho \beta }(\lambda ) &{} J_{\rho \sigma ^2}(\lambda ) &{} J_{\rho }(\lambda ) &{} J_{\rho \tau }(\lambda ) \\ J_{\tau \beta }(\lambda ) &{} J_{\tau \sigma ^2}(\lambda ) &{} J_{\tau \rho }(\lambda ) &{} J_{\tau }(\lambda ) \\ \end{array}\right] \nonumber \\= & {} \frac{1}{n \sigma ^2}\left[ \begin{array}{cccc} X'X &{} 0 &{} X'WX \beta &{} 0 \\ 0 &{} \frac{n}{2 \sigma ^2} &{} 0 &{} tr(C^{-1}W) \\ X'WX \beta &{} 0 &{} \sigma ^2 [tr(WW)+tr(W' W)]+\beta ' X W'W X \beta &{} \sigma ^2 [tr(W'C^{-1}W +tr(C^{-1} WW)] \\ 0 &{} tr(C^{-1}W) &{} \sigma ^2 [tr(W'C^{-1}W +tr(C^{-1} WW)] &{} \sigma ^2 [tr(W' C^{-1'} C^{-1} W)+tr(C^{-1} W C^{-1} W)] \\ \end{array}\right] .\nonumber \\ \end{aligned}$$
(C.2)

Denoting \(\gamma =(\beta ', \sigma ^2)\), the standard RS test on \((\gamma ,0,\tau )\) given \(\lambda \), has the form

$$\begin{aligned} RS(\lambda ) = \frac{1}{n}d'_{\tau }(\lambda )J^{-1}_{\tau \cdot \gamma }(\lambda )d_{\tau }(\lambda ), \end{aligned}$$

where

$$\begin{aligned} \begin{aligned} J_{\tau \cdot \gamma }(\lambda )&=J_{\tau }(\lambda )-J_{\tau \gamma }(\lambda )J^{-1}_{\gamma }(\lambda )J_{\gamma \tau }(\lambda )\\&= \frac{1}{n \sigma ^2} \{ \sigma ^2 [tr(W'C^{-1'}C^{-1}W)+tr(C^{-1}WC^{-1}W)]-\frac{2\sigma ^2}{n}[tr(C^{-1}W)]^2 \} \\ \end{aligned} \end{aligned}$$

using (C.2), and the standard RS test statistic for fixed \(\lambda \) can be derived as

$$\begin{aligned} RS(\lambda )=\frac{\{ \frac{1}{\hat{\sigma ^2}}\hat{u}'C^{-1}W\hat{u}-tr[C^{-1}W] \}^2}{tr\{W'C^{-1'}C^{-1}W+[C^{-1}W]^2\}-\frac{2}{n}[tr(C^{-1}W)]^2}. \end{aligned}$$
(C.3)

Now we consider the RS test adjusted for the presence of \(\rho \). Assuming \(\lambda \) given, the adjusted RS test statistic, denoted by \(RS^*_{\tau }(\lambda )\) has the form

$$\begin{aligned} \begin{aligned} RS^*_{\tau }(\lambda ) =&\frac{1}{n}[d_{\tau }(\lambda )-J_{\tau \rho \cdot \gamma }(\lambda )J^{-1}_{\rho \cdot \gamma }(\lambda )d_{\rho }(\lambda )]' \\&\times [J_{\tau \cdot \gamma }(\lambda )-J_{\tau \rho \cdot \gamma }(\lambda )J^{-1}_{\rho \cdot \gamma }(\lambda )J_{\rho \tau \cdot \gamma }(\lambda )]^{-1} \\&\times [d_{\tau }(\lambda )-J_{\tau \rho \cdot \gamma }(\lambda )J^{-1}_{\rho \cdot \gamma }(\lambda )d_{\rho }(\lambda )],\\ \end{aligned} \end{aligned}$$

where

$$\begin{aligned} \begin{aligned} J_{\tau \rho \cdot \gamma }(\lambda )&=J_{\rho \tau \cdot \gamma }(\lambda )=J_{\tau \gamma }(\lambda )J^{-1}_{\gamma }(\lambda )J_{\gamma \rho }(\lambda )\\ J_{\rho \cdot \gamma }(\lambda )&= J_{\rho }(\lambda )-J_{\rho \gamma }(\lambda )J^{-1}_{\gamma }(\lambda )J_{\gamma \rho }(\lambda ).\\ \end{aligned} \end{aligned}$$

Using (C.2), we have

$$\begin{aligned} \begin{aligned} J_{\tau \rho \cdot \gamma }(\lambda )&=\frac{1}{n}[tr(W'C^{-1'}W)+tr(C^{-1}WW)]\\ J_{\rho \cdot \gamma }(\lambda )&= \frac{1}{n \sigma ^2} \{ \sigma ^2 [tr(W'W+WW)]+\beta 'X'W'MWX\beta \},\\ \end{aligned} \end{aligned}$$

where \(M=I-X(X'X)^{-1}X'\), and hence the adjusted RS statistic for fixed \(\lambda \) can be derived as

$$\begin{aligned} RS^* (\lambda )=\frac{\{\hat{u'}C^{-1}W\hat{u}/\hat{\sigma }^2 -tr(C^{-1}W)-[tr(W'C^{-1'}W+C^{-1}WW)](n\hat{J_{\rho \cdot \gamma }})^{-1}\hat{u}'Wy/\hat{\sigma }^2 \}^2}{tr(W'C^{-1'}C^{-1}W+C^{-1}WC^{-1}W)-\frac{2}{n}[tr(C^{-1}W)]^2-[tr(W'C^{-1'}W+C^{-1}WW)]^2 (n\hat{J_{\rho \cdot \gamma }})^{-1}},\nonumber \\ \end{aligned}$$
(C.4)

where

$$\begin{aligned} (n\hat{J_{\rho \cdot \gamma }})^{-1} = \hat{\sigma }^2 \cdot \{ \hat{\beta } X'W'MWX\beta +\hat{\sigma }^2 [tr((W'+W)W)] \}^{-1}. \end{aligned}$$

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kao, S.YH., Bera, A.K. Testing spatial regression models under nonregular conditions. Empir Econ 55, 85–111 (2018). https://doi.org/10.1007/s00181-018-1455-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00181-018-1455-2

Keywords

JEL Classification

Navigation