Advertisement

A Bootstrap Study of Variance Estimation under Heteroscedasticity using Genetic Algorithm

  • Himadri Ghosh
  • M. A. Iquebal
  • Prajneshu
Article

Abstract

The conventional ordinary least squares (OLS) variance-covariance matrix estimator for a linear regression model under heteroscedastic errors is biased and inconsistent. Accordingly, several estimators have so far been proposed by various researchers. However, none of these perform well under the finite-sample situation. In this paper, the powerful optimization technique of Genetic algorithm (GA) is used to modify these estimators. Properties of these newly developed estimators are thoroughly studied by Monte Carlo method for various sample sizes. It is shown that GA-versions of the estimators are superior to corresponding non-GA versions as there are significant reductions in the Total relative bias as well as Total root mean square error.

AMS Subject Classification

62J05 65C05 

Keywords

Linear regression model Least squares estimators Heteroscedasticity Real-coded genetic algorithm Bootstrap methods Total relative bias Total root mean square error 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Cribari-Neto, F., 2004. Asymptotic inference under heteroscedasticity of unknown form. Computational Statistics and Data Analysis, 45, 215–233.MathSciNetCrossRefGoogle Scholar
  2. Cribari-Neto, F., Ferrari, S. L. P., Cordeiro, G. M., 2000. Improved heteroscedasticity-consistent variance estimators. Biometrika, 87, 907–918.MathSciNetCrossRefGoogle Scholar
  3. Davidson, R., MacKinnon, J. G., 1993. Estimation and Inference in Econometrics. Oxford University Press, New York.zbMATHGoogle Scholar
  4. Deb, K., Agrawal, R. B., 1995. Simulated binary crossover for continuous search space. Complex Systems, 9, 115–148.MathSciNetzbMATHGoogle Scholar
  5. Deb, K., 2002. Multi-Objective Optimization using Evolutionary Algorithms. John Wiley & Sons Pvt. Ltd., Singapore.zbMATHGoogle Scholar
  6. Flachaire, E., 2005. Bootstrapping heteroscedastic regression models: Wild bootstrap vs pair bootstrap. Computational Statistics and Data Analysis, 49, 361–376.MathSciNetCrossRefGoogle Scholar
  7. Hinkley, D.V., 1977. Jackknifing in unbalanced situations. Technometrics, 19, 285–292.MathSciNetCrossRefGoogle Scholar
  8. Hoaglin, D. C., Welsch R. E., 1978. The hat matrix in regression and ANOVA. The American Statistician, 32, 17–22.zbMATHGoogle Scholar
  9. MacKinnon, J. G., White H., 1985. Some heteroscedasticity-consistent covariance matrix estimators with improved finite sample properties. Journal of Econometrics, 29, 305–325.CrossRefGoogle Scholar
  10. Miller, R. G., 1974. Unbalanced jackknife. Annals of Statistics, 2, 880–891.MathSciNetCrossRefGoogle Scholar
  11. White, H., 1980. A heteroskedasticity-consistent covariance matrix estimator and a direct test for het-eroskedasticity. Econometrica, 48, 817–838.MathSciNetCrossRefGoogle Scholar
  12. Wu, C. F. J., 1986. Jackknife bootstrap and other resampling methods in regression analysis. Annals of Statistics, 14, 1261–1295.MathSciNetCrossRefGoogle Scholar

Copyright information

© Grace Scientific Publishing 2008

Authors and Affiliations

  1. 1.Indian Agricultural Statistics Research InstituteNew DelhiIndia

Personalised recommendations