Skip to main content
Log in

Jackknifing in generalized linear models

  • Estimation
  • Published:
Annals of the Institute of Statistical Mathematics Aims and scope Submit manuscript

Abstract

In a generalized linear model, the jackknife estimator of the asymptotic covariance matrix of the maximum likelihood estimator is shown to be consistent. The corresponding jackknife studentized statistic is asymptotically normal. In addition, these results remain true even if there exist unequal dispersion parameters in the model. On the other hand, the variance estimator and the studentized statistic based on the standard method (substitution and linearization) do not enjoy this robustness property against the presence of unequal dispersion parameters.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Drygas, H. (1976). Weak and strong consistency of the least squares estimators in regression models, Zeitschrift für Wahrscheinlichkeitstheorie und Verwandte Gebiete, 34, 119–127.

    Google Scholar 

  • Fahrmeir, L. and Kaufmann, H. (1985). Consistency and asymptotic normality of the maximum likelihood estimator in generalized linear models, Ann. Statist., 13, 342–368.

    Google Scholar 

  • Fahrmeir, L. and Kaufmann, H. (1986). Asymptotic inferences in discrete response models, Statistical Papers, 27, 179–205.

    Google Scholar 

  • Ghosh, M. (1986). Discussion of C. F. J. Wu's paper, Ann. Statist., 14, 1308–1310.

    Google Scholar 

  • Hinkley, D. V. (1977). Jackknifing in unbalanced situations, Technometrics, 19, 285–292.

    Google Scholar 

  • Lai, T. L., Robbins, H. and Wei, C. Z. (1979). Strong consistency of least squares estimators in multiple regression, J. Multivariate Anal., 9, 343–361.

    Google Scholar 

  • Liang, K.-Y. and Zeger, S. L. (1986). Longitudinal data analysis using generalized linear models, Biometrika, 73, 13–22.

    Google Scholar 

  • McCullagh, P. and Nelder, J. A. (1983). Generalized Linear Models, Chapman and Hall, London.

    Google Scholar 

  • Nelder, J. A. and Wedderburn, R. W. M. (1972). Generalized linear models, J. Roy. Statist. Soc. Ser. A, 135, 370–384.

    Google Scholar 

  • Quenouille, M. (1956). Notes on bias in estimation, Biometrika, 43, 353–360.

    Google Scholar 

  • Shao, J. (1989). Jackknifing weighted least squares estimators, J. Roy. Statist. Soc. Ser. B, 51, 139–156.

    Google Scholar 

  • Shao, J. (1992). Asymptotic theory in generalized linear models with nuisance scale parameters, Probab. Theory Related Fields, 91, 25–41.

    Google Scholar 

  • Tukey, J. (1958). Bias and confidence in not quite large samples, Ann. Math. Statist., 29, 614.

    Google Scholar 

  • Wu, C. F. J. (1981). Asymptotic theory of nonlinear least squares estimation, Ann. Statist., 9, 501–513.

    Google Scholar 

  • Wu, C. F. J. (1986). Jackknife, bootstrap and other resampling methods in regression analysis (with discussion), Ann. Statist., 14, 1261–1350.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

This research was supported by an Operating Grant from the Natural Science and Engineering Research Council of Canada.

About this article

Cite this article

Shao, J. Jackknifing in generalized linear models. Ann Inst Stat Math 44, 673–686 (1992). https://doi.org/10.1007/BF00053397

Download citation

  • Received:

  • Revised:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00053397

Key words and phrases

Navigation