Skip to main content
Log in

Minimum Φ-Divergence Estimator and Φ-Divergence Statistics in Generalized Linear Models with Binary Data

  • Published:
Methodology and Computing in Applied Probability Aims and scope Submit manuscript

Abstract

In this paper, we assume that the data are distributed according to a binomial distribution whose probabilities follow a generalized linear model. To fit the data the minimum φ-divergence estimator is studied as a generalization of the maximum likelihood estimator. We use the minimum φ-divergence estimator, which is the basis of some new statistics, for solving the problems of testing in a generalized linear model with binary data. A wide simulation study is carried out for studying the behavior of the new family of estimators as well as of the new family of test statistics.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • S. M. Ali, and S. D. Silvey, “A general class of coefficients of divergence of one distribution from another,” Journal of the Royal Statistical Society, Series B vol. 26 pp. 131–142, 1966.

    MathSciNet  Google Scholar 

  • E. B. Andersen, Introduction to the Statistical Analysis of Categorical Data I, Springer 1997.

  • A. Basu, and S. Basu, “Penalized minimum disparity methods for multinomial models,” Statistica Sinica vol. 8 pp. 841–860, 1998.

    MathSciNet  MATH  Google Scholar 

  • S. K. Bhandari, A. Basu, and S. Sarkar, “Robust inference in parametric model using the family of generalized negative exponential disparities,” Australian and New Zealand Journal of Statistics vol. 48 pp. 95–114, 2006.

    Article  MathSciNet  MATH  Google Scholar 

  • B. Bhattacharya, “Csiszár divergence from constant failure rate model for grouped data,” Communications Statistics Theory and Methods vol. 30(6) pp. 1131–1141, 2001.

    Article  MATH  Google Scholar 

  • B. Bhattacharya, and A. Basu, “Disparity based goodness-of-fit tests for and against restrictions for multinomial models,” Journal Nonparametric Statistic vol. 15(1) pp. 1–10, 2003.

    Article  MathSciNet  MATH  Google Scholar 

  • H. S. Chen, K. Lai, and A. Ying, “Goodness-of-fit and minimum power divergence estimation for survival data,” Statistica Sinica vol. 14 pp. 231–248, 2004.

    MathSciNet  MATH  Google Scholar 

  • N. A. C. Cressie, and L. Pardo, “Minimum φ-divergence estimator and hierarchical testing in loglinear models,” Statistica Sinica vol. 10 pp. 867–884, 2000.

    MathSciNet  MATH  Google Scholar 

  • N. A. C. Cressie, and L. Pardo, “Phi-divergence statistics,” In A. H. ElShaarawi and W. W. Piegorich (eds.), Encyclopedia of Environmetrics, vol. 3 pp. 1551–1555, Wiley: New York, 2002.

    Google Scholar 

  • N. A. C. Cressie, L. Pardo, and M. C. Pardo, “Size and power considerations for testing loglinear models using φ-divergence test statistics,” Statistica Sinica vol. 17(5) pp. 555–570, 2003.

    MathSciNet  Google Scholar 

  • I. Csiszár, “Eine Informationtheorestiche Ungleichung und ihre Anwendung anf den Beweis der Ergodizität Markoffshen Ketten,” Publications of the mathematical Institute of Hungarian Academy of Sciences, Series A vol. 8 pp. 84–108, 1963.

    Google Scholar 

  • J. R. Dale, “Asymptotic normality of goodness-of-fit statistics for sparse product multinomials,” Journal of the Royal Statistical Society, Series B vol. 41 pp. 48–59, 1986.

    MathSciNet  Google Scholar 

  • T. S. Ferguson, A Course in Large Sample Theory, Wiley: New York, 1996.

    MATH  Google Scholar 

  • K. Fokianos, “Power divergence family of tests for categorical time series models,” Annals of the Institute of Statistical Mathematics vol. 54 pp. 543–564, 2002.

    Article  MathSciNet  MATH  Google Scholar 

  • S. Kullback, “Kullback information,” In S. Kotz and N. L. Johnson (eds.), Encyclopedia of Statistical Sciences, vol. 4 pp. 421–425, Wiley: New York, 1985.

    Google Scholar 

  • M. L. Menéndez, J. A. Pardo, and L. Pardo, “Tests for bivariate symmetry against ordered alternatives in square contingency tables,” Australian and New Zealand Journal of Statistics vol. 45(1) pp. 115–124, 2003.

    Article  MathSciNet  MATH  Google Scholar 

  • M. L. Menéndez, J. A. Pardo, L. Pardo, and K. Zografos, “On tests of symmetry, marginal homogeneity and quasi-symmetry in two-way contingency tables based on minimum −divergence estimator with constraints,” Journal of Statistical Computation and Simulation vol. 75(7) pp. 555–580, 2005.

    Article  MathSciNet  MATH  Google Scholar 

  • I. Molina, and D. Morales, “Rényi statistics for testing hypotheses in mixed linear regression models,” Journal of Statistical Planning and Inference vol. 137 pp. 87–102, 2006.

    Article  MathSciNet  Google Scholar 

  • D. Morales, L. Pardo, and I. Vajda, “Asymptotic divergence of estimates of discrete distributions,” Journal Statistical Planning and Inference vol. 48 pp. 347–369, 1995.

    Article  MathSciNet  MATH  Google Scholar 

  • D. Morales, L. Pardo, and I. Vajda, “Some new statistics for testing hypotheses in parametric models,” Journal of Multivariate Analysis vol. 62(1) pp. 137–168, 1997.

    Article  MathSciNet  MATH  Google Scholar 

  • J. Muñoz-Garcia, J. M. Muñoz-Pichardo, and L. Pardo “Cressie and read power-divergences as influence measures for logistic regression model,” Computational Statistics and Data Analysis vol. 50, pp. 3199–3221, 2006.

    Article  MathSciNet  Google Scholar 

  • J. A. Pardo, L. Pardo, and M. C. Pardo, “Minimum φ-divergence estimator in logistic regression models,” Statistical Papers vol. 47 pp. 91–108, 2005.

    Article  MathSciNet  Google Scholar 

  • J. A. Pardo, L. Pardo, and M. C. Pardo, “Testing in logistic regression models based on φ-divergences measures,” Journal of Statistical Planning and Inference vol. 136 pp. 982–1006, 2006.

    Article  MathSciNet  MATH  Google Scholar 

  • L. Pardo, Statistical Inference Based on Divergence Measures, Chapman & Hall, 2006.

  • L. Pardo, and M. C. Pardo, “Nonadditivity in loglinear models using φ-divergences and MLEs,” Journal of Statistical Planning and Inference vol. 127(1–2) pp. 237–252, 2005.

    Article  MathSciNet  MATH  Google Scholar 

  • M. C. Pardo, “On Burbea-Rao divergence based goodness-of-fit tests for multinomial models,” Journal of Multivariate Analysis vol. 69(1) pp. 65–87, 1999.

    Article  MathSciNet  MATH  Google Scholar 

  • W. C. Parr, “Minimum distance estimation: a bibliography,” Communications in Statistics (Theory and Methods) vol. 10 pp. 1205–1224, 1981.

    Article  MathSciNet  Google Scholar 

  • T. R. C. Read, and N. A. C. Cressie, Goodness-of-fit Statistics for Discrete Multivariate Data, Springer: New York, 1988.

    MATH  Google Scholar 

  • M. J. Rivas, M. T. Santos, and D. Morales, “Rényi test statistics for partially observed diffusion processes,” Journal of Statistical Planning and Inference vol. 127 pp. 91–102, 2005.

    Article  MathSciNet  MATH  Google Scholar 

  • I. Vajda, Theory of Statistical Inference and Information, Kluwer: Boston, 1989.

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to J. A. Pardo.

Additional information

This work was partially supported by Grant MTM2006-06872 and UCM2006-910707.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Pardo, J.A., Pardo, M.C. Minimum Φ-Divergence Estimator and Φ-Divergence Statistics in Generalized Linear Models with Binary Data. Methodol Comput Appl Probab 10, 357–379 (2008). https://doi.org/10.1007/s11009-007-9054-2

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11009-007-9054-2

Keywords

AMS 2000 Subject Classification

Navigation