Skip to main content
Log in

The Extended Bregman Divergence and Parametric Estimation in Continuous Models

  • Published:
Sankhya B Aims and scope Submit manuscript

Abstract

Under standard regularity conditions, the maximum likelihood estimator (MLE) is the most efficient estimator at the model. However, modern practice recognizes that it is rare for the hypothesized model to hold exactly, and small departures from it are never entirely unexpected. But classical estimators like the MLE are extremely sensitive to the presence of noise in the data. Within the class of robust estimators, which constitutes parametric inference techniques designed to overcome the problems due to model misspecification and noise, minimum distance estimators have become quite popular in recent times. In particular, density-based distances under the umbrella of the Bregman divergence have been demonstrated to have several advantages. Here we will consider an extension of the ordinary Bregman divergence, and investigate the scope of parametric estimation under continuous models using this extended divergence proposal. Many of our illustrations will be based on the GSB divergence, a particular member of the extended Bregman divergence, which appears to hold high promise within the robustness area. To establish the usefulness of the proposed minimum distance estimation procedure, we will provide detailed theoretical investigations followed by substantial numerical verifications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  • Basak, S. and Basu, A. (2022). The extended Bregman divergence and parametric estimation. Statistics 56(3), 699–718

    Article  MathSciNet  Google Scholar 

  • Basak, S., Basu, A., Jones, M.C. (2021). On the ‘optimal’ density power divergence tuning parameter. J. Appl. Stat. 48(3), 536–556

    Article  MathSciNet  Google Scholar 

  • Basu, A. and Lindsay, B.G. (1994). Minimum disparity estimation for continuous models: efficiency, distributions and robustness. Ann. Inst. Stat. Math. 46(4), 683–705

    Article  MathSciNet  Google Scholar 

  • Basu, A., Harris, I.R., Hjort, N.L. and Jones, M.C. (1998). Robust and efficient estimation by minimising a density power divergence. Biometrika 85(3), 549–559

    Article  MathSciNet  Google Scholar 

  • Basu, A., Shioya, H. and Park, C. (2011). Statistical Inference: The Minimum Distance Approach. CRC Press, Boca Raton, Florida

    Book  Google Scholar 

  • Beran, R. (1977). Minimum Hellinger distance estimates for parametric models. Ann. Stat. 5(3), 445–463

    Article  MathSciNet  Google Scholar 

  • Bregman, L.M. (1967). The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Comput. Math. Math. Phys. 7(3), 200–217

    Article  MathSciNet  Google Scholar 

  • Cressie, N. and Read, T.R.C. (1984). Multinomial goodness-of-fit tests. J. R. Stat. Soc. B (Methodological) 46(3), 440–464

    Article  MathSciNet  Google Scholar 

  • Fernholz, L.T. (1983). Von Mises Calculus for Statistical Functionals, Lecture Notes in Statistics, Vol. 19. Springer, New York

  • Ghosh, A. and Basu, A. (2017). The minimum \({S}\)-divergence estimator under continuous models: the Basu-Lindsay approach. Stat. Papers 58(2), 341–372

    Article  MathSciNet  Google Scholar 

  • Ghosh, A., Harris, I.R., Maji, A., Basu, A. and Pardo, L. (2017). A generalized divergence for statistical inference. Bernoulli 23(4A), 2746–2783

    Article  MathSciNet  Google Scholar 

  • Hong, C. and Kim, Y. (2021). Automatic selection of the tuning parameter in the minimum density power divergence estimation. J. Korean Stat. Assoc. 30(3), 453–465

    MathSciNet  Google Scholar 

  • Lehmann, E.L. (1983). Theory of Point Estimation. John Wiley and Sons, New York

    Book  Google Scholar 

  • Liese, F. and Vajda, I. (1987). Convex Statistical Distances. Teubner, Boston

    Google Scholar 

  • Mukherjee, T., Mandal, A., Basu, A. (2019). The B-exponential divergence and its generalizations with applications to parametric estimation. Stat. Methods Appl. 28(2), 241–257

    Article  MathSciNet  Google Scholar 

  • Pardo, L. (2006). Statistical Inference Based on Divergence Measures. Chapman and Hall/CRC, Boca Raton, Florida

    Google Scholar 

  • Silverman, B.W. (1986). Density Estimation for Statistics and Data Analysis. Chapman and Hall, London, United Kingdom

    Google Scholar 

  • Stigler, S.M. (1977). Do robust estimators work with real data? Ann. Stat. 5(6), 1055–1098

    Article  MathSciNet  Google Scholar 

  • Thode Jr., H.C. (2002). Testing For Normality. Marcel Dekker, New York

    Book  Google Scholar 

  • Warwick, J. and Jones, M.C. (2005). Choosing a robustness tuning parameter. J. Stat. Comput. Simul. 75(7), 581–588

Download references

Funding

The authors did not receive support from any organization for the submitted work.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sancharee Basak.

Ethics declarations

Conflicts of interest

The authors have no competing interests to declare that are relevant to the content of this article.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Below is the link to the electronic supplementary material.

Supplementary file 1 (pdf 404 KB)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Basak, S., Basu, A. The Extended Bregman Divergence and Parametric Estimation in Continuous Models. Sankhya B (2024). https://doi.org/10.1007/s13571-024-00333-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s13571-024-00333-z

Keywords

Mathematics Subject Classification

Navigation