Skip to main content
  • 852 Accesses

Abstract

This chapter provides a general framework of the minimum divergence method for a statistical model. In particular, we explore the U-minimum divergence method, in which the U-loss function for parameter estimation is introduced as an empirical analogue for the U-divergence given a data set, and the U-estimator is defined by minimizing the U-loss function. The variety of U-estimators is due to the selection diversity of selection for the generator function U. We give a sensible understanding for the dualistic relation of the U-estimator and the maximum U-entropy model. Then, we investigate the robustness performance of the \(\beta \)-power estimator under a typical statistical model that differs from the U-model. Furthermore, the statistical property of the \(\gamma \)-power estimator defined by the projective \(\gamma \)-power divergence is investigated.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 139.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Amari, S. (1982). Differential geometry of curved exponential families-curvatures and information loss. The Annals of Statistics, 10, 357–385.

    Article  MathSciNet  Google Scholar 

  • Basu, A., Harris, I. R., Hjort, N., & Jones, M. (1998). Robust and efficient estimation by minimising a density power divergence. Biometrika, 85, 549–559.

    Article  MathSciNet  Google Scholar 

  • Basu, A., Shioya, H., & Park, C. (2011). Statistical inference: the minimum distance approach. Boca Raton: CRC Press.

    Book  Google Scholar 

  • Basu, S., Basu, A., & Jones, M. C. (2006). Robust and efficient parametric estimation for censored survival data. Annals of the Institute of Statistical Mathematics, 58, 341–355.

    Article  MathSciNet  Google Scholar 

  • Beaton, A., & Tukey, J. (1974). The fitting of power series, meaning polynomials, illustrated on band spectroscopic data. Technometrics, 16, 147–185.

    Article  Google Scholar 

  • Beran, R. (1977). Minimum Hellinger distance estimates for parametric models. The Annals of Statistics, 5, 445–463.

    MathSciNet  MATH  Google Scholar 

  • Cox, D. R., & Hinkley, D. V. (1974). Theoretical statistics. New York: CRC Press.

    Book  Google Scholar 

  • Dudley, R. M. (2014). Uniform central limit theorems. New York: Cambridge University Press.

    MATH  Google Scholar 

  • Efron, B. (1975). The efficiency of logistic regression compared to normal discriminant analysis. Journal of the American Statistical Association, 70, 892–898.

    Article  MathSciNet  Google Scholar 

  • Eguchi, S. (1983). Second order efficiency of minimum contrast estimators in a curved exponential family. The Annals of Statistics, 3, 798–803.

    MathSciNet  MATH  Google Scholar 

  • Gneiting, T., & Raftery, A. E. (2007). Strictly proper scoring rules, prediction, and estimation. Journal of the American statistical Association, 102, 359–378.

    Article  MathSciNet  Google Scholar 

  • Godambe, V. P. (1991). Estimating functions. Oxford University Press.

    Google Scholar 

  • Grünwald, P. D., & Dawid, A. P. (2004). Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory. The Annals of Statistics, 32, 1367–1433.

    Article  MathSciNet  Google Scholar 

  • Hampel, F. R., Ronchetti, E. M., Rousseeuw, P. J., & Stahel, W. A. (2011). Robust statistics: The approach based on influence functions. New York: Wiley.

    MATH  Google Scholar 

  • Hampel, F. R. (1974). The influence curve and its role in robust estimation. Journal of the American Statistical Association, 69, 383–393.

    Article  MathSciNet  Google Scholar 

  • Huber, P. J. (1964). Robust estimation of a location parameter. Annals of Mathematical Statistics, 35, 73–101.

    Article  MathSciNet  Google Scholar 

  • Hunter, D. R. (2007). Curved exponential family models for social networks. Social Networks, 29, 216–230.

    Article  Google Scholar 

  • Jones, M. C., Hjort, N. L., Harris, I. R., & Basu, A. (2001). A comparison of related density-based minimum divergence estimators. Biometrika, 88, 865–873.

    Article  MathSciNet  Google Scholar 

  • Lanckriet, G., & Sriperumbudur, B. K. (2009). On the convergence of the concave-convex procedure. Advances in Neural Information Processing Systems, 22, 1759–1767.

    Google Scholar 

  • Lehmann, E. L., & Casella, G. (2006). Elements of large-sample theory. Springer Science & Business Media.

    Google Scholar 

  • Maronna, R. A., Martin, R. D., Yohai, V. J., & Salibián-Barrera, M. (2019). Robust statistics: Theory and methods (with R). Wiley.

    Google Scholar 

  • McLachlan, G. J., & Krishnan, T. (2007). The EM algorithm and extensions. Hoboken: Wiley.

    MATH  Google Scholar 

  • Minami, M., & Eguchi, S. (2002). Robust blind source separation by beta divergence. Neural Computation, 14, 1859–1886.

    Article  Google Scholar 

  • Mollah, M. N. H., Minami, M., & Eguchi, S. (2006). Exploring latent structure of mixture ICA models by the minimum beta-divergence method. Neural Computation, 18, 166–190.

    Article  Google Scholar 

  • Mollah, Nurul Haque, & M., Sultana, N., Minami, M., & Eguchi, S. (2010). Robust extraction of local structures by the minimum beta-divergence method. Neural Networks, 23, 226–238.

    Google Scholar 

  • Yuille, A. L., & Rangarajan, A. (2003). The concave-convex procedure. Neural Computation, 15, 915–936.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Shinto Eguchi .

Rights and permissions

Reprints and permissions

Copyright information

© 2022 Springer Japan KK, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Eguchi, S., Komori, O. (2022). Minimum Divergence Method. In: Minimum Divergence Methods in Statistical Machine Learning. Springer, Tokyo. https://doi.org/10.1007/978-4-431-56922-0_4

Download citation

Publish with us

Policies and ethics