Advertisement

New indices of sample informational energy with application to testing uniformity

  • M. MahdizadehEmail author
Article
  • 2 Downloads

Abstract

In recent years, there has been a growing interest in utilizing information-theoretic measures as a tool for statistical inference. Informational energy is an important quantity in this area which has found many applications. This article aims to present some new estimators of this quantity. The new measures are then employed to test for uniformity. Performances of the resulting tests are assessed through simulation study. Finally, the procedures are illustrated using a real data set.

Keywords

Density estimation Information theory Test of fit 

Mathematics Subject Classification

62F03 62F40 

Notes

Acknowledgements

The author is thankful to the reviewer for constructive comments that greatly improved the manuscript.

References

  1. 1.
    Alizadeh Noughabi, H.: A new estimator of entropy and its application in testing normality. J. Stat. Comput. Simul. 80, 1151–1162 (2010)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Alizadeh Noughabi, H.: Entropy-based tests of uniformity: A Monte Carlo power comparison. Commun. Stat. Simul. Comput. 46, 1266–1279 (2017)MathSciNetCrossRefGoogle Scholar
  3. 3.
    Alizadeh Noughab, H., Alizadeh Noughab, R.: On the entropy estimators. J. Stat. Comput. Simul. 83, 784–792 (2013)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Alizadeh Noughabi, H., Arghami, N.R.: A new estimator of entropy. J. Iran. Stat. Soc. 9, 53–64 (2010)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Alizadeh Noughabi, H., Chahkandi, M.: Informational energy and its application in testing normality. Ann. Data Sci. 2, 391–401 (2015)CrossRefGoogle Scholar
  6. 6.
    Bowman, A.W.: Density based tests for goodness-of-fit. J. Stat. Comput. Simul. 40, 1–13 (1992)CrossRefGoogle Scholar
  7. 7.
    Correa, J.C.: A new estimator of entropy. Commun. Stat. Theory Methods 24, 2439–2449 (1995)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, New York (2006)zbMATHGoogle Scholar
  9. 9.
    Dudewicz, E.J., van der Meulen, E.C.: Entropy-based tests of uniformity. J. Am. Stat. Assoc. 76, 967–974 (1981)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Durbin, J.: Distribution theory for tests based on the sample distribution function (Vol. 9). In: Conference Board of the Mathematical Sciences, Regional Conference Series in Applied Mathematics. SIAM, Philadelphia (1973)Google Scholar
  11. 11.
    Ebrahimi, N., Pflughoeft, K., Soofi, E.S.: Two measures of sample entropy. Stat. Probab. Lett. 20, 225–234 (1994)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Grzegorzewski, P., Wieczorkowski, R.: Entropy-based goodness-of-fit test for exponentiality. Commun. Stat. Theory Methods 28, 1183–1202 (1999)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Kuiper, N.H.: Tests concerning random points on a circle. Proc. Koninklijke Nederlondse Akademie van Wetenschappen A 63, 38–47 (1960)zbMATHGoogle Scholar
  14. 14.
    Mahdizadeh, M.: On entropy based test of exponentiality in ranked set sampling. Commun. Stat. Simul. Comput. 44, 979–995 (2015)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Mahdizadeh, M.: On testing uniformity using an information-theoretic measure. Commun. Stat. Simul. Comput. 46, 6173–6196 (2017)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Mahdizadeh, M., Arghami, N.R.: Efficiency of ranked set sampling in entropy estimation and goodness-of-fit testing for the inverse Gaussian law. J. Stat. Comput. Simul. 80, 761–774 (2010)MathSciNetCrossRefGoogle Scholar
  17. 17.
    Mahdizadeh, M., Arghami, N.R.: Improved entropy based test of uniformity using ranked set samples. SORT 37, 3–18 (2013)MathSciNetzbMATHGoogle Scholar
  18. 18.
    Mahdizadeh, M., Zamanzade, E.: A comprehensive study of lognormality tests. Electron. J. Appl. Stat. Anal. 10, 349–373 (2017)MathSciNetzbMATHGoogle Scholar
  19. 19.
    Mahdizadeh, M., Zamanzade, E.: New goodness-of-fit tests for the Cauchy distribution. J. Appl. Stat. 44, 1106–1121 (2017)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Mahdizadeh, M., Zamanzade, E.: Goodness-of-fit tests for the Cauchy distribution with application to financial modeling. J. King Saud Uni. Sci. (2019).  https://doi.org/10.1016/j.jksus.2019.01.015 CrossRefGoogle Scholar
  21. 21.
    Onicescu, O.: Energie Informationelle. C. R. Acad. Sci. Paris. Ser. A, 841–842 (1966)Google Scholar
  22. 22.
    Pardo, M.C.: A test of uniformity based on informational energy. Stat. Pap. 44, 521–534 (2003)MathSciNetCrossRefGoogle Scholar
  23. 23.
    Shannon, C.E., Weaver, W.W.: The Mathematical Theory of Communication. University of Illinois Press, Urbana (1949)zbMATHGoogle Scholar
  24. 24.
    Stephens, M.A.: EDF statistics for goodness of fit and some comparisons. J. Am. Stat. Assoc. 69, 730–737 (1974)CrossRefGoogle Scholar
  25. 25.
    Vasicek, O.: A test of normality based on sample entropy. J. R. Stat. Soc. B 38, 54–59 (1976)MathSciNetzbMATHGoogle Scholar
  26. 26.
    Watson, G.S.: Goodness-of-fit on a circle. Biometrika 48, 109–114 (1961)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Zamanzade, E.: Testing uniformity based on new entropy estimators. J. Stat. Comput. Simul. 43, 3191–3505 (2015)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Zhang, J.: Powerful goodness-of-fit tests based on the likelihood ratio. J. R. Stat. Soc. B 64, 281–294 (2002)MathSciNetCrossRefGoogle Scholar

Copyright information

© African Mathematical Union and Springer-Verlag GmbH Deutschland, ein Teil von Springer Nature 2019

Authors and Affiliations

  1. 1.Department of StatisticsHakim Sabzevari UniversitySabzevarIran

Personalised recommendations