Skip to main content

Hypothesis Testing, Information Divergence and Computational Geometry

  • Conference paper
Book cover Geometric Science of Information (GSI 2013)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 8085))

Included in the following conference series:

Abstract

In this paper, we consider the Bayesian multiple hypothesis testing problem from the stance of computational geometry. We first recall that the probability of error of the optimal decision rule, the maximum a posteriori probability (MAP) criterion, is related to both the total variation and the Chernoff statistical distances. We then consider the exponential family manifolds, and show that the MAP rule amounts to a nearest neighbor classifier that can be implemented either by point locations in an additive Bregman Voronoi diagram or by nearest neighbor queries using various techniques of computational geometry. Finally, we show that computing the best error exponent upper bounding the probability of error, the Chernoff distance, amounts to (1) find a unique geodesic/bisector intersection point for binary hypothesis, (2) solve a closest Bregman pair problem for multiple hypothesis.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Leang, C.C., Johnson, D.H.: On the asymptotics of M-hypothesis Bayesian detection. IEEE Transactions on Information Theory 43(1), 280–282 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  2. Hellman, M.E., Raviv, J.: Probability of error, equivocation and the Chernoff bound. IEEE Transactions on Information Theory 16, 368–372 (1970)

    Article  MathSciNet  MATH  Google Scholar 

  3. Chernoff, H.: A measure of asymptotic efficiency for tests of a hypothesis based on the sum of observations. Annals of Mathematical Statistics 23, 493–507 (1952)

    Article  MathSciNet  MATH  Google Scholar 

  4. Brown, L.D.: Fundamentals of statistical exponential families: with applications in statistical decision theory. Institute of Mathematical Statistics, Hayworth (1986)

    MATH  Google Scholar 

  5. Amari, S., Nagaoka, H.: Methods of Information Geometry. Oxford University Press (2000)

    Google Scholar 

  6. Banerjee, A., Merugu, S., Dhillon, I.S., Ghosh, J.: Clustering with Bregman divergences. Journal of Machine Learning Research 6, 1705–1749 (2005)

    MathSciNet  MATH  Google Scholar 

  7. Westover, M.: Asymptotic geometry of multiple hypothesis testing. IEEE Transactions on Information Theory 54(7), 3327–3329 (2008)

    Article  MathSciNet  Google Scholar 

  8. Boissonnat, J.D., Nielsen, F., Nock, R.: Bregman Voronoi diagrams. Discrete & Computational Geometry 44(2), 281–307 (2010)

    Article  MathSciNet  MATH  Google Scholar 

  9. Boissonnat, J.D., Yvinec, M.: Algorithmic Geometry. Cambridge University Press, New York (1998)

    Book  MATH  Google Scholar 

  10. Nielsen, F.: k-MLE: A fast algorithm for learning statistical mixture models. In: IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP). IEEE (2012); preliminary, technical report on arXiv

    Google Scholar 

  11. Piro, P., Nielsen, F., Barlaud, M.: Tailored Bregman ball trees for effective nearest neighbors. In: European Workshop on Computational Geometry (EuroCG), LORIA, Nancy, France. IEEE (March 2009)

    Google Scholar 

  12. Nielsen, F., Piro, P., Barlaud, M.: Bregman vantage point trees for efficient nearest neighbor queries. In: Proceedings of the 2009 IEEE International Conference on Multimedia and Expo (ICME), pp. 878–881 (2009)

    Google Scholar 

  13. Garcia, V., Debreuve, E., Nielsen, F., Barlaud, M.: k-nearest neighbor search: Fast GPU-based implementations and application to high-dimensional feature matching. In: IEEE International Conference on Image Processing (ICIP), pp. 3757–3760 (2010)

    Google Scholar 

  14. Nielsen, F.: An information-geometric characterization of Chernoff information. IEEE Signal Processing Letters (SPL) 20(3), 269–272 (2013)

    Article  Google Scholar 

  15. Nielsen, F., Boltz, S.: The Burbea-Rao and Bhattacharyya centroids. IEEE Transactions on Information Theory 57(8), 5455–5466 (2011)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Nielsen, F. (2013). Hypothesis Testing, Information Divergence and Computational Geometry. In: Nielsen, F., Barbaresco, F. (eds) Geometric Science of Information. GSI 2013. Lecture Notes in Computer Science, vol 8085. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40020-9_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-40020-9_25

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-40019-3

  • Online ISBN: 978-3-642-40020-9

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics