Skip to main content

Part of the book series: Advances in Computer Vision and Pattern Recognition ((ACVPR))

Abstract

A theory of patterns analysis has to suggest criteria how patterns in data can be defined in a meaningful way and how they should be compared. Similarity-based Pattern Analysis and Recognition is expected to adhere to fundamental principles of the scientific process that are expressiveness of models and reproducibility of their inference. Patterns are assumed to be elements of a pattern space or hypothesis class and data provide “information” which of these patterns should be used to interpret the data. The mapping between data and patterns is constructed by an inference algorithm, in particular by a cost minimization process. Fluctuations in the data usually limit the precision that we can achieve to uniquely identify a single pattern as interpretation of the data. We advocate an information-theoretic perspective on pattern analysis to resolve this dilemma where the tradeoff between informativeness of statistical inference and their stability is mirrored in the information-theoretic optimum of high information rate and zero communication error. The inference algorithm is considered as a noisy channel which naturally limits the resolution of the pattern space given the uncertainty of the data.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    In the following, we restrict hypotheses to map an object to a pattern. The more general situation of object configurations can be analyzed in an analogous way but involves a more complex notation.

  2. 2.

    For binary weights, Z β (X (n)) corresponds the microcanonical partition function by assuming that almost all solutions cost close to R(c ,X (n))+1/β.

  3. 3.

    The reader should keep in mind that we are not interested in deriving a new principle for coding, but we exploit the communication metaphor to derive a quantitative criterion of how precisely we can approximate the global minimizer of a cost function by an approximation set.

  4. 4.

    Superscript (n) dropped for readability.

  5. 5.

    Please note that AEP has to be proved for a selected cost function R.

  6. 6.

    W.l.o.g. we use the symmetric encoding {−1,1} rather than {0,1} to simplify the calculations.

References

  1. Alon, N., Ben-David, S., Cesa-Bianchi, N., Haussler, D.: Scale-sensitive dimensions, uniform convergence, and learnability. J. ACM 44(4), 615–631 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  2. Buhmann, J.M.: Information theoretic model validation for clustering. In: International Symposium on Information Theory, Austin Texas. IEEE Press, New York (2010). http://arxiv.org/abs/1006.0375

    Google Scholar 

  3. Buhmann, J.M., Kühnel, H.: Vector quantization with complexity costs. IEEE Trans. Inf. Theory 39(4), 1133–1145 (1993)

    Article  MATH  Google Scholar 

  4. Buhmann, J.M., Chehreghani, M.H., Frank, M., Streich, A.P.: Information theoretic model selection for pattern analysis. In: Guyon, I., Dror, G., Lemaire, V., Taylor, G., Silver, D. (eds.) ICML 2011 Workshop on “Unsupervised and Transfer Learning”, Bellevue, Washington, vol. 27, pp. 51–65 (2012). Clearwater Beach, Florida, JMLR: W&CP 5

    Google Scholar 

  5. Buhmann, J.M., Mihalák, M., Srámek, R., Widmayer, P.: Robust optimization in the presence of uncertainty. In: Inventions in Theoretical Computer Science 2013, Berkeley. ACM 2013, pp. 505–514 (2012). doi:10.1145/2422436.2422491

    Google Scholar 

  6. Busse, L.: Information in orderings (learning to order). Ph.D. thesis, # 20600, ETH Zurich, CH-8092 Zurich, Rämistrasse (2012)

    Google Scholar 

  7. Busse, L.M., Chehreghani, M.H., Buhmann, J.M.: The information content in sorting algorithms. In: International Symposium on Information Theory, pp. 2746–2750. IEEE Press, Cambridge (2012)

    Google Scholar 

  8. Chehreghani, M.H., Giovanni Busetto, A., Buhmann, J.M.: Information theoretic model validation for spectral clustering. In: AISTATS 2012, La Palma. J. Mach. Learn. Res. (W&CP), vol. 22, pp. 495–503 (2012)

    Google Scholar 

  9. Cover, T.M., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, New York (1991)

    Book  MATH  Google Scholar 

  10. Csiczár, I., Körner, J.: Information Theory: Coding Theorems for Discrete Memoryless Systems. Academic Press, New York (1981)

    Google Scholar 

  11. Frank, M., Buhmann, J.M.: Selecting the rank of SVD by maximum approximation capacity. In: International Symposium on Information Theory, St. Petersburg, pp. 1036–1040. IEEE Press, New York (2011)

    Google Scholar 

  12. Grenander, U.: General Pattern Theory: a Mathematical Study of Regular Structures. Oxford University Press, Oxford (1994)

    MATH  Google Scholar 

  13. Grenander, U., Miller, M.I.: Pattern Theory: from Representation to Inference. Oxford University Press, Oxford (2007)

    Google Scholar 

  14. Han, L., Rossi, L., Torsello, A., Wilson, R.C., Hancock, E.R.: Information theoretic prototype selection for unattributed graphs. In: Gimel’farb, G.L., Hancock, E.R., Imiya, A., Kuijper, A., Kudo, M., Omachi, S., Windeatt, T., Yamada, K. (eds.) Structural, Syntactic, and Statistical Pattern Recognition. Lecture Notes in Computer Science, vol. 7626, pp. 33–41. Springer, Berlin (2012)

    Chapter  Google Scholar 

  15. Hofmann, T., Buhmann, J.M.: Pairwise data clustering by deterministic annealing. IEEE Trans. Pattern Anal. Mach. Intell. 19(1), 1–14 (1997)

    Article  Google Scholar 

  16. Lingamneni, A., Krishna Muntimadugu, K., Enz, C., Karp, R.M., Palem, K.V., Piguet, C.: Algorithmic methodologies for ultra-efficient inexact architectures for sustaining technology scaling. In: Proceedings of the 9th Conference on Computing Frontiers, CF’12, pp. 3–12. ACM, New York (2012)

    Google Scholar 

  17. Rose, K., Gurewitz, E., Fox, G.: Vector quantization by deterministic annealing. IEEE Trans. Inf. Theory 38(4), 1249–1257 (1992)

    Article  MATH  Google Scholar 

  18. Vapnik, V.N.: Estimation of Dependencies Based on Empirical Data. Springer, New York (1982)

    Google Scholar 

  19. Vapnik, V.N., Chervonenkis, A.Ya.: On the uniform convergence of relative frequencies of events to their probabilities. Theory Probab. Appl. 16, 264–280 (1971)

    Article  MATH  Google Scholar 

Download references

Acknowledgement

This work has been partially supported by the FP7 EU project SIMBAD and by the SNF project 200021_138117. JB acknowledges very stimulating discussions with A. Busetto, L. Busse, M.H. Chehreghani, M. Frank, M. Mihalák, V. Roth, R. Srámek, W. Szpankowski and P. Widmayer.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Joachim M. Buhmann .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Springer-Verlag London

About this chapter

Cite this chapter

Buhmann, J.M. (2013). SIMBAD: Emergence of Pattern Similarity. In: Pelillo, M. (eds) Similarity-Based Pattern Analysis and Recognition. Advances in Computer Vision and Pattern Recognition. Springer, London. https://doi.org/10.1007/978-1-4471-5628-4_3

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-5628-4_3

  • Publisher Name: Springer, London

  • Print ISBN: 978-1-4471-5627-7

  • Online ISBN: 978-1-4471-5628-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics