Skip to main content
Log in

Random matrices and information theory

Случайные матрицы и теория информации

  • Published:
Il Nuovo Cimento B (1965-1970)

Summary

A general method is proposed for assigning a probability distribution to a random matrix. Its principle is that the amount of information contained in the probability distribution should not exceed the minimum amount needed to satisfy relevant properties of the matrix. As examples, classical random matrices are recovered, and the case of a given density of eigenvalues is treated.

Riassunto

Si propone un metodo generale per attribuire una distribuzione di probabilità ad una matrice aleatoria. Il metodo si basa sul principio che la distribuzione di probabilità non debba contenere informazioni maggiori del minimo necessario per soddisfare le proprietà importanti della matrice. Come esempio, si ritrovano le matrici aleatorie classiche e si tratta il caso di una densità data di autovalori.

Реэюме

Предлагается обший метод для приписывания распределения вероятности для случайной матрицы. Суть метода состоит в том, что количество информации, содержашейся в распределении вероятности, не должно превыщать минимального количества, необходимого для удовлетворения соответствуюших свойств матрицы. Как пример, эаново выводятся классические случайные матрицы, и рассматривается случай для эаданной плотности собственных эначений.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Statistical Theories of Spectra: Fluctuations,C. E. Porter, Ed. (New York, 1965), hereafter referred to as STSF.

  2. M. L. Mehta:Random Matrices and the Theory of Energy Levels (New York, 1967).

  3. E. P. Wigner:SIAM Rev.,9, 1 (1967).

    Article  ADS  Google Scholar 

  4. E. P. Wigner:Proc. Fourth Can. Math. Congr. (Toronto, 1957), p. 174; STSF, p. 188.

  5. M. L. Mehta andM. Gaudin:Nucl. Phys.,18, 420 (1960); STSF, p. 342;B. V. Bronk:Journ. Math. Phys.,5, 215 (1964).

    Article  MathSciNet  Google Scholar 

  6. F. J. Dyson:Journ. Math. Phys.,3, 140 (1962); STSF, p. 379.

    Article  ADS  MathSciNet  Google Scholar 

  7. E. P. Wigner: STSF, p. 446.

  8. J. Ginibre:Journ. Math. Phys.,6, 440 (1965).

    Article  ADS  MathSciNet  Google Scholar 

  9. C. E. Porter andN. Rosenzweig:Suomalaisen Tiedeakatemian Toimituksia, vol. A-VI, no. 44 (1960), p. 19; STSF, p. 251.

  10. N. Rosenzweig andC. E. Porter:Phys. Rev.,120, 1698 (1960); STSF, p. 312;F. J. Dyson:Journ. Math. Phys.,3, 1191 (1962); STSF, p. 424;N. Rosenzweig, J. E. Monahan andM. L. Mehta:Nucl. Phys., A109, 437 (1968).

    Article  ADS  Google Scholar 

  11. D. Fox andP. B. Kahn:Phys. Rev.,134, B 1151 (1964);H. S. Leff:Journ. Math. Phys.,5, 756, 763 (1964);B. V. Bronk:Journ. Math. Phys.,6, 228 (1965).

  12. B. V. Bronk:Thesis, Princeton University (1964).

  13. C. E. Porter: STSF, p. 49.Bronk was the first to use information theory (ref. (11,12)); his work needed two improvements. On the one hand, instead of the unique constraint (8), which is sufficient to obtain the Gaussian ensemble (9), the variances and the correlations ofall matrix elementsM αβ are assumed to be given, without justification for the absence of correlations, nor for the factor √2 between variances of diagonal and off-diagonal matrix elements. On the other hand, the volume element d[M] entering in the definition of information for positive random matrices is chosen somewhat arbitrarily; this is troublesome, since one may derive, from a fixed set of constraints, anyP[M] by simply changing the definition of the basic measure d[M].

  14. L. Brillouin:Science and Information Theory (New York, 1956);E. T. Jaynes:Phys. Rev.,106, 620 (1957);A. Katz:Principles of Statistical Mechanics, The Information Theory Approach (New York, 1967).

  15. C. E. Shannon andW. Weaver:The Mathematical Theory of Communication (Urbana, Ill.,, 1949);A. Feinstein:Foundations of Information Theory (New York, 1958);D. Middleton:Introduction to Statistical Communication Theory, Chap. 6 (New York, 1960).

  16. C. E. Porter: STSF, p. 32.

  17. N. Rosenzweig: inStatistical Physics, Brandeis Lectures, vol.3 (New York, 1963), p. 91.

  18. F. J. Dyson:Journ. Math. Phys.,3, 154 (1962); STSF, p. 393.

    ADS  Google Scholar 

  19. H. A. Bethe:Rev. Mod. Phys.,9, 53 (1937);C. Bloch:Phys. Rev.,93, 1094 (1954);J. M. B. Lang andK. J. Le Couteur:Proc. Phys. Soc.,67 A, 586 (1954);T. D. Newton:Can. Journ. Phys.,34, 804 (1956).

    Article  ADS  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Balian, R. Random matrices and information theory. Nuovo Cimento B (1965-1970) 57, 183–193 (1968). https://doi.org/10.1007/BF02710326

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02710326

Keywords

Navigation