Skip to main content
Log in

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Literature cited

  1. C. E. Shannon and W. Weaver, The Mathematical Theory of Communication, Univ. of Illinois Press, (1949).

  2. I. M. Gel'fand, A. N. Kolmogorov, and A. M. Yaglom, “On the general definition of the quantity of information,” Dokl. Akad. Nauk SSSR,111, No. 4, 745–748 (1956).

    Google Scholar 

  3. A. I. Kolmogorov, A. M. Yaglom, and I. M. Gel'fand, “Quantity of information and entropy for continuous distributions,” in: Proceedings of the Third Mathematical Meeting [in Russian], Vol. 3, Izd. AN SSSR, Moscow (1956), pp. 300–320.

    Google Scholar 

  4. I. M. Gel'fand and A. M. Yaglom, “On the evaluation of the quantity of information about a random function contained in a second such function,” Uspekhi Matem. Nauk,12, No. 1, 3–52 (1957).

    Google Scholar 

  5. Pèrez, “Generalized concepts of uncertainty, entropy, and information, from the point of view of the theory of martingales” [in French], in: Transactions of the First Prague Conference on Information Theory, Statistical Decision Functions, and Random Processes, Publishing House of the Czechoslovak Academy of Sciences, Prague (1957), pp. 183–208.

    Google Scholar 

  6. Ch'iang Chieh-p'ei, “Note on the determination of the quantity of information,” Teoriya Veroyatn. i Ee Primenen.,3, No. 1, 99–104 (1958).

    Google Scholar 

  7. M. S. Pinsker, Information and Information Stability of Random Variables and Processes [in Russian], Izd. AN SSSR, Moscow (1970).

    Google Scholar 

  8. R. L. Stratonovich, “Quantity of information and entropy of sections of stationary Gaussian processes,” Probl. Peredachi Informats.,3, No, 2, 3–21 (1967).

    Google Scholar 

  9. H. L. Van Trees, Detection, Estimation, and Modulation Theory, Vol. 1, Wiley, New York (1968).

    Google Scholar 

  10. T. E. Duncan, “On the calculation of mutual information,” SIAM J. Appl. Math.,19, 1, 215–220 (1970).

    Google Scholar 

  11. C. R. Baker, “Mutual information for Gaussian processes,” SIAM J. Appl. Math.,19, No. 2, 451–458 (1970).

    Google Scholar 

  12. T. E. Duncan, “Mutual information for stochastic differential equations,” Information and Control,19, No. 3, 265–271 (1971).

    Google Scholar 

  13. T. T. Kadota, M. Zakai, and J. Ziv, “Mutual information of the white Gaussian channel with and without feedback,” IEEE Trans. on Information Theory,IT-17, 368–371 (1971).

    Google Scholar 

  14. B. Grigelionis, “On nonlinear filtering theory and absolute continuity of measures corresponding to the stochastic processes,” in: Second Japan-USSR Symposium on Probability Theory, Vol. 1, Kyoto (1972), pp. 107–125.

    Google Scholar 

  15. B. Grigelionis, “On stochastic equations of nonlinear filtrations of random process,” Litovsk. Matem. Sb.,12, No. 4, 37–51 (1972).

    Google Scholar 

  16. B. Grigelionis, “On the structure of the densities of measures corresponding to random processes,” Litovsk. Matem. Sb.,13, No. 1, 71–78 (1973).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Additional information

Translated from Lietuvos Matematikos Rinkinys Litovski Matematicheskii Sbornik), Vol. 14, No. 1, pp. 5–11, January–March, 1974.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Grigelionis, B. Mutual information for locally infinitely divisible random processes. Lith Math J 14, 1–6 (1974). https://doi.org/10.1007/BF01414306

Download citation

  • Received:

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF01414306

Keywords

Navigation