Minds and Machines

, Volume 24, Issue 1, pp 19–36 | Cite as

Information Dynamics

  • Amos Golan


Though we have access to a wealth of information, the main issue is always how to process the available information. How to make sense of all we observe and know. Just like the English alphabet: we know there are 26 letters but unless we put these letters together in a meaningful way, they convey no information. There are infinitely many ways of putting these letters together. Only a small number of those make sense. Only some of those convey exactly what we wish to convey though the message may be interpreted differently by different individuals. That same issue comes up with information: how can we process the information we have? How can we infer and reason under conditions of incomplete observed information? In his seminal book on the philosophy of information, Floridi (2011a) raises a number of open questions. I discuss here one of these questions. That question is how to process information. To do so, I take the more realistic view that information is always limited, incomplete and possibly noisy. I define types of information, relate it to Floridi’s definitions and discuss a basic formulation for processing information under a unified framework. I relate it to some of the basic concepts discussed in the book.


Efficiency Entropy Information Generalized Maximum Entropy Generalized Cross Entropy Maximum Entropy Noise Truth 



I thank Luciano Floridi and Ariel Caticha for many enchanting conversations on the topics discussed here, and for providing me with comments on earlier versions of this paper. I also benefited from comments during recent seminars on the topic. Finally, I thank Patrick Allo for his thoughtful comments, and to the Editor, Tony Beavers.


  1. Adriaans, P. (2011). Some open problems in the study of information and computation.
  2. Adriaans, P. (forthcoming). Philosophy of Information. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Fall 2013 Edition).Google Scholar
  3. Caticha, A. (2012). Entropic inference and the foundations of physics (monograph commissioned by the 11th Brazilian Meeting on Bayesian Statistics). EBEB-2012, USP Press, São Paulo, Brazil. Online at
  4. Crnkovic, G. D., & Hofkirchner, W. (2011). Floridi’s open problems in philosophy of information, ten years later. Information, 2, 327–359.CrossRefGoogle Scholar
  5. Floridi, L. (2004). Open problems in the Philosophy of Information. Metaphilosophy, 35, 554–582.CrossRefGoogle Scholar
  6. Floridi, L. (2010). Information – A very short introduction. Oxford: Oxford University Press.CrossRefGoogle Scholar
  7. Floridi, L. (2011a). The philosophy of information (pp. 1–432). Oxford: Oxford University Press.CrossRefGoogle Scholar
  8. Floridi, L. (2011b). Semantic conceptions of information. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Spring 2013 Edition). Google Scholar
  9. Golan, A. (1994). A multivariable stochastic theory of size distribution of firms with empirical evidence. Advances in Econometrics, 10, 1–46.Google Scholar
  10. Golan, A. (2008). Information and entropy econometrics—a review and synthesis. Foundations and Trends® in Econometric, 2(1–2), 1–145.Google Scholar
  11. Golan, A. (2012). On the Foundations and Philosophy of Info-Metrics. In Lecture Notes in Computer Science (LNCS), 2012, Vol. 7318.Google Scholar
  12. Golan A., & Dose, V. (2001). A generalized information theoretical approach to tomographic reconstruction. Journal of Physics A: Mathematical and General, 34, 1271–1283. Google Scholar
  13. Golan, A., Judge, G., & Miller, D. (1996a). Maximum entropy econometrics: Robust estimation with limited data. New York: Wiley.zbMATHGoogle Scholar
  14. Golan, A., Judge, G. G., & Perloff, J. (1996b). A generalized maximum entropy approach to recovering information from multinomial response data. Journal of the American Statistical Association, 91, 841–853.CrossRefzbMATHMathSciNetGoogle Scholar
  15. Golan, A., Judge, G., & Perloff, J. (1997). Estimation and inference with censored and ordered multinomial response data. Journal of Econometrics, 79, 23–51.CrossRefzbMATHMathSciNetGoogle Scholar
  16. Hartley, R. V. L. (1928). Transmission of information. Bell System Technical Journal, 7(3), 535–563.Google Scholar
  17. Jaynes, E. T. (1957a). Information theory and statistical mechanics. Physics Review, 106, 620–630.CrossRefzbMATHMathSciNetGoogle Scholar
  18. Jaynes, E. T. (1957b). Information theory and statistical mechanics II. Physics Review, 108, 171–190.CrossRefMathSciNetGoogle Scholar
  19. Proceedings. (2011). Info-Metrics Institute Workshop on the Philosophy of Information. American University, Washington, DC.
  20. Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27, 379–423.CrossRefMathSciNetGoogle Scholar
  21. Shore, J. E., & Johnson, R. W. (1980). Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory, IT-26(1), 26–37.CrossRefMathSciNetGoogle Scholar
  22. Toussaint, U. V., Golan, A., & Dose, V. (2004). Maximum entropy decomposition of quadruple mass spectra. Journal of Vacuum Science and Technology A, 22(2), 401–406.CrossRefGoogle Scholar
  23. van Benthem, J. (2011). Logical dynamics of information and interaction (pp. 1–384). Cambridge: Cambridge University Press.CrossRefzbMATHGoogle Scholar
  24. van Benthem, J., & Adriaans, P. (2008). Philosophy of information. Amsterdam: North Holland.Google Scholar
  25. Zellner, A. (1988). Optimal information processing and bayes theorem. American Statistician, 42, 278–284.Google Scholar
  26. Zellner, A. (2002). Information processing and Bayesian analysis. Journal of Econometrics, 107, 41–50.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2013

Authors and Affiliations

  1. 1.Info-Metrics Institute and Department of EconomicsAmerican UniversityWashingtonUSA

Personalised recommendations