Skip to main content
Log in

Information Dynamics

  • Published:
Minds and Machines Aims and scope Submit manuscript

Every block of stone has a statue inside it and it is the task of the sculptor to discover it.

I saw the angel in the marble and carved until I set him free. Michelangelo (1475–1564)

Abstract

Though we have access to a wealth of information, the main issue is always how to process the available information. How to make sense of all we observe and know. Just like the English alphabet: we know there are 26 letters but unless we put these letters together in a meaningful way, they convey no information. There are infinitely many ways of putting these letters together. Only a small number of those make sense. Only some of those convey exactly what we wish to convey though the message may be interpreted differently by different individuals. That same issue comes up with information: how can we process the information we have? How can we infer and reason under conditions of incomplete observed information? In his seminal book on the philosophy of information, Floridi (2011a) raises a number of open questions. I discuss here one of these questions. That question is how to process information. To do so, I take the more realistic view that information is always limited, incomplete and possibly noisy. I define types of information, relate it to Floridi’s definitions and discuss a basic formulation for processing information under a unified framework. I relate it to some of the basic concepts discussed in the book.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Notes

  1. For a related discussion on open questions in PI see also Floridi (2004), Crnkovic and Hofkirchner (2011) and a complimentary summary of open questions in the interconnection of philosophy of information and computation (Adriaans 2011) as well as the different chapters in van Benthem and Adriaans (2008).

  2. See also discussion and definitions of “information” in Floridi’s (2011a, b), Adriaans (forthcoming) as well as the different definitions in van Benthem and Adriaans (2008), the nice discussion in Caticha (2012), and the discussion in Golan (2008). For more cross disciplinary discussion of information and information dynamics, within an interdisciplinary perspective, see also the papers in the Info-Metrics proceedings on the philosophy of information (2011) and van Benthem (2011).

  3. For some early discussion of the statistical properties of that approach see for example Golan et al. (1996, 1997) and for some early applications in the natural sciences see Golan and Dose (2001) and Toussaint et al. (2004).

  4. Is it a purely “logical” contradiction or a “statistical” one is another interesting question, but the issue is that we need an inferential approach that will allow us to handle such problems.

  5. Stated simply, entropic methods are designed to process information in the form of expected value constraints. The observed sample averages are not expected values. The question then becomes how one can use information in the form of sample averages in an entropic formalism. One answer is the information-theoretic, GCE framework (developed in previous publications).

References

  • Adriaans, P. (2011). Some open problems in the study of information and computation. http://staff.science.uva.nl/~pietera/open_problems.html.

  • Adriaans, P. (forthcoming). Philosophy of Information. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Fall 2013 Edition).

  • Caticha, A. (2012). Entropic inference and the foundations of physics (monograph commissioned by the 11th Brazilian Meeting on Bayesian Statistics). EBEB-2012, USP Press, São Paulo, Brazil. Online at http://www.albany.edu/physics/ACaticha-EIFP-book.pdf.

  • Crnkovic, G. D., & Hofkirchner, W. (2011). Floridi’s open problems in philosophy of information, ten years later. Information, 2, 327–359.

    Article  Google Scholar 

  • Floridi, L. (2004). Open problems in the Philosophy of Information. Metaphilosophy, 35, 554–582.

    Article  Google Scholar 

  • Floridi, L. (2010). Information – A very short introduction. Oxford: Oxford University Press.

    Book  Google Scholar 

  • Floridi, L. (2011a). The philosophy of information (pp. 1–432). Oxford: Oxford University Press.

    Book  Google Scholar 

  • Floridi, L. (2011b). Semantic conceptions of information. In E. N. Zalta (Ed.), The Stanford encyclopedia of philosophy (Spring 2013 Edition).

  • Golan, A. (1994). A multivariable stochastic theory of size distribution of firms with empirical evidence. Advances in Econometrics, 10, 1–46.

    Google Scholar 

  • Golan, A. (2008). Information and entropy econometrics—a review and synthesis. Foundations and Trends® in Econometric, 2(1–2), 1–145.

    Google Scholar 

  • Golan, A. (2012). On the Foundations and Philosophy of Info-Metrics. In Lecture Notes in Computer Science (LNCS), 2012, Vol. 7318.

  • Golan A., & Dose, V. (2001). A generalized information theoretical approach to tomographic reconstruction. Journal of Physics A: Mathematical and General, 34, 1271–1283.

    Google Scholar 

  • Golan, A., Judge, G., & Miller, D. (1996a). Maximum entropy econometrics: Robust estimation with limited data. New York: Wiley.

    MATH  Google Scholar 

  • Golan, A., Judge, G. G., & Perloff, J. (1996b). A generalized maximum entropy approach to recovering information from multinomial response data. Journal of the American Statistical Association, 91, 841–853.

    Article  MATH  MathSciNet  Google Scholar 

  • Golan, A., Judge, G., & Perloff, J. (1997). Estimation and inference with censored and ordered multinomial response data. Journal of Econometrics, 79, 23–51.

    Article  MATH  MathSciNet  Google Scholar 

  • Hartley, R. V. L. (1928). Transmission of information. Bell System Technical Journal, 7(3), 535–563.

  • Jaynes, E. T. (1957a). Information theory and statistical mechanics. Physics Review, 106, 620–630.

    Article  MATH  MathSciNet  Google Scholar 

  • Jaynes, E. T. (1957b). Information theory and statistical mechanics II. Physics Review, 108, 171–190.

    Article  MathSciNet  Google Scholar 

  • Proceedings. (2011). Info-Metrics Institute Workshop on the Philosophy of Information. American University, Washington, DC. http://www.american.edu/cas/economics/info-metrics/workshop/proceedings-2011-october.cfm

  • Shannon, C. E. (1948). A mathematical theory of communication. Bell System Technical Journal, 27, 379–423.

    Article  MathSciNet  Google Scholar 

  • Shore, J. E., & Johnson, R. W. (1980). Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy. IEEE Transactions on Information Theory, IT-26(1), 26–37.

    Article  MathSciNet  Google Scholar 

  • Toussaint, U. V., Golan, A., & Dose, V. (2004). Maximum entropy decomposition of quadruple mass spectra. Journal of Vacuum Science and Technology A, 22(2), 401–406.

    Article  Google Scholar 

  • van Benthem, J. (2011). Logical dynamics of information and interaction (pp. 1–384). Cambridge: Cambridge University Press.

    Book  MATH  Google Scholar 

  • van Benthem, J., & Adriaans, P. (2008). Philosophy of information. Amsterdam: North Holland.

    Google Scholar 

  • Zellner, A. (1988). Optimal information processing and bayes theorem. American Statistician, 42, 278–284.

    Google Scholar 

  • Zellner, A. (2002). Information processing and Bayesian analysis. Journal of Econometrics, 107, 41–50.

    Google Scholar 

Download references

Acknowledgments

I thank Luciano Floridi and Ariel Caticha for many enchanting conversations on the topics discussed here, and for providing me with comments on earlier versions of this paper. I also benefited from comments during recent seminars on the topic. Finally, I thank Patrick Allo for his thoughtful comments, and to the Editor, Tony Beavers.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Amos Golan.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Golan, A. Information Dynamics . Minds & Machines 24, 19–36 (2014). https://doi.org/10.1007/s11023-013-9326-2

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11023-013-9326-2

Keywords

Navigation