Skip to main content

Advertisement

Log in

Bounded Turing Reductions and Data Processing Inequalities for Sequences

  • Published:
Theory of Computing Systems Aims and scope Submit manuscript

Abstract

A data processing inequality states that the quantity of shared information between two entities (e.g. signals, strings) cannot be significantly increased when one of the entities is processed by certain kinds of transformations. In this paper, we prove several data processing inequalities for sequences, where the transformations are bounded Turing functionals and the shared information is measured by the lower and upper mutual dimensions between sequences. We show that, for all sequences X, Y, and Z, if Z is computable Lipschitz reducible to X, then

$$mdim(Z:Y) \leq mdim(X:Y) \text{ and} \ Mdim(Z:Y) \leq Mdim(X:Y). $$

We also show how to derive different data processing inequalities by making adjustments to the computable bounds of the use of a Turing functional. The yield of a Turing functional ΦS with access to at most n bits of the oracle S is the smallest input \(m \in \mathbb {N}\) such that \({\Phi }^{S \upharpoonright n}(m)\uparrow \). We show how to derive reverse data processing inequalities (i.e., data processing inequalities where the transformation may significantly increase the shared information between two entities) for sequences by applying computable bounds to the yield of a Turing functional.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Ambos-Spies, K.: Strongly Bounded Turing Reducibilities and Computably Enumerable Sets. Heidelberg University Lecture Notes, Heidelberg (2011)

    Google Scholar 

  2. Bell, C.B.: Mutual information and maximal correlation as measures of dependence. Ann. Math. Stat. 33(2), 587–595 (1962)

    Article  MathSciNet  MATH  Google Scholar 

  3. Case, A., Lutz, J.H.: Mutual dimension. ACM Trans. Comput. Theory 7: article no. 12. Part of the Lecture Notes in Computer Science, book series (LNCS, vol. 9235) (2015)

  4. Case, A., Lutz, J.H.: Mutual dimension and random sequences. In: Proceedings of the 40th International Symposium on the Mathematical Foundations of Computer Science, pp. 199–210. Springer (2015)

  5. Cover, T.R., Thomas, J.A.: Elements of Information Theory, 2nd edn. Wiley, Hoboken (2006)

    MATH  Google Scholar 

  6. Downey, R.G., Hirschfeldt, D.R., LaForte, G.: Randomness and reducibility. J. Comput. Syst. Sci. 68, 96–114 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  7. Gács, P.: Every sequence is reducible to a random one. Inf. Control 70, 186–192 (1986)

    Article  MathSciNet  MATH  Google Scholar 

  8. Kolmogorov, A.N.: Three approaches to the quantitative definition of information. Probl. Inf. Transm. 1(1), 1–7 (1965)

    MathSciNet  Google Scholar 

  9. Levin, L.A.: Laws of information conservation (nongrowth) and aspects of the foundation of probability theory. Probl. Peredachi Informatsii 10(3), 30–35 (1974)

    Google Scholar 

  10. Levin, L.A.: A concept of independence with applications in various fields of mathematics. Technical report, Massachusetts Institute of Technology, Massachusetts (1980)

    Google Scholar 

  11. Levin, L.A.: Randomness conservation inequalities; information and independence in mathematical theories. Inf. Control 61(1), 15–37 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  12. Lewis, A.E.M., Barmpalias, G.: Random reals and Lipschitz continuity. Math. Struct. Comput. Sci. 16, 737–749 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  13. Lewis, A.E.M., Barmpalias, G.: Randomness and the linear degrees of computability. Ann. Pure Appl. Log. 145, 252–257 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  14. Li, M., Vitányi, P.: An Introduction to Kolmogorov Complexity and its Applications, 3rd edn. Springer, Berlin (2008)

    Book  MATH  Google Scholar 

  15. Lutz, J.H.: The dimensions of individual strings and sequences. Inf. Comput. 187(1), 49–79 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  16. Mayordomo, E.: A Kolmogorov complexity characterization of constructive Hausdorff dimension. Inf. Process. Lett. 84(1), 1–3 (2002)

    Article  MathSciNet  MATH  Google Scholar 

  17. Rogers, H.: Theory of Recursive Functions and Effective Computability. MIT Press, Cambridge (1987)

    Google Scholar 

  18. Soare, R.I.: Recursively enumerable sets and degrees: a study of computable functions and computably generated sets. Springer-Verlag, Berlin (1987)

    Book  MATH  Google Scholar 

  19. Soare, R.I.: Turing oracle machines, online computing, and three displacements in computability theory. Ann. Pure Appl. Log. 160, 368–399 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  20. Weihrauch, K.: Computable Analysis: an Introduction. Springer, Berlin (2000)

    Book  MATH  Google Scholar 

Download references

Acknowledgments

I would like to thank Xiang Huang, Jack Lutz, Timothy McNicholl, and Donald Stull for useful discussions. I also thank two anonymous reviewers for their helpful comments and suggestions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Adam Case.

Additional information

This research was supported in part by National Science Foundation Grants 1247051 and 1545028. A preliminary version of part of this work was presented at the 11th International Conference on Computability, Complexity, and Randomness.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Case, A. Bounded Turing Reductions and Data Processing Inequalities for Sequences. Theory Comput Syst 62, 1586–1598 (2018). https://doi.org/10.1007/s00224-017-9804-7

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00224-017-9804-7

Keywords

Navigation