Abstract
We develop a variational principle between mean dimension theory and rate distortion theory. We consider a minimax problem about the rate distortion dimension with respect to two variables (metrics and measures). We prove that the minimax value is equal to the mean dimension for a dynamical system with the marker property. The proof exhibits a new combination of ergodic theory, rate distortion theory and geometric measure theory. Along the way of the proof, we also show that if a dynamical system has the marker property then it has a metric for which the upper metric mean dimension is equal to the mean dimension.
Similar content being viewed by others
Notes
Throughout the paper we assume that the base of the logarithm is two. The natural logarithm (i.e. the logarithm of base e) is written as \(\ln (\cdot )\).
\(f{:}\,\mathcal {X}\rightarrow (C_N)^\mathbb {Z}\) is called an embedding of a dynamical system if it is a topological embedding and satisfies \(f\circ T = \sigma \circ f\).
The idea of introducing mean Hausdorff dimension was partly motivated by the study of Kawabata–Dembo [KD94, Proposition 3.2]. Roughly speaking, their result [KD94, Proposition 3.2] corresponds to Step 2.2 for \((\mathcal {X}, T) = (A^\mathbb {Z},\mathrm {shift})\) with \(A\subset \mathbb {R}^n\). In other words, Step 2.2 is a generalization of their result to arbitrary dynamical systems.
There also exists a small issue about the tame growth of covering numbers condition. But we ignore it here.
We always assume that the \(\sigma \)-algebra of a finite set is the largest one (the set of all subsets).
The continuity of \(\rho \) and \(\lambda \) is inessential. But we assume it for simplicity. Indeed in our applications, \(\mathcal {X}= \mathcal {Y}\), \(\rho \) is a distance function and \(\lambda \) is a constant.
E.g. expanding signals in a wavelet basis, discarding small terms and quantizing the remaining terms.
Indeed here we use only \(\underline{{\mathrm {mdim}}}_{\mathrm {H}}(\mathcal {X},T,d) < s\).
An important point for us is that the statement is valid for each fixed \(\delta \) (not only the limits of \(\delta \rightarrow 0\)).
Since we assume that A is a finite set, this just means that \(\mu _n(x)\rightarrow \mu (x)\) at each \(x\in A\).
The coupling between X(k) and Y is given by the probability mass function
$$\begin{aligned} \sum _{x'\in A^m} \pi _k(x,x') \mathbb {P}(Y=y|\mathcal {P}^m(X)=x'), \end{aligned}$$which converges to \(\mathbb {P}(\mathcal {P}^m(X)=x, Y=y)\).
“Open” is easy. To show “dense”, take arbitrary \(f\in C(\mathcal {X},V)\) and \(\delta >0\). Choose \(0<\varepsilon <1/n\) such that \(d(x,y)<\varepsilon \) implies \(\left| \left| f(x)-f(y)\right| \right| <\delta \). There exists an \(\varepsilon \)-embedding \(\pi {:}\,\mathcal {X}\rightarrow P\) in a simplicial complex P of dimension \(\le \dim \mathcal {X}\). From Lemma 5.3 (2) and (3) in Section 5.2 we can find a linear embedding \(g:P\rightarrow V\) with \(\left| \left| g(\pi (x))-f(x)\right| \right| < \delta \). From Lemma 5.3 (1), \(\log \#(g(P), \left| \left| \cdot \right| \right| ,\varepsilon ')/\log (1/\varepsilon ')\) is less than \(\dim \mathcal {X} + 1/n\) for sufficiently small \(\varepsilon '\). This shows \(g\circ \pi \in A_n\). So \(A_n\) is dense. Therefore the main point of the proof of Theorem 5.2 is a “polyhedral approximation”. The basic idea of the proof of Theorem 5.1 is also a polyhedral approximation, but in a much more accurate way. See Section 5.3.
Here is a technical point. The number \(A(P_n*Q_n)\) is defined by using the simplicial complex structure of \(P_n*Q_n\). We use the natural simplicial complex structure of the join \(P_n*Q_n\) here, not its subdivision introduced in (5.8).
References
T. Berger. Rate Distortion Theory: A Mathematical Basis for Data Compression. Englewood Cliffs, Prentice-Hall, NJ (1971).
T.M. Cover and J.A. Thomas. Elements of Information Theory, 2nd edition. Wiley, New York (2006).
E.I. Dinaburg. A correlation between topological entropy and metric entropy. Dokl. Akad. Nauk SSSR, 190 (1970), 19–22.
M. Effros, P.A. Chou and G.M. Gray. Variable-rate source coding theorems for stationary nonergodic sources. IEEE Trans. Inf. Theory, 40 (1994), 1920–1925.
M. Einsiedler and T. Ward. Ergodic Theory with a View Towards Number Theory, Graduate Texts in Mathematics. 259, Springer, London.
T.N.T. Goodman. Relating topological entropy and measure entropy. Bull. London Math. Soc., 3 (1971), 176–180.
L.W. Goodwyn. Topological entropy bounds measure-theoretic entropy. Proc. Amer. Math. Soc., 23 (1969), 679–688.
R.M. Gray. Entropy and Information Theory. Springer-Verlag, New York (1990).
M. Gromov. Topological invariants of dynamical systems and spaces of holomorphic maps: I. Math. Phys. Anal. Geom., 2 (1999), 323–415.
Y. Gutman. Mean dimension and Jaworski-type theorems. Proceedings of the London Mathematical Society, (4)111 (2015), 831–850.
Y. Gutman, E. Lindenstrauss and M. Tsukamoto. Mean dimension of \(\mathbb{Z}^k\)-actions. Geom. Funct. Anal., (3)26 (2016), 778–817.
Y. Gutman, Y. Qiao and M. Tsukamoto. Application of signal analysis to the embedding problem of \(\mathbb{Z}^k\)-actions arXiv:1709.00125, to appear in Geom. Funct. Anal.
Y. Gutman and M. Tsukamoto. Embedding minimal dynamical systems into Hilbert cubes, preprint. arXiv:1511.01802.
J.D. Howroyd. On dimension and on the existence of sets of finite, positive Hausdorff measures. Proc. London Math. Soc., 70 (1995), 581–604.
R.I. Jewett. The prevalence of uniquely ergodic systems. J. Math. Mech., 19 (1970), 717–729.
T. Kawabata and A. Dembo. The rate distortion dimension of sets and measures. IEEE Trans. Inf. Theory., (5)40 (1994), 1564–1572.
A.N. Kolmogorov and V.M. Tihomirov. \(\varepsilon \)-entropy and \(\varepsilon \)-capacity of sets in functional spaces. Amer. Math. Soc. Transl., (2)33 (1963), 277–367.
W. Krieger. On unique ergodicity. In: Proc. sixth Berkeley symposium, Math. Statist. Probab. Univ. of California Press (1970), pp. 327–346.
A. Leon-Garcia, L.D. Davisson and D.L. Neuhoff. New results on coding of stationary nonergodic sources. IEEE Trans. Inform. Theory., 25 (1979), 137–144.
H. Li and B. Liang. Mean dimension, mean rank and von Neumann–Lück rank. J. Reine Angew. Math., 739 (2018), 207–240.
E. Lindenstrauss. Mean dimension, small entropy factors and an embedding theorem. Inst. Hautes Études Sci. Publ. Math. 89 (1999), 227–262.
E. Lindenstrauss and M. Tsukamoto. Mean dimension and an embedding problem: an example. Israel J. Math., 199 (2014), 573–584.
E. Lindenstrauss and M. Tsukamoto. From rate distortion theory to metric mean dimension: variational principle. IEEE Trans. Inf. Theory, (5)64 (2018), 3590–3609.
E. Lindenstrauss, and B. Weiss. Mean topological dimension. Israel J. Math., 115 (2000), 1–24.
P. Mattila. Geometry of Sets and Measures in Euclidean Spaces, Fractals and Rectifiability. Cambridge Studies in Advanced Mathematics, 44. Cambridge University Press, Cambridge (1995).
T. Meyerovitch and M. Tsukamoto. Expansive multiparameter actions and mean dimension. Trans. Amer. Math. Soc., 371 (2019), 7275–7299.
M. Misiurewicz. A short proof of the variational principle for \(\mathbb{Z}^N_+\) actions on a compact space. In: International Conference on Dynamical Systems in Mathematical Physics (Rennes, 1975), Astérisque, vol. 40, pp. 145–157, Soc. Math. France, Paris (1976).
L. Pontrjagin and L. Schnirelmann. Sur une propriété métrique de la dimension. Ann. Math., 33 (1932), 152–162.
A. Rényi. On the dimension and entropy of probability distributions. Acta Math. Sci. Hung., 10 (1959), 193–215.
F.E. Rezagah, S. Jalali, E. Erkip and H.V. Poor. Rate-distortion dimension of stochastic processes. arXiv:1607.06792.
K. Schmidt. Dynamical Systems of Algebraic Origin, Progress in Mathematics, 128, Birkhäuser Verlag, Basel (1995).
C.E. Shannon. A mathematical theory of communication. Bell Syst. Tech. J., 27 (1948), 379–423, 623–656.
C.E. Shannon. Coding theorems for a discrete source with a fidelity criterion. IRE Nat. Conv. Rec. Pt. 4, pp. 142–163 (1959).
M. Tsukamoto. Deformation of Brody curves and mean dimension. Ergod. theory Dyn. Syst. 29 (2009), 1641–1657.
M. Tsukamoto. Mean dimension of the dynamical system of Brody curves. Invent. Math., 211 (2018), 935–968.
M. Tsukamoto. Large dynamics of Yang–Mills theory: mean dimension formula. J. Anal. Math., 134 (2018), 455–499.
A. Velozo and R. Velozo. Rate distortion theory, metric mean dimension and measure theoretic entropy. arXiv:1707.05762.
C. Villani. Optimal Transport Old and New. Springer-Verlag, Berlin (2009).
Y. Wu and S. Verdú. Rényi information dimension: fundamental limits of almost lossless analogue compression. IEEE Trans. Inf. Theory, (8)56, (2010) 3721–3747.
Acknowledgements
This project was initiated at the Banff International Research Station meeting “Mean Dimension and Sofic Entropy Meet Dynamical Systems, Geometric Analysis and Information Theory” in 2017. We thank BIRS for hosting this workshop, and for providing ideal conditions for collaborations. We also thank the referee for her/his helpful comments.
Author information
Authors and Affiliations
Corresponding author
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
E.L. was partially supported by ISF Grant 891/15. M.T. was partially supported by JSPS KAKENHI 18K03275.
Rights and permissions
About this article
Cite this article
Lindenstrauss, E., Tsukamoto, M. Double variational principle for mean dimension. Geom. Funct. Anal. 29, 1048–1109 (2019). https://doi.org/10.1007/s00039-019-00501-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00039-019-00501-8
Keywords and phrases
- Dynamical system
- Mean dimension
- Rate distortion dimension
- Variational principle
- Invariant measure
- Geometric measure theory