Helvik, T. & Lindgren, K. J Stat Phys (2014) 155: 687. doi:10.1007/s10955-014-0972-4
Whether a system is to be considered complex or not depends on how one searches for correlations. We propose a general scheme for calculation of entropies in lattice systems that has high flexibility in how correlations are successively taken into account. Compared to the traditional approach for estimating the entropy density, in which successive approximations build on step-wise extensions of blocks of symbols, we show that one can take larger steps when collecting the statistics necessary to calculate the entropy density of the system. In one dimension this means that, instead of a single sweep over the system in which states are read sequentially, one take several sweeps with larger steps so that eventually the whole lattice is covered. This means that the information in correlations is captured in a different way, and in some situations this will lead to a considerably much faster convergence of the entropy density estimate as a function of the size of the configurations used in the estimate. The formalism is exemplified with both an example of a free energy minimisation scheme for the two-dimensional Ising model, and an example of increasingly complex spatial correlations generated by the time evolution of elementary cellular automaton rule 60.
Entropy density Lattice systems Correlations Complexity Spin systems Cellular automata