1 Introduction

Rényi entropy [1] and complexity measures [2,3,4,5,6,7,8,9,10,11] proved to be very useful in studying electronic structure (see e. g. [2, 12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27]). These quantities are generally used in position-space, though there have been several investigations in momentum-space (see e. g. [2, 12,13,14,15,16,17,18,19,20, 28]), too. Moreover, the sum of position-space and momentum-space Rényi entropies are also turned to be beneficial (see e. g. [2, 12,13,14,15,16]). Phase-space Rényi entropy has also been defined and several inequalities have been derived for it [29]. Recently, phase-space Shannon and Fisher information have been defined and analyzed in a combined information theoretical and thermodynamic view of density functional theory (DFT) [30,31,32].

In this paper phase-space Rényi entropy and complexity are defined and the thermodynamic picture of density functional theory proposed by Ghosh, Berkowitz and Parr [33] is utilized. According to this interpretation DFT can be regarded a local thermodynamics. Even a local temperature can be defined. The method has extensions and applications [34,35,36,37,38,39,40,41,42,43,44,45,46,47].

The structural entropy defined by Pipek, Varga and Nagy [48,49,50], the LMC statistical complexity introduced by López-Ruiz, Mancini and Calbet [3] and generalized complexity proposed by López-Ruiz, Nagy, Romera and Sanudo [9, 10] are now extended to the phase space. It is shown that the logarithm of the phase-space LMC complexity reduces to the position-space structural entropy defined by Pipek et al. provided that constant local temperature is taken.

2 Rényi entropy and statistical complexity

Consider a D-dimensional density function \(f(\mathbf{r})\) which is nonnegative and normalized to 1 (\(\int f(\mathbf{r}) d\mathbf{r}=1\)). The Rényi entropy of order q is given by

$$\begin{aligned} R_f^{(q)} = \frac{1}{1-q}\ln \int [f(\mathbf{r})]^{q} d\mathbf{r}, \quad \text {for } 0<q <\infty \quad \text {and } q \ne 1 , \end{aligned}$$
(1)

where \(\mathbf{r}\) stands for \(r_1, ..., r_D\). Obviously, the limit \(q\rightarrow 1\) provides the Shannon entropy [51]

$$\begin{aligned} S_f= -\int f(\mathbf{r})\ln f(\mathbf{r})d\mathbf{r} . \end{aligned}$$
(2)

The so-called LMC complexity [4, 5] is defined by

$$\begin{aligned} C_f^{LMC}=H_fQ_f, \end{aligned}$$
(3)

where

$$\begin{aligned} H_f=e^{S_f} \end{aligned}$$
(4)

and

$$\begin{aligned} Q_f=e^{-R_f^{(2)}} . \end{aligned}$$
(5)

That is, it contains Rényi entropies of \(q \rightarrow 1\) and \(q=2\).

Previously, Pipek, Varga and Nagy [48,49,] introduced the structural entropy: (see also [49, 50])

$$\begin{aligned} S_f^{str} = R^{(1)}_f - R^{(2)}_f . \end{aligned}$$
(6)

Obviously,

$$\begin{aligned} S_f^{str} = \ln {C_f^{LMC}} \end{aligned}$$
(7)

Replacing the Shannon entropy in Eq.(3) with the Rényi entropy of order q, a q-dependent measure of complexity is obtained [9]:

$$\begin{aligned} C_f^{(q)}=H_f^{(q)}Q_f, \quad \text {with} \quad H_f^{(q)} = e^{R_f^{(q)}} . \end{aligned}$$
(8)

\(C_f^{(q)}\) tends to the LMC complexity \(C_f^{LMC}\) in the limit \(q\rightarrow 1\).

A further generalization leads to a two-parameter measure of complexity [10]:

$$\begin{aligned} \tilde{C}^{(q_1,q_2)}_f=e^{R_f^{(q_1)}-R_f^{(q_2)}},\quad \quad 0<q_1,q_2<\infty . \end{aligned}$$
(9)

Obviously, \(\tilde{C}^{(1,2)}_f=C_f^{LMC}\) and \(\tilde{C}^{(q,2)}_f=C_f^{(q)}\).

The generalized statistical complexity measure \(\tilde{C}^{(q_1,q_2)}_f\) has several important properties: inversion symmetry, monotonicity, universal bound, invariance under translations and rescaling transformations, near-continuity. These are inherent in the analysis based on Rényi entropies, therefore the localization map parameters of Pipek and coworkers already possess most of these properties. The generalized statistical complexity measure has been applied for studying different quantum systems [9, 10].

3 Phase-space Rényi entropy and complexity

Phase-space Rényi entropy can be defined from the Wigner [52] and the Husimi [53] distribution functions. Phase-space Rényi entropy based on a special family of phase-space distribution functions has also been introduced and several theorems have been proved [29]. These distribution functions are nonnegative and produce the correct marginal distribution functions, that is, the integration of the distribution function with respect to the position variables leads to the momentum-space distribution function and the integration with respect to the momentum variables provides the position-space distribution function. We can remark here that the Wigner distribution function also yields the correct marginal distribution functions, but is not everywhere nonnegative. Wigner showed that bilinear distribution functions are not universally nonnegative. However, Cohen and Zaparovanny [54,55,56] proved that there exist distribution functions that are not bilinear but nonnegative and give the correct marginal distributions.

In this paper a phase-space distribution function derived by Ghosh, Berkowitz and Parr [33] is utilized. As we can see below, this distribution function provides only the correct marginal position-space distribution function. The marginal momentum-space distribution function differs from the correct one. Instead, in the Ghosh - Berkowitz - Parr (GBP) theory the kinetic energy density equals the correct one.

Now the GBP theory [33] is summarized: an electronic system with density \(\varrho\) is taken and a phase-space distribution function \(g(\mathbf{r}, \mathbf{p})\) with the correct position-space marginal

$$\begin{aligned} \int d\mathbf{p} g(\mathbf{r}, \mathbf{p}) = \varrho (\mathbf{r}) \end{aligned}$$
(10)

and the correct kinetic energy density \(t(\mathbf{r})\)

$$\begin{aligned} \int d\mathbf{p} \frac{p^2}{2m} g(\mathbf{r}, \mathbf{p}) = t(\mathbf{r}) \end{aligned}$$
(11)

is searched for. In DFT \(\varrho\) integrates to the number of electrons N:

$$\begin{aligned} \int d\mathbf{r} \varrho (\mathbf{r}) = N \;. \end{aligned}$$
(12)

Obviously, the kinetic energy is given by

$$\begin{aligned} E^{kin} = \int d\mathbf{r} t(\mathbf{r}) . \end{aligned}$$
(13)

GBP obtains g by maximizing the information

$$\begin{aligned} S= -k \int d\mathbf{r} d\mathbf{p} g(\mathbf{r}, \mathbf{p})(\ln {g(\mathbf{r}, \mathbf{p})} -1) \end{aligned}$$
(14)

with the conditions (10) and (11). As the variation leads to a Maxwell-Boltzmann-like distribution function

$$\begin{aligned} g(\mathbf{r}, \mathbf{p}) = e^{-\alpha (\mathbf{r})} e^{-\beta (\mathbf{r})p^2/2} \;, \end{aligned}$$
(15)

this approach is called a thermodynamic transcription of DFT. k is the Boltzmann constant and \(\alpha (\mathbf{r})\) and \(\beta (\mathbf{r})\) are \(\mathbf{r}\)-dependent Lagrange multipliers. Substituting Eq. (15) to Eq. (11) the well-known ideal gas expression is gained:

$$\begin{aligned} t(\mathbf{r}) = \frac{3}{2} \; \frac{\varrho (\mathbf{r})}{\beta (\mathbf{r})} . \end{aligned}$$
(16)

This form follows from the fact that the kinetic energy density is kept fixed. \(\beta (\mathbf{r})\) is called local inverse temperature. Another useful form of f is

$$\begin{aligned} g(\mathbf{r}, \mathbf{p}) = \left[ \frac{2 \pi }{\beta (\mathbf{r})} \right] ^{-3/2} \varrho (\mathbf{r}) e^{- \beta (\mathbf{r})p^2/2} \;. \end{aligned}$$
(17)

Substituting Eq. (15) to Eq. (14) the well-known ideal gas expression is gained for the entropy.

However, the kinetic energy density \(t(\mathbf{r})\) is not uniquely defined. Any term that integrates to zero can be added to the kinetic energy density resulting the same kinetic energy. Thus \(\beta\) is not unique either. See Refs. [57,58,59] for the most frequently applied forms of t. Recently, another form has been proposed: the one for which phase-space Fisher/Shannon information is minimum/maximum [60, 61]. It turned out that this condition provides a constant \(\beta\):

$$\begin{aligned} \beta =\frac{3}{2} \frac{N}{E^{kin}} . \end{aligned}$$
(18)

We add by passing that the derivation above is valid for excited states as well. Only the density and the kinetic energy density were used and these quantities can be excited-state density and kinetic energy density.

We can observe that the derivation is valid both for interacting and non-interacting kinetic energy density. If the interacting kinetic energy is taken, the virial theorem can be utilized, i. e.

$$\begin{aligned} E = - E^{kin} \end{aligned}$$
(19)

in Coulomb systems at equilibrium nuclear geometry, where E is the total energy. Then Eq. (18) can be rewritten as

$$\begin{aligned} \beta =\frac{3}{2} \frac{N}{|E|} . \end{aligned}$$
(20)

In the following instead of Eq. (17) the normalized phase-space distribution function

$$\begin{aligned} f(\mathbf{r}, \mathbf{p}) = \frac{1}{N} g(\mathbf{r}, \mathbf{p}) = \frac{1}{N} \left[ \frac{\beta (\mathbf{r})}{ 2 \pi } \right] ^{3/2} \varrho (\mathbf{r}) e^{- \beta (\mathbf{r}) p^2/2} \end{aligned}$$
(21)

is applied. Substituting f into Eq. (1)

$$\begin{aligned} R_f^{(q)} = \frac{1}{1-q} \left\{ \ln {\int [\sigma (\mathbf{r})]^{q} [\beta (\mathbf{r})]^{3(q-1)/2} d\mathbf{r}} - \frac{3}{2} [(q-1)\ln {(2 \pi )} + \ln {q}] \right\} , \end{aligned}$$
(22)

where

$$\begin{aligned} \sigma (\mathbf{r}) = \frac{1}{N} \varrho (\mathbf{r}) \end{aligned}$$
(23)

is the density normalized to 1. \(\sigma\) is called shape function [62]. If \(q \rightarrow 1\) the phase-space Shannon entropy is obtained:

$$\begin{aligned} S_f = -\int [\sigma (\mathbf{r}) \ln { [\sigma (\mathbf{r}) \beta (\mathbf{r})^{3/2}] d\mathbf{r}} + \frac{3}{2} [\ln {(2 \pi )} + 1] . \end{aligned}$$
(24)

On the other hand, Eq. (22) provides

$$\begin{aligned} R_f^{(2)} = - \ln {\int [\sigma (\mathbf{r})]^2 [\beta (\mathbf{r})]^{3/2} d\mathbf{r}} + \frac{3}{2} [ \ln {(2 \pi )} + \ln {2}] \end{aligned}$$
(25)

for \(q = 2\).

If \(\beta\) is constant the Rényi entropy can be written as

$$\begin{aligned} R_f^{(q)} = R_{\sigma }^{(q)} + A_{\beta }^{(q)} , \end{aligned}$$
(26)

where

$$\begin{aligned} R_{\sigma }^{(q)} = \frac{1}{1-q}\ln \int [\sigma (\mathbf{r})]^{q} d\mathbf{r} \end{aligned}$$
(27)

is the position-space Rényi entropy for the normalized density \(\sigma\) and

$$\begin{aligned} A_{\beta }^{(q)} = \frac{3}{2} \left[ \ln {\left( \frac{2 \pi }{\beta }\right) } -\frac{\ln {q}}{1 - q} \right] \end{aligned}$$
(28)

Using the kinetic energy instead of \(\beta\) ( Eq. (18)) we are led to

$$\begin{aligned} A_{\beta }^{(q)} = \frac{3}{2} \left[ \ln {\left( \frac{4 \pi }{3N} E^{kin}\right) } -\frac{\ln {q}}{1 - q} \right] . \end{aligned}$$
(29)

In a Coulomb system, using Eq. (20) Eq. (29) takes the form

$$\begin{aligned} A_{\beta }^{(q)} = \frac{3}{2} \left[ \ln {\left( \frac{4 \pi }{3N} |E|\right) } -\frac{\ln {q}}{1 - q} \right] . \end{aligned}$$
(30)

That is, the phase-space Rényi entropy is a sum of the position-space Rényi entropy and a term depending on the total energy and the order of q.

Using Eq. (9) the logarithm of the generalized complexity takes the form

$$\begin{aligned}&\ln {\tilde{C}^{(q_1,q_2)}_f} = \frac{1}{1-q_1} \left\{ \ln {\int [\sigma (\mathbf{r})]^{q_1} [\beta (\mathbf{r})]^{3(q_1-1)/2} d\mathbf{r}} \right\} \nonumber \\&\quad - \frac{1}{1-q_2} \left\{ \ln {\int [\sigma (\mathbf{r})]^{q_2} [\beta (\mathbf{r})]^{3(q_2-1)/2} d\mathbf{r}} \right\} + B^{(q_1,q_2)} , \end{aligned}$$
(31)

where

$$\begin{aligned} B^{(q_1,q_2)} = \frac{3}{2} \left[ \frac{\ln {q_2}}{1-q_2} - \frac{\ln {q_1}}{1-q_1} \right] . \end{aligned}$$
(32)

As a special case the phase-space generalization of the structural entropy is is obtained:

$$\begin{aligned}&\ln {C} = \ln {\tilde{C}^{(1,2)}_f} = - \int [\sigma (\mathbf{r}) \ln { [\sigma (\mathbf{r}) \beta (\mathbf{r})^{3/2}] d\mathbf{r}} \nonumber \\&\quad + \ln {\int [\sigma (\mathbf{r})]^2 [\beta (\mathbf{r})]^{3/2} d\mathbf{r}} + \frac{3}{2} (1 - \ln {2}) . \end{aligned}$$
(33)

If \(\beta\) is constant we can apply Eq. (26) for the Rényi entropy, therefore

$$\begin{aligned} \ln {\tilde{C}^{(q_1,q_2)}_f} = R_{\sigma }^{(q_1)} - R_{\sigma }^{(q_2)} + A_{\beta }^{(q_1)} - A_{\beta }^{(q_2)} . \end{aligned}$$
(34)

It follows from Eq. (28) that \(\beta\) disappears from \(\ln {\tilde{C}^{(q_1,q_2)}_f}\):

$$\begin{aligned} \ln {\tilde{C}^{(q_1,q_2)}_f} = \ln {\tilde{C}^{(q_1,q_2)}_{\sigma }} + B^{(q_1,q_2)} , \end{aligned}$$
(35)

where

$$\begin{aligned} \ln {\tilde{C}^{(q_1,q_2)}_{\sigma }} = R_{\sigma }^{(q_1)} - R_{\sigma }^{(q_2)} \end{aligned}$$
(36)

is the logarithm of the position-space generalized complexity. That is, the logarithm of the phase-space and the position-space generalized complexity measures differs only in a term depending only on \(q_1\) and \(q_2\). Or, the phase-space generalized complexity is proportional to the position-space generalized complexity. If we take \(q_1=1\) and \(q_2=2\), we obtain that the phase-space LMC complexity is proportional to the position-space LMC complexity. That is, the difference of the logarithm of the phase-space and the position-space structural entropies is a constant : \(3 (1 - \ln {2})/2\). This constant is due to the Gaussian form given in Eq. (15) being a direct consequence of the minimalization of Eq. (14) as Pipek and Varga has shown in their basic work [49].

4 Discussion

In DFT the density is the basic quantity determining the external potential, the Hamitonian, thus any property of the system under investigation. So the phase-space distribution function f is also determined by the density. In Coulomb systems f can be immediatelly given if the density is known in case of constant temperature. It is because the density decays as

$$\begin{aligned} \lim _{r \rightarrow \infty } \frac{\partial \ln {{{\bar{\varrho }}}(r)}}{\partial r} = - \sqrt{8 (E_0^{N-1} - E)} \; , \end{aligned}$$
(37)

where \(E_0^{N-1} - E\) is the vertical ionization potential of the N-electron system. E is the energy of the state considered and \(E_0^{N-1}\) is the ground-state energy of the \(N-1\) electron system [63,64,65]. Thus, the asymptotic decay of the density decides the energy E and \(\beta\) via the virial theorem. Therefore, \(\varrho\) (or \(\sigma\)) determines f. Moreover, \(\sigma\) (or \(\varrho\)) also determines the phase-space Rényi entropy as \(R_f^{(q)}\) is a sum of the position-space Rényi entropy \(R_{\sigma }^{(q)}\) that can be easily calculated from \(\sigma\) and the term \(A_{\beta }^{(q)}\) that can be obtained from E [Eqs. (26, 27 and 30)]. In case of a Coulomb system these statements are true not only for the ground, but excited states as well [63,64,65].

In summary, phase-space Rényi entropy and complexity have been defined utilizing the thermodynamic transcription of density functional theory. Phase-space structural entropy, phase-space LMC and generalized complexity have been defined. It has been shown that the logarithm of the phase-space LMC complexity reduces to the position-space structural entropy defined by Pipek et al. in case of constant local temperature.