Quantum information descriptors in position and momentum spaces

The resultant measures of the entropy/information content in complex electronic states are discussed in the canonical position and momentum representations of molecular quantum mechanics. The nonclassical (phase/current) supplements of the classical (probability) descriptors of the overall entropy/information content in electronic states are identified and the associated entropy deficiency (information distance) quantities are introduced. The Shannon (global, logarithmic) and Fisher (local, gradient) information descriptors in both spaces are summarized, and the momentum continuity equation is used to establish the associated probability source. General relations between global and local information densities are examined and the etropic principles determining molecular phase equilibria are investigated.


Introduction
The classical (probability-based) Information Theory (IT) [1][2][3][4][5][6][7][8] has proven its utility in extracting a "chemical" interpretation of the calculated (quantum mechanical) Throughout the article x denotes a scalar quantity, x stands for the row-or column-vector, and x represents a square or rectangular matrix. The natural logarithm log = ln used in the Shannon entropy expresses the amount of information in nats (natural units): 1 nat = 1.44 bits.
B Roman F. Nalewajski nalewajs@chemia.uj.edu.pl 1 Department of Theoretical Chemistry, Jagiellonian University, R. Ingardena 3, 30-060 Cracow, Poland electron distributions in molecules [9][10][11][12][13][14][15][16][17][18][19]. However, as recently argued elsewhere [9,[20][21][22], the resultant measures of the overall entropy/information content in electronic states should include contributions from densities of both the particle probability, related to the wave-function modulus, and the state phase or its gradient determining the system electronic current. The nonclassical contributions due to the state phase/current in such generalized IT concepts complement the familiar classical measures, functionals of the particle probability distribution alone. In this combined quantum treatment the classical and nonclassical descriptors extract the full information contained in the system complex wave function, combining contributions due to the state probability and its phase/current distributions, respectively. The resultant measure is then capable of distinguishing states corresponding to the same electron density but differing in their current distributions. The corresponding extension of the classical information-distance functionals has also been proposed [22][23][24][25] and the communication channels of the probability propagation in molecules [3,4,7,[11][12][13] have been supplemented by their nonclassical companions of the phase/current scattering [23].
Establishing the overall information content of electronic states, combining the classical and nonclassical contributions is of paramount importance for the complete IT treatment of molecular states. This generalized description also generates a "thermodynamic" framework for describing rates of specific reorganizations in the system electronic structure [22,24,25]. It recognizes both the density and phase/current degrees-of-freedom of molecular states and its conceptual framework formally resembles that used in the ordinary irreversible thermodynamics [26]. The nonclassical entropy/information components have been shown to be essential for describing the system phase-equilibria [9,[20][21][22] and extracting the resultant patterns of the chemical bond multiplicities [23] or reactivity behavior [27].
This position (r -space) development calls for a similar extension of the momentum ( p-space) IT description. In the present analysis, after a brief summary of the molecular Fourier transforms and the resultant entropy/information descriptors in r -space, we shall introduce their p-space counterparts and explore the momentum continuity equation. We shall also examine relations between the IT quantities in these Fourierconjugate representations of molecular states. The elementary events of observing the specified position/momentum of an electron define the corresponding probability distributions [28], which generate the relevant classical IT descriptors [1][2][3][4][5][6][7][8][11][12][13]29,30]. These electron densities, when combined with the state position or momentum currents, should similarly determine the complementary nonclassical IT contributions in these two canonical representations. One recalls that the chemically most important large-distance, valence (external) regions of atoms in the position density correspond to the low-momentum (internal) regions of the momentum density.
To simplify this theoretical analysis we shall generally limit ourselves to oneelectron case. Its generalization to N -electron systems involves the wave functions in the Harriman representation [22,25] using the Harriman-Zumbach-Maschke (HZM) [31,32] construction of Density Functional Theory (DFT) [33][34][35]. Such antisymmetric wave functions of N fermions, the Slater determinants of equidensity orbitals yielding the specified particle distribution, adopt the crucial insights due to Macke [36] and Gilbert [37].

Principal quantities and relations in position representation
In the context of Heisenberg's uncertainty principle one invokes the two canonical (continuous) basis sets in the Hilbert space of quantum molecular states, corresponding to the sharply specified electron positions {r i } and momenta { p i =hk i }, respectively. For a single particle (N = 1) these position and momentum bases thus combine the state vectors {|r 1 } and {h 3/2 | p 1 ≡ |k 1 } corresponding to the sharply specified electron position r 1 and momentum p 1 =hk 1 . These eigenvectors of the corresponding quantum observablesr andp, form the complete basis sets, d r 1 |r 1 r 1 | = d p 1 | p 1 p 1 | = d k 1 |k 1 k 1 | = 1, and define the associated wavefunctions (Dirac's deltas) in these two complementary representations: Above d r ≡ d 3 r and d p ≡ d 3 p or d k ≡ d 3 k stand for the corresponding infinitesimal volume elements in the position and momentum/wave-number spaces, respectively. One also recalls that the mixed representations of these basis vectors, i.e., their projections onto state vectors the complematary basis set, define the familiar wave functions: whereū r ( p) = p|r = u p (r) * andū r (k) = k|r = u k (r) * . The relevant definitions of the position and momentum operators in these two representations are then given by the following (diagonal) kernels: The particle wave functions ϕ(r) = r|ϕ andφ( p) = p|ϕ corresponding to the same state vector |ϕ , called the Fourier pair, are transforms of each other [38,39]: and ϕ( p) = p|ϕ = d r p|r r|ϕ = d r u p (r) * ϕ(r) These two conjugate transformations constitute the mutually inverse operations: Applying the quantum-mechanical superposition principle to the continuous combinations of Eqs. (4) and (5) then allows one to interpret their coefficients as amplitudes of the conditional probabilities P(x| y), of observing the basis state |x in the basis state | y , d x P(x| y) = 1, Consider a general (complex) state in the position representation, of the prototype one-electron system described by the Hamiltonian whereT(r) stands for the kinetic energy operator and v(r) denotes the external potential due to the system fixed nuclei. This wavefunction is defined by two (real) functions representing the state modulus, R(r), and phase, φ(r). They generate the electronic probability distribution ρ(r), the conditional probability P(r|ϕ) of observing |r in |ϕ , and density current j (r): where the (Hermitian) current operator The current-per-particle measures the effective "velocity" distribution of this probability fluid, shaped by the gradient of the wavefunction phase. These two fundamental electronic distributions can be regarded as an alternative pair of state-parameters specifying the system quantum state: Therefore, the principal quantities [R, φ] (or [ρ, φ]) in the complex state of Eq. (8) generate alternative sets of "variables", [ρ, ∇ r φ] or [ρ, V ] and [ρ, j ], each providing the complete specification of the quantum state in question. Therefore, its resultant entropy/information description can be formulated in terms of any of these complementary (classical, nonclassical) degrees-of-freedom of the molecular electronic structure. One also observes that the probability density of Eq. (10) in fact measures the conditional probability of observing |r in |ϕ : P(r|ϕ) = ϕ|r r|ϕ , P(r|ϕ) d r = 1.
The eigensolutions of the energy operatorĤ(r), determine the stationary electronic states {ϕ i (r) = r|ϕ i } at the specified time t 0 = 0. They correspond to the sharply specified electronic energies {E i }, with the lowest (i = 0) eigenvalue marking the energy of the system ground state, and the stationary (time-independent) probability distribution ρ i (r) = |ϕ i (r 1 )| 2 = R i (r) 2 . Each nondegenerate stationary state ϕ i (r) exhibits the purely time-dependent phase φ i (t) in the full quantum state ψ i (r, t) = r|ψ i (t) , i.e., the exactly vanishing spatial-phase: φ i (r) = 0. Therefore, it represents the stationary probability distribution, ρ i (r, t) = ρ i (r), and vanishing current: Consider now the familiar free-particle case, for v(r) = 0. The associated planewave eigenfunction ϕ k (r, t) = A exp[i(k · r − ω k t)] gives: ρ k (r, t) = ρ k = |A| 2 , j k (r) = j k = (h/m)ρ k k = |A| 2 V k , and hence ∇ r · j k (r) = 0; here the particle classical velocity V k = p k /m measures the momentum p k = kh per unit mass. This example shows that the necessary ("weak") condition of the system stationary character, of the time-independent probability distribution, does not always imply the vanishing state current, the sufficient ("strong") condition of the molecular exact stationarity.
in the system equilibrium wave function, ϕ eq.
representing the phase-transformed state ϕ 0 (r) = r|ϕ[ρ 0 ] . The equilibrium wavefunction and its Fourier transform generate the associated plane-wave expansions: ϕ eq. (r) = r|ϕ eq. = r| p p|ϕ eq. d p = u r ( p) * φeq. ( p) = r|ϕ eq. and ϕ eq. ( p) = p|ϕ eq. = p|r r|ϕ eq. d r = u p (r) * ϕ eq. (r)d r. (19) For the fixed external potential v such states conserve the Density-Functional Theoretic and exhibit the maximum of the resultant IT entropy. This is in analogy to the equilibrium states of the ordinary thermodynamics, which also maximize (thermodynamic) entropy for the specified value of the system internal energy [26]. One further recalls that the time-dependent Schrödinger equation (SE), implies the sourceless, σ ρ (r, t) ≡ 0, continuity relation for the electron probability distribution: It demonstrates that the time dependence of the electron distribution (l.h.s.) originates from the density outflow alone (r.h.s.).
defined by the multiplicative kinetic-energy part T ( p) and the external potential operatorv( p) =v[r( p)] involving the Fourier transformv( p) = F [v(r)]. For an even external potential, v(−r) = v(r), one thus obtains the even function This is the case for v(r) = v(r ), where r = |r|, e.g., the Coulombic potential due to nuclei in the Born-Oppenheimer approximation, when the transformed external potential also depends only on the modulus p of the momentum p [39]: The momentum density, measuring the conditional probabilityP( p |ϕ ) of observing | p in |ϕ , now reads: It exhibits the inversion symmetry, π( p) = π(− p), by the principle of microreversibility [28]. Its flow aspect is revealed by the momentum current

/m)π( p) =V ( p)π( p),Ĵ( p) = ( p/m)π( p). (25)
It should be recalled that π( p) ≡ γ ( p, p) =ρ( p) ≡ F [ρ(r)] [28], where γ ( p, p ) =φ( p)φ * ( p ) stands for the one-electron density matrix in p-space. Indeed, the former involves a 6-dimensional Fourier transform of the whole density matrix in r -space, and not just of its diagonal part γ (r, r) ≡ ρ(r): where s = r − r and the internally-folded electron density B(s) = d r γ (r, r + s) stands for the reciprocal ( p-space) form factor. The latter is used in the reconstruction of momentum densities from the experimental data and in the r -space analysis of Compton profiles, while the Fourier transformed electron densityρ( p) determines the form factor of the X-ray crystallography [28]. The same is true for the inverse transforms [see Eq. (4)]: In view of the importance of the phase aspect for the nonclassical entropy/information terms it is of interest to examine the Fourier transforms of the spatial phase component in electronic states. Obviously, the (complex) Fourier transformφ( p) = F [φ(r)] of the spatial phase part of the wave function in position representation [Eq.
differs from the (real) phase χ( p) of the quantum state in momentum space, The former defines the "coefficients" in the plane-wave expansion of φ(r): (30) In Eq. (28) the integrands determining the real and imaginary parts ofφ( p) exhibit a definite parity when so does φ(r): the even φ(r) combined with the odd factor sin[( p/h) · r] implies Imφ( p) = 0, while the odd φ(r) combined with the even factor cos[( p/h) · r] gives rise to Reφ( p) = 0, so that the even φ(r) gives rise to realφ( p), while the odd spatial phase generates the pure imaginaryφ( p) [38]. In a general case of the spatial phase combining the even an odd parts one thus obtains the complex transformed phaseφ( p).
. This deduction follows from then even parity of the transformation Jacobian, which is opposite to that of f 0 (r) itself. For such symmetric probability density p 0 (r) the odd orthogonality phase of equidensity orbitals in the HZM construction thus generates the pure-imaginary phase part in momentum representation, while the (even) thermodynamic phase contribution gives rise to the real momentumphase component: Both these contributions define the overall (complex) transform of Φ eq. l [r; ρ 0 ]: Consider next the Fourier transformed one-electron state of Eq. (8) It is defined [see Eq. (29)] by the effective modulus M( p) and phase χ( p) components in the momentum space: which differ from the Fourier transforms of the corresponding parts of ϕ(r): Only for the real orbital ϕ(r) = R(r), e.g., the nondegenerate ground state of the single-particle system, when φ(r) = 0, one finds: M( p) =R( p) and χ( p) =φ( p) = 0. Therefore, the strong-stationary (zero-current) state in r -space, in general generates the weak-stationary (finite-current) Fourier transform: Finally, let us briefly explore the continuity equation in momentum space. It results from the transformed SE [39], (45) and its Hermitian conjugate. They generate the following time-derivative of π( p, t) = ψ( p, t)ψ( p, t) * , expressed in terms of the full momentum density matrix γ ( p, One also observes that the divergence of the momentum current of Eq.
Therefore, the source term in the momentum-continuity equation reads: As an illustrative example consider the stationary state |ψ i (t) in position representation [Eq. (15)]. The conjugate wave function in momentum space,ψ i ( t)], complementary to the position-representation of the same state vector |ψ i (t) , ψ i (r, t) = r|ψ i (t) = ϕ i (r) exp(−iω i t), reads: It implies the weak-stationary (finite-current) momentum density, π i ( p, t) = |φ i ( p)| 2 = π i ( p) and J( p) = 0, and hence the vanishing first term in the r.h.s. of the first line in Eq. (48): In such states, however, the momentum current remains finite, J i ( p) = |φ i ( p)| 2 p/m = 0, so that the local source of the momentum density is determined by the divergence of Eq. (47): For the realv( p) of Eq. (23b) this is the case for any freely evolving quantum statē ψ( p, t) [see Eq. (46)]: Therefore, for the strong-stationary electronic state in the position space, for which the nonclassical r -space entropy/information contributions identically vanish, one should expect a presence of the nonvanishing current-related entropy/information contributions in the p-space.

Resultant entropy/information descriptors in physical space
The adequate information measure in quantum IT must reflect the complete "structure" aspect of the molecular state, accounting for the entropic characteristics related to both the modulus and phase components of the system wave function. The resultant information content of molecular electronic states has to measure both the compactness/width descriptors of the particle probability distribution, manifested by the classical information measures, and the associated contributions due to the state probability currents, embodied in the nonclassical information complements. These generalized entropic concepts give rise to the extremum information principles which determine the quantum phase-equilibria in electronic distributions [9,[20][21][22][23][24][25][26][27]. Such a combined IT perspective also applies to electron "communications" in molecules [23]. The classical information channels [3,4,7,8,[11][12][13] reflect the elementary probability-scattering between the "input" to "output" events relevant for the adopted resolution level of electron distributions, while their nonclassical analogs involve the associated (phase/current)-propagations in the molecular bond system. These networks generate the non-classical entropic indices of the molecular bond multiplicities and their covalent/ionic components.
We continue the representative one-electron development of the preceding sections. In the quantum IT approach one introduces the "horizontal" equilibrium state for the specified ground-state particle distribution ρ 0 = |ϕ[ρ 0 ]| 2 [see Eqs.
For the fixed external potential v this maximum-entropy state conserves the DFT and exhibits a nonvanishing current density. We begin with a brief summary of the key entropic concepts formulated in the classical IT. The local (gradient) measure of Fisher [1,2] reflects the average determinicity-information in the probability density ρ(r) = |ϕ(r)| 2 = R(r) 2 , for local events defining the position representation. It is reminiscent of von Weizsäcker's [40] inhomogeneity correction to the kinetic energy functional in the Thomas-Fermi-Dirac theory, where the multiplicative operator I ρ (r) = [∇ρ(r)/ρ(r)] 2 ≡ I ρ (r) ≡ I class. (r) stands for the associated information density per electron. This intrinsic accuracy descriptor characterizes an effective localization (compactness, "narrowness") of the particle distribution. It simplifies when expressed in terms of the probability aplitude R(r) = √ ρ(r), thus revealing that it effectively measures the gradient content in the modulus factor of the wavefunction.
The global measure of Shannon [3,4], complementary to the local descriptor I [ρ], reflects the average indeterminicity-information in ρ(r), called the classical Shannon entropy in state | ϕ ,
One further observes that these classical information/entropy densities are mutually related [9,[20][21][22][23]: (58) Thus, the square of the gradient of the local Shannon probe of state probability "indeterminicity" (disorder) generates the density of the associated Fisher measure of the state classical "determinicity" (order). An important generalization of the global classical measure, called the entropy deficiency (directed divergence, cross entropy or missing information), has been proposed by Kullback and Leibler [5,6]. It reflects the information "distance" between the two normalized distributions defined for the same set of events. In the position representation the missing information S[ρ|ρ 0 ] in ρ(r) = |ϕ(r)| 2 relative to the reference distribution ρ 0 (r) = |ϕ 0 (r)| 2 , measures the average value of the probability surprisal s ρ (r) = log[ρ(r)/ρ 0 (r)] ≡ log w ρ (r), where w ρ (r) = ρ(r)/ρ 0 (r) stands for the local probability "enhancement". This global quantity reflects the information similarity between the two compared probability densities: the more they differ from one another the higher the information distance; it identically vanishes only when the two distributions are identical: S[ρ|ρ] = 0. Similar concepts of the information distance can be advanced using the local Fisher measure [41][42][43]. The classical directed divergence S[ρ|ρ 0 ] ≡ S class. [ψ|ψ 0 ], is then generalized into the following gradient analog: The classical Fisher information is proportional to the nonhomogeneity term of the electronic kinetic energy [40]. The same analogy helps in designing a general form of the associated nonclassical gradient measure I nclass.
[ϕ], of the informationdeterminicity content in the quantum state | ϕ , and ultimately to surmise the complementary Shannon descriptor S nclass.
[ϕ] measuring the nonclassical informationindeterminicity in | ϕ [9,[20][21][22][23][24][25]. More specifically, the amplitude form of the classical Fisher information renders its natural generalization in terms of the system wavefunction, i.e., the generally complex amplitude of the electron probability distribution. In this way one conjuctures a general form of the I nclass. [ A presence of a nonvanishing electronic current signifies a displacement from the system strong-stationarity. It introduces the additional "structure" element in the quantum molecular state. This current pattern implies less "uncertainty" (more "order") in the molecular electronic state, compared to its classical information content, i.e., the negative sign of the nonclassical entropy (information-indeterminicity) supplement due to the system phase and the positive sign of the associated nonclassical information-determinicity (gradient) contribution. In other words, the weak-stationary quantum state is expected to exhibit less "disorder" (more "order") content compared to the strong-stationary state of the same probability distribution. This expectation can indeed be justified by comparing the prototype one-dimensional traveling-wave, when one has the full knowledge of the current vector, with the standing-wave of the same amplitude, representing a total ignorance of the flow direction.
For a single-electron in state ϕ(r) of Eq. (8) the overall gradient measure of the information content is related to the system overall kinetic energy and contains both the classical (probability) and nonclassical (phase/current) components [9,[20][21][22][23][24][25]: Together they generate the resultant quantum measure of Eq. (62): This overall gradient determinicity-information is proportional to the average kinetic energy T [ϕ] = ϕ|T |ϕ , the expectation value of the particle kinetic energy operatorT(r) of Eq. (9), which can be also partitioned into contributions due to the system particle probability and phase/current densities: The two components of the generalized gradient information also reflect the associated kinetic-energy terms: To establish the unknown nonclassical entropy complement S nclass. (r) of the classical Shannon density S class. (r) one applies Eq. (58) to the nonclassical densities of functionals I nclass.
[ϕ] [9,[20][21][22][23]: where for definiteness we have adopted the positive phase convention: |φ(r)| ≡ φ(r) ≥ 0. This finally gives the non-positive functional of the phase-related complement to the Shannon entropy (indeterminicity) content: This nonclassical entropy is seen to be proportional to the local magnitude of the phase function, φ(r) ≥ 0, the square root of the phase-density φ(r) 2 , with the particle probability ρ(r) providing the local "weighting" factor. This functional characterizes the displacement from the strong-stationarity in terms of the negative average spatial phase. It complements the (positive) classical Shannon entropy of Eq. (57), in the resultant measure of the global entropy (indeterminicity-information) content in both the probability and phase/current distributions of the complex electronic state ϕ: To summarize, the system electron distribution, related to the wave-function modulus, reveals the classical (probability) aspect of the molecular information content, while its phase (current) facet gives rise to the specifically quantum (nonclassical) entropy/information terms. Together these two contributions monitor the full (resultant) indeterminicity-information in a non-equilibrium or variational quantum state, thus providing the complete information description of its evolution towards the final equilibrium.
We now return to the intriguing question: what is an appropriate measure of the non-classical gradient "entropy" (indeterminicity-information)? To justify the negative current-related entropy term we again compare the (one-dimensional) "traveling" wave of the precisely specified current direction, with the associated strong-stationary, "standing" wave situation resulting from the equal 50 % probabilities of the "left" and "right" "traveling" waves of the same amplitude. The latter case implies our total ignorance of a direction of the wave vector as well as the vanishing average current and the nonclassical entropy-information supplements: S[ρ, φ] = I [ρ, φ] = 0, while say 100 % "right" traveling-wave represents a finite current in this direction, and hence nonvanishing S[ρ, φ] and I [ρ, φ]. Clearly, the pure traveling-wave situation represents a lower degree of the electronic "uncertainty" ("ignorance"), i.e., lower entropy (indeterminicity-information), S[ρ, φ] < 0 or S[ϕ] < S[ρ], and thus higher determinicity-information: In the local (gradient) approach to the information measure, the classical part I [ρ] represents the information received after hypothetically removing the uncertainty in the particle position, so that it also measures the classical part of the gradient uncertainty (entropy). Since a presence of a finite current introduces less uncertainty in the overall (probability and current) electronic structure, the negative of the non-classical contribution I [ρ, φ] > 0 provides a good candidate for the gradient, current-related indeterminicity measure, thus giving rise to the following resultant gradient "entropy": It has indeed been demonstrated [23,25], that for the given electron density ρ(r) the unconstrained (horizontal) extrema of bothĨ [ϕ] and S[ϕ] have the same equilibriumphase solution of Eq. (17) which defines the equilibrium state of Eq. (18).
One similarly introduces the appropriate nonclassical contributions of the information-distance measures [22][23][24][25]43] related to the phase/current degrees-offreedom of the two compared quantum states ϕ and ϕ 0 , which generate the associated (probability, phase, current) characteristics: (ρ, φ, j ) and (ρ 0 , φ 0 , j 0 ), respectively. It appears that the most appropriate generalized measure of the information distance results from generalizing the symmetrized measure of Kullback [6]: Its nonclassical complement then generates the symmetrized information-distance due to the phase-surprisal in the associated resultant mesure determining the resultant entropy deficiency between the two complex wave functions: The corresponding generalization of the symmetrized gradient information distance [11] [see Eq. (61)], includes the associated nonclassical complement in the resultant measure: whereĪ class. ( p) denotes the density of this classical gradient information in momentum space. These complementary densities in p-space obey the same relation as their r -space counterparts [compare Eq. (58)]: The corresponding nonclassical supplement to the classical gradient measure of Eq. (81) again results from examining this determinicity-information for the complex momentum-probability amplitudeφ( p) of Eq. ([compare Eq. (62)]: The above classical and nonclassical entropies then determine the resultant measurē S[φ] of the global "uncertainty" in stateφ and its combined densityS( p): The densities of these nonclassical gradient and global functionals thus obey the momentum analog of the r -space relation of Eq. (70): Let us now examine the phase equilibria in momentum space. We begin with exploring the extrema of nonclassical entropic terms alone, which determine the so called "vertical" equilibria, for the fixed classical information contrubutions due to the (fixed) momentum density. One observes that the IT rule of the maximum value ofS nclass. [φ] with respect to χ( p), i.e., the vertical entropy principle in p-space, again identifies the strong-stationary state, corresponding to the vanishing phase χ eq. ( p) = 0: max χ( p)S nclass.
This vertical solution also determines the minimum of the complementary gradient determinicity-informationĪ nclass. [φ]: One observes, however, that the zero-current (strong-stationary) state ϕ i (r) in the position space generally implies a finite-current (weak-stationary) stateφ i ( p) ≡ F [ϕ i (r)] in the momentum space. Only by the subsequent enforcing of the extremum of the nonclassical entropy contribution in the p-space, i.e., the momentum vertical principle, one determines the vertical equilibrium in the momentum space, which represents the exactly vanishing values of both the momentum current and the associated nonclassical information contribution. Therefore, in such vertical-equilibrium states the classical (probability) functionals in momentum representation indeed amount to the overall information content of the electronic momentum-space structure.
The physically-unconstrained extrema of the resultant entropy/information functionals, combining the classical and nonclassical terms, similarly determine the "horizontal" equilibria in molecular systems. It has been demonstrated elsewhere [23,25] that the equilibrium phase of Eq. (17), which determines the equilibrium state ϕ eq. in r -space [Eqs. The same solution follows from the maximum principle of the gradient "entropy" (indeterminicity-information) in momentum space: It gives rise to the Euler equation for the optimum phase, and hence the optimum phase χ eq. ( p) = ± (1/2)lnπ( p). The correct sign of this horizontal solution [see Eq. (93)] follows from the adopted positive phase convention: One also observes, that in this equilibrium state the momentum-current remains unaffected by the equilibrium phase transformation of Eq. (18): Finally, let us summarize the corresponding information distances (entropy deficiencies) between two complex statesφ andφ 0 in the momentum representation, defined by their phases χ and χ 0 and the classical amplitudes M and M 0 , giving rise to the corresponding momentum probabilities π = M 2 and π 0 = (M 0 ) 2 . The resultant entropy deficiency between these two electronic states in p-space now reads: wherē s π ( p) = ln[π( p)/π 0 ( p)] = lnw π ( p) ands χ ( p) = ln[χ( p)/χ 0 ( p)] = lnw χ ( p) respectively denote the probability and phase surprisals in momentum space. The corresponding terms of the resultant gradient measure of the Fisher-type information distance are:
A majority of molecular wave functions is constructed from one-particle states, the molecular orbitals, e.g., the natural orbitals (NO) ψ = {ψ n (r)} [44,45] in terms of which where {λ n } stand for the NO occupations. The Fourier-transforms {ψ n ( p) = F [ψ n (r)]}, called momentals [28], then determine the conjugate density matrix in the p-space:γ In the HZM representation of DFT one constructs the N -electron Slater determinants for the given electron density ρ(r) = ρ 0 (r) = NP 0 (r) using the equidensity orbitals {ϕ l (r)} [32] of Eq.

Conclusion
To accommodate the complex wave functions of molecular electronic states the nonclassical (phase/current)-related supplements of the classical (probability) descriptors of the entropy/information content are required. In the position representation the quantum-generalized gradient measure of the Fisher determinicity-information involves a contribution due to the probability current (phase gradient), which gives rise to a non-vanishing information source. It is related to the dimentionless ("reduced") expectation value of the system electronic kinetic energy. The resultant entropy of the Shannon indeterminicity-information descriptor similarly involves the negative average phase contribution, which complements the familiar Shannon functional of the electronic probability distribution. This extension satisfies the requirement that the relation between the classical Shannon and Fisher information densities extends into the nonclassical (quantum) domain, between the entropy/information densities due to the state phase/current. The gradient "entropy" descriptor has also been introduced, including the negative nonclassical constribution. Both global and gradient measures of the resultant entropy have been shown to give rise to the same phase solution marking the system horizontal-equilibrium. Similar generalized descriptors of both the resultant information content and entropy-deficiency (information-distance) have been introduced in the p-space. The continuity equation for the momentum density has been shown to exhibit a nonvanishing source. The Fourier transforms of the strong-stationary (zero-current) states in the position representation were shown to give rise to the weak-stationary (finitecurrent) states in the momentum space. Such an observation strengthens the need for the nonclassical information supplements in molecular quantum mechanics. This fully quantum information development for a single electron has been naturally generalized into many-electron systems by using the HZM construction of modern DFT, for generating the Slater determinants giving the prescribed electron distribution.
The generalized information principles in the momentum space have been examined, which determine the equilibrium states of molecules and their constituent fragments in this representation. Two types of such entropic rules have been examined: for the maxima of either the nonclassical entropy/information contribution alone, or of the associated resultant measure. The former rule determines the so called vertical equilibrium state, while the latter principle generates the system horizontal equilibrium. The optimum p-space horizontal phase is related to the logarithm of the momentum probability density, thus generating a non-vanishing momentum current, and hence the finite nonclassical entropy/information contributions.