On entropy-continuity descriptors of molecular equilibrium states

Resultant information concepts combining descriptors of both the probability and phase/current distributions in molecular electronic states are summarized. Continuity equations for these fundamental densities are explored and the equilibrium states of molecular systems are determined from the relevant information principles. The extremum of the nonclassical entropy/information terms alone determines the system vertical-equilibrium state of the vanishing current. The maximum of the resultant entropy, which defines the system horizontal-equilibrium, corresponds to the optimum “thermodynamic” phase determined by the system particle distribution and predicts its finite current related to the gradient of electron density. This equilibrium phase-transformation conserves the probability density while modifying the current of the original state. Generalized measures of the entropy/information content give rise to a thermodynamic description of molecular equilibrium states, which establishes the entropy source and flux concepts in the underlying information continuity equation. The influence of the equilibrium-phase transformation on the continuity relations for the state probability distribution, phase density, and local entropy production is examined.


Introduction
The classical information theory (CIT) [1][2][3][4][5][6][7][8], using measures of the entropy/ information content of molecular probability distributions, has been remarkably successfull in addressing many classical issues in the theory of electronic structure and reactivity, e.g., [9][10][11][12]. It fails, however, to account for the structural information contained in the phase or current distributions of complex quantum states, being designed to probe solely the particle density. The electronic current, generated by the phase gradient, ultimately contributes to the resultant information content and quantum communications between bonded atoms in molecules. Such a quantum extension of information concepts, which recognizes the entropic contributions generated by the state phase density or electronic current, calls for the nonclassical complements to the familiar classical measures of the information content [13][14][15][16][17][18][19][20]. This quantum information theory (QIT) distinguishes states exhibiting the same electron distribution but differing in composition of electronic currents, which CIT fails to do.
In QIT an adequate measure of the resultant information content of molecular electronic states must reflect their complete "structure" aspect, accounting for the entropy/information characteristics of the spatial distributions of both the probability and phase components of the system wavefunction. The electron density, related to the wavefunction modulus, reveals the classical (probability) aspect of the state information content, while the phase/current facet gives rise to the nonclassical entropy/information terms. Together these two contributions allow one to monitor the full (resultant) information content of, say, degenerate, non-equilibrium or general (variational) states, thus providing the complete information description of their evolution towards the ground-state equilibrium.
Such a combined perspective also applies to the cross-entropy (informationdistance) quantities and electronic communications in molecules, which generate the entropic descriptors of chemical bonds. In QIT the classical information channel reflects the probability-scattering from the "input" to "output" events relevant for the adopted resolution level. Its nonclassical supplement involves the associated networks of elementary phase-and current-propagations in the molecular bond system [19]. Such phase/current networks generate the nonclassical supplements to classical entropic descriptors of molecular bond multiplicities and their covalent/ionic composition [19,20].
A conservation in time of the overall classical measures of the entropy/information content in the position-probability density, due to the sourceless character of the continuity equation for the quantum distribution of electrons, has been stressed [11] and the continuity equations for the resultant Shannon and Fisher type measures have been explored [21,22]. These full measures exhibit finite source terms due to the nonclassical information terms. The phenomenological description of equilibria in molecular systems has been proposed [17,18,21], which resembles that developed in the ordinary irreversible thermodynamics [23]. This nonequilibrium treatment of the time evolution of entropic probes of molecular electronic states has introduced a new framework for describing in information terms the dynamics of chemical processes [20]. The relevant entropy current and source concepts have been identified in the continuity equation for the resultant entropy density, and the local sources of the overall Shannon (global) and Fisher (gradient) type entropies have been interpreted [22].
The sourceless character of the continuity equation for the quantum probability density in position representation implies a conservation of the classical information measures in time. This is no longer the case, when the information contributions due to the phase/current distributions are taken into account. Indeed, the continuity equations for these nonclassical degrees-of-freedom of molecular electronic states exhibit finite sources [11,14,17,18,24], which ultimately contribute to the time dependence of the resultant entropy measures. These nonclassical state-variables also introduce a nonvanishing source term into the associated continuity equation for the resultant entropy, which expresses a local balance between the flow and source contributions in the time evolution of the resultant information density [22]. However, specific forms of this continuity relation for the Fisher or Shannon type of the overall measure of the entropy/information content is phase-current dependent, with different flux definitions only reshuffling the local probability and phase rates, known from molecular Schrödinger equation, between the inflow and source parts of the underlying phasecontinuity equation.
The molecular phase-equilibria are determined by the extremum information principles involving either the nonclassical or resultant entropy/information concepts [15][16][17][18]25,26]. The latter determines the so called horizontal-equilibrium state, which represents the phase-transformed quantum state of a molecule, corresponding to the optimum, density-dependent "thermodynamic" phase. This manipulation of the electronic state, without an accompanying change of quantum observables, preserves the electron probability distribution but modifies its current. It is the main purpose of this work to examine how this "thermodynamic" transformation affects the continuity equations for the electron probability, phase and entropy densities.

Probability and phase continuity
Let us consider a general one-electron state ϕ(r, t) = R(r, t) exp[iφ(r, t)], with R(r, t) and φ(r, t) representing its modulus and phase components, respectively. We adopt the usual Born-Oppenheimer (BO) approximation of the fixed nuclear positions. In this prototype molecular scenario one envisages a single electron moving in the external potential v(r) due to the "frozen" nuclei, described by the Hamiltonian The state probability distribution and its current density read: These principal distributions reflect the state two independent components: its modulus and the phase or its gradient determining the current density j ≡ j p . The wave-function modulus factor R, the classical amplitude of the particle probability distribution p = R 2 , and the state phase φ or its gradient ∇φ thus constitute two fundamental "degrees-of-freedom" of electronic states: They both contribute to the resultant entropy/information descriptors of molecular quantum states, which combine both the classical contributions, due the the particle probability distribution, and the nonclassical terms, related to the state phase/current [13][14][15][16][17][18][19][20]. The dynamics of this quantum state in wave mechanics is determined by the Schrödinger equation (SE):Ĥ One further recalls that the weighted sum of this equation and its Hermitian conjugate gives rise to the sourceless form of the continuity equation for the particle probability distribution p(r, t): This continuity relation for the position-space probability expresses a local balance in electron redistributions from the angle of the wave-function modulus: the local rate of the probability density in the fixed "monitoring" volume element, ∂ p(r, t)/∂t, is solely due to the probability outflow measured by the negative divergence of the probability current density, −∇ · j (r, t), while its rate in the volume element moving with the particle, dp(r, t)/dt, i.e., the probability source, identically vanishes: σ p (r, t) = 0. The weighted difference of SE and its conjugate, similarly determines the time derivative of the state phase φ(r, t) [14], (6) or the associated rate of the phase-density f(r, t) = φ(r, t) 2 ≥ 0: ∂ f(r, t)/∂t = 2φ(r, t)∂φ(r, t)/∂t.
Ascribing the flux concept to phase component of molecular states is not unique [14,20,25,26]. Alternative phase-flow concepts J f (r, t) only reshuffle the known rate of the state phase-density between the outflow, −∇ · J f (r, t), and the source, Σ f (r, t), contributions in the underlying phase-continuity equation: Therefore, for definiteness, one ascribes the whole derivative of Eq. (7) exclusively to the absolute (abs.) phase-source, for the identically vanishing phase-current [20,22,25,26],

Classical entropy/information measures nad their nonclassical complements
At specified time t, which for simplicity we temporarily leave out from the list of the state arguments, the Fisher measure [1,2] of the gradient information in p(r), reflecting the average determinicity-information content in this probability density, is reminiscent of the von Weizsäcker [27] inhomogeneity correction to the kinetic energy functional in Thomas-Fermi-Dirac (TFD) theory: where stands for the functional density-per-electron. This intrinsic accuracy descriptor measures an effective localization (compactness, narrowness, "order") of the probability distribution. It simplifies, when expressed in terms of its classical probability aplitude R(r), thus revealing that it effectively measures the average length of the gradient of the state modulus factor R(r). The complementary CIT descriptor, the classical entropy of Shannon [3,4], where the associated density-per-electron, S p (r) ≡ S class. (r) = −ln p(r), reflects the average indeterminacy-information content in p(r). This functional measures the average uncertainty (diffuseness, spread, "disorder") in the probability distribution. It also provides the average amount of the information received, when this uncertainty is removed by the particle-localization experiment: . One further observes that densities of these complementary gradient and global entropy/information probes are mutually related: The amplitude form of the Fisher information renders its natural, resultant generalization in terms of the system complex probability amplitude, i.e., the wavefunction ϕ(r) itself: It is proportional to the state overall kinetic energy T [ϕ], , and introduces the nonclassical supplement to the classical gradient information due to phase/current:  (14) to the nonclassical entropy/information densities [15][16][17][18][19][20]25,26]: It follows from Eqs. (14) and (17) that the square of the gradient of the Shannon probe, of the state indeterminicity (disorder), generates the density of the complementary Fisher measure, of the state determinicity (order). For the positive phase convention, φ = |φ| = ( f ) 1/2 ≥ 0, this relation finally gives: The nonclassical entropy S nclass.
[ϕ] thus reflects the state average phase φ . The relevant densities-per-electron of the nonclassical entropy/information contributions thus read: For the specified probability density p and the adopted phase convention, its negative sign signifies the maximum entropy in the strong-stationary molecular state, when φ(r) = 0 and j (r) = 0. One thus predicts that the probability-constrained (vertical) displacements φ(r) = φ(r) ≥ 0 from this strong-stationary, vertical-equilibrium state ϕ, for which the nonclassical entropy/information terms exactly vanish, result in lowering the resultant entropy S[ϕ], the overall information-indeterminicity measure, compared to the initial phase-maximum level S class.
[ϕ] = S [ p]. Therefore, in close analogy to the maximum entropy principle of the ordinary thermodynamics [23], such transitions of the state spatial phase, 0 → φ(r), diminish the nonclassical entropy component, ≥ 0 then measuring the entropy change corresponding to the reverse displacement φ(r) → 0, which restores back the system vertical equilibrium (see Fig. 1). , I nclass. , A complementary picture emerges, when one examines the Fisher-type, gradient measure of the state information-determinicity. For the specified probability distribution p(r), the vertical equilibrium φ(r) = 0 now represents the phase-minimum of the resultant information content. Thus, a finite displacement of the spatial phase, φ = φ > 0, increases the gradient information content by I nclass.
[φ] ≡ −I [φ] ≤ 0 measuring the change in the information content corresponding to the reverse displacement, φ(r) → 0, which restores back the vertical equilibrium marking the vanishing nonclassical information.
Therefore, a presence of a finite spatial phase, i.e., of a nonvanishing electronic current, signifies a displacement from the system strong stationarity. It introduces an additional "structure" element of quantum systems, which is not recognized in CIT. The current pattern implies less "uncertainty" (more "order") in the molecular electronic state, compared to its classical information content. The CIT approach formally corresponds to the vanishing local currents, i.e., to the "standing-wave" solution representing our total "ignorance" of the current direction, when the "forward" and "backward" currents have exactly the same probability. Thus, the state with a definite current direction and its (finite) magnitude exhibits a higher information content, i.e., a lower entropy descriptor, compared to the strong-stationary state, in which the direction of the current remains undefined, with the "forward" and "backward" contributions exactly cancelling each other.
Indeed, since higher information content implies less state uncertainty, one predicts the negative sign of the nonclassical phase entropy (indeterminicity) supplement S nclass. [ϕ], and the positive sign of the nonclassical current information (determinicity) term I nclass. [ϕ]. The complex quantum state exhibiting a nonvanishing spatial phase thus exhibits less resultant entropy compared to its maximum value in the strong-stationary state of the same probability distribution, and hence relatively more resultant information. Indeed, a close relation of the classical gradient measure to the kinetic energy suggests a similar relation between its nonclassical supplement and the kinetic energy due to electronic current, thus suggesting a positive sign of the current contribution to the resultant Fisher-type information. The negative nonclassical entropy contribution, which lowers the resultant entropy relative to its classical value, further implies less resultant information received, when this current "order" is removed in the strong-stationary state of the same density.
To summarize, the average resultant entropy is determined by the sum of the classical, S class.
[ϕ] = S[φ], entropy components: each separately related [see Eqs. (14), (17) Of interest also is the state resultant gradient entropy [25], a local indeterminicity descriptorĨ Since the classical Shannon entropy also measures the information content in the probability distribution, S[ p] = I S [ p], one similarly attributes the classical Fisher functional to the classical part of this gradient entropy: Moreover, since the presence of a finite current diminishes the state resultant uncertainty, we ascribe the negative nonclassical gradient term I nclass.
[ϕ] to the nonclassical gradient entropy:Ĩ One also observes that the density-per-electron of this nonclassical gradient entropy is now related to its Shannon-type analog by the modified variant of Eq. (17): The nonpositive character of the nonclassical entropy S nclass.
[ϕ] =Ĩ [φ] manifest that a presence of a finite electronic current decreases the uncertainty ("disorder") level in electronic state. This is also in accord with the nonnegative kinetic-energy type Fisher measure , which reflects a degree of the state nonclassical determinicity ("order").

Molecular phase-equilibria
The maxima of generalized measures of quantum entropy content determine the associated molecular phase-equilibria [15][16][17][18]25,26]. Let us first summarize the vertical equilibrium principles, for the fixed probability distribution p = p 0 = R 2 0 of the nondegenerate ground-state ϕ 0 = R 0 , which determines the trial (variational) states They consistently predict the strong-stationary ground-state solution φ opt. (r) = φ 0 (r) = 0 as the system vertical equilibrium ( eq. ), for which the electronic current exactly vanishes: Thus, the nonclassical, (phase/current)-related information principles (26a) all identify the lowest eigenstate of the electronic Hamiltonian, as the vertical equilibrium state of this one-electron system. The ground-state probability distribution p 0 then determines all physical properties of the system, functionals of the electron density alone, in accordance with the first Hohenberg-Kohn theorem of the modern density functional theory (DFT) [28,29]. The vanishing spatial phase in the nondegenerate ground state thus signifies the complete absence of the current aspect in the molecular electronic structure. Any displacement from this extreme, strong-stationary situation is manifested by the presence of a finite nonclassical entropy/information contribution, reflecting the average magnitude of either the state spatial phase (in Shannon's measure) or of its gradient (in Fisher's descriptor). These nonclassical terms reflect the extra current "disorder" or "order" in the quantum state under consideration: increased presence of currents implies more overall "structure" (order) in the system and hence less overall electronic "uncertainty" (disorder). This expectation is reflected by the negative signs of the nonclassical complements S nclass. [ϕ 0 ] orĨ nclass. [φ 0 ], of the classical Shannon or gradient entropy terms. Therefore, any finite displacement φ(r) = φ(r) = 0 in the spatial phase from the φ 0 (r) = 0 reference level generates the negative entropy displacements from the relevant classical entropy values, Fig. 1): The which also reaches its phase-maximum value I [ p 0 ] for the lowest equilibrium state ϕ 0 (Fig. 1).
We now put these measures to a consistency test by exploring the horizontal phaseequilibria ( eq. ), φ eq. [ p; r], which mark extrema of the resultant global and gradient entropies: The associated Euler equations determine the optimum phase φ eq. of such equilibrium states, in terms of the state probability distribution: φ eq. = φ eq. [ p]. One expects that predictions resulting from these complementary Shannon-type and gradient resultant entropy measures should have common solutions.
Let us first examine implications of the maximum principle of the resultant Shannon-type (global) entropy. The functional derivative, with respect to ϕ * (r), of the resultant entropy functional S[ϕ] = ϕ|Ŝ|ϕ , where the multiplicative operator S(r) = S(r), ultimately determines the horizontal-equilibrium state ϕ eq. corresponding to "thermodynamic" phase φ eq. [ p]: The relevant Euler equations determining this equilibrium phase in terms of the probability density p, φ eq. = φ eq. [ p], read: This entropic rule thus predicts the equilibrium thermodynamic phase related to electron probability distribution, For the ground-state probability distribution p = p 0 this prediction is in spirit of the Hohenberg-Kohn theorem of modern DFT, that the ground-state density or the associated probability distribution uniquely identify the system electronic state.
[ p] of the resulting Euler equation, indeed recovers the solution of Eq. (31). Thus, the maximum entropy principles derived from the quantum-generalized global and gradient entropies predict the same horizontal equilibrium state corresponding to the spatial phase determined by the negative logarithm of the system electron probability density: this unitary transformation of the system wavefunction ϕ(r, t) modifies the initial probability current in ϕ, into the equilibrium current:

t)∇ R(r, t). (39)
Let us now examine how this unitary equilibrium transformation of the initial wavefunction influences the continuity equations of the probability and phase distributions.
We recall, that the stationary-probability condition p s (r, t) = p(r) covers both the plane-wave, weak-stationary j = const., and the strong-stationary j = 0 states. It can be now explicitly demonstrated that the phase transformation of the wavefunction, which generates the horizontal equilibrium state, does not affect the local time derivative in the continuity equation for the particle probability distribution. It indeed follows from Eq. (44) and its Hermitian conjugate that their weighted difference gives: * Ĥeq. ϕ eq. − ϕ eq.Ĥ eq. ϕ eq. * ] = −h/(2mi)[ϕ * eq. ϕ eq. − ϕ eq. ϕ eq. * ] = −∇ · j eq. + σ p,eq.
Therefore, the probability source σ p,eq. in the horizontal equilibrium state ϕ eq. is modified relative to σ p = 0 [Eq. (5)] in the initial state ϕ: It should be observed, however, that-by the Gauss theorem-this equilibrium local source of the probability density does not affect its overall normalization, since the particles are neither produced nor destroyed: One similarly recovers in the equilibrium state the phase-derivative equations (6) and (7). From the weighted sum of Eq. (44) and its Hermitian conjugate one again finds: [ϕ * eq.Ĥ eq. ϕ eq. + ϕ eq.Ĥ eq. ϕ * eq. ] Thus, at the specified particle location the partial time-derivative of the original phase distribution f = φ 2 also remains unaffected by the equilibrium phase transformation. However, in this equilibrium state the time derivative of the resultant phase eq. = φ + φ eq. , φ eq. = −(1/2)ln p, reads: ∂ eq. /∂t = ∂φ/∂t + ∂φ eq. /∂t = ∂φ/∂t − [1/(2 p)](∂ p/∂t) where we have used the probability continuity equation. In the absolute source scale of Eq. (9) the equilibrium phase-source is determined by the partial time derivative of the equilibrium phase density f eq. = 2 eq. : ∂ f eq. /∂t = 2 eq. (∂ eq. /∂t) = Σ abs. f − (ln p)(∂φ/∂t) This equation also identifies the displacement of Σ abs. f.eq. relative to the original phasesource of Eq. (9), Σ abs. f = ∂ f /∂t. We thus conclude that the equilibrium phase transformation of electronic states while preserving the molecular probability distribution introduces finite displacements in sources of both the probability and phase distributions the antisymmetric wave functions of N fermions yielding the specified particle distribution. In this section we briefly summarize the equilibrium one-determinantal states of molecular systems, e.g., in the familiar Kohn-Sham (KS) or Hartree-Fock (HF) self-consistent field (SCF) theories, which result from extrema of the resultant entropy/information functionals including the nonclassical contributions due to the current/phase components of molecular orbitals (MO) defining the electron configuration in question.
The density-conserving Slater determinants generated in the HZM construction provide natural variational functions for the vertical thermodynamic searches in the IT principles of electronic structure theory. These trial N -electron functions are constructed using the complex, plane-wave type equidensity MO {ϕ k (r) = R(r) exp[iΦ k (r)]}. They offer a convenient framework for an extension of the present analysis into the general N -electron case. In constructing the mutually orthogonal Slater determinants generating the specified electron density ρ(r) = N p(r) these MO adopt the equal, density-dependent modulus R(r) = [ρ(r)/N ] 1/2 = p(r) 1/2 and the resultant spatial phase, defined by the orbital reduced momentum k, the density-dependent vector function f (r) = f [ρ; r], common to all orbitals and linked to the Jacobian of the r → f (r) transformation, and the system "thermodynamic" phase φ(r), identicall in all occupied orbitals. The "orthogonality" phases {F k (r)} assure the independence of these singleparticle states. The optimum wave-number vectors k 0 = {k 0 l } and the associated density ρ 0 (r) are determined at the SCF MO stage, i.e., from the energy-minimization principle using the HZM determinant as variational wavefunction. The thermodynamic phase φ 0 (r) is subsequently determined from the resultant entropic rule. The equilibrium HZM determinant is thus defined by N -lowest equidensity MO for the prescribed groundstate probability distribution p 0 (r) = [R 0 (r)] 2 = ρ 0 (r)/N , for which f [ p 0 ; r] ≡ f 0 (r) and The overall phase Φ k [ p 0 ; r] of the (ground-state occupied) equidensity orbital determines its electronic current, The configuration overall flow measure of all N electrons, j (r; N ) ≡ j (r), is now determined by the expectation value of the N -electron current operator, in the ground-state HZM determinant, It is determined by the average "wave-number" vector in k 0 (N ), For the fixed electron distribution in the vertical entropy principle only the nonclassical components depend upon the MO "wave-number" vectors k 0 and their thermodynamic phase φ(r), which together determine the resultant phase in Harriman's construction. In the vertical search only φ(r) is varied for the fixed k 0 . The maximum principle of the average nonclassical entropies S nclass.
] determines the optimum thermodynamic phase of the verticalequilibrium, labeled by the upper index " eq. ", that also minimizes I [( p 0 , k 0 ), φ]. The associated equidensity orbitals are shaped by the displacements {δk l = k l − K 0 } of the orbital wave-number vector from the average value K 0 , Therefore, in this equilibrium state the overall current [Eq. (59)] exactly vanishes: Let us now examine the horizontal-equilibrium phases of the occupied equidensity orbitals. In a general HZM determinant The vertical energy principle fixes the MO orthogonality phases in the resultant phases {Φ l (r) = δk l · f 0 (r) + φ(r)}, which remain conserved at the final, "thermodynamic" optimization stage determining their missing thermodynamic contribution φ(r), equal in all occupied MO of (N ). This assures the orthonormality of the equilibrium orbitals and the normalization of (N ): (N )| (N ) = l ϕ l |ϕ l = 1.
For the prescribed ground-state probability distribution p 0 , determined at an earlier SCF energy minimization stage, in the associated horizontal principle for the resultantentropy one thus manipulates only the thermodynamic phase component φ(r) of all MO phases {Φ l (r)}. The relevant variation of ϕ * l in this variational search reads: The horizontal phase-equilibrium extremizes the functional of the resultant entropy or information: Performing variations (66) of the complex-conjugate orbitals gives the associated Euler equation for determining the optimum equilibrium phase φ eq. (r) of "thermodynamic" orbitals in a general N -electron case, which finally gives Therefore, the horizontal equilibrium state marks the resultant orbital phases determined by the electron probability distribution alone. For p = p 0 this is again a manifestation of the Hohenberg-Kohn theorem of DFT [28], that the ground-state electron density uniquely determines the equilibrium equidensity orbitals in HZM construction: The associated equilibrium orbital, now generates the following MO current contribution, and hence the resultant equilibrium current of N electrons: where we have again observed that l δk l = N δ K 0 = 0. In the horizontal ground phase-equilibrium the overall current of N electrons is thus also determined by the gradient of the molecular electron density ρ 0 = N p 0 . It can be also directly verified that the same equilibrium thermodynamic phase of Eq. (69) results from the extremum of the gradient entropy: From the associated Euler equation,

Elements of nonequilibrium thermodynamic description
When combined with the probability-and phase-continuity equations these two relations give the following thermodynamic-like expression for the rate of the local production of the resultant entropy, where the second term vanishes in the absolute-source scale of Eq. (9): Therefore, besides the thermodynamic-like product of the probability affinity and its flux the above expression also involves a product of the affinity and (absolute) source of the the phase distribution. Elsewhere we have explored expressions for the corresponding sources of the resultant entropies in the position and momentum spaces, in terms of the state QIT intensities, affinities and currents [22,24,32]. They predict that the vanishing intensities give rise to vanishing sources of the quantum Shannonand Fisher-type entropies. A generalization of this one-electron analysis to general N -electron systems would again involve the wave functions in the HZM construction of the modern DFT.
We have also demonstrated in Sect. 5 that this change in the state phase/current affects sources of both the probability and phase distributions in the position representation:

Conclusion
Elsewhere we have explored in a more detail the densities, currents, information measures, and continuity relations in the p-space [22,24,32]. The composition of the probability and current distributions in the position (r ) and momentum ( p) spaces are quite different [33]. For example, the chemically most important external (large r , valence) region of the position-density corresponds to the internal (low p) region of the momentum-density. The former gives rise to a sourceles continuity relation, while the latter exhibits a finite source term, conditional on the adopted flux definition [22,24,32]. Indeed, the Fourier transforms of the strong-stationary (zero-current) states in the position representation generally give rise to the weak-stationary (finitecurrent) states in the momentum space. This observation strengthens the need for the nonclassical information supplements in the QIT treatments of molecular electronic states.
One recalls that the change of the position representation of the present analysis into its canonical, momentum analog involves the simultaneous unitary operation performed on both the vavefunctions and operators. Therefore, from the known properties of such unitary transformations one predicts that the above position-space conclusions remain generally valid also in the momentum space. Indeed, the unitary canonical transformations between these two representations change the form of the system wavefunction without affecting its independent variables. Thus the quantum mechanical operators related by the unitary transformations represent the same physical quantity, preserve their linear and Hermitian character and algebraic relations between observables, e.g., commutation rules, the spectrum of eigenvalues, and the matrix elements, e.g., the expectation values E = ϕ|Ĥ|ϕ = ϕ eq. |Ĥ eq. |ϕ eq. , j (r, t) = ϕ|ĵ(r)|ϕ = ϕ eq. |ĵ eq. (r)|ϕ eq. , etc .
One further observes that, due to the nonvanishing equilibrium source of the probability density in r -space, the densities of classical information contributions exhibit the nonvanishing sources in the associated r -space entropy/information continuity equations in such phase-optimized states. The nonclassical information terms also introduce finite information sources due to the modified probability current in the equilibrium molecular state. Moreover, a generally nonvanishing source of the momentum density generates finite information-source contributions in p-space, due to both the classical and nonclassical information/entropy terms. In the momentum representation a nonvanishing momentum-probability current, due to a finite p-phase component of the momentum wavefunction, and the current-dependent terms of the resultant measures of the entropic content of molecular states should have a profound influence on the time-evolution of the resultant entropy/information functionals.
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.