Information equilibria, subsystem entanglement, and dynamics of the overall entropic descriptors of molecular electronic structure

Overall descriptors of the information (determinicity) and entropy (uncertainty) content of complex molecular states are reexamined. These resultant concepts combine the classical (probability) contributions of Fisher and Shannon, and the relevant nonclassical supplements due to the state phase/current. The information-theoretic principles determining equilibria in molecules and their fragments are explored and the nonadditive part of the global entropy is advocated as a descriptor of the classical index of the quantum entanglement of molecular subsystems. Affinities associated with the probability and phase fluxes are identified and the criterion of vanishing overall information-source is shown to identify the system stationary electronic states. The production of resultant density of the gradient-information is expressed in terms of the conjugate affinities (forces, perturbations) and fluxes (currents, responses). The Schrödinger dynamics of probability and phase components of molecular electronic states is used to determine the temporal evolution of the overall gradient information and complex entropy. The global sources of the resultant information/entropy descriptors are shown to be of purely nonclassical origin, thus identically vanishing in real electronic states, e.g., the nondegenerate ground state of a molecule.


Introduction
The electronic structure of molecules is embodied in their quantum states generating both the system particle density and current distributions. The continuity relation for the state probability density, which relates these two structural aspects of molecular wavefunctions, implies that the density dynamics is determined by the current's divergence. To paraphrase Prigogine [1], while the electron density determines a static facet of "being", the probability current reflects the state dynamic aspect of "becoming". A general electronic wavefunction is a complex entity characterized by both its modulus and phase components. The square of the former determines the particle probability distribution marking the structure of "being", while the gradient of the latter generates the state current density reflecting the structure of "becoming". These two structural manifestations give rise to the associated classical and nonclassical contributions in the resultant measures of the information/entropy content in complex electronic states [2]. Both the probability and phase/current distributions carry partial information contributions to the resultant entropic content of the underlying quantum state of a molecule.
In entropic theories of molecular electronic structure, e.g., [2][3][4][5], one thus requires an appropriate quantum generalization [2] of the familiar classical descriptors of information theory (IT) [6][7][8][9][10][11][12][13], of the information content in the probability distribution. The quantum extensions [2] of the Fisher (gradient) [6,7] and Shannon (global) [8,9] measures, This paper belongs to Topical Collection International Conference on Systems and Processes in Physics, Chemistry and Biology (ICSPPCB-2018) in honor of Professor Pratim K. Chattaraj on his sixtieth birthday The following notation is adopted: A denotes a scalar, A is the row or column vector, A represents a square or rectangular matrix, and the dashed symbol Â stands for the quantum-mechanical operator of the physical property A. The logarithm of the Shannon information measure is taken to an arbitrary but fixed base: log = log 2 corresponds to the information content measured in bits (binary digits), while log = ln expresses the amount of information in nats (natural units): 1 nat = 1.44 bits. appropriate for complex amplitudes (wavefunctions) of molecular quantum mechanics (QM), combine the partial contributions due to probability (wavefunction modulus) and current (wavefunction phase) degrees-of-freedom. In the position representation the electron probability distribution p(r) alone generates the state classical amount of information, i.e., the information received from outcomes of incoherent (phase-unrelated) local events, outcomes of measurements of the particle position r. Their nonclassical complements in the resultant entropy/information measures, describing the coherent (phase-related) local events, generate the corresponding coherence entropy/information supplements [2,[14][15][16][17][18][19][20] due to the state phase ϕ(r) or the current density j(r) ∝ ∇ϕ(r). Similar generalized descriptors of both the overall information content and entropy-deficiency (information-distance) [10,11] can be introduced in the momentum space [2,21].
The classical IT [6][7][8][9][10][11][12][13], an important branch of the applied probability theory, has already provided new insights into molecular electronic structure and generated useful descriptors of atoms in molecules, reactivity preferences and patterns of chemical bonds, e.g., [2][3][4][5]. The classical information terms are conceptually related to modern density functional theory (DFT) [22][23][24]. They probe the entropic content of incoherent localization events, the outcomes of experiments measuring the particle position, while their nonclassical companions provide the information supplement due to the phase-coherence between such local events, which is inherent in general wavefunctions of QM. The familiar average information/ entropy measures of Fisher and Shannon reflect only the information/entropy content in the system wavefunction due to the probability distribution; thus, failing to distinguish states exhibiting the same electron density but different current compositions.
Therefore, in the quantum IT (QIT) description of the phase equilibria in molecular systems and their constituent fragments [2,[16][17][18][19][20][21][25][26][27][28][29] one has to unite both the probability and phase/current aspects of the system quantum states, in order to fully characterize the overall information content in molecular wavefunctions, the equilibrium states of both the system as a whole and its constituent parts, a degree of the quantum entanglement (mutual bonding status) of subsystems, or the electron diffusion processes [28]. The recently introduced resultant IT descriptors combine the classical probability contributions with their respective nonclassical supplements due to the state phase/current. The densities of nonclassical information/entropy terms exhibit the same mutual relations as their classical analogs and they introduce the nonvanishing source terms into their respective continuity relations [2,26]. They have been successfully used to establish the phase equilibria in molecules, and to distinguish the mutually bonded (phase-related, "entangled") status of molecular fragments and reactants from its nonbonded (phase-unrelated, "disentangled") analog [2,[27][28][29][30][31].
The complex global entropy [2,25], the expectation value of a non-Hermitian operator, generates the probability and phase contributions in the resultant measure as its real and imaginary parts. This two-component ("vector") extension satisfies the requirement that a classical dependence between densities-per-electron of the ordinary Shannon and Fisher entropy/information measures also covers the interrelation between their nonclassical supplements. The phase-dependent concept of complex entropy will be related to the Shannon entropy of information theory and von Neumann's entropy in density matrix. The gradient entropy (indeterminicity-information) analog of the resultant Fisher (determinicity-information) descriptor has also been conjectured [2]. It combines the classical Fisher information with the negative nonclassical phase/current supplement. Indeed, the presence of a finite current introduces additional structure, "order" element, thus increasing the state information (determinicity) content and decreasing its entropy (uncertainty) property of electronic "disorder".
In this work relations between densities of the the complex entropy and resultant information measures will be reexamined and the phase and information equilibria, marking extrema of the resultant entropy and information measures, respectively, will be explored. The nonadditive part of the global entropy in the subsystem resolution will be advocated as a classical information descriptor of the quantum entanglement of the partition molecular fragments. The "vector" character of complex entropy density raises the natural question of what "scalar" function of its real (probability) and imaginary (phase) parts determines the information principle for establishing molecular phase-equilibria. This information quantity will be identified as the resultant gradient entropy.
The source term in the continuity relation for the overall gradient information will be examined and the underlying state affinities ("forces") and fluxes ("responses") will be identified; the equilibrium criterion of the vanishing information production will be shown to determine the stationary states of molecular QM. The time evolution of the QIT entropy/information measures will be explored using the the dynamical equations for the probability and phase components of electronic wavefunctions implied by the molecular Schrödinger Eq. (SE). The time dependence of the resultant information/entropy will be expressed in terms of the state probability and phase degrees-of-freedom, and the nonclassical origins of these derivatives will be revealed.

Probability and phase components of electronic states
Let us consider a single electron (N = 1) at time t 0 = 0 in state |ψ(t 0 )〉 ≡ |ψ(0)〉 ≡ |ψ〉 described by the complex wave function in position-representation, where R(r) and ϕ(r) stand for its modulus and phase parts. It determines the probability distribution, and its current here the momentum operatorp is defined by its action on the wavefunction 〈r|pψ 〉 = −iħ∇ψ(r), and the average velocity V(r) of the probability fluid, measuring the current-per-particle, reflects the state phase-gradient: The wavefunction modulus, the classical amplitude of the particle probability density, and the state phase or its gradient, determining the effective velocity and probability flux, thus constitute two fundamental degrees-of-freedom in the full quantum IT treatment of electronic states: ψ ⇔ (R, ϕ) ⇔ (p, j).
One envisages the electron moving in the external potential v(r) due to the "frozen" nuclei of the molecule, described by the electronic Hamiltonian whereT r ð Þ denotes its kinetic part. The quantum dynamics of a general electronic state |ψ(t)〉, giving rise to the associated wavefunction is generated by SE, which also determines temporal evolutions of the state two physical components: the instantaneous probability density p(r, t) = |ψ(r, t)| 2 = R(r, t) 2 ≡ p(t), and the state phase ϕ(r, t) ≡ ϕ(t). The total time derivative of the former expresses the sourceless continuity relation for the probability distribution, The total derivative, dp r; determining the local probability "source" σ p (r, t), has been interpreted above as the time rate of change in a moving infinitesimal volume element of the probability fluid, while the partial derivative ∂p(r, t)/∂t represents the corresponding rate at the fixed point in space. The probability continuity thus implies: dp r; t ð Þ=dt þ p r; t ð Þ ∇ ⋅V r; t ð Þ ¼ 0: Thus, the vanishing probability source of Eq. (8a) also implies the vanishing divergence of the velocity field: ∇⋅V(r, t) = 0. This probability continuity equation also determines the dynamics of the state modulus component: The particle effective velocity also determines the current concept associated with the state phase: J(r, t) = ϕ(r, t) V(r, t). The scalar field ϕ(r, t) and its conjugate current density J(r, t) determine a nonvanishing phase source [2] in the associated continuity equation: The phase dynamics from SE, finally identifies the phase source: As an illustration consider the stationary wavefunction corresponding to energy E s , representing an eigenstate of the Hamiltonian: The phase dynamics of Eq. (13) then recovers the stationary SE and identifies a constant phase source: Resultant information/entropy concepts and uncertainty principle At a given instant t = t 0 the average Fisher [6] measure of the classical gradient information for locality events, contained in the molecular probability density p(r) = R(r) 2 , is reminiscent of von Weizsäcker's [32] inhomogeneity correction to the kinetic energy functional in Thomas-Fermi theory, here ℐ p (r) = p(r) I p (r) denotes the functional density and I p (r) stands for the associated density-per-electron. The amplitude form I[R] reveals that this classical descriptor measures the average length of the modulus gradient ∇R. This classical, probability descriptor characterizes an effective "narrowness" of p(r), i.e., a degree of determinicity of the particle position. The classical Shannon (S) [8] descriptor of the global entropy in p(r), similarly reflects the distribution "spread" (uncertainty), i.e., a degree of the position indeterminacy. It also provides the amount of information received, when this uncertainty is removed by an appropriate particle-localization experiment: . The densities-per-electron of these complementary information and entropy functionals are seen to satisfy the classical relation: These probability functionals of the classical information/ entropy content generalize naturally into the corresponding resultant quantum descriptors combining the probability and phase/current contributions to the overall entropy/information descriptors of the electronic state |ψ〉 [2,[14][15][16][17][18][19][20]. Such generalized concepts are applicable to complex wavefunctions of molecular QM. They are defined as average values of the associated operators: the Hermitian operator of the gradient information [33] related to the kinetic energy operatorT r ð Þ, and the non-Hermitian (multiplicative) operator of the state complex entropy [25], Therefore, their quantum expectation values in state |ψ〉 give rise to real and complex average IT descriptors, respectively. The overall gradient infomation combines the classical (probability) and nonclassical (phase/current) contributions: This resultant (real) gradient information is proportional to the state average kinetic energy: The resultant complex ("vector") measure of the state global entropy is similarly determined by its real (classical) and imaginary (nonclassical) contributions: The densities-per-electron of these functionals, and now satisfy the complex generalized relation One also introduces the resultant gradient entropy [2], the state local uncertainty descriptor (indeterminicity-information), exhibiting a nonpositive phase supplement M ϕ (r) = − I[ϕ] in its overall density-per-electron: M ψ (r) ≡ M p (r) + M ϕ (r) = I p (r) − I ϕ (r). Indeed, the presence of a finite probability current j(r) ≠ 0, generated by the local phase ϕ(r) > 0, implies an additional "structure" element of the current distribution in the electronic state, thus increasing its resultant information ("order") density, I ϕ (r) > 0, and lowering the associated entropy ("disorder") contribution: M ϕ (r) < 0. The global entropy of probability distribution has also been generalized in the resultant "scalar" measure of the uncertainty content in the specified quantum state ψ [2]: It represents the expectation value of the Hermitian operator of the scalar measure of resultant global entropy, and combines the classical contribution S[p] ≥ 0 of Shannon with its nonclassical supplement S[ϕ] = −2〈ϕ〉 ψ ≤ 0 reflecting the average phase in state |ψ〉: 〈ϕ〉 ψ = ∫p(r) ϕ(r) dr ≥ 0. To summarize, the modulus (probability) and phase (current) components of electronic states are both accounted for in the resultant measures of the gradient or global descriptors of the information/entropy content in generally complex wavefunctions of molecular QM. These overall descriptors combine the familiar classical functionals of the system probability density and their nonclassical supplements due to the state phase or its current density. Their densities-per-electron satisfy classical relations linking the gradient and global descriptors, appropriately generalized to cover a complex character of electronic states. The Hermitian operatorÎ r ð Þ gives rise to the real expectation value of the state content of resultant determinicity information I[ψ], related to the average kinetic energy T[ψ], while the non-Hermitian entropy operator S ψ r ð Þ generates the complex average measure H[ψ] of the global uncertainty in ψ. The classical and nonclassical densities-per-electron of the resultant gradient information and the overall global entropy then separately obey the classical relations: The squared gradient of the classical and nonclassical components in the Shannon-type entropy density is thus seen to determine densities of the associated contributions to the resultant Fisher-type information: Therefore, the gradient of complex entropy can be regarded as the quantum amplitude of the resultant information content. In other words, ∇Ŝ ψ r ð Þ appears as the "square root" ofÎ r ð Þ. This development is thus in spirit of the quadratic approach of Prigogine [1].
The (Hermitian) operator of the gradient entropy is seen to involve the sum of the ordinary squares of gradients of the operator components: This relation establishes the scalar information principle for determining the phase-equilibria [2,[14][15][16][17][18][19][20], corresponding to "thermodynamic" phase shift ϕ eq. (r) ≥ 0. More specifically, the extremum of M[ψ] with respect to ψ * , 〈δψ|M ψ |ψ〉 = 0, gives the Euler equation which identifies the equilibrium phase The same optimum solution follows from the extremum rule It should be observed, however, that the associated extremum principle for the resultant gradient information I[ψ], 〈δψ|Î |ψ〉 = 0, predicts a pure-imaginary optimum phase ϕ opt. (r) = iϕ eq. (r). It is also of interest to examine how the familiar Heisenberg ("indeterminicity") inequality for the product of squared dispersions in the particle position r and momentum p, translates into the probability (modulus) and current (phase) components of a general wavefunction of Eq. (1). The explicit form of the momentum operator in position representation, p r ð Þ = −iħ∇, gives the following expression for the momentum factor, involving the classical Fisher information related to the average velocity, 〈V〉 ψ = ∫pVdr = ∫jdr, and the state nonclassical gradient information I[ϕ]. The squared dispersion of the particle momentum can thus be expressed in terms of the state result information content I[ψ], The Heisenberg uncertainty relation then reads:

Entanglement entropy of molecular fragments
Consider a division of the electron density ρ(r) in a molecular system M = A-B containing N = ∫ρ(r)dr electrons in the fragment distributions ρ(r) = {ρ X (r)} corresponding to the complementary subsystems A and B: For example, these fragments can represent atoms-inmolecules (AIM) or their collections, functional groups, reactants, etc. This partition also applies to the associated division of the probability (shape factor) distribution p(r) = ρ(r)/N, ∫p(r)dr = 1, Here the vectors π(r) = {π X (r)} and p(r) = {p X (r)} combine the fragment probability densities, unity normalized within the whole molecule and in individual subsystems, respectively, while P = {P X = ∫π X (r)dr = N X /N} contains the condensed probabilities of these constituent parts of M: P A + P B = 1. These overall and subsystem probabilities generate the classical Shannon entropies reflecting the corresponding classical uncertainty descriptors. The global entropy of the molecule as a whole also defines the total entropy in division p, while the component probabilities define the indeterminicity measures of the probability density in individual fragments: Together they determine the additive entropy for this partitioning, and hence the associated nonadditive part of S total [p] = S[p]: The latter can also be expressed as the weighted average of entropy deficiencies [10,11] in the fragment probability densities relative to the molecular distribution, measuring the corresponding information distances: The overall entropy in molecular electron density, and its additive contribution, give the associated nonadditive component: This nonadditive entropy thus measures the additive (molecularly-referenced) entropy deficiency in electron densities. It reflects the average information distance between the fragment and molecular densities. It can be used to describe the information similarity between constituent parts and the whole system: the smaller this missing information, the more the two fragments resemble the molecule [3, [34][35][36].
The IT descriptors of Eqs. (46) and (49) can be regarded as measures of the "binding" entropy in the mutually-open (entangled) subsystems for the specified molecular state Ψ yielding ρ, denoted as Ψ→ρ, in the bonded (molecular) com- since the additive entropies of Eqs. (43) and (48) characterize the mutually-closed (disentangled) subsystems in the nonbonded ("promolecular") reference [27,29,31] Above the mutual "bonding" and "non-bonding" character of the two fragments has been denoted by the vertical brokenand solid-lines, respectively, which separate these subsystems in the corresponding composite system. One further observes that the molecularly-referenced additive information-distance of Eq. (49) supplemented by the local constraint of Eq. (38), of conserving the molecular electron density in the partition, gives the variational similarity criterion establishing the equal division of ρ between the two subsystems: ρ X = ρ/2, X = A, B. It has been shown elsewhere [3, [34][35][36], however, that the information variational rule in terms of the nonadditive entropy-deficiency referenced to the densities ρ 0 = {ρ X 0 } of separate subsystems, generates the Hirshfeld [37] ("stockholder") pieces of the molecular density: The optimum pieces of the molecular density can thus be regarded either as the local molecular enhancement w(r) = ρ(r)/ρ 0 (r) of the subsystem density ρ X 0 , or as the promolecular share d X 0 (r) = ρ X 0 (r)/ρ 0 (r) in the molecular density ρ(r). The promolecular distribution ρ 0 (r) = ∑ X ρ X 0 (r) is determined by the separate-fragment densities shifted to their actual positions in the molecule [3,37]. Here, The "stockholder" fragments thus exhibit the maximum information similarity to their isolated (promolecular) analogs, giving rise to the minimum of the relevant entropy-deficiency (missing-information) descriptors [3, [34][35][36]. A reference to Eq. (54) indicates that for this particular division scheme the nonadditive missing information of Eq. (39) exactly vanishes [3]: since w X H = w and d X H = d X 0 . One further observes that expressing ρ X (r) in ΔS nadd. [ρ|ρ 0 ] as ρ(r) d X (r) gives since both the local density ρ(r) and local additive information-distance ΔS add. [d(r)|d 0 (r)] are separately nonnegative. Therefore, the Hirshfeld subsystems also result from the maximum information principle for the nonadditive entropy deficiency: The stockholder pieces of the molecular density thus exhibit the maximum nonadditivity relative to the promolecular distributions in the separate subsystems.
Let us now examine the entanglement entropy of molecular fragments in the phase-equilibrium state of Eq.
exhibits the resultant local phase identified by the optimum "thermodynamic" phase-shift of Eq. (36) related to the system probability density: In such an entangled state of subsystems, in the bonded (molecular) reference system M * eq: = (A * ¦B * ) eq. , the resultant phase Φ eq. [p] also characterizes each mutually-open (nonadditive) fragment X * related to a common molecular "ancestor" state ψ[p]: The phase transformation of Eq. (59) generates the extra current contribution proportional to the probability gradient: The equilibrium states of the mutually-closed (additive) subsystems {X þ eq: } in M þ eq: = (A þ eq: |B þ eq: ) are similarly described by the respective "thermodynamic" phase-shifts marking their own (internal) equilibria: They generate the associated currents in molecular fragments, and the resultant average current: In such a fragment resolution the overall entropy of M eq. =M * eq: in the phase-equilibrium state ψ eq. [p] thus reads: cancel each other. One also observes that the additive equilibrium component, describing the mutually nonbonded fragments in M þ eq: , exactly vanishes: Indeed, the phenomenon of a quantum entanglement [28,30,31] for the given electron distribution in the molecule as a whole, has an exclusively nonclassical (phase/current) origin. This entropic descriptor of the fragment entanglement vanishes for the nondegenerate ground state ψ = ψ 0 , when the probability "degree-offreedom" alone exactly identifies the molecular electronic state: ϕ = ϕ 0 = 0, and hence j = j 0 = 0. This result is in accordance with the basic theorems of DFT [22], which predict that all physical properties in such a "classical" (real) state are determined by the molecular electron density alone. Therefore, when the molecular current exactly vanishes, a division of the molecular density, a "static" distribution of electrons, into the equilibrium fragment pieces amounts to a classical partition into a collection of disentangled subsystems, which is devoid of any phase (coherence) content.

Affinities, fluxes, information production, and equilibrium
We now reexamine a related problem of the source term in the associated continuity equation for the resultant gradient information, generated by the coupled probability p(r, t) = ψ(r, t) ψ * (r, t) = R(r, t) 2 and phase ϕ(r, t) = (2i) −1 ln[ψ(r, t)/ψ * (r, t)] components of the molecular electronic state ψ(r, t). It can be approached using the standard treatment of irreversible thermodymamics [38].
Before addressing the problem of a production of the resultant information let us briefly examine the continuity of the classical gradient information I[p] of Eq. (18). Its functional derivative F class: defines the local probability "intensity" of the classical information, which determines the functional differential, dI[p] = ∫ F p class. (r) δp(r) dr, the Fisher information current, J I class. (r) = F p class. (r) j(r), its divergence: and derivative of the functional density: Taking into account the probability continuity then gives the information source derivative: This product of the classical probability "affinity" G p class.
(r) and "flux" j(r) is thus seen to identically vanish for an r-independent phase, i.e., the zero current, e.g., in the stationary state of Eq. (15). Turning now to the continuity problem of the resultant gradient information, one again recognizes the continuity relations of Eqs. (8) and (12) for the independent (instantaneous) probability and phase parameters of a general, complex wavefunction of Eq. (6), expressing the associated dynamical equations resulting from SE: They determine the differential of this resultant gradient information, and suggest the associated information current measuring the regional resultant information transported through unit area per unit time.
The rate of a local production of the resultant gradient information is given by the sum of the information leaving the region and the rate of the information source within this infinitesimal volume, where: Finally, using the continuity Eq. (75) identifies the source term of the resultant gradient information: The first term in the preceding equation is classical in character. As in irreversible thermodynamics [38] it combines products of regional affinities G(r) ≡ {G k (r) = ∇F k (r)}, gradients of local intensities F(r) ≡ {F k (r)}, and fluxes J(r) ≡ {J k (r)} associated with the state parameters x(r) ≡ {x k (r)}. The former determine the information "forces" driving these conjugate flows, while the latter appear as information "responses" to these generalized perturbations: Due to the nonclassical (phase) contribution in the overall information measure, the rate of production of the resultant gradient information does not vanish for zero affinities: However, the equilibrium source of the gradient information vanishes for the zero phase intensity, F ϕ (r) = 0, when ∇ϕ(r) = 0, e.g., in the stationary state of Eqs. , G ϕ [ψ s ] = 0}. The stationary state, corresponding to the sharply specified electronic energy of Eq. (16), thus exhibits a nonvanishing probabilityintensity of Eq. (71), and zero values of the phase-intensity and affinity: The vanishing production of the resultant gradient information in the stationary quantum states, identifies them as the system information equilibra. Thus, the stationary wavefunctions of molecular QM represent the zero production states of the overall gradient information. It should be observed, however, that contrary to the concept of equilibrium in irreversible thermodynamics [38], such IT equilibrium states do not correspond to the vanishing affinities G = 0, since σ I [ψ s ] vanishes due to J p [ψ s ] = G ϕ [ψ s ] = 0 and F ϕ [ψ s ] = 0. A given displacement from this information equilibrium, specified by the applied "forces" G, triggers probability and phase flows with each flux depending on all affinities and all intensities: J k = J k (G, F). One recalls that in ordinary thermodynamics [38] each flux depends most strongly on its own affinity and it is also known to vanish as these affinities vanish, so one can expand the currents in powers of the affinities with no constant term. The intensity and affinity concepts corresponding to the phase equilibrium of Eq. (61) are also of interest: The former is thus related to the Fick's diffusion equation [28] (see Eq. (63)]: which formally identifies the electron diffusion coefficient D = ħ/(2 m) and the associated current for this migration: Dynamics of resultant entropy/information descriptors Let us now reexamine a temporal evolution of the overall measures of the information and entropy content in the specified molecular quantum state |ψ(t)〉. One recalls that the average energy E[ψ(t)] of an isolated molecular system, remains conserved in time: One similarly explores time dependency of overall measures of the complementary entropy/information descriptors: the expectation values of the state complex entropy, and its resultant gradient information: One first observes that a direct differentiation of the complex-entropy functional H(t) = ∫ψ * (r, t) S ψ (r, t)ψ(r, t) dr = ∫p(r, t) S ψ (r, t) dr gives: where ω(t) = E(t)/ħ. Therefore, this complex derivative exhibits the following real and imaginary components: In the Schrödinger dynamical picture, the time change of the resultant gradient information, the operator of which does not depend on time explicitly,Î r ð Þ = −4∇ 2 = (8 m/ħ 2 )T r ð Þ, results solely from the time dependence of the system state vector itself. Therefore, the time derivative of the average Fisher-type gradient (determinicity) information is generated by the expectation value of the commutatorĤ;Î Â Ã alone, and the integration by parts implies 〈ψ|∇ψ〉 = − 〈∇ψ|ψ〉 ≡ 〈∇ † ψ|ψ〉 or ∇ † = −∇. Hence, the time derivative of the overall gradient information reads: Again, this total time-derivative of the resultant gradient information is seen to be determined by the current content of the molecular electronic state. Therefore, it identically vanishes for the zero current density everywhere for ϕ(r) = 0, thus again confirming its nonclassical origin.

Conclusions
The IT approach has proven its utility in a variety of molecular scenarios, e.g., [39][40][41][42]. In this analysis we examined mutual relations between densities of the classical and nonclassical components of the resultant information/entropy measures, combining the probability contributions of Fisher or Shannon and their associated phase/current supplements. For example, the complex ("vector") entropy approach combines the classical (real) and nonclassical (imaginary) contributions due to the state probability (wavefunction modulus) and current (wavefunction phase), respectively. Such generalized entropic concepts allow one to distinguish the information content of states generating the same electron density but differing in their phase/current composition. They also allow a more precise information-theoretic description of the bonding status of molecular fragments. The IT principles using the resultant quantum descriptors of the entropy/information content in electronic states have also been used to determine the phase and information equilibria in molecules and their constituent parts [12][13][14][15][16][17][18]. The phase aspect of molecular states is also vital for the quantum (amplitude) communications between atoms in molecules [2][3][4][5]42], which determine entropic descriptors of the chemical bond multiplicities and their covalent/ionic composition.
The need for the nonclassical (phase/current) supplements of the classical (probability) measures of the information content in molecular states has been stressed. The electron density distribution determines a static facet of the molecular structure, while the current distribution describes its dynamic aspect. Both these structural manifestations contribute to the overall information content of the generally complex electronic states of molecular systems, reflected by resultant IT concepts. The total time derivatives of such entropic descriptors of electronic states have been examined. These time dependencies have been established via the Schrödinger equation and the dynamics/continuity it implies for the classical and nonclassical degrees-of-freedom of complex wavefunctions. The nonclassical origin of the net temporal changes in the overall entropy/information quantities have been demonstrated. Thus, in real electronic states, exhibiting the vanishing local phase and current, the time derivatives of the resultant gradient information and global entropy exactly vanish.
Although, for simplicity reasons, we have assumed the one-electron case, the modulus (density), and the phase (current) aspects of general electronic states can be similarly separated [2] using the Harriman-Zumbach-Maschke (HZM) construction [43,44] of Slater determinants yielding the specified electron density. The present single-electron development can then be naturally generalized into many-electron states of atomic and molecular systems. One observes that in HZM construction the common modulus part of N occupied (orthonormal) equidensity orbitals (EO), for the given groundstate density ρ, ψ[ρ] = {ψ w [p]}, reflects the molecular probability distribution p(r) = ρ(r)/N, and so do the system optimum (orthonormal) Kohn For example, the expectation value of the N-electron operator for the overall gradient information, where T s (N) stands for the kinetic energy of noninteracting electrons in the KS limit [23]. Thus, the amount of resultant gradient information in the occupied HZM orbitals derived from the optimum molecular distribution of electrons equals to that contained in orbitals describing the separable KS system corresponding to the same ground-state density. Notice, however, that the proportions between the classical (probability) and nonclassical (phase) information contributions vary with different partitions of the molecular density into orbital components.
In the DFT-based theory of chemical reactivity one distinguishes between several hypothetical stages involving either the mutually bonded (entangled) or nonbonded (disentangled) states of reactants for the same electron distribution in constituent subsystems. These two categories are discerned by the phase aspect of the quantum entanglement between such molecular fragments, e.g., [2,27]. We have identified the classical entropic descriptor of this phenomenon, the nonadditive global entropy, which has been interpreted as the partition additive entropy-deficiency measuring the average information-distance between the fragment and molecular densities. The equilibrium phases and currents of reactants can be related to the relevant electron densities using the entropic principles of the quantum IT. This generalized approach deepens our understanding of the molecular/promolecular promotions of the constituent molecular fragments and provides a more precise framework for describing the hypothetical stages invoked in the theory of the chemical bond and reactivity.
The phenomenological apparatus of irreversible thermodynamics [38] also provides an attractive basis for an entropic representation of elementary molecular processes [2]. In this analysis we approached anew the problem of productions of the overall measures of information/entropy, which take into account both the modulus and phase components of complex wavefunctions. We introduced the relevant intensity and affinity conjugates of both the probability and phase fluxes, which together define a local production of the state information content. The nonvanishing local phase source has been identified, giving rise to the nonclassical contribution in the local production of the resultant entropy/information content. It has been argued that the criterion of the vanishing production of the gradient information identifies the stationary states of molecular QM as the system information equilibria. The local information-source has also been interpreted "thermodynamically", by separating a classical summation over products of affinities ("perturbations") and fluxes ("responses") associated with the probability and phase/current degreesof-freedom of molecular states. Since spontaneous flows driven by displacements in the given information affinity should act in a direction to restore the equilibrium, these elementary products should be negative; thus, decreasing the state information (determinicity) level, and hence increasing the state entropy (uncertainty, indeterminicity) content. This suggests a positive entropy production, and hence a negative information source.
To conclude this analysis, let us briefly comment on the resultant entropy concepts containing an explicit phase contribution [Eqs. (24) and (29)], the Shannon entropy of electron probability distribution [Eq. (19)], and von Neumann's (vN) ensemble average entropy [45] contained in the density operator. The latter is defined as mathematical trace involving the density operatorρ of the statistical mixture in question, expressed in terms of its eigenvectors {|ψ j 〉} and eigenvalues (probabilities) {|η j 〉}: It generates the information entropy contained in the ensemble state probabilities: This measure identically vanishes for the pure quantum state |ψ〉, whenρ ψ = |ψ〉 〈ψ| and η ψ = 1: S vN [ρ ψ ] = 0. The (idempotent) density operatorρ ψ then determines the Hermitian density matrix in position representation, γ r; r' ð Þ ¼ rρ ψ r' ¼ ψ r ð Þψ * r' ð Þ; γ r; r ð Þ ¼ p r ð Þ; ð106Þ in terms of which the Shannon entropy S[p] contained in the probability density p(r) reads: Indeed, in this pure-state case the above "ensemble" average measure reduces to the expectation value in state |ψ〉, S[p] = Therefore, in the familiar Shannon entropy of classical IT, which reconstructs the ensemble-average measure of von Neuman's quantum entropy in density matrix, the phase/ current information terms of the complex entropies S[ψ] and S[ψ * ] = S[ψ] * cancel out, as indeed expected of the expectation value of the Hermitian operatorŜ class: ψ . One recalls that in QM one represents the physical properties by the associated (linear) Hermitian operators. However, the information entropy is neither an observable, determined in an experiment, nor is it linear in the underlying probability argument. Therefore, attributing to the overall quantum entropy content in the specified quantum state a non-Hermitian operator is an admissible, workable proposition, capable of a unique phase characterization of the entangled molecular subsystems [31].
Open Access This article is distributed under the terms of the Creative Comm ons Attribution 4.0 International License (http:// creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.