Entropic concepts in electronic structure theory

It is argued that some elusive “entropic” characteristics of chemical bonds, e.g., bond multiplicities (orders), which connect the bonded atoms in molecules, can be probed using quantities and techniques of Information Theory (IT). This complementary perspective increases our insight and understanding of the molecular electronic structure. The specific IT tools for detecting effects of chemical bonds and predicting their entropic multiplicities in molecules are summarized. Alternative information densities, including measures of the local entropy deficiency or its displacement relative to the system atomic promolecule, and the nonadditive Fisher information in the atomic orbital resolution(called contragradience) are used to diagnose the bonding patterns in illustrative diatomic and polyatomic molecules. The elements of the orbital communication theory of the chemical bond are briefly summarized and illustrated for the simplest case of the two-orbital model. The information-cascade perspective also suggests a novel, indirect mechanism of the orbital interactions in molecular systems, through “bridges” (orbital intermediates), in addition to the familiar direct chemical bonds realized through “space”, as a result of the orbital constructive interference in the subspace of the occupied molecular orbitals. Some implications of these two sources of chemical bonds in propellanes, π-electron systems and polymers are examined. The current–density concept associated with the wave-function phase is introduced and the relevant phase-continuity equation is discussed. For the first time, the quantum generalizations of the classical measures of the information content, functionals of the probability distribution alone, are introduced to distinguish systems with the same electron density, but differing in their current(phase) composition. The corresponding information/entropy sources are identified in the associated continuity equations.

localization function Á Entropic probes of electron densities Á Information continuity Á Information theory Á Quantum information measures

Introduction
For the chemical understanding of molecular electronic structure it is vital to interpret the system equilibrium electron density qðrÞ ¼ NpðrÞ in terms of pieces representing the relevant subsystems, e.g., Atoms-in-Molecules (AIM) (Bader 1990), functional groups or other pieces of interest, e.g., the r and p electrons in benzene, and through such intuitive concepts as multiplicities (orders) of the internal (intra-subsystem) and external (interfragment) chemical bonds, describing the bonding pattern between these molecular fragments (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a. Here, N stands for the system overall number of electrons and the electron probability distribution p(r) determines the so called shape factor of the molecular density. In general, the semantics of these traditional chemical descriptors is not sharply defined in modern quantum mechanics, although quite useful definitions are available from several alternative perspectives (Nalewajski 2012a), which reflect the accepted chemical intuition quite well. However, since these chemical concepts do not represent specific observables, i.e., specific quantum mechanical operators, they ultimately have to be classified as Kantian noumenons of chemistry (Parr et al. 2005). It has been demonstrated recently that the Information Theory (IT) (Fisher 1925;Frieden 1998;Shannon 1948;Shannon and Weaver 1949;Kullback and Leibler 1951;Kullback 1959) can be used to elucidate their more precise meaning in terms of the entropy/information quantities (Nalewajski 2002(Nalewajski , 2003a(Nalewajski , b, 2006a(Nalewajski , 2010a(Nalewajski , 2012aParr 2000, 2001;Nalewajski and Świtka 2002;Broniatowska 2003a, 2007;Nalewajski and Loska 2001). This article summarizes the diverse IT perspectives on the molecular electronic structure, in which the molecular states, their electron distributions and probability currents carry the complete information about the system bonding patterns. Some of these chemical characteristics are distinctly ''entropic'' in character, being primarily designed to reflect the ''pairing'' patterns between electrons, rather than the molecular energetics.
The Information Theory (see Appendix 1) is one of the youngest branches of the applied probability theory, in which the probability ideas have been introduced into the field of communication, control, and data processing. Its foundations have been laid in 1920s by Fisher (1925) in his classical measurement theory, and in 1940s by Shannon (Shannon 1948;Shannon and Weaver 1949), in his mathematical theory of communication. The electronic quantum state of a molecule is determined by the system wave function, the amplitude of the particle probability distribution which carries the information. It is intriguing to explore the information content of electronic densities in molecules and to extract from them the pattern of chemical bonds, reactivity trends and ''entropic'' molecular descriptors, e.g., bond multiplicities (''orders'') and their covalent/ionic composition. In this brief survey we summarize some of the recent developments in such IT probes of the molecular electronic structure. In particular, we shall explore the information displacements due to subtle electron redistributions accompanying the bond formation and diagnose locations of the direct chemical bonds using the nonadditive information contributions in the Atomic Orbital (AO) resolution. We shall also examine patterns of entropic connectivities between AIM, which result from ''communications'' between these basis functions. Their combinations represent the Molecular Orbitals (MO) in typical Self-Consistent Field (SCF) calculations. In these SCF LCAO MO theories the electronic structure is expressed in terms of either the Hartree-Fock (HF) MO of the Wave-Function Theories (WFT) or their Kohn-Sham (KS) analogs in the modern Density Functional Theory (DFT), the two main avenues of the contemporary computational quantum mechanics.
The other fundamental problem is the question: what is the adequate measure of the ''information content'' of the given quantum state of a molecule? The classical IT and the information measures it introduces deal solely with the electron density (probability) distribution, reflecting the modulus part of the complex wave function. However, in the degenerate (complex) electronic states the quantum measure of information should reflect not only the spatial distribution of electrons but also their probability currents related to the gradient of the phase part of the complex wave function, in order to distinguish between the information content of two systems exhibiting the same electron density but differing in their current composition. Such quantum extension of the classical gradient (local) measure, the Fisher information (1925), has indeed been proposed by the Author (Nalewajski 2008a). It introduces a nonclassical information term proportional to the square of the particle current. However, no quantum generalization of the complementary (global) measure represented by the familiar Shannon entropy (Shannon 1948;Shannon and Weaver 1949) is currently available. We shall address this question in the final sections of this work.
It has been amply demonstrated (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a) that many classical problems of theoretical chemistry can be approached afresh using this novel IT perspective. For example, the displacements in the information distribution in molecules, relative to the promolecular reference consisting of the nonbonded constituent atoms in their molecular positions, have been investigated and the least-biased partition of the molecular electron distributions into subsystem contributions, e.g., densities of AIM have been investigated. The IT approach has been shown to lead to the ''stockholder'' molecular fragments of Hirshfeld (1977). These optimum density pieces have been derived from alternative global and local variational principles of IT. Information theory facilitates a deeper insight into the nature of bonded atoms (Nalewajski 2002(Nalewajski , 2003a(Nalewajski , b, 2006a(Nalewajski , 2010a(Nalewajski , 2012aParr 2000, 2001;Broniatowska 2003a, 2007;Nalewajski and Loska 2001), the electron fluctuations between them (Nalewajski 2003c), and a thermodynamic-like description of molecules (Nalewajski 2003c(Nalewajski , 2006b(Nalewajski , 2004a. It also increases our understanding of the elementary reaction mechanisms (Nalewajski and Broniatowska 2003b;López-Rosa et al. 2010). By using the complementary Shannon (global) and Fisher (local) measures of the information content of the electronic distributions in the position and momentum spaces, respectively, it has been demonstrated that these classical IT probes allow one to precisely locate the substrate bond-breaking and the product bond-forming stages along the reaction coordinate, which are not seen on the reaction energy profile alone (López-Rosa et al. 2010).
These applications have advocated the use of several IT concepts and techniques as efficient tools for exploring and understanding the electronic structure of molecules, e.g., in facilitating the spatial localization of the system electrons and chemical bonds, extraction of the entropic bond-orders and their covalent/ionic composition, in monitoring the promotion (polarization/hybridization) and charge-transfer processes determining the valence state of bonded atoms, etc. The spatial localization and multiplicity of specific bonds, not to mention some qualitative questions about their very existence, e.g., between the bridgehead carbon atoms in small propellanes, presents another challenging problem that has been successfully tackled by this novel treatment of molecular systems. The nonadditive Fisher information in the AO resolution has been recently used as the Contra-Gradience (CG) criterion for localizing the direct bonding regions in molecules (Nalewajski 2006a(Nalewajski , 2008a(Nalewajski , 2010a(Nalewajski , b, 2012aNalewajski et al. 2010Nalewajski et al. , 2012a, while the related information density in the Molecular Orbital (MO) resolution has been shown (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012aNalewajski et al. 2005) to determine the vital ingredient of the successful Electron-Localization Function (ELF) (Becke and Edgecombe 1990;Silvi and Savin 1994;Savin et al. 1997).
The Communication Theory of the Chemical Bond (CTCB) has been developed using the basic entropy/information descriptors of the molecular information (communication) channels at various levels of resolving the molecular probability distributions (Nalewajski 2000(Nalewajski , 2004b(Nalewajski , c, d, 2005a(Nalewajski , b, c, 2006a(Nalewajski , c, d, e, f, g, 2007(Nalewajski , 2008b(Nalewajski , c, d, 2009a(Nalewajski , b, c, d, 2010a(Nalewajski , 2012aNalewajski and Jug 2002). The entropic probes of the molecular bond structure have provided new, attractive tools for describing the chemical bond phenomenon in information terms. It is the main purpose of this survey to illustrate the efficiency of alternative local entropy/information probes of the molecular electronic structure, explore the information origins of the chemical bonds, and to present recent developments in the Orbital Communication Theory (OCT) (Nalewajski 2009e, f, g, 2010a(Nalewajski 2009e, f, g, , c, 2011a(Nalewajski 2009e, f, g, , b, 2012aNalewajski et al. , 2012b. In OCT the chemical bonding is synonymous with some degree of communications (probability scatterings) between AO. It can be realized either directly, through the constructive interference of interacting orbitals, i.e., as the ''dialogue'' between the given pair of orbitals, or indirectly, through the information cascade involving other orbitals, which can be compared to the ''rumor'' spread through these orbital intermediates. The importance of the non-additive effects in the chemical-bond phenomena will be emphasized throughout and some implications of the information-cascade (bridge) propagation of electronic probabilities in molecular information systems, which generate the indirect bond contributions due to orbital intermediaries (Nalewajski 2010d(Nalewajski , e, 2011c(Nalewajski , 2012cGurdek 2011, 2012), will be examined.
This work surveys representative applications of alternative entropy densities as probes of the bonding patterns in molecules. The nonadditive information contributions defined in alternative orbital resolutions will be examined, the IT tools for locating electrons and chemical bonds will be introduced, and the bond direct/indirect origins will be explored in prototype molecules. The conditional probabilities in AO resolution, which define the molecular information (communication) system of OCT, can be generated using the bondprojected superposition principle of quantum mechanics. The entropy/information measures of the bond covalency/iconicity reflect the average noise and the flow of information in such molecular network, respectively (Nalewajski 2006a(Nalewajski , 2009e, f, g, 2010a(Nalewajski , c, 2011a(Nalewajski , b, 2012a. Finally, the quantum extension of the classical Shannon entropy will be proposed, following a similar generalization of the Fisher measure related to the system average kinetic energy (Nalewajski 2008a), and the phase-current concept will be introduced in the information-continuity context (Nalewajski 2010a(Nalewajski , 2012a.

Direct and indirect orbital interactions
In typical molecular scenarios one probes the information contained in the ground state distribution of electrons and examines its displacement relative to the system promolecule, the initial state in the bond formation process. The latter is dominated by reconstructions of the valence shells of constituent AIM. In the simplest, single-determinant description of the familiar orbital approximation the equilibrium electron distribution is determined by the optimum shapes of the occupied MO resulting from the relevant SCF approach, i.e., from the familiar HF or KS equations of the computational quantum mechanics. In this MO subspace the superposition principle of quantum mechanics then generates the direct (through-space) communication network between AO participating in the bond formation proces. Its consecutive (cascade) combinations subsequently give rise to the relevant indirect probability scatterings, which are responsible for the through-bridge bonds in the molecule.
In OCT the molecule is treated as the information system involving elementary events of localizing electrons on AO, both in its input and output (see Fig. 1). In this description each AO constitutes both the ''emitter'' and ''receiver'' of the signal of assigning electrons in a molecule to these elementary basis functions of the separated atoms. Besides the direct communications between AO, i.e., a ''dialogue'' between basis functions, the given pair of orbitals also exhibits the indirect (cascade) communications involving remaining orbitals acting as intermediaries in this ''gossip'' exchange of information (Nalewajski 2010a(Nalewajski , 2010d(Nalewajski , e, 2011b(Nalewajski , c, 2012a(Nalewajski , 2012Gurdek 2011, 2012). The direct communication network of the system chemical bonds is determined by the conditional probabilities Pðv 0 jvÞ ¼ Pða ! bÞ ¼ fP i!j g of observing alternative AO outputs b ¼ v 0 ¼ fv j g for the given AO inputs a ¼ v ¼ fv i g. In SCF LCAO MO theory they result from the superposition principle of quantum mechanics (Dirac 1958) supplemented by the projection onto the occupied MO the subspace. They are determined by the squares of the associated scattering amplitudes fA i!j g, which are proportional to the corresponding elements of the system density matrix c AO (Appendix 2). For the given input probability PðaÞ ¼ p the scattered signals from all inputs generate the resultant output probabilities PðbÞ ¼ q, with the molecular input giving rise to the same distribution in the channel output of such a ''stationary'' probability propagation: This molecular channel can be probed using different input signals, some specifically prepared to extract desired properties of the chemical bond . These descriptors can be global in character, when they describe the molecule as a whole, or they can refer to localized bonds in and between molecular fragments (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a. Both the internal and external bonds of molecular pieces can be determined in this way. The molecular communication system can be applied in the full AO resolution, or it can be used in alternative ''reductions'' (Nalewajski 2005b(Nalewajski , 2006a(Nalewajski , 2010a(Nalewajski , 2012a, when some input and/or output orbital events are combined into groups defining less resolved level of molecular communications. The direct probability scattering networks can be also combined into information cascades (Fig. 2), involving a single or several AO intermediates, in order to generate relevant networks for the indirect communications between the basis functions of the SCF LCAO MO calculations. These indirect communications in molecules generate the so called bridge-contributions to the resultant bond orders (Nalewajski 2010d(Nalewajski , e, 2011c(Nalewajski , 2012aGurdek 2011, 2012). The conditional probabilities between AO basis functions in the occupied subspace of MO, Pðv 0 jvÞ ¼ fPðv j jv i Þ ¼ Pðv i^vj Þ=p i PðjjiÞg (Nalewajski 2009e;f;g, 2010a, c, 2011a, 2012a, which result from the quantum-mechanical superposition principle (Dirac 1958), determine the proper communication network (see Appendix 1) for discussing the entropic origins of the chemical bond. Here, Pðv i^vj Þ stands for the joint molecular probability of simultaneously observing orbitals v i and v j , in the ''input'' and ''output'' of the underlying orbital communication system, respectively, while p i is the probability of the electron occupying ith AO in the molecule (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a) (see also Appendix 1). The complementary information-scattering (covalent) and information-flow (ionic) descriptors of the molecular information system, represented by the channel conditional entropy S and mutual information I, respectively, then generate the IT multiplicities of these two bond components, which together give rise to the overall IT bond order N ¼ I þ S (Nalewajski 2000(Nalewajski , 2004b(Nalewajski , c, d, 2005a(Nalewajski , b, c, 2006a(Nalewajski , c, d, e, f, g, 2007(Nalewajski , 2008b(Nalewajski , c, d, 2009a(Nalewajski , b, c, d, e, f, g, 2010a(Nalewajski , c, 2011a(Nalewajski , 2012aNalewajski and Jug 2002;. The amplitudes of the conditional probabilities exhibit typical for electrons property of interference (Nalewajski 2011c). An example of such an information channel, determined by the doubly occupied bonding MO originating from an interaction of two (singly occupied) AO is shown in Fig. 1.
The global IT descriptors of the whole AO network for the direct communications in a molecule (see Appendix 1) involve the conditional entropy S P b ð ÞjP a ð Þ ð ÞSðqjpÞ S, which measures the channel average communication noise, and the mutual information relative to the promolecular input signal p 0 , IðP 0 ðaÞ : Therefore, the broken communication connections in the diagram are excluded from the set of the admissible bridge communications reflects the network information flow. In OCT they signify the IT bond covalency and ionicity, respectively, and generate the associated overall IT index of the bond multiplicity in the molecular system under consideration: N ¼ S þ I. An appropriate probing of this communication network allows one to extract the associate indices of the localized bonds, between pairs of AIM, and the internal and external bond characteristics of molecular fragments (Nalewajski 2010a(Nalewajski , 2012a. Alternative channel reductions can be used to eliminate the subsystem internal bond multiplicities, in order to extract the complementary external descriptors, of the fragment bonds with its molecular environment (Nalewajski 2005b(Nalewajski , 2006a. This communication approach also allows one to index the molecular couplings, between internal bonds in different subsystems (Nalewajski 2010c(Nalewajski , 2011a. Thus, in the global description the molecular AO channel is probed by the molecular input signal p ¼ p i f g, when one extracts the purely molecular (covalency) descriptor, and by the promolecular signal , when one is interested in the IT ''displacement'' quantity, relative to this reference of the molecularly placed, initially nonbonded atoms (see Fig. 1). Direct information scattering PðjjiÞ between the given pair of AO originating from different atoms, v i 2 A and v j 2 B is then proportional to the square of the coupling element c j;i ¼ c i;j of the Charge and Bond-Order (CBO), density matrix c AO (Appendix 2). Therefore, this probability is also proportional to the associated contribution M i;j ¼ c 2 i;j of these two AO to the Wiberg index (Wiberg 1968) measuring the ''multiplicity'' of the direct chemical bond A-B: As an illustration consider the simplest 2-AO model of the chemical bond, resulting from the interaction between the given pair of the (orthonormal) AO, A(r) [ A and B(r) [ B, with each atom contributing a single electron to form the chemical bond in the ground molecular state determined by the doubly occupied bonding MO.
Its communication system, shown in Fig. 1, conserves the overall (single) IT bond order for all admissible bond polarizations measured by the probability parameter P: However, as explicitly shown in Fig. 3, the IT covalent/ionic composition of such a model chemical bond changes with the MO polarization P, so that these two bond components Fig. 3 Variations of the IT-covalent S P ð Þ and and IT-ionic I P ð Þ components of the chemical bond in the 2-AO model with changing MO polarization P, and the conservation of the overall bond-order N P ð Þ = 1bit Electronic structure theory 33 compete with one another. In accordance with an accepted chemical intuition the symmetrical bond, for P ¼ Q ¼ 1 = 2 , gives rise to the maximum bond covalency, S P ¼ 1 = 2 ð Þ¼ 1 bit, e.g., in the r bond of H 2 or the p bond of ethylene, while the ion-pair configurations, for P = (0, 1), signify the purely ionic bond: IðPÞ ¼ 1 bit.
In this nonsymmetrical binary channel the molecular input signal p ¼ ðP; Q ¼ 1 À PÞ probes the model bond covalency, while the promolecular input p 0 = (, ) determines the model IT iconicity descriptor. It should be stressed, however, that this system of indexing, intended for the fixed molecular (equilibrium) geometry, is insensitive to changes in the actual AO coupling strength. For an appropriate modification, which correctly predicts the monotonically decreasing bond order with increasing bond length see (Nalewajski in press b).
The average entropy/information descriptors of the localized bond A-B in polyatomic systems where similarly shown  to adequately approximate the associated Wiberg bond orders, at the same providing a resolution of this resultant index into its covalent (S A;B ) and ionic (I A;B ) components.
In MO theory the direct chemical coupling between ''terminal'' atomic orbitals on different centers is strongly influenced by their mutual interaction-strength and overlap, which together condition the associated energy of their bonding combination (in the constructive AO interference), relative to the initial AO energies. This direct textbook mechanism, e.g., of the p bond between the nearest neighbors (ortho carbons) in benzene ring, is realized ''through space'', without any interference of the remaining AO. It generally implies an accumulation of electrons between the atomic nuclei, the so called bond charge, which may exhibit different polarizations up to the full electron transfer, in accordance with the electronegativity/hardness differences between the two atoms. In MO theory the chemical multiplicity of such direct interaction is generally adequately represented by the Wiberg index M A;B .
For more distant atoms, e.g., in the cross-ring (meta and para) p interactions in benzene, these conditions for an effective mixing of AO into the bonding MO combination are not fulfilled. The natural question then arises: are there any additional possibilities for bonding such more distant neighbors in atomic chain or ring? An example of such controversy has arisen in explaining the central-bond problem in the small [1.1.1] propellane, for which both the density and entropy-deficiency/displacement diagrams do not confirm a presence of the direct chemical bonding, in contrast to a larger [2.2.2] system, where the full central bond has been diagnosed (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a. We also recall, that to justify the existence of ''some'' direct bonding between atoms without the bond-charge feature, the alternative (correlational) ''Charge-shift'' mechanism has been proposed by Shaik et al. within the generalized Valence-Bond (VB) description. It is realized via charge fluctuations resulting from a strong resonance between the bond covalent and ionic VB structures.
However, since in general the HF and KS theories describe the bond-patterns in molecules quite well, one should search for some additional long-range bonding capabilities, present even at this lowest, one-determinantal description the molecular electronic structure, in terms of the occupied subspace of MO in the ground-state electron configuration. One would indeed expect that this level of theory, which was demonstrated of being perfectly capable of tackling larger propellanes, should be also adequate to treat smaller systems as well. It has indeed been demonstrated (Nalewajski 2010d(Nalewajski , e, 2011b(Nalewajski , c, 2012cGurdek 2011, 2012), that in OCT there are indeed additional, indirect sources of entropic multiplicities of chemical interactions, which explain a presence of a non-vanishing bond orders between more distant AIM.
More specifically, within the OCT description all communications between AO originating from different centers ultimately generate the associated bond-order contributions. The direct components result from the orbital mutual probability propagation (the orbital ''dialogue''), which does not involve any orbital intermediatries (Fig. 1). However, the transfer of information can be also realized indirectly, as a ''gossip'' spread via the remaining orbitals, e.g., in the single-bridge cascade of Fig. 2, obtained from the consecutive combination of the two direct channels. Examples of such ''combined'' communications in the p system of benzene are shown in Fig. 4. The bond multiplicity M A;Bjbridges of such multicentre ''bonds'' in MO theory can be measured by the sum of the relevant products of the Wiberg indices of all intermediate interactions in the occupied subspace of MO (Nalewajski 2010d(Nalewajski , e, 2011c(Nalewajski , 2012cGurdek 2011,2012). Therefore, the larger the bridge, the less information is transferred in this indirect manner, and the weaker through-bridge interaction. The latter is the most effective, when realized through the real chemical bridges, i.e., the mutually (directly) bonded pairs of AIM.
Together the direct ðM A;B Þ and indirect ðM A;Bjbridges Þ components determine the resultant bond order in this generalized perspective on communicational multiplicities of chemical bonds: As an illustration we report below the relevant bond-order data (from Hückel theory) for the p interactions in benzene, in the consecutive atom/orbital numbering of Fig. 4, It follows from these results that the artificial distinction of the meta p interaction as completely nonbonding does not hold in this generalized OCT perspective. The strongest ''half'' bond, of predominantly direct origin, is indeed detected for the nearest (ortho) neighbors in the ring, while both cross-ring iterations are predicted to give rise to similar, weaker resultant interactions: the meta bond order is exclusively of the bridge origin, while the para bond exhibits comparable direct and indirect contributions. Theoretical analysis of illustrative polymer chains (Nalewajski 2012c;Nalewajski and Gurdek 2012) indicates that this indirect mechanism effectively extends the range of chemical interactions to about three-atom bridges. It has been also demonstrated that the source of these additional interactions lies in the mutual dependencies between (nonorthogonal) bond projections of AO (Nalewajski and Gurdek 2011). An important conclusion from this analysis is that the given pair of (terminal) AO can be still chemically bonded even when their directly coupling CBO matrix element vanishes, provided that these two basis functions directly couple to a chain of the mutually bonded orbital intermediates. This effectively extends the range of the bonding influence between AO, which is of a paramount importance for polymers, supramolecular, catalytical and biological systems.

Electron densities as information carriers
The electrons carry the information in molecular systems (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a. Several IT quantities (see Appendix 1) have provided sensitive probes into changes in the equilibrium particle distributions, relative to the promolecular reference, and generated information detectors of chemical bonds or devices monitoring the valence state of AIM. In orbital approximation the molecular electron density is given by the sum of MO contributions q MO ¼ fq a g; q ¼ P a q a . This partition defines the associated MO-additive (add) component, A add ½q MO ¼ P a A½q a , of the density functional A½q A total ½q MO attributed to the given physical property A, and hence also its MO-nonadditive (nadd) part at this resolution level (see Appendix 2): Such MO partitioning of the Fisher information density f q; r ½ f total q MO ; r ½ (Appendix 1), has been shown  to generate the key conditional probability used to define ELF (Becke and Edgecombe 1990). In a search for the entropic origins of chemical bonds the AO resolution is required. It generates the AO-addtivive partitioning of the promolecular density. This partitioning of the intrinsic accuracy density f q; r ½ f total X AO ; r Â Ã (Appendix 2) leads to a related CG criterion, which represents an effective tool for the bond localization (Nalewajski 2010a(Nalewajski , b, 2012aNalewajski et al. 2010Nalewajski et al. , 2012a. Thus, the AO-nonadditive component of the Fisher information density used to define the CG criterion, related to the atomic (promolecular) reference, reads: It should be recalled at this point that the set of AO densities on constituent atoms {X} gives rise to densities of isolated atoms fq 0 X ðrÞ ¼ P i2X q i ðrÞg, which by the Hohenberg-Kohn (HK) theorem of DFT (Hohenberg and Kohn 1964) uniquely identify the atomic external potentials (due to the atom own nucleus), {v X (r) = v X [q X 0 ; r]}, and hence also the atomic Hamiltonians {H X (N X 0 , v X )}, where N X 0 stands for the overall number of electrons in the isolated atom X. Since atomic positions R = {R X } are parametrically fixed in the Born-Oppenheimer (BO) approximation, this information suffices to uniquely identify the molecular external potential as well, vðrÞ ¼ P X v X ðr À R X Þ, and hence also the molecular Hamiltonian H(N, v), where N ¼ P X N 0 X N 0 , which generates the equilibrium electron distribution q(r) of the whole system. Therefore, in the adiabatic approximation there is also one-to-one mapping from the densities q AO of the isolated atoms to the molecular density, q(r) = q[q AO ; r]. Hence, so that the definition of the AO-nonadditive component of Eq. (10) can be also interpreted in the spirit of the multi-functional of Eq. (9): The negative values of f nadd [q AO ; r] reflect an extra delocalization of electrons via the system chemical bonds (Nalewajski 2008a(Nalewajski , 2010a(Nalewajski , b, 2012aNalewajski et al. 2010Nalewajski et al. , 2012a. Therefore, the valence basins of its negative values can serve as sensitive detectors of the spatial localization of the direct chemical bonds in a molecule. For two interacting AO this is the case when the gradient of one orbital exhibits the negative projection on the direction of the gradient of the other orbital, which explains the name of the CG probe (Nalewajski 2008a). This criterion localizes chemical bonds, regions of an increased electron delocalization, in an analogous manner as the inverse of the negative f nadd [q MO ; r] indexes the localization of electrons in the ELF approach Becke and Edgecombe 1990).
A transformation of the initial (promolecular) distribution of electrons q 0 into the final molecular equilibrium density q, as reflected by the familiar density-difference function (see Appendix 2), Dq(r) = q(r) -q 0 (r), also called the deformation density, can be alternatively probed using several local IT probes (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a of Appendix 1. For example, the missing information functional, and its density Ds(r) = q(r)I(r) can be used to monitor changes in the information content due to chemical bonds. It measures the so called cross(relative)-entropy referenced to the promolecular state of nonbonded atoms. Alternatively, the corresponding change in the Shannon entropy (see Appendix 1), and its density Dh(r) can be used to extract the displacement descriptors of the system chemical bonds (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a. The Dq(r) and Ds(r) probes where shown to be practically equivalent in monitoring the effects of chemical bonds and of the associated AIM promotion. Indeed, these two displacement maps so strongly resemble one another that they are hardly distinguishable. This confirms a close relation between the local density and entropy-deficiency relaxation patterns, thus attributing to the former the complementary IT interpretation of the latter. This strong resemblance between these molecular diagrams also indicates that the local inflow of electrons increases the cross-entropy, while the outflow of electrons gives rise to a diminished level of this relative uncertainty content of the electron density in molecules.
Electronic structure theory 37 The density displacement and the missing information distributions can be thus viewed as equivalent probes of the system direct chemical bonds. In Fig. 5 we compare the illustrative Dq(r) and Dh(r) plots for a representative set of linear molecules. The main feature of Dh diagrams, an increase in the electron uncertainty in te bonding region between the two nuclei, is due to the inflow of electrons to this region. This manifests the bond-covalency phenomenon, attributed to the electron-sharing effect and a delocalization of the bonding electrons, now moving effectively in the field of both nuclei. In all these maps one detects a similar nodal structure. The nonbonding regions are seen to exhibit a decreased uncertainty, either due to a transfer of the electron density from this area to the vicinity of the two nuclei and the region between them, or as a result of the orbital hybridization.
Therefore, the molecular entropy difference function also displays all typical features in the reconstruction of electron distribution in a molecule, relative to free atoms. Its diagrams thus provide an alternative information tool for diagnosing the presence of chemical bonds through displacements in the entropy (uncertainty) content of the molecular electron densities.
As an additional illustration we present the combined density-difference, entropy-displacement, and information-distance analysis of the central C 0 -C 0 bond in small propellanes. The main objective of this study was to examine the effect of a successive increase in the size of the carbon bridges in the series of the [1.  Figure 7 reports the contour maps of the density difference function Dq(r), the missing information density Ds(r), and of the local entropy-displacement Dh(r), for the planes of sections displayed in Fig. 6. The relevant ground-state densities have been generated using the DFT-LDA calculations in the extended (DZVP) basis set.
The density difference plots show that in small [1.1.1] and [2.1.1] propellanes there is on average a depletion of the electron density between the bridgehead carbon atoms, relative to the atomic promolecule, while the larger [2.2.1] and [2.2.2] systems exhibit a net density buildup in this region. A similar conclusion follows from the entropy-displacement and entropy-deficiency plots of the figure. The two entropic maps are again seen to be qualitatively similar to the corresponding density-difference plots. This resemblance is seen to be particularly strong between Dq(r) and Ds(r) diagrams shown in first two columns of the figure.
Therefore, all these bond detectors predict an absence of the direct chemical bonding in two small propellanes and their full presence in larger systems. This qualitatively agrees with predictions from the simple models of the localized chemical bonds in the smallest and largest of these propellanes, which are summarized in Fig. 8. Indeed, in the minimum basis set framework the bond structure in these two systems can be qualitatively understood in terms of the localized MO resulting from interactions between the directed orbitals on neighboring atoms and the non-bonding electrons occupying the non-overlapping atomic hybrids.
In the smallest [1.1.1] system the nearly tetrahedral (sp 3 ) hybridization on both bridgehead and bridging carbons is required to form the chemical bonds of the three carbon bridges and to accommodate two hydrogens on each of the bridge carbons. Thus three sp 3 hybrids on each of the bridgehead atoms are used to form the chemical bonds with the bridge carbons and the fourth hybrid, directed away from the central-bond region, remains nonbonding and singly occupied.
In the largest [2.2.2] propellane the two central carbons acquire a nearly trigonal (sp 2 ) hybridization, to form bonds with the bridge neighbors, each left with a single 2p r orbital directed along the central-bond axis, which have not been used in this hybridization scheme, now being available to form a regular r bond, the through-space component of the overall multiplicity of the central C 0 -C 0 bond. This explains the missing direct interaction in a smaller (diradical) [1.1.1] propellane and its full presence in a larger [2.2.2] system. resolutions have been also shown to provide sensitive tools for localizing electrons and chemical bonds in molecular systems, through the ELF (or IT-ELF) (Nalewajski 2010a(Nalewajski , 2012aNalewajski et al. 2005;Becke and Edgecombe 1990;Silvi and Savin 1994;Savin et al. 1997) and CG (Nalewajski 2008a(Nalewajski , 2010b(Nalewajski , 2012bNalewajski et al. 2010Nalewajski et al. , 2012a concepts, respectively. The electron localization property of the former, proportional to the inverse of the square (ELF) or just the simple inverse (IT-ELF) of the negative f nadd [q MO ; r] Becke and Edgecombe 1990), is demonstrated in Fig. 9. It reports an application of these tools in a study of the central bond in the [1. propellanes of the preceding section. This analysis again indicates an absence of the direct bond in the smaller system and its full presence in the larger propellane.
The density f nadd [q AO ; r] of the nonadditive Fisher information in AO resolution is shown in Figs. 10, 11, 12, 13. It is seen to represent an efficient CG tool for the bond localization, with the valence basins of its negative values, signifying an increased electron delocalization, now identifying the bonding regions in molecules (Nalewajski 2010a(Nalewajski , b, 2012aNalewajski et al. 2010Nalewajski et al. , 2012a. Accordingly, the areas of its positive values identify the nonbonding regions of the molecule. They represent a relative contraction/ localization of electrons as a result of the AIM polarization/hybridization in their valence states, due to a presence of the bond partners in the molecule. In Fig. 13 this CG analysis has been employed to study the central bond problem in propellanes. Each row in the figure corresponds to the specified molecular system, with the left diagram displaying the contour map in the section perpendicular to the central bond, in its midpoint, and the right diagram corresponding to the bond section containing the bridge Fig. 6 The propellane structures and the planes of sections containing the bridge and bridgehead (C 0 ) carbon atoms identified by black circles carbon. This independent analysis generally confirms our previous conclusions of the practical absence of the direct C 0 -C 0 bond in the smallest propellane, and its full presence in the largest compound. However, this transition is now seen to be less abrupt, since even in the [1.1.1] system one detects a small central bonding region, which gradually increases with successive bridge enlargements. The figure also demonstrates the efficiency of the CG criterion in localizing the remaining C-C and C-H bonds.  It follows from the virial theorem analysis of the diatomic BO potential that changes in the bond energy with the internuclear distance can be uniquely partitioned into the associated displacements of its kinetic and potential components. In this overall perspective the  electronic kinetic energy drives the atomic approach only at an early stage of the bondformation process, while at the equilibrium separation between the nuclei it constitutes a destabilizing factor. Indeed, it is the potential energy of electrons which is globally responsible for the chemical bonding at this stage. In other words, the overall contraction (promotion) of atomic densities in a molecule, which increases the kinetic energy relative to separated-atom limit, dominates the truly bonding kinetic energy contribution reflected by the nonadditive Fisher information. This is why most of the physical interpretations of the origins of the direct chemical bond emphasize the potential component, e.g., a stabilization of the diatomic system due to an attraction of the screened nuclei to the shared ''bond charge'' and the contraction of atomic distributions in the molecular external potential. In this interpretation the kinetic energy is regarded only as a prior ''catalyst'' of the bond formation, at an earlier stage of the atomic approach.
In the CG approach a separation of the nonadditive component of the kinetic energy eliminates the atom-promotion effects, which effectively hide the bonding influence of this energy contribution, effective also at the equilibrium separation between the nuclei. As we have seen in this section, this partition is also vital for gaining an insight into the information origins of the chemical bond.

Nonclassical Shannon entropy
We have already stressed in the ''Introduction'' section, there is a need for designing the quantum-generalized measures of the information content, applicable to the complex probability amplitudes (wave functions) encountered in quantum mechanics. The classical measures (summarized in Appendix 1) probe only the probability amplitudes, related to the modulus of the system state function, neglecting the information content of its phase (current) component. Elsewhere (Nalewajski 2008a) the Author has already examined a natural quantum extension of the classical Fisher information [Eq. (33)], which introduces the nonclassical term due to probability current [Eq. (63)] (Nalewajski 2008a(Nalewajski , 2010a(Nalewajski , 2012a. One could expect that a similar generalization of the classical Shannon entropy S[p] [Eq. (34)] is required in quantum mechanics, to include the relevant phase/current contribution. In the remaining part of this overview we therefore discuss, for the first time, the entropy content of the phase feature of the molecular quantum states. We shall address this problem using the already known quantum contribution to the Fisher information, by adopting the natural requirement that the relations between the known classical information densities of the Fisher and Shannon measures should also hold for their quantum complements. The related issues of the phase current and information continuity are addressed in Appendix 3.
It follows from Eq. (63) that both the electron distribution p(r) and its current j(r) determine the resultant quantum Fisher information content I[p, j]. Its first, classical part I[p] explores the information contained in the probability distribution, while the nonclassical contribution I[j] measures the gradient information in the probability current, i.e., in the phase gradient of Eq. (61). Thus, the quantum Fisher functional I[w] symmetrically probes the gradient content of both aspects of the complex wave-function: Hence, the classical Fisher information measures the ''length'' of the ''reduced'' gradient rp of the probability density, while the other contribution represents the corresponding ''length'' of the reduced vector of the probability current density " j. In one electron system of Appendix 3 this generalized measure becomes identical with the classical functional I[p] only for the stationary quantum states characterized by the time-independent probability amplitude R(r) = u(r) and the position-independent phase U(t) = -xt: These two information contributions in Eq. (63) can be alternatively expressed in terms of the real and imaginary parts of the gradient of the wave-function logarithm, r ln w ¼ ðrwÞ=w, Therefore, these complementary components of the quantum Fisher information have a common interpretation in quantum mechanics, as the p-weighted averages of the gradient content of the real and imaginary parts of the logarithmic gradient of the system wavefunction, thus indeed representing a natural (complex) generalization of the classical (real) gradient information measure [Eq. (31). Let us examine some properties of the resultant density of this generalized Fisher information, or the associated information density per electron: The latter is generated by the squares of the local values of the related quantities per electron: the probability gradient ð e rpÞ 2 and the current density ðjÞ 2 , which ultimately shape these classical and nonclassical information terms in quantum mechanics. This expression emphasizes the basic equivalence of the roles played by the probability density and its current in shaping the resultant value of the generalized Fisher information density. We now search for a possible relation between the classical gradient density, The logarithm of the probability density is seen to also shape the classical part of the Shannon entropy density per electron,s class: , giving rise to the gradient rs class: ¼ ÀðrpÞ=p.
Hence, the two classical information densities per electron are indeed related: with the classical Fisher measuref class: ¼ ðr ln pÞ 2 representing the squared gradient of the associated classical Shannon density:s class: ¼ À ln p. This relation can now be used to introduce the unknown nonclassical parts nclass: of the density per-electron of the generalized Shannon entropy in quantum mechanics, s½w ¼s class: ½p þs nclass: ½p; U; which includes the nonclassical terms nclass: ½p; U ¼s nclass: ½p; j. This can be accomplished by postulating that the relation of Eq. (22) also holds for the nonclassical information contributions: Therefore, the gradient of the nonclassical part of the generalized Shannon entropy is proportional to the probability current per electron, i.e., the velocity of the probability fluid, and hence, to an additive constant [see Eq. (60)], Alternatively, the negative of the phase modulus or its multiplicity can be used as the nonclassical (quantum) complement of the classical entropy density, to conform to the maximum entropy level at the system ground state. We thus conclude that, to a physically irrelevant constant, the phase function of Eq. (60) can be itself regarded as the nonclassical part of the Shannon information density per particle. It gives a transparent division of the quantum-generalized entropy density: s½w ¼s class: ½R þs nclass: ½U ¼ À2 ln R þ 2U or s½w ¼ s class: ½p þ s nclass: ½p; U ¼ Àp ln p þ 2pU Àp ln p þ 2u: ð26Þ The density per particle of the generalized Shannon entropy densitys½w is seen to be divided into the familiar classical components class: ½R, determined solely by the wave function modulus factor R(r), and the nonclassical supplements nclass: ½U, reflecting the phase function of the system wave function.
Let us now examine the associated source (see Appendix 3) of the quantum Shannon entropy functional:

Conclusion
In this short overview of the information origins of chemical bonds and the quantum generalization of the classical information measures we have surveyed a wide range of IT concepts and techniques which can be used to probe various aspects of chemical interactions between atomic orbitals in molecules. Alternative measures of the classical information densities were used to explore the ''promotional'' and ''interference'' displacements the bonded atoms in the molecular environment, relative to the system promolecular reference. Nonadditive parts of the gradient (Fisher) information density in the MO and AO resolutions, which define ELF and CG localization criteria, respectively, were shown to provide efficient and sensitive tools for locating electrons and bonds in molecular systems.
In OCT, which regards a molecule as the communication network in AO resolution, with AO constituting both the elementary ''emitters'' and ''receivers'' of the electron AOassignment signals, the typical descriptors of the channel average ''noise'' (conditional entropy) and the information flow (mutual information) constitute adequate IT descriptors of the molecular overall entropy covalency and information ionicity, due to all chemical bonds in the system under consideration. By shaping the input signal in these networks and ''reducing'' (condensing) (Nalewajski 2006a) the resolution level of these AO probabilities, it is also possible to extract the interal and external IT descriptors of specific bonds in molecular fragments, e.g., the localized bond multiplicities of diatomic subsystems, as well as measures of the information coupling between bonds located in different parts of the molecule.
The cascade generalization of OCT and the indirect extension of the classical Wiberg bond-order concept both suggest a novel, indirect (bridge) mechanism of chemical interactions, via orbital intermediates, which complements the familiar direct bonding mechanism via the constructive interference of the valence AO. The model study of linear polymers indicates that this new component effectively extends the bonding range up to three intermediate bonds in the polymer chain. In this generalized perspective on chemical interactions the overall bond multiplicity reflects all nonadditivities between AO on different atoms. On one hand, this dependency is realized directly, via the bonding combination of AO, which generally generates the bond-charge accumulation between nuclei of the interacting atoms. On the other hand, it also has an indirect source, due to the chemical coupling to the remaining, directly interacting orbitals in the molecule, in the whole bond system determined by the subspace of the occupied MO. The intermediate communications (bonds) between AO where shown to result from the implicit interdependencies between (nonorthogonal) bond projections of AO, i.e., from their simultaneous involvement in all chemical bonds of the molecule.
In OCT the intermediate bonds can be linked to the information transfer via the AO cascade, in which the AO inputs and outputs are linked by the single or multiple AO bridges. Both MO theory and IT allow one to generate the through-bridge multiplicities of such indirect couplings between more distant AO, which complement the familiar direct bond orders in the corresponding resultant indices combining the through-space and through-bridges components. This generalized outlook on the bond pattern in molecules was shown to give a more balanced picture of the weaker (cross-ring) p bonds in benzene, with the meta interactions (directly nonbonding) exhibiting a strong indirect bond multiplicity.
In the closing section of this article we have emphasized a need for the quantum extensions of the classical information measures, in order to accommodate the complex wave functions (probability amplitudes) of the quantum mechanical description. The appropriate generalization of the gradient (Fisher) information introduces the information contribution due to the probability current, giving rise to a nonvanishing information source. We have similarly introduced the phase current of the complex wave function, and generalized the Shannon entropy by including the (additive) phase contribution. This extension has been accomplished by requesting that the relation between the classical Shannon and Fisher information densities should also reflect the relation between their nonclassical (quantum) contributions.
The variational rules involving these quantum measures of information, for the fixed ground-state electron density (probability distribution), provide ''thermodynamic'' Extreme Information Principles (EPI) (Frieden 1998) for determining the equilibrium states of the quantum system in question (Nalewajski in press a, c; Nalewajski submitted).
Open Access This article is distributed under the terms of the Creative Commons Attribution License which permits any use, distribution, and reproduction in any medium, provided the original author(s) and the source are credited.

Appendix 1: Classical information measures
Here we briefly summarize the classical IT quantities, related to the particle probability distributions alone, which are used in this survey to describe the bonding patterns in molecules. We begin with the historically first (local) measure of Fisher (Fisher 1925;Frieden 1998), formulated in about the same time when the final shape of Quantum Theory has emerged. For the local events of finding an electron at r with probability p(r), the shape factor of the density qðrÞ ¼ NpðrÞ in N-electron system, this approach defines the following gradient measure of the average information content, This classical Fisher information in p(r), also called the intrinsic accuracy, is reminiscent of the familiar von Weizsäcker (1935) inhomogeneity correction to the kinetic energy functional in the Thomas-Fermi-Dirac theory. It characterizes the effective ''localization'' (compactness) of the random (position) variable around its average value. For example, in the normal (Gauss) distribution the Fisher information measures the inverse of its variance, i.e., the invariance. The functional I[p] can be simplified, when expressed in terms of classical probability amplitude AðrÞ ¼ ffiffiffiffiffiffiffiffi pðrÞ p : In order to cover the quantum (complex) amplitudes in quantum mechanics, this classical measure has to be appropriately generalized as the square of the modulus of the wavefunction gradient (Nalewajski 2010a(Nalewajski , 2012b(Nalewajski , 2008a. For example, in a single-electron state wðrÞ, when pðrÞ ¼ w Ã ðrÞwðrÞ ¼ jwðrÞj 2 , Electronic structure theory 49 where T½w stands for the expectation value of the particle kinetic energy. Therefore, this quantum-generalized (local) measure of Fisher, proportional to the particle average kinetic energy, probes the length of the amplitude gradient rwðrÞ.
The other popular information measure has been introduced by Shannon (Shannon 1948;Shannon and Weaver 1949). This complementary (global) descriptor of the average information content in the normalized probability distribution p(r), or in its discrete representation p = {p i } of the orbital events in the molecule, called the Shannon entropy, reflects the indeterminacy (''spread'') of the random variable(s) involved around the corresponding average value(s). It measures the average amount of the information received, when this uncertainty is removed be an appropriate ''localization'' experiment.
An important generalization of this information measure, called the directed divergence, cross(relative)-entropy, or the entropy deficiency, has been proposed by Kullback andLeibler (Kullback andLeibler 1951, Kullback 1959). It reflects the information ''distance'' between two normalized distributions defined for the same set of elementary events. For example, the missing information DS½pjp 0 in the continuous probability distribution p(r) relative to the reference probability density p 0 (r) is given by the average value of the surprisal, IðrÞ ¼ log½pðrÞ=p 0 ðrÞ log wðrÞ, the logarithm of the local enhancement factor w(r), The information distance DSðpjp 0 Þ between the two discrete distributions p = {p i } and p 0 = {p i 0 } similarly reads: The (non-negative) entropy deficiency reflects the information similarity between the two compared probability distributions. The more they differ from one another the higher information distance, which identically vanishes only when they are identical: for events a = {a i } and b = {b j }, respectively, one decomposes the joint probabilities of the simultaneous events a^b ¼ fa i^bj g; Pða^bÞ ¼ fPða i^bj Þ ¼ p i;j g p, as products of the ''marginal'' probabilities of events in one set, say a, and the corresponding conditional probabilities PðbjaÞ ¼ fPðjjiÞg of outcomes in the other set b, given that the events a have already occurred: fp i;j ¼ p i PðjjiÞg: The relevant normalization conditions for the joint and conditional probabilities then read: The Shannon entropy of this ''product'' distribution of simultaneous events, has been expressed above as the sum of the average entropy in the marginal probability distribution, S(p), and the average conditional entropy in q given p: The latter represents the extra amount of uncertainty about the occurrence of events b, given that the events a are known to have occurred. In other words, the amount of information obtained as a result of simultaneously observing the events a and b equals to the amount of information in one set, say a, supplemented by the extra information provided by the occurrence of events in the other set b, when a are known to have occurred already. This additivity property is qualitatively illustrated in Fig. 14.
The common amount of information in two events a i and b j , I(i:j), measuring the information about a i provided by the occurrence of b j , or the information about b j provided by the occurrence of a i , is called the mutual information in two events: It may take on any real value, positive, negative, or zero. It vanishes when both events are

S(p|q) I(p:q) S(q|p) S(p) S(q)
Fig. 14 Schematic diagram of the conditional-entropy and mutual-information quantities for two dependent probability distributions p = P(a) and q = P(b). Two circles enclose areas representing the entropies S(p) and S(q) of two separate probability vectors, while their common (overlap) area corresponds to the mutual information I(p:q) in these two distributions. The remaining part of each circle represents the corresponding conditional entropy, S(p|q) or S(q|p), measuring the residual uncertainty about events in one set, when one has the full knowledge of the occurrence of events in the other set of outcomes. The area enclosed by the circle envelope then represents the entropy of the ''product'' (joint) distribution: Electronic structure theory 51 independent, i.e., when the occurrence of one event does not influence (or condition) the probability of the occurrence of the other event, and it is negative when the occurrence of one event makes a nonoccurrence of the other event more likely. It also follows from the preceding equation that where the self-information of the joint event Iði^jÞ ¼ À log p i;j . Thus, the information in the joint occurrence of two events a i and b j is the information in the occurrence of a i plus that in the occurrence of b j minus the mutual information. Therefore, for independent events, when p i;j ¼ p i q j ; I i : j ð Þ ¼ 0 and I i^j ð Þ ¼ I i ð Þ þ I j ð Þ. The mutual information of an event with itself defines its self-information: It vanishes when p i = 1, i.e., when there is no uncertainty about the occurrence of a i , so that the occurrence of this event removes no uncertainty, hence conveys no information. This quantity provides a measure of the uncertainty about the occurrence of the event, i.e., the information received, when the event actually occurs. The Shannon entropy of Eq. (34) can be thus interpreted as the mean value of self-informations in all individual events, e.g., S p ð Þ ¼ P i p i I i ð Þ: One similarly defines the average mutual information in two probability distributions as the p-weighted mean value of the mutual information quantities for individual joint events: where the equality holds for the independent distributions, when p i;j ¼ p 0 i;j ¼ p i q j . Indeed, the amount of uncertainty in q can only decrease when p has been known beforehand, SðqÞ ! SðqjpÞ ¼ SðqÞ À Iðp : qÞ, with equality being observed when two sets of events are independent, thus giving rise to the non-overlapping entropy circles. These average entropy/information relations are also illustrated in Fig. 14.
It should be observed that the average mutual information is an example of the entropy deficiency measuring the missing information between the joint probabilities Pða^bÞ ¼ p of the dependent events a and b, and the joint probabilities P ind: ða^bÞ ¼ p 0 ¼ p T q for the independent events: Iðp : qÞ ¼ DSðpjp 0 Þ. The average mutual information thus reflects a degree of a dependence between events defining the two probability schemes. A similar information-distance interpretation can be attributed to the conditional entropy: SðpjqÞ ¼ SðpÞ À DSðpjp 0 Þ: We conclude this section with some rudiments of the Shannon communication theory (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012aShannon 1948;Shannon and Weaver 1949;Abramson 1963;Pfeifer 1978). The main elements of a typical communication device are shown in Fig. 15. Its probability scattering is reflected by the reactangular matrix PðbjaÞ, of the conditional probabilities of observing events b in its output, or outputs for short, given inputs a. This communication network determines the complementary descriptors of two dependent probability distributions of such an information system: the input distribution p = P(a) reflecting the way the device is used (exploited), and the output probability vector q ¼ PðbÞ ¼ pPðbjaÞ. They involve the conditional entropy SðpjqÞ of the outputgiven-input, a measure of the channel average communication ''noise'' (dissipated information), and the mutual information in two distributions, I(p:q), which reflects the amount of information flowing through the network. In the orbital channel both a and b involve all AO contributed by the constituent atoms of the molecule to form the system chemical bonds, a : v and b : v 0 , so that the orbital channel is defined by the square matrix of conditional probabilities Pðv0jvÞ ¼ fPðjjiÞg. Its scattered information SðpjqÞ has been interpreted as the overall IT measure of the bond covalency, while the complementary descriptor I(p:q) reflects the average IT iconicity of all chemical bonds (Nalewajski 2000(Nalewajski , 2004b(Nalewajski , c, 2005a(Nalewajski , b, c, 2006a(Nalewajski , c, d, e, f, g, 2007(Nalewajski , 2008b(Nalewajski , c, d, 2009a(Nalewajski , b, c, d, e, f, g, 2010a(Nalewajski , 2010c(Nalewajski , 2011a(Nalewajski , b, 2012aNalewajski and Jug 2002;Nalewajski et al. , 2012b.
By appropriately shaping the input ''signal'', e.g., p = p 0 , one can influence the amout of the channel information flow, The sum of both these descriptors gives the overall IT index of bond multiplicity: Nðp; qÞ ¼ Iðp : qÞ þ SðqjpÞ ¼ DSðqjpÞ þ SðpÞ: Therefore, for the stationary molecular channel in AO resolution, when p = q and hence DSðqjpÞ ¼ 0, the overall index amounts to the Shannon entropy of the molecular distribution p.
The most effective use of the communication device involves the input signal p * for which the information flow reaches the maximum value (capacity level): The logarithm of the information measure is taken to an arbitrary but fixed base. In keeping with the custom in works on IT the logarithm taken to base 2 corresponds to the information measured in bits (binary digits), while selecting log = ln expresses the amount of information in nats (natural units): 1 nat = 1.44 bits.

Appendix 2: Orbital nonadditivities
There are two limiting configurations (references) in the bond formation process: a collection of the separated constituent atoms in their respective ground states, and the ground state of the molecule as a whole. Indeed, the chemical bond phenomena reflect all changes in the electronic structure marking a transition from the initial state of the equilibria in isolated atoms to the final stage of the resultant equilibrium density in the whole molecular  system. The electron redistribution between AIM, relative to the electron density q 0 ðrÞ ¼ P i q i ðrÞ in the isoelectronic promolecular reference, a collection of the free atoms shifted to their molecular positions, where the occupied AO densities q i ðrÞ ¼ N 0 i jv i r ð Þj 2 , N i 0 stands for the ground-state occupation of ith AO and P i N 0 i ¼ N, is customarily probed by the density difference function of quantum chemistry: The bonded atoms are known to strongly resemble the corresponding free (neutral) atoms. In other words, they are only slightly modified as a result of the bond formation, mainly in their valence shells, exhibiting some polarization towards their bonding partners and an effective charge due to the electron transfer between AIM (open subsystems). The main sources of these changes are the most ''polarizable'' and superposable valence electrons of the constituent atoms, which delocalize via the system chemical bonds determined by the subspace of the occupied MO. Since each probability distribution carries the associated information content, this shift in the electron density implies the corresponding change in the distribution of electronic information. Examining these changes and designing proper IT tools for extracting bonded atoms, locating electrons and bonds, determining their multiplicities and covalent/ionic composition, were the main objectives of the IT analysis, e.g., (Nalewajski 2006a(Nalewajski , 2010a(Nalewajski , 2012a. The chemical interpretation of the bond-formation process calls for an appropriate resolution level. The most natural in the SCF LCAO MO theories are MO, w ¼ fw a g, expressed as linear combinations w ¼ vC of the (orthonormal) AO v ¼ fv i g contributed by the constituent free atoms. Here, the LCAO matrix C ¼ C occ jC virt À Á groups the corresponding columns corresponding to the expansion coefficients determining the occupied (occ) and virtual (virt) MO subspaces, respectively. Finding electrons in these molecular or atomic states determines the set of elementary ''events'' in the molecule and promolecule, respectively. Their probabilities, , and the associated orbital occupations n MO ¼ fn a g; and N AO ¼ N 0 i È É , in the molecule and promolecule, respectively, P a n a N ¼ P i N 0 i N 0 , then represent the associated condensed (discrete) descriptors. These occupations are given by the diagonal elements in the corresponding (canonical) density matrices: where P w occd and P v occd stand for the projection operators onto the occupied subspaces of MO in the molecule and of AO in the promolecule, respectively. In other words, in the simplest SCF MO level of a theoretical description the MO constitute the ''Natural Orbitals'' (NO) in the molecule, while AO provide the corresponding NO set of the promolecule, which diagonalize the associated canonical density matrices c MO and c 0,AO , respectively.
Transforming these matrices into the corresponding (non-canonical) representations of the other set of elementary states gives the following square matrices: Their diagonal elements c i,i and c 0 a;a then reflect the effective AO occupations in the molecule and the effective MO occupations in the promolecule, respectively.
The corresponding local descriptors of the molecular electronic structure thus involve the elementary MO densities, or the square matrix of the AO density products: The associated distributions in the promolecular reference are similarly resolved in terms of the elementary AO densities: or the matrix of MO products where the promolecular density matrix in the MO representation is defined in Eq. (48). Therefore, in the SCF MO approach the MO and AO bases give rise to the additive partitioning of the molecular and promolecular densities, respectively. Both these reference states can be subsequently used to define the additive components of density functionals, in order to extract their associated nonadditive, truly bonding contributions (''Electron densities as information carries'' section). When the purely molecular displacement quantities are of interest, e.g., in the ELF problem, one uses the additive MO partitioning of the molecular density [Eq. (49)], while probing chemical bonds via the CG criterion calls for the promolecular reference, and hence the additive AO division of the promolecular density [Eq. (51)]. Indeed, in intuitive chemical considerations the bonding phenomena reflect displacements in electron distribution relative to the initial state of the system promolecule, e.g., in the density difference function Dq(r) [Eq. (46)]. The AO basis thus provides a proper framework to describe and ultimately understand the bonding mechanism.
Clearly, by dividing quantities of Eqs. (49-52) by N gives rise to partitions of the associated probability distributions (shape factors of the corresponding densities), p(r) = q(r)/N and p 0 (r) = q 0 (r)/N, e.g., As already remarked above (see also ''Electron densities as information carries'' section), these orbital pieces define the associated division of the corresponding density/ probability functionals into their additive and nonadditive contributions (Nalewajski 2006a(Nalewajski , 2008a(Nalewajski , 2010a(Nalewajski , b, 2012aNalewajski et al. 2010;Nalewajski et al. 2005Nalewajski et al. , 2012a, in the spirit of earlier such partitionings (Gordon and Kim 1972;Wesołowski 2004a, b). For example, the given functional of the promolecular density F[q 0 ] : F total [q AO ] exhibits in this AO resolution the additive (promolecular) component, which subsequently implies the functional nonadditive part relative to this reference: In the ELF problem the molecular reference calls for the MO resolution : Therefore, each additive resolution of the electronic distribution determines its unique set of nonadditivities. Hence, the nonadditive characteristics in the MO resolution, related to the molecular reference, differ from those in the AO resolution, corresponding to the promolecular reference.
preceding equation that rU measures the scaled density-current per particle, i.e. the electron velocity V.
The following question now arises: can one introduce a related phase-flow concept, associated with the other degree-of-freedom of quantum states? By symmetry, it can be expected to be related to the gradient of the modulus part R(r) = [p(r)] 1/2 of the wave function,  (31,32)], due to the probability density p(r) or its amplitude A(r), and the quantum correction involving both p(r) and j(r): This quantum gradient-measure exhibits the following full time dependence (Nalewajski 2012a): where the force acting on the particle F r ð Þ ¼ Àrv r ð Þ: In the preceding equation we have used the probability continuity: dp r ð Þ=dt _ pðrÞ ¼ r p r ð Þ ¼ op r ð Þ=ot þ r Á j r ð Þ ¼ 0: Therefore, only the nonclassical Fisher information generates a nonvanishing source r f of this gradient information measure (Nalewajski 2012a): The local production of this quantum measure of the Fisher information is thus proportional to the scalar product of the local force and flow vectors, in perfect analogy to the familiar expression for the entropy source in irreversible thermodynamics (Callen 1960), given by the sum of products of the corresponding rates of change (or flows) of extensive quantities (thermodynamic fluxes) and the conjugate gradients or differences of intensive ''forces'' (thermodynamic affinities). Notice, that in the nondegenerate ground state, for which the time-dependent phase is position independent, the current and its information contribution identically vanish, so that the current correction I[j] to the gradient (Fishertype) measure of information should be important in the degenerate stationary states and in general nonstationary quantum states of molecular systems. We recall, that the current concept, a crucial component of the probability continuity in quantum mechanics, emerges in the context of the time-dependence of the system wave function, determined by the Schrödinger equation or its complex conjugate: H r ð ÞwðrÞ ¼ i" h ow r ð Þ=ot and H r ð Þw Ã ðrÞ ¼ Ài" h ow Ã r ð Þ=ot: Via straightforward manipulations one then derives Eq. (65) from the ''weighted'' difference of these two equations: Thus, the local change in the probability density (l.h.s) is solely due to the probability outflow (r.h.s.) measured by the divergence of the probability current density. This probability-continuity equation expresses the local balance in electron redistributions. It signifies the source-less probability redistribution in molecular systems, with the vanishing total time derivative, expressing the time rate of change of the particle density in an infinitesimal ''monitoring'' volume element flowing with the particle. It also follows from the preceding equation that the overall probability (wave function normalization) is conserved in time, while the local time-derivative of the modulus factor R reads: oR r ð Þ=ot ¼ À½2RðrÞ À1 r Á j r ð Þ ¼ À " h=m ð ÞfrR r ð Þ Á rU r ð Þ þ ½RðrÞ=2DU r ð Þg: The ''weighted'' sum of the Schrödinger equations (67) similarly reads w Ã ðrÞ½ow r ð Þ=ot À wðrÞ ½ow Ã r ð Þ=ot ¼ 2ipðrÞ½oU r ð Þ= ¼ i" h=m ð ÞfRðrÞ r ð Þ À ½R r ð ÞrU r ð Þ 2 g; ð71Þ giving rise to the explicit time-derivative of the wave-function phase: oU r ð Þ=ot ¼ " h=2m ð ÞfR À1 ðrÞDR r ð Þ À ½rU r ð Þ 2 g À v r ð Þ=" h; in the last term including the external potential contribution. It follows from Eqs. (70) and (72) that the Schrödinger equation gives rise to a coupled dynamics of both these components of the complex wave function. Notice, however, that only the phase function explicitly depends on the system external potential. It is our goal now to express this phase derivative as a combination of the corresponding divergence of the related phase-current and the accompanying phase-source terms in the associated phase-continuity equation.
It should be also recalled that the continuity balance expresses some basic conservation law. In order to similarly introduce the phase-continuity, one could associate the following expectation (average) value of the system phase corresponding to the particle probability distribution p(r): with u(r) standing for the particle phase-density at r, and the phase factor U(r) representing the associated density per electron. In a search for the source r u (r) of the phase distribution u(r) one again calculates the total time derivative (for the moving monitoring space element) du r ð Þ=dt r u r ð Þ ¼ ou r ð Þ=ot þ r Á J u r ð Þ; with the partial derivative representing the local change in the fixed monitoring volume around r and the divergence term representing the phase-inflow, due to the associated current J u . Since the source r p vanishes [Eq. (65)] and u(r) depends upon both the probability density p(r) as well as the local phase U(r), Therefore, the continuity equation (75) is fully determined by the associated balance equation for the phase factor itself: A proper identification of the phase-current J U and the associated source r U , both conforming to the time derivative of Eq. (72), remains our main goal now. It should be emphasized, that this problem is not unique and an adoption of the specific form of the phase current implies the conjugate form of the source. Since the speed (j/p) = V of the probability flow reflects the gradient of the complementary phase factor U, one would expect the phase current to involve the gradient of the amplitude R alone, or the gradients of both these degrees-of-freedom of the quantum state w.
One first observes the following derivative identities containing the first two terms in rhs of Eq. (72): r Á ðUrUÞ ¼ ðrUÞ 2 þ UDU and r Á ðr ln RÞ ¼ r Á ðR À1 rRÞ ¼ R À1 DR À ðR À1 rRÞ 2 : Therefore, introducing the phase-current J U involving the gradients of both components, gives the following source in the continuity equation (76): Similarly, by linking the current solely to the gradient of the amplitude factor, yields r 0 U ¼ " h=2m ð Þ½ðr ln RÞ 2 À ðrUÞ 2 À v=" h; In the stationary, say ground state of a molecule, w 0 r ð Þ ¼ R 0 r ð Þ exp½iU 0 r ð Þ ¼ R 0 r ð Þ expðÀiE 0 t=" hÞ R 0 r ð Þ expðÀix 0 tÞ ð81Þ corresponding to the sharply specified energy E = E 0 , the local phase is equalized throughout the whole physical space, U 0 (t) = -x 0 t, and hence: rU 0 = 0 and hUi ¼ U 0 t ð Þ. In such states both definitions of the phase current and the associated source become identical and time independent: Electronic structure theory 59 It should be finally observed that this phase equalization in the stationary quantum state is related to the associated equalization of the local energy, eðrÞ wðrÞ À1 H r ð ÞwðrÞ or e Ã ðrÞ ¼ w Ã ðrÞ À1 H r ð Þw Ã ðrÞ; ð84Þ at the ground-state level: