2. Information-Theoretic Approach
Information theory developed by Shannons and others as a branch of applied mathematics, electrical engineering, and computer science is involved in the quantification of information which is often a probability distribution function. Since the electron density is a continuous probability distribution function, as early as in the year of 1980, information theory has been applied to DFT to study atoms and molecules. We call this particular category of work the information-theoretic approach (ITA).
There are three different representation using in ITA, they are:
Electron density representation
Shape function representation
Atoms-in-molecules representation
2.1. Electron Density Representation
Using the electron density \(\rho(\mathbf{r})\) as the probability function in information theory, one obtains the first ITA representation. A key measure of information is entropy, quantifying the uncertainty involved in predicting the value of the distribution function. Shannon entropy is the first such a measure widely used in the literature, which reads
where \(s_{\text{S}}(\mathbf{r})\) is the Shannon entropy density and \(\rho(\mathbf{r})\) is the total electron density, satisfying the following condition in relation to the total number of electrons, N, of the system.
Shannon entropy measures the spatial delocalization of the electronic density. The second important measure in information theory is the Fisher information \(I_{\text{F}}\), defined as followse
which is a gauge of the sharpness or concentration of the electron density distribution. In Eq.(3), \(i^{\prime}_{\text{F}}(\mathbf{r})\) is the Fisher information density and \(\mathbf{\nabla} \rho(\mathbf{r})\) is the density gradient. Earlier, we haveproved that there is an equivalent expression for the Fisher information in terms of the Laplacian of the electron density \(\nabla^2 \rho(\mathbf{r})\)
Equations (3) and (4) are equal in the sense that they can be derived by partial integration from one to the other, and that the two integrals have the same value. As have been shown, local behaviors of the two integrals, \(i_F({\mathbf{r}})\), and \(i^{\prime}_F(\mathbf{r})\), are markedly different. More importantly, we have proved the existence of the following rigorous relationship among the three quantities \(s_s(\mathbf{r})\), \(i_F({\mathbf{r}})\), and \(i^{\prime}_F(\mathbf{r})\),
whose validity has subsequently been verified by numerical results.
The third quantity in the same spirit is the Ghosh-Berkowitz-Parr (GBP) entropy
where \(t(\mathbf{r}, \rho)\) is the kinetic energy density, which is related to the total kinetic energy \(T_s\) via
and \(t_{\text{TF}}(\mathbf{r}, \rho)\) is the Thomas-Fermi kinetic energy density,
with k as the Boltzmann constant, \(c = (5/3) + \ln(4\pi c_K/3)\), and \(c_K = (3/10)(3\pi^2)^{2/3}\). The GBP entropy originates from the effort to transcribe the ground-state density functional theory into a local thermodynamics through the phase-space distribution function \(f(\mathbf{r}, \mathbf{p})\) which is a function of both the electron position \(\mathbf{r}\) and momentum \(\mathbf{p}\) as its two basic variables. The conditions of such a recast of DFT into thermodynamics are that the phase-space distribution function is associated with the ground state electron density \(\rho(\mathbf{r})\) and kinetic energy density \(t(\mathbf{r}, \rho)\) through the following relationships
The specific form of the local kinetic energy \(t(\mathbf{r}, \rho)\) used is thefollowing,
Very recently, three information-theoretic quantities, Renyi entropy, Tsallis entropy, and Onicescu information energy, are introduced as new reactivity descriptors in DFRTs. The Renyi entropy oforder n, where n > 0 and n / 1, is defined as
When n approaches to 1, the Renyi entropy, Eq.(14), reduces tothe Shannon entropy, Eq.(3). The Tsallis entropy of order n isdefined as follows
It is a generalization of the standard Boltzmann-Gibbs entropy.The common term in Eqs.(14) and (15) is the integral of the n-th power of the electron density, which is called the Onicescuinformation energy of order n
Onicescu introduced this quantity in an attempt to define a finer measure of dispersion distribution than that of Shannon entropy in information theory
Closely related to the concept of entropy in information theory is the relative entropy, which is a non-symmetric measureof the entropy difference between two probability distribution functions. Well known examples in the literature are the relative Shannon entropy, also called information gain, Kullback-Leibler divergence, or information divergence, defined byo
and the relative Renyi entropy of order n
where \(\rho_0(\mathbf{r})\) is the reference state density satisfying the same normalization condition as \(\rho(\mathbf{r})\). This reference density can be from the same molecule with different conformation or from the reactant of a chemical reaction when the transition state is investigated.
Notes
k is the Boltzmann constant, by default 1.0 for convenience.
k is the Boltzmann constant, by default 1.0 for convenience.
k is the Boltzmann constant, by default 1.0 for convenience.
2.2. Shape Function Representation
Information-theoretic quantities defined in Eqs.(3) (18) employ the electron density as the probability distribution func-tion. There is another distribution function in DFRT, the shapefunction o(r)..9.o 72, which is related to the electron density p(r)and the total number of electrons N through the following relationship,
with the following normalization condition
Information-theoretic quantities defined in Eqs.(3)(18) cansimilarly be redefined with the shape function, yielding
Because of Eq.(19), quantities in these two representations are correlated, except for the GBP entropy, which does not have an analytical expression between the two representations. As can be readily proved, we have
and
These rigorous relationships between the two representations of information-theoretic quantities enable us to obtain them inter-changeably from one representation to the other.
2.3. Atoms-in-Molecules Representation
Another important aspect of the information-theoretic approach is to re-evaluate the above quantities from the perspective of atoms in molecules. To consider atomic contributions of an information-theoretic quantity in a molecular system, three approaches are available to perform atom partitionsin molecules. They are Becke’s fuzzy atom approach, Bader’s zero-flux AIM approach, and Hirshfeld’s stockholder approach.
TODO
Implement the Becke and Bader AIM approach。
The total electron population N of the system is thesummation ofelectron density in each atomic contribution, \(N_A\)
and
and
where \(\rho(\mathbf{r})\), is the electron density on atom (or group) A in a molecule, whose total molecular electron density is \(\rho(\mathbf{r})\), \(\rho_A(\mathbf{r})\) is the counterpart of atom (or group) A in the reference state, which can be neutral atom, or ion, or group, etc, and \(\Omega_A\) is the atomic basin of atom A in the molecule. The counterpart in terms ofthe shape function can be derived similarly.