2. ITA

The Information-theoretic approach quantities (ITA)

The Shannon entropy \(S_S\) characterizes the spatial delocalization of the electron density, and Fisher information \(I_F\) measures the sharpness or localization of the same. They are defined by

\[S_{S} = -\int \rho(\mathbf{r}) ln\rho(\mathbf{r}) d\mathbf{r}\]
\[F_{I} = -\int \frac{|\nabla\rho(\mathbf{r})|^2}{\rho(\mathbf{r})} d\mathbf{r}\]

respectively. The Ghosh–Berkowitz–Parr (GBP) entropy \(S_{GBP}\) is a functional of both the electron density and the kinetic energy density,

\[S_{GBP} = -\int \frac{3}{2}k\rho(\mathbf{r})\left[c+ ln\frac{t(\mathbf{r};\rho)}{t_{TF}(\mathbf{r};\rho)} \right] d\mathbf{r}\]

where \(t(\mathbf{r};\rho)\) and \(t_{TF}(\mathbf{r};\rho)\) are the (non-interacting) kinetic-energy density and the Thomas–Fermi (TF) kinetic energy density, respectively. The former is related to the total kinetic energy \(T_S\) \(\mathcal{via}\) \(\int t(\mathbf{r};\rho) d\mathbf{r} = T_S\) and \(t_{TF}(\mathbf{r};\rho)\) is defined as \(t_{TF}(\mathbf{r};\rho) = C_k \rho^{5/ 3}(\mathbf{r})\). Here, k is the Boltzmann constant, \(c = (5/3) + ln(4\pi c_K/3)\), and \(c_K = (3/10)(3\pi^2)^{2/3}\). The kinetic energy density, \(t(\mathbf{r};\rho)\), can be computed from the orbital densities. \(t(\mathbf{r};\rho) = \sum_i \frac{1}{8} \frac{\nabla \rho_i \dot \nabla \rho_i}{\nabla\rho_i}-\frac{1}{8} \nabla^2 \rho\). Different forms of the kinetic-energy density are used in different contexts, but within the mathematical framework of Ghosh, Berkowitz, and Parr, this is the maximum-entropy choice. In addition, some other ITA quantities have been introduced in conceptual density functional theory (CDFT). One example is Onicescu information energy of order n.

\[E_n = \frac{1}{n-1}\int \rho^n(\mathbf{r}) d\mathbf{r}\]

relative R{‘e}nyi entropy of order n

\[R_n^r = \frac{1}{n-1}ln\int \left[ \frac{\rho^n(\mathbf{r})}{\rho^{n-1}_0(\mathbf{r})} d\mathbf{r} \right]\]

and information gain (also called Kullback–Leibler divergence, or relative Shannon entropy) \(I_G\) is given in eqn(6)

\[I_G = \int \rho(\mathbf{r}) ln \frac{\rho(\mathbf{r})}{\rho_0(\mathbf{r})} d\mathbf{r}\]

where \(\rho_0(\mathbf{r})\) is the reference-state density satisfying the same normalization condition as \(\rho(\mathbf{r})\).

Recently, one ofthe present authors proposed another three functions \(G_1\), \(G_2\),and \(G_3\), whose analytical forms as shown below:

\[\begin{split}\begin{aligned} G_1 &= \sum_A \int \nabla^2 \rho_A(\mathbf{r}) \frac{\rho_A(\mathbf{r})}{\rho_A^0(\mathbf{r})} d\mathbf{r}\\ G_2 &= \sum_A \int \rho_A(\mathbf{r})\left[\frac{\nabla^2 \rho_A(\mathbf{r})}{\rho_A(\mathbf{r})}-\frac{\nabla^2 \rho_A^0(\mathbf{r})}{\rho_A^0(\mathbf{r})} \right ] d\mathbf{r}\\ G_3 &= \sum_A \int \rho_A(\mathbf{r})\left[\nabla ln\frac{\rho_A(\mathbf{r})}{\rho_A^0(\mathbf{r})} \right]^2 d\mathbf{r}\\ \end{aligned}\end{split}\]

The quantifications and applications of eqn (7)–(9) can be found in ref. 44 and 46. Note that during the past decade, we have attempted to seamlessly glue the density functional theory and information theory together, as electron density can be a linker between these two theories. The progress and applications can be found in our recent review.59 In addition, Hirshfeld’s stockholder approach60–64 is often adopted in the literature to partition atoms in a molecule, as defined in eqn (10),

\[\rho_A(\mathbf{r}) = \omega_A(\mathbf{r})\rho(\mathbf{r}) = \frac{\rho_A^0(r-R_A)}{\sum_B \rho_B^0(r-R_B)} \rho(\mathbf{r})\]

Here \(\rho_A(\mathbf{r})\) is the Hirshfeld density for atom A in a molecule, \(\omega_A(\mathbf{r})\) is a sharing function for atom A, \(\rho_B^0(\mathbf{\mathbf{r}-\mathbf{R}_A})\) is the density of atom B centered at the nuclear position vector \(\mathbf{R}_A\),and \(\rho(\mathbf{r})\) is the actual molecular density. The sum over all the nuclei-centered free atom densities, typically spherically averaged ground-state atomic densities, is termed the promolecular density. While the stockholder approach is natural in the context of ITA because it is also based on information-theoretic arguments, other molecular partitioning schemes like Becke’s fuzzy atom approach and Bader’s zero-flux atoms-in-molecules (AIM) method can also be used.

Name

Symbol

Description

Test Text

Shannon entrop

\(S_S\)

Title

Here’s this

Fisher information

\(F_I\)

Text

And more

Ghosh–Berkowitz–Parr entropy

\(S_{GBP}\)

Text

And more

Onicescu information energy of order n

\(E_n\)

Text

And more

Re´nyi entropy of order n

\(R_n^r\)

Text

And more

Information gain

\(I_G\)

Text

And more

\(G_1\)

Text

And more

\(G_2\)

Text

And more

\(G_3\)

Text

And more

[ ]: