## Saturday, July 5, 2014

### Thermodynamic Context for Fisher Information Matrices

The following is from Wikipedia on the reparametrization of a Fisher Information matrix:
The Fisher information depends on the parametrization of the problem. If θ and η are two scalar parametrizations of an estimation problem, and θ is a continuously differentiable function of η, then
${\mathcal I}_\eta(\eta) = {\mathcal I}_\theta(\theta(\eta)) \left( \frac{{\mathrm d} \theta}{{\mathrm d} \eta} \right)^2$
where ${\mathcal I}_\eta$ and ${\mathcal I}_\theta$ are the Fisher information measures of η and θ, respectively.[12]In the vector case, suppose ${\boldsymbol \theta}$ and ${\boldsymbol \eta}$ are k-vectors which parametrize an estimation problem, and suppose that ${\boldsymbol \theta}$ is a continuously differentiable function of ${\boldsymbol \eta}$, then,[13]${\mathcal I}_{\boldsymbol \eta}({\boldsymbol \eta}) = {\boldsymbol J}^{\mathrm T} {\mathcal I}_{\boldsymbol \theta} ({\boldsymbol \theta}({\boldsymbol \eta})) {\boldsymbol J}$
where the (ij)th element of the k × k Jacobian matrix $\boldsymbol J$ is defined by
$J_{ij} = \frac{\partial \theta_i}{\partial \eta_j}\,,$
and where ${\boldsymbol J}^{\mathrm T}$ is the matrix transpose of ${\boldsymbol J}$.
In information geometry, this is seen as a change of coordinates on a Riemannian manifold, and the intrinsic properties of curvature are unchanged under different parametrization. In general, the Fisher information matrix provides a Riemannian metric (more precisely, the Fisher-Rao metric) for the manifold of thermodynamic states, and can be used as an information-geometric complexity measure for a classification of phase transitions, e.g., the scalar curvature of the thermodynamic metric tensor diverges at (and only at) a phase transition point.[14]In the thermodynamic context, the Fisher information matrix is directly related to the rate of change in the corresponding order parameters [emphasis mine].[15] In particular, such relations identify second-order phase transitions via divergences of individual elements of the Fisher information matrix.
I wonder how this could be extended to the dispersion of mutations through configuration space.

Some possible sources for further inquiry:

Also Information Geometry: From Black Holes to Condensed Matter Systems, an editorial by Tapobrata Sarkar, Hernando Quevedo, and Rong-Gen Cai.

And for black hole thermodynamics look at Information geometries for black hole physics by Narit Pidokrajt, as well as Information Geometric Approach to Black Hole Thermodynamics by the same author.

The figure below is from Frederic Barbaresco's "Eidetic Reduction of Information Geometry" in Geometric Theory of Information by Frank Nielsen.