Abstract:
We present a method to generate probability distributions that correspond to metrics obeying partial differential equations generated by extremizing a functional $J[g^{\mu\nu}(\theta^i)]$, where $g^{\mu\nu}(\theta^i)$ is the Fisher metric. We postulate that this functional of the dynamical variable $g^{\mu\nu}(\theta^i)$ is stationary with respect to small variations of these variables. Our approach enables a dynamical approach to Fisher information metric. It allows to impose symmetries on a statistical system in a systematic way. This work is mainly motivated by the entropy approach to nonmonotonic reasoning.

Abstract:
We show a general relation between the spatially disjoint product of probability density functions and the sum of their Fisher information metric tensors. We then utilise this result to give a method for constructing the probability density functions for an arbitrary Riemannian Fisher information metric tensor. We note further that this construction is extremely unconstrained, depending only on certain continuity properties of the probability density functions and a select symmetry of their domains.

Abstract:
We study the interrelationships between the Fisher information metric recently introduced, on the basis of maximum entropy considerations, by Brody and Hughston (quant-ph/9906085) and the monotone metrics, as explicated by Petz and Sudar. This new metric turns out to be not strictly monotone in nature, and to yield (via its normalized volume element) a prior probability distribution over the Bloch ball of two-level quantum systems that is less noninformative than those obtained from any of the monotone metrics, even the minimal monotone (Bures) metric. We best approximate the additional information contained in the Brody-Hughston prior over that contained in the Bures prior by constructing a certain Bures posterior probability distribution. This is proportional to the product of the Bures prior and a likelihood function based on four pairs of spin measurements oriented along the diagonal axes of an inscribed cube.

Abstract:
We derive the Einstein tensor from the Fisher information metric that is defined by the probability distribution of a statistical mechanical system. We find that the tensor naturally contains essential information of the energy-momentum tensor of a classical scalar field, when the entropy data or the spectrum data of the system are embedded into the classical field as the field strength. Thus, we can regard the Einstein equation as the equation of coarse-grained states for the original microscopic system behind the classical field theory. We make some remarks on quantization of gravity and various quantum-classical correspondences.

Abstract:
On a closed manifold of dimension greater than one, every smooth weak Riemannian metric on the space of smooth positive probability densities, that is invariant under the action of the diffeomorphism group, is a multiple of the Fisher--Rao metric.

Abstract:
We show that, in finite dimensions, the only monotone metrics for which the (+1) and (-1) affine connections are mutually dual are constant multiples of Bogoliubov-Kubo-Mori metric

Abstract:
Chentsov studied Riemannian metrics on the set of probability measures from the point of view of decision theory. He proved that up to a constant factor the Fisher information is the only metric which is monotone under stochastic transformations. The present paper deals with monotone metrics on the space of finite density matrices on the basis motivated by quantum mechanics. A characterization of those metrics is given in terms of operator monotone functions. Several concrete metrics are constructed and analyzed, in particular, instead of the uniqueness in the probabilistic case, there is a large class of monotone metrics. Some of those appeared already in the physics literature a long time ago. A limiting procedure to pure states is discussed as well.

Abstract:
The origins of Fisher information are in its use as a performance measure for parametric estimation. We augment this and show that the Fisher information can characterize the performance in several other significant signal processing operations. For processing of a weak signal in additive white noise, we demonstrate that the Fisher information determines (i) the maximum output signal-to-noise ratio for a periodic signal; (ii) the optimum asymptotic efficacy for signal detection; (iii) the best cross-correlation coefficient for signal transmission; and (iv) the minimum mean square error of an unbiased estimator. This unifying picture, via inequalities on the Fisher information, is used to establish conditions where improvement by noise through stochastic resonance is feasible or not.

Abstract:
For a known weak signal in additive white noise, the asymptotic performance of a locally optimum processor (LOP) is shown to be given by the Fisher information (FI) of a standardized even probability density function (PDF) of noise in three cases: (i) the maximum signal-to-noise ratio (SNR) gain for a periodic signal; (ii) the optimal asymptotic relative efficiency (ARE) for signal detection; (iii) the best cross-correlation gain (CG) for signal transmission. The minimal FI is unity, corresponding to a Gaussian PDF, whereas the FI is certainly larger than unity for any non-Gaussian PDFs. In the sense of a realizable LOP, it is found that the dichotomous noise PDF possesses an infinite FI for known weak signals perfectly processed by the corresponding LOP. The significance of FI lies in that it provides a upper bound for the performance of locally optimum processing.

Abstract:
In biology, information flows from the environment to the genome by the process of natural selection. But it has not been clear precisely what sort of information metric properly describes natural selection. Here, I show that Fisher information arises as the intrinsic metric of natural selection and evolutionary dynamics. Maximizing the amount of Fisher information about the environment captured by the population leads to Fisher's fundamental theorem of natural selection, the most profound statement about how natural selection influences evolutionary dynamics. I also show a relation between Fisher information and Shannon information (entropy) that may help to unify the correspondence between information and dynamics. Finally, I discuss possible connections between the fundamental role of Fisher information in statistics, biology, and other fields of science.