oalib
Search Results: 1 - 10 of 100 matches for " "
All listed articles are free for downloading (OA Articles)
Page 1 /100
Display every page Item
Dynamics of the Fisher Information Metric  [PDF]
Xavier Calmet,Jacques Calmet
Physics , 2004, DOI: 10.1103/PhysRevE.71.056109
Abstract: We present a method to generate probability distributions that correspond to metrics obeying partial differential equations generated by extremizing a functional $J[g^{\mu\nu}(\theta^i)]$, where $g^{\mu\nu}(\theta^i)$ is the Fisher metric. We postulate that this functional of the dynamical variable $g^{\mu\nu}(\theta^i)$ is stationary with respect to small variations of these variables. Our approach enables a dynamical approach to Fisher information metric. It allows to impose symmetries on a statistical system in a systematic way. This work is mainly motivated by the entropy approach to nonmonotonic reasoning.
Probability Density Functions from the Fisher Information Metric  [PDF]
T. Clingman,Jeff Murugan,Jonathan P. Shock
Statistics , 2015,
Abstract: We show a general relation between the spatially disjoint product of probability density functions and the sum of their Fisher information metric tensors. We then utilise this result to give a method for constructing the probability density functions for an arbitrary Riemannian Fisher information metric tensor. We note further that this construction is extremely unconstrained, depending only on certain continuity properties of the probability density functions and a select symmetry of their domains.
The Brody-Hughston Fisher Information Metric  [PDF]
Paul B. Slater
Physics , 2003,
Abstract: We study the interrelationships between the Fisher information metric recently introduced, on the basis of maximum entropy considerations, by Brody and Hughston (quant-ph/9906085) and the monotone metrics, as explicated by Petz and Sudar. This new metric turns out to be not strictly monotone in nature, and to yield (via its normalized volume element) a prior probability distribution over the Bloch ball of two-level quantum systems that is less noninformative than those obtained from any of the monotone metrics, even the minimal monotone (Bures) metric. We best approximate the additional information contained in the Brody-Hughston prior over that contained in the Bures prior by constructing a certain Bures posterior probability distribution. This is proportional to the product of the Bures prior and a likelihood function based on four pairs of spin measurements oriented along the diagonal axes of an inscribed cube.
Emergent General Relativity from Fisher Information Metric  [PDF]
Hiroaki Matsueda
Physics , 2013,
Abstract: We derive the Einstein tensor from the Fisher information metric that is defined by the probability distribution of a statistical mechanical system. We find that the tensor naturally contains essential information of the energy-momentum tensor of a classical scalar field, when the entropy data or the spectrum data of the system are embedded into the classical field as the field strength. Thus, we can regard the Einstein equation as the equation of coarse-grained states for the original microscopic system behind the classical field theory. We make some remarks on quantization of gravity and various quantum-classical correspondences.
Uniqueness of the Fisher-Rao metric on the space of smooth densities  [PDF]
Martin Bauer,Martins Bruveris,Peter W. Michor
Mathematics , 2014,
Abstract: On a closed manifold of dimension greater than one, every smooth weak Riemannian metric on the space of smooth positive probability densities, that is invariant under the action of the diffeomorphism group, is a multiple of the Fisher--Rao metric.
On the Uniqueness of the Chentsov Metric in Quantum Information Geometry  [PDF]
M. R. Grasselli,R. F. Streater
Physics , 2000,
Abstract: We show that, in finite dimensions, the only monotone metrics for which the (+1) and (-1) affine connections are mutually dual are constant multiples of Bogoliubov-Kubo-Mori metric
Extending the Fisher metric to density matrices  [PDF]
D. Petz,Cs. Sudar
Physics , 2001,
Abstract: Chentsov studied Riemannian metrics on the set of probability measures from the point of view of decision theory. He proved that up to a constant factor the Fisher information is the only metric which is monotone under stochastic transformations. The present paper deals with monotone metrics on the space of finite density matrices on the basis motivated by quantum mechanics. A characterization of those metrics is given in terms of operator monotone functions. Several concrete metrics are constructed and analyzed, in particular, instead of the uniqueness in the probabilistic case, there is a large class of monotone metrics. Some of those appeared already in the physics literature a long time ago. A limiting procedure to pure states is discussed as well.
Fisher Information as a Metric of Locally Optimal Processing and Stochastic Resonance  [PDF]
Fabing Duan, Fran?ois Chapeau-Blondeau, Derek Abbott
PLOS ONE , 2012, DOI: 10.1371/journal.pone.0034282
Abstract: The origins of Fisher information are in its use as a performance measure for parametric estimation. We augment this and show that the Fisher information can characterize the performance in several other significant signal processing operations. For processing of a weak signal in additive white noise, we demonstrate that the Fisher information determines (i) the maximum output signal-to-noise ratio for a periodic signal; (ii) the optimum asymptotic efficacy for signal detection; (iii) the best cross-correlation coefficient for signal transmission; and (iv) the minimum mean square error of an unbiased estimator. This unifying picture, via inequalities on the Fisher information, is used to establish conditions where improvement by noise through stochastic resonance is feasible or not.
Fisher information as a performance metric for locally optimum processing  [PDF]
Fabing Duan,Francois Chapeau-Blondeau,Derek Abbott
Mathematics , 2011,
Abstract: For a known weak signal in additive white noise, the asymptotic performance of a locally optimum processor (LOP) is shown to be given by the Fisher information (FI) of a standardized even probability density function (PDF) of noise in three cases: (i) the maximum signal-to-noise ratio (SNR) gain for a periodic signal; (ii) the optimal asymptotic relative efficiency (ARE) for signal detection; (iii) the best cross-correlation gain (CG) for signal transmission. The minimal FI is unity, corresponding to a Gaussian PDF, whereas the FI is certainly larger than unity for any non-Gaussian PDFs. In the sense of a realizable LOP, it is found that the dichotomous noise PDF possesses an infinite FI for known weak signals perfectly processed by the corresponding LOP. The significance of FI lies in that it provides a upper bound for the performance of locally optimum processing.
Natural selection maximizes Fisher information  [PDF]
Steven A. Frank
Quantitative Biology , 2009, DOI: 10.1111/j.1420-9101.2008.01647.x
Abstract: In biology, information flows from the environment to the genome by the process of natural selection. But it has not been clear precisely what sort of information metric properly describes natural selection. Here, I show that Fisher information arises as the intrinsic metric of natural selection and evolutionary dynamics. Maximizing the amount of Fisher information about the environment captured by the population leads to Fisher's fundamental theorem of natural selection, the most profound statement about how natural selection influences evolutionary dynamics. I also show a relation between Fisher information and Shannon information (entropy) that may help to unify the correspondence between information and dynamics. Finally, I discuss possible connections between the fundamental role of Fisher information in statistics, biology, and other fields of science.
Page 1 /100
Display every page Item


Home
Copyright © 2008-2017 Open Access Library. All rights reserved.