%0 Journal Article %T Jensen divergence based on Fisher's information %A P. S¨˘nchez-Moreno %A A. Zarzo %A J. S. Dehesa %J Mathematics %D 2010 %I arXiv %R 10.1088/1751-8113/45/12/125305 %X The measure of Jensen-Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, is very sensitive to the fluctuations of the probability distributions because it is controlled by the (local) Fisher information, which is a gradient functional of the distribution. So, it is appropriate and informative when studying the similarity of distributions, mainly for those having oscillatory character. The new Jensen-Fisher divergence shares with the Jensen-Shannon divergence the following properties: non-negativity, additivity when applied to an arbitrary number of probability densities, symmetry under exchange of these densities, vanishing if and only if all the densities are equal, and definiteness even when these densities present non-common zeros. Moreover, the Jensen-Fisher divergence is shown to be expressed in terms of the relative Fisher information as the Jensen-Shannon divergence does in terms of the Kullback-Leibler or relative Shannon entropy. Finally the Jensen-Shannon and Jensen-Fisher divergences are compared for the following three large, non-trivial and qualitatively different families of probability distributions: the sinusoidal, generalized gamma-like and Rakhmanov-Hermite distributions. %U http://arxiv.org/abs/1012.5041v1