全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...
Mathematics  2010 

Jensen divergence based on Fisher's information

DOI: 10.1088/1751-8113/45/12/125305

Full-Text   Cite this paper   Add to My Lib

Abstract:

The measure of Jensen-Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, is very sensitive to the fluctuations of the probability distributions because it is controlled by the (local) Fisher information, which is a gradient functional of the distribution. So, it is appropriate and informative when studying the similarity of distributions, mainly for those having oscillatory character. The new Jensen-Fisher divergence shares with the Jensen-Shannon divergence the following properties: non-negativity, additivity when applied to an arbitrary number of probability densities, symmetry under exchange of these densities, vanishing if and only if all the densities are equal, and definiteness even when these densities present non-common zeros. Moreover, the Jensen-Fisher divergence is shown to be expressed in terms of the relative Fisher information as the Jensen-Shannon divergence does in terms of the Kullback-Leibler or relative Shannon entropy. Finally the Jensen-Shannon and Jensen-Fisher divergences are compared for the following three large, non-trivial and qualitatively different families of probability distributions: the sinusoidal, generalized gamma-like and Rakhmanov-Hermite distributions.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133