全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

Approximation of Kullback-Leibler Divergence between Two Gaussian Mixture Distributions
高斯混合分布之间K-L散度的近似计算

Keywords: K-L divergence(KLD),Gaussian mixture distribution(GMD),relative entropy,upper-bound of K-L divergence
K-L散度(KLD)
,高斯混合分布(GMD),相对熵,K-L散度上界

Full-Text   Cite this paper   Add to My Lib

Abstract:

For no closed-form expression is available for Kullback-Leibler divergence(KLD)between two Gaussian mixture distributions(GMDs),the upper-bound of its solution is used to approximate it.In this paper,the upper-bound of KLD of two GMDs with the same number of components is deduced according to the relative entropy link rule,and then a tighter upper-bound is proposed.In the case that two GMDs have different numbers of components,a method,named optimal Gaussian duplication(OGD),is proposed to approximate their KLD.The evaluation experiments are performed on the acoustic models of the initial and the final,which all are modeled by GMD based HMM in speech recognition.The experimental results show that the tighter upper-bound can more perfectly approximate the KLD than other methods,and the proposed OGD method can effectively compute the upper-bound of KLD between two GMDs with different numbers of components.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133