%0 Journal Article %T Approximation of Kullback-Leibler Divergence between Two Gaussian Mixture Distributions
高斯混合分布之间K-L散度的近似计算 %A WANG Huan-Liang %A HAN Ji-Qing %A ZHENG Tie-Ran %A
王欢良 %A 韩纪庆 %A 郑铁然 %J 自动化学报 %D 2008 %I %X For no closed-form expression is available for Kullback-Leibler divergence(KLD)between two Gaussian mixture distributions(GMDs),the upper-bound of its solution is used to approximate it.In this paper,the upper-bound of KLD of two GMDs with the same number of components is deduced according to the relative entropy link rule,and then a tighter upper-bound is proposed.In the case that two GMDs have different numbers of components,a method,named optimal Gaussian duplication(OGD),is proposed to approximate their KLD.The evaluation experiments are performed on the acoustic models of the initial and the final,which all are modeled by GMD based HMM in speech recognition.The experimental results show that the tighter upper-bound can more perfectly approximate the KLD than other methods,and the proposed OGD method can effectively compute the upper-bound of KLD between two GMDs with different numbers of components. %K K-L divergence(KLD) %K Gaussian mixture distribution(GMD) %K relative entropy %K upper-bound of K-L divergence
K-L散度(KLD) %K 高斯混合分布(GMD) %K 相对熵 %K K-L散度上界 %U http://www.alljournals.cn/get_abstract_url.aspx?pcid=5B3AB970F71A803DEACDC0559115BFCF0A068CD97DD29835&cid=8240383F08CE46C8B05036380D75B607&jid=E76622685B64B2AA896A7F777B64EB3A&aid=29EF1B55B0736D2C674B97601D18DBFF&yid=67289AFF6305E306&vid=339D79302DF62549&iid=94C357A881DFC066&sid=2A2AA8B7E19F0DF7&eid=D45762219109E903&journal_id=0254-4156&journal_name=自动化学报&referenced_num=0&reference_num=12