|
自动化学报 2008
Approximation of Kullback-Leibler Divergence between Two Gaussian Mixture Distributions
|
Abstract:
For no closed-form expression is available for Kullback-Leibler divergence(KLD)between two Gaussian mixture distributions(GMDs),the upper-bound of its solution is used to approximate it.In this paper,the upper-bound of KLD of two GMDs with the same number of components is deduced according to the relative entropy link rule,and then a tighter upper-bound is proposed.In the case that two GMDs have different numbers of components,a method,named optimal Gaussian duplication(OGD),is proposed to approximate their KLD.The evaluation experiments are performed on the acoustic models of the initial and the final,which all are modeled by GMD based HMM in speech recognition.The experimental results show that the tighter upper-bound can more perfectly approximate the KLD than other methods,and the proposed OGD method can effectively compute the upper-bound of KLD between two GMDs with different numbers of components.