%0 Journal Article %T Incorporating Linguistic Structure into Maximum Entropy Language Models %A Fang GaoLin %A Gao Wen %A Wang ZhaoQi %A
方高林 %A 高文 %A 王兆其 %J 计算机科学技术学报 %D 2003 %I %X In statistical language models, how to integrate diverse linguistic knowledge in a general framework for long-distance dependencies is a challenging issue. In this paper, an improved language model incorporating linguistic structure into maximum entropy framework is presented. The proposed model combines trigram with the structure knowledge of base phrase in which trigram is used to capture the local relation between words, while the structure knowledge of base phrase is considered to represent the long-distance relations between syntactical structures. The knowledge of syntax, semantics and vocabulary is integrated into the maximum entropy framework. Experimental results show that the proposed model improves by 24% for language model perplexity and increases about 3% for sign language recognition rate compared with the trigram model. %K maximum entropy %K language model %K base phrase identification %K sign language recognition
语言模型 %K 最大熵法 %K 符号语言识别 %K 信息处理 %U http://www.alljournals.cn/get_abstract_url.aspx?pcid=5B3AB970F71A803DEACDC0559115BFCF0A068CD97DD29835&cid=8240383F08CE46C8B05036380D75B607&jid=F57FEF5FAEE544283F43708D560ABF1B&aid=1C7C847BA2F771DF67047252A426447A&yid=D43C4A19B2EE3C0A&vid=13553B2D12F347E8&iid=CA4FD0336C81A37A&sid=2B25C5E62F83A049&eid=2B25C5E62F83A049&journal_id=1000-9000&journal_name=计算机科学技术学报&referenced_num=0&reference_num=18