全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于因果影响独立模型的贝叶斯网络参数学习

DOI: 10.13195/j.kzyjc.2014.0463, PP. 1007-1013

Keywords: 贝叶斯网络,因果影响独立,样本集,参数学习

Full-Text   Cite this paper   Add to My Lib

Abstract:

基于因果影响独立模型及其中形成的特定上下文独立关系,提出一种适于样本学习的贝叶斯网络参数学习算法.该算法在对局部概率模型降维分解的基础上,通过单父节点条件下的子节点概率分布来合成局部结构的条件概率分布,参数定义复杂度较低且能较好地处理稀疏结构样本集.实验结果表明,该算法与标准最大似然估计算法相比,能充分利用样本信息,具有较好的学习精度.

References

[1]  Pearl J. Probabilistic reasoning in intelligent systems: Networks of plausible inference[C]. Networks of Plausible Inference. San Francisco: Morgan Kaufmann, 1988: 383-408.
[2]  张连文, 郭海鹏. 贝叶斯网引论[M]. 北京: 科学出版社, 2006: 31-74.
[3]  (Zhang L W, Guo H P. An introduction to Bayesian networks[M]. BeiJing: Science Press, 2006: 31-74.)
[4]  Zhang N L, Poole D. Exploiting causal independence in Bayesian network inference[J]. J of Artificial Intelligence Research, 1996, 5(7): 301-328.
[5]  张宏毅, 王立威, 陈瑜希. 概率图模型研究进展综述[J].软件学报, 2013, 24(11): 2476-2497.
[6]  (Zhang H Y, Wang L W, Chen Y X. Research progress of probabilistic graphical models: A survey[J]. J of Software, 2013, 24(11): 2476-2497.)
[7]  Heckerman D. Causal independence for knowledge acquisition and inference[C]. Proc of the 9th Conf on Uncertainty in Artificial Intelligence. San Mateo: Morgan Kaufmann Publishers, 1993: 122-127.
[8]  Yang S, Natarajan S. Knowledge intensive learning: Combining qualitative constraints with causal independence for parameter learning in probabilistic models[C]. Machine Learning and Knowledge Discovery in Databases. Berlin Heidelberg: Springer, 2013: 580-595.
[9]  Vomlel J, Tichavsk′y P. Computationally efficient probabilistic inference with noisy threshold models based on a CP tensor decomposition[C]. Proc of the 6th European Workshop on Probabilistic Graphical Models(PGM 2012). Granada, 2012: 355-362.
[10]  D’Ambrosio B. Symbolic probabilistic inference in large BN20 networks[C]. Proc of the 10th Conf on Uncertainty in Artificial Intelligence. San Mateo: Morgan Kaufmann Publishers, 1994: 128-135.
[11]  Li W, Poupart P, van Beek P. Exploiting structure in weighted model counting approaches to probabilistic inference[J]. J of Artificial Intelligence Research, 2011, 40(1): 729-765.
[12]  Pradhan M, Provan G, Middleton B, et al. Knowledge engineering for large belief networks[C]. Proc of the 10th Conf on Uncertainty in Artificial Intelligence. San Mateo: Morgan Kaufmann Publishers, 1994: 484-490.
[13]  Luque M, D′?ez F J. Variable elimination for influence diagrams with super-value nodes[J]. Int J of Approximate Reasoning, 2010, 51(6): 615-631.
[14]  D′?ez F J, Gal′an S F. Efficient computation for the Noisy-MAX[J]. Int J of Intelligent Systems, 2004, 18(2): 165-177.
[15]  D′?ez F J, Druzdzel M J. Canonical probabilistic models for knowledge engineering[R]. Madrid: UNED, 2007.
[16]  Boutilier C, Friedman N, Goldszmidt M, et al. Contextspecific independence in Bayesian networks[C]. Proc of the 12th Conf on Uncertainty in Artificial Intelligence. San Mateo: Morgan Kaufmann Publishers, 1996: 115-123.
[17]  Zagorecki A, Druzdzel M J. Knowledge engineering for Bayesian networks: How common are Noisy-MAX distributions in practice?[J]. IEEE Trans on Systems, Man, and Cybernetics: Systems, 2013, 43(1): 186-195.
[18]  Zagorecki A, Voortman M, Druzdzel M J. Decomposing local probability distributions in Bayesian networks for improved inference and parameter learning[C]. Proc of the 19th Int Florida Artificial Intelligence Research Society Conf. Menlo Park: AAAI Press, 2006: 860-865.

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133