全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于信息熵的不完备数据特征选择算法*

, PP. 1131-1137

Keywords: 特征选择,不完备数据,不完备信息熵,不完备决策表,相似关系

Full-Text   Cite this paper   Add to My Lib

Abstract:

在分析已有不完备信息熵的基础上,提出一种基于相似关系的不完备信息熵,并证明该信息熵的若干性质.给出一个不完备数据特征选择算法,算法以改进的不完备信息熵作为特征选择准则,直接对不完备数据的特征进行熵值分析,并采用顺序前向浮动选择方法解决特征间的相关性问题.最后在UCI实测数据集上的实验表明,文中算法具有更高的准确率和更快的特征选择速度.

References

[1]  Balamurugan S A A, Rajaram R. Effective and Efficient Feature Selection for Large-Scale Data Using Bayes' Theorem. International Journal of Automation and Computing, 2009, 6(1): 62-71
[2]  Yao X, Wang X D, Zhang Y X, et al. Summary of Feature Selection Algorithms. Control and Decision, 2012, 27(2): 161-166 (in Chinese)(姚 旭,王晓丹,张玉玺,等.特征选择方法综述.控制与决策, 2012, 27(2): 161-166)
[3]  Saeys Y, Inza I, Larraaga P. A Review of Feature Selection Techniques in Bioinformatics. Bioinformatics, 2007, 23(19): 2507-2517
[4]  Liu H W. A Study on Feature Selection Algorithms Using Information Entropy. Ph.D Dissertation. Changchun, China: Jilin University, 2010 (in Chinese)(刘华文.基于信息熵的特征选择算法研究.博士学位论文.长春:吉林大学, 2010)
[5]  Zheng Z H, Wu X Y, Srihari R. Feature Selection for Text Categorization on Imbalanced Data. ACM SIGKDD Explorations Newsle-tter, 2004, 6(1): 80-89
[6]  Zhang C Y, Tian Z. Adaptive Kernel Feature Subspace Method for Efficient Feature Extraction. Pattern Recognition and Artificial Inte-lligence, 2013, 26(4): 392-401 (in Chinese)(张朝阳,田 铮.特征有效提取的自适应核特征子空间方法.模式识别与人工智能, 2013, 26(4): 392-401)
[7]  Xu Y, Li J T, Wang B, et al. A Category Resolve Power-Based Feature Selection Method. Journal of Software, 2008, 19(1): 82-89 (in Chinese)(徐 燕,李锦涛,王 斌,等.基于区分类别能力的高性能特征选择方法.软件学报, 2008, 19(1): 82-89)
[8]  Liu H W, Liu L, Zhang H J. Feature Selection Using Mutual Information: An Experimental Study // Proc of the 10th Pacific Rim International Conference on Artificial Intelligence. Hanoi, Vietnam, 2008: 235-246
[9]  Liang J Y, Li C W, Wei W. Advanced in Feature Selection Based on Rough Set. Journal of Shanxi University: Natural Science Edition, 2012, 35(2): 211-218 (in Chinese)(梁吉业,李超伟,魏 巍.基于Rough Sets的特征选择研究进展.山西大学学报:自然科学版, 2012, 35(2): 211-218)
[10]  Hsu W H. Genetic Wrappers for Feature Selection in Decision Tree Induction and Variable Ordering in Bayesian Network Structure Learning. Information Sciences, 2004, 163(1/2/3): 103-122
[11]  Chiang L H, Pell R J. Genetic Algorithms Combined with Discri-minant Analysis for Key Variable Identification. Journal of Process Control, 2004, 14(2): 143-155
[12]  Guyon I, Weston J, Barnhill S, et al. Gene Selection for Cancer Classification Using Support Vector Machines. Machine Learning, 2002, 46(1/2/3): 389-422
[13]  Sánchez-Maroo N, Alonso-Betanzos A, Tombilla-Sanromán M. Filter Methods for Feature Selection-A Comparative Study // Proc of the 8th International Conference on Intelligent Data Engineering and Automated Learning. Birmingham, UK, 2007: 178-187
[14]  Sun Y, Todorovic S, Goodison S. Local-Learning-Based Feature Selection for High-Dimensional Data Analysis. IEEE Trans on Pa-ttern Analysis and Machine Intelligence, 2009, 32(9): 1610-1626
[15]  Zhang X, Chu S J, Xu M Z. Null Value Estimation Method Based on Information Granularity for Incomplete Information System. Journal of Chinese Computer Systems, 2011, 32(4): 752-756 (in Chinese)(张 霞,储尚军,许鸣珠.基于信息粒度的不完备信息系统空值补齐算法.小型微型计算机系统, 2011, 32(4): 752-756 )
[16]  Liang J Y, Mi J R, Wei W, et al. An Accelerator for Attribute Reduction Based on Perspective of Objects and Attributes. Know-ledge-Based Systems, 2013, 44: 90-100
[17]  Sun L, Xu J C, Li S Q, et al. New Approach for Feature Selection by Using Information Entropy. Journal of Information and Computational Science, 2011, 8(12): 2259-2268
[18]  Xu J C, Sun L. Knowledge Entropy and Feature Selection in Incomplete Decision Systems. Applied Mathematics and Information Sciences, 2013, 7(2): 829-837

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133