全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

一种基于局部随机子空间的分类集成算法

, PP. 595-603

Keywords: 子分类器,分类集成,特征选择,局部随机子空间

Full-Text   Cite this paper   Add to My Lib

Abstract:

分类器集成学习是当前机器学习研究领域的热点之一。然而,经典的采用完全随机的方法,对高维数据而言,难以保证子分类器的性能。为此,文中提出一种基于局部随机子空间的分类集成算法,该算法首先采用特征选择方法得到一个有效的特征序列,进而将特征序列划分为几个区段并依据在各区段的采样比例进行随机采样,以此来改进子分类器性能和子分类器的多样性。在5个UCI数据集和5个基因数据集上进行实验,实验结果表明,文中方法优于单个分类器的分类性能,且在多数情况下优于经典的分类集成方法。

References

[1]  Fukunaga K.Introduction of Statistical Pattern Recognition.2nd Edition.London,UK: Academic Press,1991
[2]  Dietterich T G.Machine Learning Research: Four Current Directions.Artificial Intelligence.1997,18(4): 97-136
[3]  Hansen L K,Salamon P.Neural Network Ensembles.IEEE Trans on Pattern Analysis and Machine Intelligence,1990,12(10): 993-1001
[4]  Krogh A,Vedelsby J.Neural Network Ensembles,Cross Validation,and Active Learning.Cambridge,USA: MIT Press,1995
[5]  Dietterich T G.An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging,Boosting and Randomization.Machine Learning,2000,40(2): 139-158
[6]  Freund Y,Schapire R E.Experiments with a New Boosting Algorithm // Proc of the 13th International Conference on Machine Learning.New Brunswick,USA,1996: 148-156
[7]  Breiman L.Bagging Predictors.Machine Learning,1996,24(2): 123-140
[8]  Optiz D.Feature Selection for Ensembles // Proc of the 16th National Conference on Artificial Intelligence.Orlando,USA,1999: 379-384
[9]  Dietterich T G,Bakiri G.Error-Correcting Output Codes: A General Method for Improving Multiclass Inductive Learning Programs // Proc of the 9th National Conference on Artificial Intelligence.Anaheim,USA,1991: 572-577
[10]  Dietterich T G,Bakiri G.Solving Multiclass Learning Problems via Error Correction Output Codes.Journal of Artificial Intelligence Research,1995,2(1): 263-286
[11]  Zhou Zhihua,Wu Jianxin,Tang Wei.Ensembling Neural Networks: Many Could Be Better Than All.Artificial Intelligence,2002,137(1/2): 239-263
[12]  Li Nan,Zhou Zhihua.Selective Ensemble under Regularization Framework // Proc of the 8th International Workshop on Multiple Classifier Systems.Reykjavik,Iceland,2009: 293-303
[13]  Tumer K,Ghosh J.Classifier Combining: Analytical Results and Implications // Proc of the AAAI Workshop on Integrating Multiple Learned Models for Improving and Scaling Machine Learning Algorithms.Portland,USA,1996: 126-132
[14]  Ho T K.The Random Subspace Method for Constructing Decision Forests.IEEE Trans on Pattern Analysis and Machine Intelligence,1998,20(8): 832-844
[15]  Wang Xiaogang,Tang Xiaoou.Using Random Subspace to Combine Multiple Features for Face Recognition // Proc of the 6th IEEE International Conference on Automatic Face and Gesture Recognition.Los Alamitos,USA,2004: 284-289
[16]  Bay S D.Combining Nearest Neighbor Classifiers through Multiple Feature Subsets // Proc of the 17th International Conference on Machine Learning.Madison,USA,1998: 37-45
[17]  Yang Ming.A Novel Algorithm for Attribute Reduction Based on Consistent Criterion.Chinese Journal of Computers,2010,33(2): 231-239 (in Chinese)(杨 明.一种基于一致性准则的属性约简算法.计算机学报,2010,33(2): 231-239)
[18]  Yang Ming,Yang Ping.A Novel Condensing Tree Structure for Rough Set Feature Selection.Neurocomputing,2008,71(4/5/6): 1092-1100
[19]  Yang Ming.An Incremental Updating Algorithm for Attribute Reduction Based on Improved Discernibility Matrix.Chinese Journal of Computers,2007,30(5): 815-822
[20]  Crammer K,Gilad-Bachrach R,Navot A,et al.Margin Analysis of the LVQ Algorithm // Proc of the 17th Conference on Neural Information Processing Systems.Sydney,Australia,2002: 492-496
[21]  Gilad-Bachrach R,Navot A,Tishby N.Margin Based Feature Selection-Theory and Algorithms // Proc of the 21st International Conference on Machine Learning.Banff,Canada,2004: 43-50
[22]  Vapnik V.The Nature of Statistical Learning Theory.New York,USA: Springer-Verlag,1995

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133