Fukunaga K.Introduction of Statistical Pattern Recognition.2nd Edition.London,UK: Academic Press,1991
[2]
Dietterich T G.Machine Learning Research: Four Current Directions.Artificial Intelligence.1997,18(4): 97-136
[3]
Hansen L K,Salamon P.Neural Network Ensembles.IEEE Trans on Pattern Analysis and Machine Intelligence,1990,12(10): 993-1001
[4]
Krogh A,Vedelsby J.Neural Network Ensembles,Cross Validation,and Active Learning.Cambridge,USA: MIT Press,1995
[5]
Dietterich T G.An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging,Boosting and Randomization.Machine Learning,2000,40(2): 139-158
[6]
Freund Y,Schapire R E.Experiments with a New Boosting Algorithm // Proc of the 13th International Conference on Machine Learning.New Brunswick,USA,1996: 148-156
Optiz D.Feature Selection for Ensembles // Proc of the 16th National Conference on Artificial Intelligence.Orlando,USA,1999: 379-384
[9]
Dietterich T G,Bakiri G.Error-Correcting Output Codes: A General Method for Improving Multiclass Inductive Learning Programs // Proc of the 9th National Conference on Artificial Intelligence.Anaheim,USA,1991: 572-577
[10]
Dietterich T G,Bakiri G.Solving Multiclass Learning Problems via Error Correction Output Codes.Journal of Artificial Intelligence Research,1995,2(1): 263-286
[11]
Zhou Zhihua,Wu Jianxin,Tang Wei.Ensembling Neural Networks: Many Could Be Better Than All.Artificial Intelligence,2002,137(1/2): 239-263
[12]
Li Nan,Zhou Zhihua.Selective Ensemble under Regularization Framework // Proc of the 8th International Workshop on Multiple Classifier Systems.Reykjavik,Iceland,2009: 293-303
[13]
Tumer K,Ghosh J.Classifier Combining: Analytical Results and Implications // Proc of the AAAI Workshop on Integrating Multiple Learned Models for Improving and Scaling Machine Learning Algorithms.Portland,USA,1996: 126-132
[14]
Ho T K.The Random Subspace Method for Constructing Decision Forests.IEEE Trans on Pattern Analysis and Machine Intelligence,1998,20(8): 832-844
[15]
Wang Xiaogang,Tang Xiaoou.Using Random Subspace to Combine Multiple Features for Face Recognition // Proc of the 6th IEEE International Conference on Automatic Face and Gesture Recognition.Los Alamitos,USA,2004: 284-289
[16]
Bay S D.Combining Nearest Neighbor Classifiers through Multiple Feature Subsets // Proc of the 17th International Conference on Machine Learning.Madison,USA,1998: 37-45
[17]
Yang Ming.A Novel Algorithm for Attribute Reduction Based on Consistent Criterion.Chinese Journal of Computers,2010,33(2): 231-239 (in Chinese)(杨 明.一种基于一致性准则的属性约简算法.计算机学报,2010,33(2): 231-239)
[18]
Yang Ming,Yang Ping.A Novel Condensing Tree Structure for Rough Set Feature Selection.Neurocomputing,2008,71(4/5/6): 1092-1100
[19]
Yang Ming.An Incremental Updating Algorithm for Attribute Reduction Based on Improved Discernibility Matrix.Chinese Journal of Computers,2007,30(5): 815-822
[20]
Crammer K,Gilad-Bachrach R,Navot A,et al.Margin Analysis of the LVQ Algorithm // Proc of the 17th Conference on Neural Information Processing Systems.Sydney,Australia,2002: 492-496
[21]
Gilad-Bachrach R,Navot A,Tishby N.Margin Based Feature Selection-Theory and Algorithms // Proc of the 21st International Conference on Machine Learning.Banff,Canada,2004: 43-50
[22]
Vapnik V.The Nature of Statistical Learning Theory.New York,USA: Springer-Verlag,1995