
自动化学报 2011
Costsensitive AdaBoost Algorithm for Multiclass Classification Problems

Abstract:
To solve the cost merging problem when multiclass costsensitive classification is transferred to twoclass costsensitive classification, a costsensitive AdaBoost algorithm which can be applied directly to multiclass classification is constructed. The proposed algorithm is similar to real AdaBoost algorithm in algorithm flow and error estimation formula. When the costs are equal, this algorithm becomes a new real AdaBoost algorithm for multiclass classification, guaranteeing that the training error of the combination classifier could be reduced while the number of trained classifiers increased. The new real AdaBoost algorithm does not need to meet the condition that every classifier must be independent, that is to say, the independent condition of classifiers can be derived from the new algorithm, instead of being the must for current real AdaBoost algorithm for multiclass classification. The experimental results show that this new algorithm always ensures the classification result trends to the class with the smallest cost, while the existing multiclass costsensitive learning algorithm may fail if the costs of being erroneously classified to other classes are imbalanced and the average cost of every class is equal. The research method above provides a new idea to construct new ensemble learning algorithms, and an AdaBoost algorithm for multilabel classification is given, which is easy to operate and approximately meets the smallest error classification rate.