摘要

The classical AdaBoost algorithm is an ensemble of weak learners, and can be used to construct a strong classifier. A weak learner is incorporated into the ensemble at each step, and the classification of the derived ensemble is improved by properly adjusting the weight of each weak learner. The classical AdaBoost algorithm has some limitations; for example, it is sensitive to noisy data, which may impede the generalization capability of the derived classifier and lead to overfitting problem. A selective boosting, sBoost, technique is proposed in this paper to tackle these problems. The proposed sBoost classifier focuses on the generalized classification performance rather than those hard-to-learn samples, and the penalties of hard-to-learn samples are mitigated to the degree associated with their noise level. An error correction method is suggested to detect potential clean samples and prevent them from misclassification to further alleviate the overfitting problem. The effectiveness of the developed sBoost technique is tested by a series of simulation tests. Test results show that the developed sBoost technique can improve the classification accuracy and prevent the overfitting problem effectively.

  • 出版日期2015-5-25