摘要

The traditional cost-sensitive machine learning techniques use a fixed misclassification cost to build a classification model all over the training set. The reason is that the real misclassification costs are usually unknown or unobtainable. In this paper we propose a novel cost-sensitive learning algorithm, named Adaptive Cost Optimization (ACO), which uses the resampling and optimization techniques to build an ensemble classification model. We empirically evaluate and compare ACO with the naive Bayesian (NB) and MetaCost. Experimental results show that ACO is effective, steady, and robust. The proposed algorithm outperforms both NB and MetaCost.