摘要

This paper investigates the performance of Rotation Forest ensemble method in improving the generalization ability of a base predictor for solving regression problems through conducting experiments on several benchmark data sets, which is also compared with that of Bagging, Random Forest, Adaboost.R2, and a single regression tree. The sensitivity of Rotation Forest to the choice of parameters included in it is also studied. On the considered regression data sets, Adaboost.R2 is seen to generally outperform Rotation Forest and both of them are better than Random Forest and a single tree. With respect to Bagging and Rotation Forest, it seems that there is not a clear winner between them. Furthermore, pruning the tree seems to have some bad effect on the performance of all the considered methods.