摘要

Twin support vector regression (TSVR), as an effective regression machine, solves a pair of smaller-sized quadratic programming problems (QPPs) rather than a single large one as in the classical support vector regression (SVR), which makes the learning speed of TSVR approximately 4 times faster than that of the conventional SVR. However, the empirical risk minimization principle is implemented in TSVR, which reduces its generalization ability to a certain extent. In order to improve the prediction accuracy and stability of algorithm, we propose a novel TSVR for the regression problem by introducing a regularization term into the objective function, which ensures the new algorithm implements the structural risk minimization principle instead of the empirical risk minimization principle. Moreover, the up- and down-bound functions obtained in our algorithm are as parallel as possible. Thus it ensures that our proposed algorithm yields lower prediction error and lower standard deviation in theory. The experimental results on one artificial dataset and six benchmark datasets indicate the feasibility and validity of our novel TSVR.