摘要

In this paper, a novel regression algorithm coined flexible support vector regression is proposed. We first model the insensitive zone in classic support vector regression, respectively, by its up- and down-bound functions and then give a kind of generalized parametric insensitive loss GPILF). Subsequently, based on GPILF, we propose an optimization criterion such that the unknown regressor and its up- and down-bound functions can be found simultaneously by solving a single quadratic programming problem. Experimental results on both several publicly available benchmark data sets and time series prediction show the feasibility and effectiveness of the proposed method.