摘要

In this paper, the problem of optimizing SVR automatically for time series forecasting is considered, which involves introducing auto-adaptive parameters C-i and epsilon(i) to depict the non-uniform distribution of the information offered by the training data, developing multiple kernel function K-sigma to rescale different attributes of input space, optimizing all the parameters involved simultaneously with genetic algorithm and performing feature selection to reduce the redundant information. Experimental results assess the feasibility of our approach (called Model-optimizing SVR or briefly MO-SVR) and demonstrate that our method is a promising alternative for time series forecasting.