摘要

In this paper, we present an Online Sequential Reduced Kernel Extreme Learning Machine (OS-RKELM). In OS-RKELM, only a small part of the instances in the original training samples is employed for training the kernel neurons, while the output weights are attained analytically. Similar to the Online Sequential Extreme Learning Machine (OS-ELM), OS-RKELM learns data samples in a chunk-by-chunk or one-by-one mode and does not require an archival of the data sample once it has been learned. OS-RKELM also contains few control parameters, thus avoiding the need for cumbersome fine-tuning of the algorithm. OS-RKELM supports a widespread types of kernels as hidden neurons and is capable of addressing the singular problem that arises when the initial training samples are smaller than the neuron size. A comprehensive performance evaluation of the OS-RKELM against other state-of-the-art sequential learning algorithms, including OS-ELM, Large-scale Active Support Vector Machine (LASVM) and Budgeted Stochastic Gradient Descent Support Vector Machine (BSGD) using popular time series, regression and classification benchmarks have been conducted. Experimental results obtained indicate that the proposed OS-RKELM showcases improved prediction accuracy and efficiency over the OS-ELM, LASVM and BSGD in many cases.