摘要

This paper investigates the design of a linear-in-the-parameters (LITP) regression classifier for two-class problems. Most existing algorithms generally learn a classifier (model) from the available training data based on some stopping criterions, such as the Akaike's final prediction error (FPE). The drawback here is that the classifier obtained is then not directly obtained based on its generalization capability. The main objective of this paper is to improve the sparsity and generalization capability of a classifier, while reducing the computational expense in producing it. This is achieved by proposing an automatic two-stage locally regularized classifier construction (TSLRCC) method using the extreme learning machine (ELM). In this new algorithm, the nonlinear parameters in each term, such as the width of the Gaussian function and the power of a polynomial term, are firstly determined by the ELM. An initial classifier is then generated by the direct evaluation of these candidates models according to the leave-one-out (LOO) misclassification rate in the first stage. The significance of each selected regressor term is also checked and insignificant ones are replaced in the second stage. To reduce the computational complexity, a proper regression context is defined which allows fast implementation of the proposed method. Simulation results confirm the effectiveness of the proposed technique.