摘要

Support vector machines (SVMs) are by far the most sophisticated and powerful classifiers available today. However, this robustness and novelty in approach come at a large computational cost. On the other hand, nearest neighbor (NN) classifiers provide a simple yet robust approach that is guaranteed to converge to a result. In this paper, we present a technique that combines these two classifiers by adopting a NN rule-based structural risk minimization classifier. Using synthetic and real data, the classification technique is shown to be more robust to kernel conditions with a significantly lower computational cost than conventional SVMs. Consequently, the proposed method provides a powerful alternative to SVMs in applications where computation time and accuracy are of prime importance. Experimental results indicate that the NNSRM formulation is not only computationally less expensive, but also much more robust to varying data representations than SVMs.

  • 出版日期2004-1-5