摘要

Margin based feature extraction has become a hot topic in machine learning and pattern recognition. In this paper, we present a novel feature extraction method called Adaptive Margin Maximization (AMM) in which margin is defined to measure the discrimination ability of the features. The motivation comes principally from the iterative weight modification mechanism of the powerful boosting algorithms. In our AMM, the samples are dynamically weighted and features are learned sequentially. After one new feature is learned by maximizing the weighted total margin of data, the weights are updated so that the samples with smaller margins receive larger weights. The feature learned in the next round will thus try to concentrate more on these "hard" samples adaptively. We show that when the data are projected onto the feature space learned by AMM, most examples have large margins, and therefore the nearest neighbor classifier yields small generalization error. This is in contrast to existing margin maximization based feature extraction approaches, in which the goal is to maximize the total margin. Extensive experimental results on benchmark datasets demonstrate the effectiveness of our method.

全文