摘要

A sparse and low-rank near-isometric linear embedding (SLRNILE) method has been proposed to make dimensionality reduction and extract proper features for hyperspectral imagery (HSI) classification. The SLRNILE stands on the theory of the John-Lindenstrauss lemma, and tries to estimate a sparse and low-rank projection matrix that satisfies the restricted isometric property (RIP) condition on all secants of the HSI data. The RIP condition guarantees that the desired linear mapping near-isometrically preserves nearest neighbor points of all HSI pixels. Seeking the desired mapping is then modeled into minimizing a Lagrange multipliers formulation. The alternating direction method of multipliers framework is utilized to solve the above convex program, and column generation techniques are adopted to alleviate the computation memory burden during the optimization procedure. Five experiments on three widely used HSI data sets are designed to completely test the performance of SLRNILE, and experimental results are compared against those of six state-of-the-art feature extraction methods, including principal component analysis, Laplacian eigenmaps, locality preserving projections, neighborhood preserving embedding, sparse nonnegative matrix underapproximation, and random projections. The results show that SLRNILE performs best among all the seven methods, and its computational time is longest of all but still bearable for regular users. Therefore, the SLRNILE can be a good choice for feature extraction in HSI classification.