摘要

Recently, compressed sensing has been widely applied to various areas such as signal processing, machine learning, and pattern recognition. To find the sparse representation of a vector w.r.t a dictionary, an l(1) minimization problem, which is convex, is usually solved in order to overcome the computational difficulty. However, to guarantee that the l(1) minimizer is close to the sparsest solution, strong incoherence conditions should be imposed. In comparison, nonconvex minimization problems such as those with the l(p) (0 %26lt; p %26lt; 1) penalties require much weaker incoherence conditions and smaller signal to noise ratio to guarantee a successful recovery. Hence the l(p) (0 %26lt; p %26lt; 1) regularization serves as a better alternative to the popular l(1) one. In this paper, we review some typical algorithms, Iteratively Reweighted l(1) minimization (IRL1), Iteratively Reweighted Least Squares (IRLS) (and its general form General Iteratively Reweighted Least Squares (GIRLS)), and Iteratively Thresholding Method (ITM), for l(p) minimization and do comprehensive comparison among them, in which IRLS is identified as having the best performance and being the fastest as well.