摘要

In many practical situations, it is highly desirable to estimate an accurate mathematical model of a real system using as few parameters as possible. At the same time, the need for an accurate description of the system behavior without knowing its complete dynamical structure often leads to model parameterizations describing a rich set of possible hypotheses; an unavoidable choice, which suggests sparsity of the desired parameter estimate. An elegant way to impose this expectation of sparsity is to estimate the parameters by penalizing the criterion with the l(0) %26quot;norm%26quot; of the parameters. Due to the non-convex nature of the l(0)-norm, this penalization is often implemented as solving an optimization program based on a convex relaxation (e. g., l(1)/LASSO, nuclear norm, ...). Two difficulties arise when trying to apply these methods: (1) the need to use cross-validation or some related technique for choosing the values of regularization parameters associated with the l(1) penalty; and (2) the requirement that the (unpenalized) cost function must be convex. To address the first issue, we propose a new technique for sparse linear regression called SPARSEVA, with close ties with the LASSO (least absolute shrinkage and selection operator), which provides an automatic tuning of the amount of regularization. The second difficulty, which imposes a severe constraint on the types of model structures or estimation methods on which the l(1) relaxation can be applied, is addressed by combining SPARSEVA and the Steiglitz-McBride method. To demonstrate the advantages of the proposed approach, a solid theoretical analysis and an extensive simulation study are provided.

  • 出版日期2014-11