摘要

A new cost function that introduces the minimum-disturbance (MD) constraint into the conventional recursive least squares (RLS) with a sparsity-promoting penalty is first defined in this paper. Then, a variable regularization factor is employed to control the contributions of both the MD constraint and the sparsity-promoting penalty to the new cost function. Analyses show that such a regularization factor can control the forgetting factor of the RLS, which means low misalignment and fast tracking can both be achieved by adjusting the regularization factor, as when a variable forgetting factor RLS (VFF-RLS) is implemented. It is further demonstrated that the regularization factor can accelerate the convergence of the RLS, especially for sparse filtering, which means that tuning the regularization factor rather than the forgetting factor is more effective for such a case. Finally, a method for automatically determining the regularization factor, which is free of any prior knowledge, is presented. To verify the effectiveness of the analytical results via simulations, a benchmark algorithm for a VEE-RLS with a sparsity-promoting penalty is conceived, where the forgetting factor is adjusted manually according to the prophetic system changes to obtain the best performance. The proposed algorithm exhibits lower misalignment and greater robustness than those of the benchmark algorithm when a sparse system is identified, whereas the algorithms perform almost equivalently when a non-sparse system is found.