摘要

This paper is concerned with minimal norm least squares solution to general linear matrix equations including the well-known Lyapunov matrix equation and Sylvester matrix equation as special cases. Two iterative algorithms are proposed to solve this problem. The first method is based on the gradient search principle for solving optimization problem and the second one can be regarded as its dual form. For both algorithms, necessary and sufficient conditions guaranteeing the convergence of the algorithms are presented. The optimal step sizes such that the convergence rates of the algorithms are maximized are established in terms of the singular values of some coefficient matrix. It is believed that the proposed methods can perform important functions in many analysis and design problems in systems theory.