摘要

Wang proposed a gradient-based neural network (GNN) to solve online matrix-inverses. Global asymptotical convergence was shown for such a neural network when applied to inverting nonsingular matrices. As compared to the previously-presented asymptotical convergence, this paper investigates more desirable properties of the gradient-based neural network; e. g., global exponential convergence for nonsingular matrix inversion, and global stability even for the singular-matrix case. Illustrative simulation results further demonstrate the theoretical analysis of gradient-based neural network for online matrix inversion.