摘要

This paper considers the global output convergence of Cohen-Grossberg neural networks with both time-varying and distributed delays. The inputs of the neural networks are required to be time-varying and the activation functions should be globally Lipschitz continuous and monotonely nondecreasing. Based on M-matrix theory, several sufficient conditions are established to guarantee the global Output convergence of this class of neural networks. Symmetry in the connection weight matrices and the boundedness of the activation functions are abandoned in this paper. The convergence results are useful in solving some optimization problems and the design of Cohen-Grossberg neural networks with both time-varying and distributed delays. Two examples are given to illustrate the effectiveness Of Our results.