摘要

In this paper, we present some general analysis on global convergence of the recurrent neural networks (RNNs) with projection mappings in the critical case when M(L,Gamma), a matrix related to the weight matrices and the activation mappings of the networks, is nonnegative definite for some positive diagonal matrix Gamma. Considerable stability results have been obtained for the RNNs in the noncritical case when M(L, Gamma) is positive definite. In contrast, only a few conclusions have been conducted under the critical conditions. Comparing with the existing critical studies, the present critical stability results in this paper require no additional assumption on the weight matrices, can be applied to the RNNs with general projection mappings other than nearest point projection mappings, and can serve for both two fundamental RNN models. The results established for several typical RNN models unify sharpen or, generalize most of the existing stability assertions. Two examples are given to show both theoretical importance and practical feasibility of the critical results obtained.