摘要

This paper shows that the globally exponentially stable neural network with time-varying delay and bounded noises may converge faster than those without noise. And the influence of noise on global exponential stability of DNNs was analyzed quantitatively. By comparing the upper bounds of noise intensity with coefficients of global exponential stability, we could deduce that noise is able to further express exponential decay for DNNs. The upper bounds of noise intensity are characterized by solving transcendental equations containing adjustable parameters. In addition, a numerical example is provided to illustrate the theoretical result.