摘要

After sigmoid activation function is replaced with piecewise linear activation function, the adding decaying self-feedback continuous Hopfield neural network (ADSCHNN) searching space changes to hyper-cube space, i.e. the simplified ADSCHNN is obtained. Then, convergence analysis is given for the simplified ADSCHNN in hyper-cube space. It is proved through convergence analysis that the ADSCHNN outperforms the continuous Hopfield neural network (CHNN), when they are applied to solve optimization problem. It is also proved that when extra self-feedback is negative, the ADSCHNN is more effective than the extra self-feedback is positive, when the ADSCHNN is applied to solve TSP.