摘要

Associative memory (AM) is a very important part of the theory of neural networks. Although the Hebbian learning rule is always used to model the associative memory, it easily leads to spurious state because of the linear outer product method. In this work, nonlinear function constitution and dynamic synapses, against a spurious state for associative memory neural network are proposed. The model of the dynamic connection weight and the updating scheme of the states of neurons are presented. Nonlinear function constitution improves the conventional Hebbian learning rule to be a nonlinear outer product method. The simulation results show that both nonlinear function constitution and dynamic synapses can effectively enlarge the attractive basin. Comparing to the existing memory models, associative memory of neural network with nonlinear function constitution can both enlarge the attractive basin and increase the storage capacity. Owing to dynamic synapses, the attractive basin of the stored patterns is further enlarged, at the same time the attractive basin of the spurious state is diminished. But the storage capacity is decreased by using the dynamic synapses.