摘要

Almost all the existing continuous neural networks now for associative memory are based on optimizing a quadratic function. The disadvantage is that the structure is complicated and implementation of circuit cannot coincide with the theoretical analysis. In this paper, a new linear neural network is presented for associative memory. Simple in nature, the proposed network is proved to be an ideal associative memory. Compared with other networks, the main difference is that each pattern to be recognized is used as a parameter of the network other than an initial point. The main advantage is that the theoretic analysis coincides precisely with the easy implementation and the network can explain clearly the meaning of each refused pattern.
The model considered herein is intrinsically a neural network for solving linear optimization problems with hybrid constraints. But its state is allowed to lie in a general closed convex set spanned by the prototype patterns. The performance of the proposed network is demonstrated by means of simulation of one numerical example.

  • 出版日期2005-12-15