摘要

Sparse networks are ubiquitous in real networked complex systems. Besides being the consequence of the economic requirement, does the sparse feature have other role in the functioning of the networked systems? Here we study the effect of network connection density on the difference of computational performance of Hopfield attractor networks. Using computer simulation, we find that the stability of patterns on random networks and scale-free networks has the maximum difference with a specific sparse connection density. We also study the stability of partial patterns encoded by nodes of the largest degrees. The advantage of scale-free network's partial pattern stability is also maximal on sparse network. Using the signal-to-noise-ratio analysis, we find that the non-monotonicity of the difference is induced by the competition between the distinction of degree distribution and the signal strength, and show that the density inducing maximal difference of computational performance is sparse.