摘要

Spatial architecture neural network (SANN), which is inspired by the connecting mode of excitatory pyramidal neurons and inhibitory interneurons of neocortex, is a multilayer artificial neural network and has good learning accuracy and generalization ability when used in real applications. However, the backpropagation-based learning algorithm (named BP-SANN) may be time consumption and slow convergence. In this paper, a new fast and accurate two-phase sequential learning scheme for SANN is hereby introduced to guarantee the network performance. With this new learning approach (named SFSL-SANN), only the weights connecting to output neurons will be trained during the learning process. In the first phase, a least-squares method is applied to estimate the span-output-weight on the basis of the fixed randomly generated initialized weight values. The improved iterative learning algorithm is then used to learn the feedforward-output-weight in the second phase. Detailed effectiveness comparison of SFSL-SANN is done with BP-SANN and other popular neural network approaches on benchmark problems drawn from the classification, regression and time-series prediction applications. The results demonstrate that the SFSL-SANN is faster convergence and time-saving than BP-SANN, and produces better learning accuracy and generalization performance than other approaches.