摘要

Elman Neural Network (ENN) has found numerous applications in such as time series prediction, system identification and adaptive control since it has powerful dynamic memories. However, one of the problems often associated with this type of network is the local minima problem which usually occurs in the process of learning. To solve this problem and to speed up the process of the convergence, we propose an improved learning algorithm by adding a term in error function which relates to the neuron saturation of the hidden layer. The activation functions are adapted to prevent neurons in the hidden layer from getting stuck into deep saturation. area. We apply the proposed algorithm into the Boolean Series Prediction Question (BSPQ) problem, and Amplitude Detection (AD) problem to demonstrate its efficiency. The simulation results show that the proposed algorithm has superior executive efficiency and ability to achieve better generalization capacity than other algorithms by avoiding the local minima problem.

  • 出版日期2009-10