摘要

In this paper, a novel multilayer backpropagation (BP) neural network model is proposed based on conjugate gradient (CG) method with generalized Armijo search. The presented algorithm requires low memory and performs fast convergent speed in practical applications. One reason is that the constructed conjugate direction guarantees the sufficient descent behavior in minimizing the given objective function. The other stems from the fact that the generalized Armijo method can automatically determine a more suitable learning rate in each training epoch. As a theoretical contribution, two deterministic convergent results, weak and strong convergence, have been detailedly proved under more relaxed assumptions. The weak convergence means that the norm of gradient of the objective function tends to zero. For the strong convergence, it represents that the sequence of weight vectors approaches a fixed point. To support the theoretical results, some illustrated simulations have been done on various benchmark datasets.