摘要

In this research, we propose a novel fractional gradient descent-based learning algorithm (FGD) for the radial basis function neural networks (RBF-NN). The proposed FGD is the convex combination of the conventional, and the modified Riemann-Liouville derivative-based fractional gradient descent methods. The proposed FGD method is analyzed for an optimal solution in a system identification problem, and a closed form Wiener solution of a least square problem is obtained. Using the FGD, the weight update rule for the proposed fractional RBF-NN (FRBF-NN) is derived. The proposed FRBF-NN method is shown to outperform the conventional RBF-NN on four major problems of estimation namely nonlinear system identification, pattern classification, time series prediction and function approximation.

  • 出版日期2018-12