摘要

Fuzzy neural networks (FNNs), with suitable structures, have been demonstrated to be an effective tool in approximating nonlinearity between input and output variables. However, it is time-consuming to construct an FNNwith appropriate number of fuzzy rules to ensure its generalization ability. To solve this problem, an efficient optimization technique is introduced in this paper. First, a self-adaptive structural optimal algorithm (SASOA) is developed to minimize the structural risk of an FNN, leading to an improved generalization performance. Second, with the proposed SASOA, the fuzzy rules of SASOA-based FNN (SASOA-FNN) are generated or pruned systematically. This SASOA-FNN is able to organize the structure and adjust the parameters simultaneously in the learning process. Third, the convergence of SASOA-FNN is proved in the cases with fixed and updated structures, and the guidelines for selecting the parameters are given. Finally, experimental studies of the proposed SASOA-FNN have been performed on several nonlinear systems to verify the effectiveness. The comparison with other existing methods has been made, and it demonstrates that the proposed SASOA-FNN is of better performance.