摘要

Feedforward neural networks (FNNs) with a single hidden layer have been widely applied in data modeling due to its' universal approximation capability to nonlinear maps. However, such a theoretical result does not provide with any guideline to determine the architecture of the model in practice. Thus, researches on self-organization of FNNs are useful and critical for effective data modeling. This paper proposes a hybrid constructing and pruning strategy (HCPS) for problem solving, where the mutual information (MI) and sensitivity analysis (SA) are employed to measure the amount of internal information of neurons at the hidden layer and the contribution rate of each hidden neuron, respectively. HCPS merges hidden neurons when their MI value becomes too high, deletes hidden neurons when their contribution rates are sufficiently small, and splits hidden neurons when their contribution rates are very big. For each instant pattern feed into the model as a training sample, the weights of the neural network will be updated to ensure the model's output unchanged during structural adjustment. HCPS aims to get a condensed model through eliminating redundant neurons and without degrading the instant modeling performance, which is associated with the model's generalization property. The proposed algorithm is evaluated by some benchmark data sets, including classification problems, a non-linear system identification problem, a time-series prediction problem, and a real world application for pM(2.5) predictions. Simulation results with comparisons demonstrate that our proposed method performs favorably and has improved the existing work in terms of modeling performance.