摘要

In this paper, the issue of nonlinear sensitivity analysis for dimensionality reduction in hydrologic model calibration is discussed, and a novel method to quantify the sensitivity of each parameter that considers the nonlinear relationship in the model is presented. The method is based on computing the absolute variation of the nonlinear function represented by the model in its parameter space. The paper discusses the theoretical background of the method and presents the algorithm. The algorithm employs neural network as a pseudo simulator to reduce the computational burden of the analysis. The proposed approach of sensitivity analysis is illustrated through a case study on a physically based distributed hydrologic model. The results indicate that the method is able to rank the parameters effectively, and the ranking can be interpreted in the context of the physical processes being considered by the model.

  • 出版日期2011-2