摘要

This paper presents an efficient model order reduction algorithm for simulating large interconnect networks with many nonlinear elements. The proposed methodology is based on a multidimensional subspace method and uses constraint equations to link the nonlinear elements and biasing sources to the reduced order model. This approach significantly improves the simulation time of distributed nonlinear systems, since additional ports are not required to link the nonlinear elements to the reduced order model, yielding appreciable savings in the size of the reduced order model and computational time. Numerical examples are provided to illustrate the validity of the proposed algorithm.

  • 出版日期2010-8