摘要

The Method of Moments (MoM) is a numerical technique used to approximately solve linear operator equations, such as differential equations or integral equations. The unknown function is approximated by a finite series of known expansion functions with unknown expansion coefficients. The approximate function is substituted into the original operator equation, and the resulting approximate equation is tested so that the weighted residual is zero. This results in a number of simultaneous algebraic equations for the unknown coefficients. These equations are then solved using matrix calculus. MoM has been used to solve a vast number of electromagnetic problems during the last five decades. In addition to the basic theory of MoM, some simple examples are given. To demonstrate the concept of minimizing weighted error, the Fourier series is also reviewed.

  • 出版日期2012-6