摘要

In this paper, based on the generalized Taylor expansion and using the iteration matrix G of the iterative methods, we introduce a new method for computing a series solution of the linear systems. This method can be used to accelerate the convergence of the basic iterative methods. In addition, we show that, by applying the new method to a divergent iterative scheme, it is possible to construct a convergent series solution and to find the convergence intervals of control parameter for special cases. Numerical experiments are given to show the efficiency of the new method.

  • 出版日期2014-12-1