摘要

We study a classical iterative algorithm for balancing matrices in the L-infinity norm via a scaling transformation. This algorithm, which goes back to Osborne and Parlett & Reinsch in the 1960s, is implemented as a standard preconditioner in many numerical linear algebra packages. Surprisingly, despite its widespread use over several decades, no bounds were known on its rate of convergence. In this article, we prove that, for any irreducible n x n (real or complex) input matrix A, a natural variant of the algorithm converges in O(n(3) log(n rho/e)) elementary balancing operations, where rho measures the initial imbalance of A and e is the target imbalance of the output matrix. (The imbalance of A is maxi | log(a(i)(out)/a(i)(in))|, where a(i)(out), a(i)(in) are the maximum entries in magnitude in the ith row and column, respectively.) This bound is tight up to the log n factor. A balancing operation scales the ith row and column so that their maximum entries are equal, and requires O(m/n) arithmetic operations on average, where m is the number of nonzero elements in A. Thus, the running time of the iterative algorithm is O(n(2)m). This is the first time bound of any kind on any variant of the Osborne-Parlett-Reinsch algorithm. We also prove a conjecture of Chen that characterizes those matrices for which the limit of the balancing process is independent of the order in which balancing operations are performed.

  • 出版日期2017-6