摘要

Recently, the alternating direction method of multipliers (ADMM) and its variations have gained great popularity in large-scale optimization problems. This paper is concerned with the solution of the tensor equation Axm-1=b in which A is an m th-order and n-dimensional real tensor and b is an n-dimensional real vector. By introducing certain auxiliary variables, we transform equivalently this tensor equation into a consensus constrained optimization problem, and then propose an ADMM type method for it. It turns out that each limit point of the sequences generated by this method satisfies the Karush-Kuhn-Tucker conditions. Moreover, from the perspective of computational complexity, the proposed method may suffer from the curse-of-dimensionality if the size of the tensor equation is large, and thus we further present a modified version (as a variant of the former) turning to the tensor-train decomposition of the tensor A, which is free from the curse. As applications, we establish the associated inverse iteration methods for solving tensor eigenvalue problems. The performed numerical examples illustrate that our methods are feasible and efficient.