摘要

Aggregate function is a useful smoothing function to the max-function of some smooth functions and has been used to solve minimax problems, linear and nonlinear programming, generalized complementarity problems, etc. The aggregate function is a single smooth but complex function, its gradient and Hessian calculations are time-consuming. In this paper, a truncated aggregate smoothing stabilized Newton method for solving minimax problems is presented. At each iteration, only a small subset of the components in the max-function are aggregated, hence the number of gradient and Hessian calculations is reduced dramatically. The subset is adaptively updated with some truncating criterions, concerning only with computation of function values and not their gradients or Hessians, to guarantee the global convergence and, for the inner iteration, locally quadratic convergence with as few computational cost as possible. Numerical results show the efficiency of the proposed algorithm.