摘要

In this paper, we consider comparison-based adaptive stochastic algorithms for solving numerical optimization problems. We consider a specific subclass of algorithms that we call comparison-based step-size adaptive randomized search (CB-SARS), where the state variables at a given iteration are a vector of the search space and a positive parameter, the step-size, typically control's the overall standard deviation of the underlying search distribution. We investigate the linear convergence of CB-SARS on scaling-invariant objective functions. Scaling-invariant functions preserve the ordering of points w.r.t. their function value when the points are scaled with the same positive parameter (the scaling is done w.r.t. a fixed reference point). This class of functions includes norms composed with strictly increasing functions as well as many non-quasi-convex and noncontinuous functions. On scaling-invariant functions, we show the existence of a homogeneous Markov chain, as a consequence of natural invariance properties of CB-SARS (essentially scale-invariance and invariance to strictly increasing transformation of the objective function). We then derive sufficient conditions for global linear convergence of CB-SARS, expressed in terms of different stability conditions of the normalized homogeneous Markov chain (irreducibility, positivity, Harris recurrence, geometric ergodicity) and thus define a general methodology for proving global linear convergence of CB-SARS algorithms on scaling-invariant functions. As a by-product we provide a connexion between comparison-based adaptive stochastic algorithms and Markov chain Monte Carlo algorithms.

  • 出版日期2016
  • 单位INRIA