摘要

Recently, many studies have focused on differential evolution (DE), which is arguably one of the most powerful stochastic real-parameter optimization algorithms. Prominent variants of this approach largely optimize the DE algorithm; however, almost all the DE algorithms still suffer from problems like difficult parameter setting, slow convergence rate, and premature convergence. This paper presents a novel adaptive DE algorithm by constructing a trial vector generation pool, and dynamically setting control parameters according to current fitness information. The proposed algorithm adopts a distributed topology, which means the whole population is divided into three subgroups with different mutation and crossover operations used for each subgroup. However, a uniform selection strategy is employed. To improve convergence speed, a directional strategy is introduced based on the greedy strategy, which means that an individual with good performance can evolve rapidly in the optimal evolution direction. It is well known that the faster an algorithm converges, the greater the probability of premature convergence. Aimed at solving the local optimum problem, the proposed algorithm introduces a new mathematical tool in the selection process, called the membership cloud model. In essence, the cloud model improves the diversity of the population by randomly generating cloud droplets. Experimental results from executing typical benchmark functions show high quality performance of the proposed algorithm in terms of convergence, stability, and precision. They also indicate that this improved differential evolutionary algorithm can overcome the shortcoming of conventional differential evolutionary algorithms of low efficiency, while effectively avoiding falling into a local optimum.