A Differential Evolution Algorithm in the Optimization Task with a Lipschitz Continuous Cost Function

Abstract
Differential evolution algorithms represent nowadays an efficient framework to cope with complex optimization tasks with many variables and involved constraints. Nevertheless, the classic differential evolution algorithms do not ensure the global convergence to the minimum of the cost function. That is why the author designed a modification of these algorithms that guarantees asymptotic global convergence in the probabilistic sense. The article shows that Lipschitz continuity of the cost function is a reasonable assumption making possible quantitative considerations and estimates. The next part describes how the cost function domain is explored by random individuals. More random individuals mean more detailed sampling of the cost function domain. This fact is the base for the asymptotic convergence of the modified differential evolution algorithm.
Description
Subject(s)
Citation
ISSN
ISBN
Collections