I've already spoken of differential evolution in previous articles (1, 2). It is an optimization algorithm extremely versatile thanks to its absence of requirements on the function to optimize except for using real values (no need for differentiability or even continuity for example). Its algorithm is as follow:
Its simplicity and versatility make it a great tool to have in one's toolbox. However, it doesn't handle constrained optimization. Modification of the differential evolution algorithm for that kind of problems has been studied in the thesis of Zaakirah Kajee-Bagdadi. Before reading his thesis I came up with a solution which I'm introducing below. My idea similar to the one of his FDEC algorithm (minimizing both fitness and constraints simultaneously, and treating feasible and infeasible candidates separately), and simpler as it doesn't use a penalty function.
The constraints are supposed to be given as a function Constraint(X) such as Constraint(X)>=0.0 if X respects the constraint and Constraint(X)<0.0 else.
In my implementation I also randomise coeff_mutation (Zaakirah Kajee-Bagdadi does the same), and if there is no feasible candidate at the end of the optimization I return the best of the infeasible ones. The constraint value of the returned candidate allows the user to know if it's a feasible one, and if it is, whether or not relaxing the constraint is likely to improve the fitness (i.e. the candidate lies on the constraint border).
My motivation to add support for constrained problems to differential evolution comes from the implementation of support-vector machine for which a constrained optimization problem needs to be solved. It's still a wip and I've only tried it on two datasets, one with a feasible solution and one without. It worked as expected on both, so I'm looking forward to other occasions to try that enhanced differential evolution algorithm and see how it performs.