Differential Evolution

The recent Wired article on the use of evolutionary computation to optimize Formula-1 cars has been getting a bit of press lately, and finally made it to Slashdot. Buried in the comments I found a link to some work of which I was previously unaware: Differential Evolution. You can also read a matching tech report. The nifty thing is that DE works on real valued functions rather than discrete ones, which may make it more easily adaptable to optimizing functions which are more naturally represented in floating point. I haven’t skimmed the paper, but I suspect this evening I’ll be coding up my own version in Python to try to sort out the details.

Addendum: I had the chance to read the paper at lunch. The idea is really very simple and straightforward to implement. You begin by generating a pool of individuals, each of which contains a vector of floating point values, You can use these values to generate a fitness value for the individual by passing it through a fitness function. Initially, the individuals are assigned random floating point values uniformly generated over sensible domains. In each time step, two individuals are chosen from the population, and a difference vector is generated as the difference between these two individuals. A third individual is chosen, and a subset of its parameters are modified by adding a scaled version of the delta vector to the original values. This new individual is then evaluated, and if its fitness is greater than the previous one, it replaces it in the gene pool.

It seems almost too simple to be useful, but it’s dead simple to code. I’ll have to write up some demo code to give it a try in the not too distant future.