Evolution of Still Lifes

In the context of cellular automata, a still life is a stable pattern that does not change from one iteration to the next. Such patterns have mostly received attention with regard to Conway’s Game of Life. This conspicuous cellular automaton is characterized by the so-called 2333 rule: a cell remains alive only if it is surrounded by 2 or 3 live cells, and a new cell is born in any empty slot with exactly 3 alive neighbors. Therefore, any pattern intended to be a still life has to fulfill these contraints for each of its cells (alive or not). Although it is relatively easy to desing by hand some  small configurations verifying this stability property, coming up with a large still life is a very complex excercise.

Precisely due to the difficulty of designing large still lifes, this problem has become a very good benchmark to assess the effectiveness of constraint satisfaction algorithms. Actually, the problem is more than a mere CSP, since it also incorporates an optimality criterion: maximizing the number of live cells. This leads to the Maximum Density Still Life Problem (MDSLP).

The MDSLP has been traditionally tackled in the literature by different exact methods, in particular with a technique termed Bucket Elimination (BE). BE is a technique related to dynamic programming, based on the successive elimination of variables from the problem (this involves maintaining auxiliary memory structures to keep track of the optimal value of this variable for each combination of values of other variables the former is related -via constraints- to). In the case of the MDSLP, this technique benefits from the relatively simple structure of the constraint graph, but still has exponential complexity in time and space. It has nevertheless provided the best results in exactly solving the MDSLP [1], providing optimal solutions up to 20×20 patterns.

A completely different approach for finding MDSLs is using metaheuristic techniques. The first such approach was done by Gallardo et al. [2], using memetic algorithms (MA). This MA featured a dynastically optimal recombination operator based on BE, and a local-searcher based on tabu search (TS). The former is a restricted version of BE that considers the set of rows contained in the parents to be recombined,  and returns the best possible children that can be constructed using these. As to TS, it was used to improve every generated solution in short stints of n2 iterations. It was shown that an evolutionary algorithm  that did no use neither TS nor BE was incapable of providing a single feasible solution in most runs up from size 12. On the contrary a MA just featuring TS did consistently provide feasible solutions in all cases, and optimal ones for n < 15. Adding BE resulted in optimal solutions for n < 17.

A further step was taken in [3], considering a multilevel hybrid algorithm that incorporated beam search and the aforementioned MA. The use of superbuckets (a variable clustering technique) to reduce the cost of BE-based recombination was considered. It did not ultimately lead to better results, but helped in showing the usefulness of multiparent recombination.

A full-flegged approach incorporating multiparent BE-based recombination, and mini-buckets to compute lower bounds of incomplete solutions in the node-selection phase of beam search was finally reported in [4] (see also [5] for additional details). This algorithm was capable of obtaining optimal solutions up to size 20 (the largest known). For larger size only optimal symmetrical solutions are known up to size 28. Compared to these, the hybrid MA was capable of finding equal or better solutions; actually it provided new best-known solutions for for size 24 and 26. If symmetry constraints were enforced, the algorithm was capable of finding optimal solutions up to size 28.

References

[1] J. Larrosa, E. Morancho, D. Niso, On the practical use of variable elimination in constraint optimization problems: ‘still life’ as a case study. Journal of Artificial Intelligence Research 23:421–440, 2005

[2] J.E. Gallardo, C. Cotta, A.J. Fernández, A Memetic Algorithm with Bucket Elimination for the Still Life Problem, Evolutionary Computation in Combinatorial Optimization, J. Gottlieb, G. Raidl (eds.), pp. 73-85, Lecture Notes in Computer Science 3906, Springer-Verlag, Berlin Heidelberg, 2006

[3] J.E. Gallardo, C. Cotta, A. Fernández, A Multi-Level Memetic/Exact Hybrid Algorithm for the Still Life Problem, Parallel Problem Solving from Nature IX, T. Runarsson et al., pp. 212-221, Lecture Notes in Computer Science 4193, Springer-Verlag, Berlin Heidelberg, 2006

[4] J.E. Gallardo, C. Cotta, A.J. Fernández, Solving Weighted Constraint Satisfaction Problems with Memetic/Exact Hybrid Algorithms, Journal of Artificial Intelligence Research 35:533-555, 2009

[5] J.E. Gallardo, C. Cotta, A.J. Fernández, Finding Still Lifes with Memetic/Exact Hybrid Algorithms, arXiv:0812.4170, 2008

Differential Evolution

In the last seminar, we discussed Differential Evolution (DE), a population-based search algorithm for global optimization proposed by Price and Storn in 1995 [1]. DE is a type of Evolutionary Algorithm, and it solves, efficiently, optimization functions with real-valued parameters, although there are proposals for integer and binary optimization. The algorithm is similar to an Evolution Strategy (ES). However, and this appears to be DE’s main strength, the algorithm overcomes some of the difficulties that a standard ES is not able to deal with. This is due to its specific mutation operator, which is not only remarkably simple, but also very effective.

As stated in [2], an efficient mutation scheme for real parameter optimization should:

–  Use a zero-mean distribution for generating mutation vectors.

–  Dynamically scale the distribution to suit each parameter.

–  Correlate mutations to ensure rotational invariance.

While ES tries to satisfy these criteria using self-adaptation— and its solution to rotational invariance creates scalability problems —, DE’s core is a simple rule that uses three solutions to generate a new solution, performing a kind of recombination and mutation. By this peculiar linear combination of three vectors, DE’s mutation has zero-mean distribution, dynamically scales the distribution to each parameter and, most of all, it is rotational invariant, i.e., its performance is not affected when the principal axes of a separable function are arbitrarily rotated away from coincidence with coordinate axes, which means that it performs better than standard ES in non-separable functions (i.e., with epistasis). Furthermore, DE is resilient to noise, it is suited for the optimization of non-stationary functions and it is inherently parallel. On the other, and as far as I know, there is still no convergence proof for DE. However, DE works, and it seems to work quite well.

In this page, you may find information about DE, like C and Matlab code, demos and other interesting stuff.

[1] Price and Storn, “Differential Evolution – a simple and efficient adaptive scheme for global optimization over continuous spaces”, ICSI Technical Report, 1995.

[2] “New Ideas in Optimization”, Corne, Dorigo and Glover, eds., 1999.