Up to now, we (mostly) used algorithms inspired by Darwinian evolution. However, there are more evolutionary theories. In this lesson, we will discuss lamarckian and baldwinian evolution and try to use them in the framework of evolutionary algorithms.

Lamarckism

Lamarckism is in many aspects similar to Darwinian evolution. It states that individuals inherit their characteristics from their parents and that useful characteristics are developed during the life of individual. However, it also states, that the characteristics are inherited in the form they have at the time when the offspring was born. According to Lamarck, a blacksmith has developed strong muscles in his arms due to his work, and therefore his sons will also have strong arms. Similarly, giraffes stretch their necks to get to leaves on trees and therefore their offspring have even longer necks. An interesting situation may be, when a person had two children. One of them in a part of their life when they are overweight. The other after they start to do sports. The first children should thus also be overweight, while the other should be athletic.

In evolutionary algorithms, we can use the part of the theory which states the improved characteristics are inherited. We let the parents improve during their “life”. This can be done as a clever mutation which improves the individual and their offspring obtain the improved genes.

Baldwinism

Baldwin states that evolution does not select individuals based on their performance, but it favors individuals who can quickly learn good behaviors. He states that when the environment changes, the individuals who can quickly adapt are selected by the natural selection, and thanks to the selection, the next generation can adapt even faster. Thus, after a few generations, the adapted behavior may seem as instinctive. Imagine, for example, that there is a new predator in the environment. The individuals who can quickly learn a behavior which makes it hard for the predator to catch them have an advantage and multiply in the population. Their offspring will also be able to learn the required behavior quickly. During the next generations, thanks to (darwinian) natural selection, the individuals are able to learn the behavior so quickly it looks instinctive.

For evolutionary algorithm, we again take what we like from the theory. Instead of evaluating the quality of the individual, we first let it learn for a while and then select it based on what it was able to learn. We can do this by running a local search during the fitness assignment and assigning the fitness based on the improved individual. The genome of the individual does not change in this case.

Lamarck and Baldwin for continuous optimization

There is a natural way how to use baldwinian and lamarckian ideas in evolutionary algorithms. We can use a gradient-based method as the learning during the “life” of the individual (i.e. during the mutation in lamarckian evolution and during the fitness assignment in the baldwinian evolution). Our fitness function can numerically compute its own gradient in any point (Java: numericalDerivative method of RealFunction, Python: numericalDerivative function in co_functions.py). We can multiply the gradient by a suitable step size and subtract (we are minimizing) the gradient from the individual vector. It is better to make a number of shorter steps instead of one longer step.

We can of course use any other local search method instead of the gradient-based one. For example, simulated annealing or even another evolutionary algorithm.

Be careful, the library does not compute the number of fitness evaluations directly and uses the population size and number of generations to compute it (when the graphs are created). Therefore, it does not in any way compute the evaluations you need in the local search. The numerical computation of gradient for example, needs at least as many evaluations as is the dimension of the problem. Pay attention to this while comparing the algorithm to the one from last lesson.

Naposledy změněno: pondělí, 9. listopadu 2020, 12.16