Simple Genetic Algorithm (SGA) is one of the first evolutionary algorithms as is therefore (as also its name suggest) rather simple. In SGA the individuals are encoded as binary strings. There is a roulette wheel selection and a simple one-point crossover with a bit-flip mutation.
Roman told you during the lecture, how the algorithm works. In the seminar, we will go through the implementation of the algorithm and we will use the opportunity to describe the library we will use during the rest of the term.
Evolutionary algorithms are implemented in a way to maximize the fitness function. On the other hand, many real-life problems call for the minimization of an objective function. Therefore, in the library, each individual (evolution.individuals.Individual) has two different values - fitnessValue a objectiveValue. Fitness value is the one maximized by the algorithm. Objective value is the value of the objective function, it is used by the library only for logging. The advantage of having the separate values is that the objective value does not change when you change the fitness function (e.g. by scaling). Today, both the values are the same, but we will soon see examples, where they differ.
Today’s goal is rather simple - we want to evolve and individual with all 1s in its encoding. This problem is called OneMAX and is often used for theoretical analysis of evolutionary algorithms.
In simple genetic algorithm, the encoding is simple - it is a binary string. In our library, such and individual is implemented as evolution.individuals.BooleanIndividual. The class supports basic operations with individuals, i.e. access to any location in its encoding and its change. It can also randomly initialize itself.
Before the population is manipulated by the genetic operators, it is necessary to select which of the individuals are able to mate. To this end, so called mating selection is used. The simple genetic algorithm uses the roulette wheel selection. In this case, the probability of choosing an individual is proportional to its fitness. The selection is implemented in evolution.selectors.RouletteWheelSelector.
Also the genetic operators are rather simple. Simple genetic algorithm uses the one-point crossover: two individuals are selected, an index is chosen randomly, and all variables beyond this index are swapped between the two individuals. The crossover is implemented in evolution.operators.OnePtXOver. Notice that the method operate() receives the whole population as its parameter and it crosses consecutive pairs of individuals.
The mutation operator is also simple. It goes through the whole population and mutates each individual with a given probability (mutationProbability). During the mutation, the operator goes through the individual and flips the value of each bit with the probability bitFlipProbability.
In both cases (crossover and mutation) the newly created individuals are added to the offspring population.
In some more general evolutionary algorithms, there is the possibility to let some of the parents survive to the next generation (for example, if we do not want to lose the best solution). The library uses the replacement operator to this end. In SGA the replacement is simple - only the offspring survive, so the parents are dropped (see evolution.SGAReplacement).
Sometimes, also an environmental selection is used. It is executed after the genetic operators and it reduces the population size to the original size, e.g. in cases when the genetic operators can create more offspring than parents, or when the recombination operator is used to preserve some of the parents. Its use can also be specified when using the library and there is no difference in the implementation of mating selection and environmental selection.
All the parameters of the algorithm (population size, operators probabilities, length of the individual, …) are set in the properties/ga-sga.properties file.
In the file evolution.sga.Main in the main method, there is the processing of these parameters and repeated calls to the run(int n) method. The run method starts the evolutionary algorithm with the given parameters. The parameter n of this method is the number of the run and is also used as the seed for the random number generator.
Now, we have all the parts of the evolutionary algorithms and we need to put them together. There is the class evolution.EvolutionaryAlgorithm in the library and its method evolve() performs one generation of the evolutionary algorithm.
The algorithm has to be first configured. We can see an example of a configuration in the run method in evolution.sga.Main. There is the following code:
//Create new population Population pop = new Population();
//Set the options for the evolutionary algorithm
EvolutionaryAlgorithm ea = new EvolutionaryAlgorithm();
ea.addOperator(new BitFlipMutation(mutProb, mutProbPerBit));
The first 4 lines set up the initial population of the algorithm, its size and a sample individual. This individual is then cloned to the whole population and its randomInitialization() method is called to initialize all the individuals in the population (in the createRandomInitialPopulation() method).
Next lines set up the parameters of the evolutionary algorithm - the fitness function, mating selection, and genetic operators.
There is a loop a few lines later that calls the ea.evolve(pop) method repeatedly. This method performs one generation of the evolutionary algorithm. Its implementation is in the evolution.EvolutionaryAlgorithm class.
And that’s it. We have a working evolutionary algorithm. The description was rather involved, but we will use the same library during the following lessons and everything will be very similar to this one.
After you run the algorithm, it produces several output files. They can be found in the sga directory, and the name of the output directory can be also changed in the properties/ga-sga.properties file.
It is not easy to read through all these files, but they can be used as the input for the script to create the plots. Its description is on a separate page.