[EN] Description of the Java library

Simple Genetic Algorithm (SGA) is one of the first evolutionary algorithms as is therefore (as also its name suggest) rather simple. In SGA the individuals are encoded as binary strings. There is a roulette wheel selection and a simple one-point crossover with a bit-flip mutation.

Roman told you during the lecture, how the algorithm works. In the seminar, we will go through the implementation of the algorithm and we will use the opportunity to describe the library we will use during the rest of the term.

Fitness and objective function

Evolutionary algorithms are implemented in a way to maximize the fitness function. On the other hand, many real-life problems call for the minimization of an objective function. Therefore, in the library, each individual (evolution.individuals.Individual) has two different values - fitnessValue a objectiveValue. Fitness value is the one maximized by the algorithm. Objective value is the value of the objective function, it is used by the library only for logging. The advantage of having the separate values is that the objective value does not change when you change the fitness function (e.g. by scaling). Today, both the values are the same, but we will soon see examples, where they differ.

Today’s goal is rather simple - we want to evolve and individual with all 1s in its encoding. This problem is called OneMAX and is often used for theoretical analysis of evolutionary algorithms.

Individual encoding

In simple genetic algorithm, the encoding is simple - it is a binary string. In our library, such and individual is implemented as evolution.individuals.BooleanIndividual. The class supports basic operations with individuals, i.e. access to any location in its encoding and its change. It can also randomly initialize itself.


Before the population is manipulated by the genetic operators, it is necessary to select which of the individuals are able to mate. To this end, so called mating selection is used. The simple genetic algorithm uses the roulette wheel selection. In this case, the probability of choosing an individual is proportional to its fitness. The selection is implemented in evolution.selectors.RouletteWheelSelector.

Genetic operators

Also the genetic operators are rather simple. Simple genetic algorithm uses the one-point crossover: two individuals are selected, an index is chosen randomly, and all variables beyond this index are swapped between the two individuals. The crossover is implemented in evolution.operators.OnePtXOver. Notice that the method operate() receives the whole population as its parameter and it crosses consecutive pairs of individuals.

The mutation operator is also simple. It goes through the whole population and mutates each individual with a given probability (mutationProbability). During the mutation, the operator goes through the individual and flips the value of each bit with the probability bitFlipProbability.

In both cases (crossover and mutation) the newly created individuals are added to the offspring population.

Recombination and environmental selection

In some more general evolutionary algorithms, there is the possibility to let some of the parents survive to the next generation (for example, if we do not want to lose the best solution). The library uses the replacement operator to this end. In SGA the replacement is simple - only the offspring survive, so the parents are dropped (see evolution.SGAReplacement).

Sometimes, also an environmental selection is used. It is executed after the genetic operators and it reduces the population size to the original size, e.g. in cases when the genetic operators can create more offspring than parents, or when the recombination operator is used to preserve some of the parents. Its use can also be specified when using the library and there is no difference in the implementation of mating selection and environmental selection.

Algorithm settings

All the parameters of the algorithm (population size, operators probabilities, length of the individual, …) are set in the properties/ga-sga.properties file.

In the file evolution.sga.Main in the main method, there is the processing of these parameters and repeated calls to the run(int n) method. The run method starts the evolutionary algorithm with the given parameters. The parameter n of this method is the number of the run and is also used as the seed for the random number generator.

One generation of the algorithm

Now, we have all the parts of the evolutionary algorithms and we need to put them together. There is the class evolution.EvolutionaryAlgorithm in the library and its method evolve() performs one generation of the evolutionary algorithm.

The algorithm has to be first configured. We can see an example of a configuration in the run method in evolution.sga.Main. There is the following code:

//Create new population Population pop = new Population(); 
pop.setSampleIndividual(new BooleanIndividual(dimension)); 

//Set the options for the evolutionary algorithm 
EvolutionaryAlgorithm ea = new EvolutionaryAlgorithm(); 
ea.setFitnessFunction(new ExampleFitnessFunction()); 
ea.addMatingSelector(new RouletteWheelSelector()); 
ea.addOperator(new OnePtXOver(xoverProb)); 
ea.addOperator(new BitFlipMutation(mutProb, mutProbPerBit));

The first 4 lines set up the initial population of the algorithm, its size and a sample individual. This individual is then cloned to the whole population and its randomInitialization() method is called to initialize all the individuals in the population (in the createRandomInitialPopulation() method).

Next lines set up the parameters of the evolutionary algorithm - the fitness function, mating selection, and genetic operators.

There is a loop a few lines later that calls the ea.evolve(pop) method repeatedly. This method performs one generation of the evolutionary algorithm. Its implementation is in the evolution.EvolutionaryAlgorithm class.

And that’s it. We have a working evolutionary algorithm. The description was rather involved, but we will use the same library during the following lessons and everything will be very similar to this one.


After you run the algorithm, it produces several output files. They can be found in the sga directory, and the name of the output directory can be also changed in the properties/ga-sga.properties file.

  • files sgaLog.fitness.* - contain the individual runs of the algorithm, there are always two columns, the first one contains the fitness of the best individual in each generation, the second one is the average fitness in the population
  • file sgaLog.fitness_stats.log - contains the statistics of all the runs from the sgaLog.fitness.* files, there are six columns the first one contains the number of fitness evaluations, the second contains the fitness of the best individual in the worst run, the third one contains the first quartile of the fitnesses, the fourth one the average fitness, the fifth one the third quartile and the last one continues the fitness of the best individual in the best run
  • files sgaLog.objective.* and sgaLog.objective_stats - are like the sgaLog.fitness.* and sgaLog.fitness_stats, but they contain the objective value instead of the fitness value
  • files sgaLog.details.*.xml.zip - contain detailed logs of each run, including all the individuals in the population before and after each call to genetic operator, it is best used by unzipping it and copying the resources/eva.xsl file to the same folder, it can be then opened in any browser (except Google Chrome which blocks local xml+xsl)

It is not easy to read through all these files, but they can be used as the input for the script to create the plots. Its description is on a separate page.

Last modified: Monday, 7 October 2019, 3:28 PM