1

While studying genetic algorithms, I've come across different crossover operations used for binary chromosomes, such as the 1-point crossover, the uniform crossover, etc. These methods usually don't use any "intelligence".

I found methods like the fitness-based crossover and Boltzmann crossover, which use fitness value so that the child will be created from better parents with a better probability.

Is there any other similar method that uses fitness or any other way for an intelligent crossover for binary chromosomes?

nbro
  • 42,615
  • 12
  • 119
  • 217
Pablo
  • 283
  • 1
  • 5

3 Answers3

2

It's not obvious what you mean by "intelligent crossover".

However, it is common to use fitness-based selection of parents: individuals in the current population who have higher fitness are assigned a higher probability of being selected to mate and produce offspring. This will increase the likelihood that "good" combinations of genes in members of the current population will be passed along to the next generation, and that independent "good" combinations will be combined in some members of the next generation.

The "best" crossover operator depends dramatically on the structure of the problem being solved, and on the mapping of gene "vectors" to the salient features of a solution.

Edit #1: In some cases it is important to increase diversity in order to avoid convergence to a local optimum. In that case, an "intelligent" GA might for example select a first parent for its high fitness, and a second parent at random. In "Generator", a GA I sold for a while about 25 years ago, mate selection worked that way, and it was often very effective. Generator also replaced any duplicate individuals in the population with entirely random individuals. I have also structured genetic algorithms specifically to evolve multiple separate populations of individuals, with minimum gene flow between the populations, in order to evolve multiple solutions corresponding to "regional" fitness optima.

Edit #2: In genetic algorithms it is not common to directly seek the best combination of genes from both parents. The assumption is that higher-fitness parents are more likely to produce even higher-fitness offspring than lower-fitness parents. Sometimes there is a local search (hill climbing) operation where the offspring of two parents is mutated in various ways and the best of the mutants is put in the next generation.

And, sometimes a crossover operation involves producing a larger number of offspring than the parent generation, followed by culling low-fitness individuals to keep the population size constant. This is vaguely analogous to a local search via mutation, but uses random crossover instead of random mutation for its search.

S. McGrew
  • 373
  • 1
  • 8
1

The idea behind it

I'll do an analogy: while classical GAs look like how humanity reproduced until now, "intelligent crossover" is more looking like designer babies. You first would need to identify which genes are responsible for certain behaviors and then you can bring them on to the new generation.

Said this, you will understand why there aren't so many algorithms around: it is a very case-tailored approach. Possibly each problem requires a different method and certainly will increase considerably the complexity of the crossover algorithms.


How to make your own

If you want to create one of these algorithms, you might want to add a step to your GA cycle: Evaluation, Identification, Selection, Crossover, Mutation, Replacement.

On the Identification step, you can run a probabilistic search on your population to discover which genes might be responsible for higher fitness results. Then you can:

  • proceed normally: Select two parents based on their fitness and perform crossover giving more importance to the "good" genes. This is the "safest" way.

  • fuse Selection and Crossover: use the data retrieved during the Identification step to create good individuals directly from the population pool. This system is dangerous as it might reduce the population diversity at alarming rates, leading to stagnation. This is how the Fitness-based Crossover are generally built [paper]

0xSwego
  • 401
  • 6
  • 10
1

Method Overview:

  • Parent Selection: Select two parents from the population, with selection probability proportional to their fitness.
  • Crossover Probability: Determine the probability for each gene to be inherited from the first parent based on the fitness of both parents. This increases the likelihood of inheriting genes from fitter parents.
  • Offspring Generation: Generate offspring by choosing genes from either parent based on this probability, resulting in offspring biased toward fitter parent genes.

Pseudo-code:

function fitness_based_uniform_crossover(parents, fitnesses, num_offspring):
    offspring_list = []
    sorted_fitness_indices = argsort(fitnesses)
    parent_fitnesses = fitnesses[sorted_fitness_indices][-length(parents):]
    fitness_probs = parent_fitnesses / sum(parent_fitnesses)
for each offspring to generate:
    parent1 = randomly select parent from parents with probability fitness_probs
    parent2 = randomly select another parent from parents with probability fitness_probs

    prob_parent1 = (fitness_probs[parent1] / (fitness_probs[parent1] + fitness_probs[parent2]))^1.2

    for each gene position:
        if random number < prob_parent1:
            offspring_gene = gene from parent1
        else:
            offspring_gene = gene from parent2

    create offspring from collected offspring_gene values
    add offspring to offspring_list

return offspring_list

Key Advantages:

  • Effectively leverages parent fitness to guide genetic inheritance.
  • Maintains diversity while biasing offspring toward fitter solutions.
  • Adaptable to various problem domains through fitness function design.