Like stochastic beam search, genetic algorithms combine an uphill tendency with random exploration and exchange of information among parallel search threads.

Primary Advantage (if any) of GAs comes from the crossover operation.

Variant of stochastic beam search in which successor states are generated by combining two parent states rather than by modifying a single state. Analogous to natural selection (offspring, new gen organisms, fitness value criterion), except that now we are dealing with sexual rather than asexual reproduction.

GAs begin with a set of k randomly generated states, called the population. Each state, or individual is represented as a string over a finite alphabet - most commonly, a string of 0s and 1s.

Each state is rated by the objective function, GA term: fitness function. Probability of being chosen for reproducting is directly proportional to the fitness score, and the percentages are shown next to the raw scores.

Partners are selected at random for reproduction, in accordance with the probabilities in (fig. b: fitness function value).

In the example one individual is selected twice and one is not at all.

For each pair to be mated, a crossover point is chosen randomly from the positions in the string.

There Strings (individuals/states) are usually pretty diverse at the beginning of the algorithm, so the crossing over will have dramatic effects at first. Similar to simulated annealing though, they take 'large steps at first' then gradually slow down and take 'smaller steps' as the algorithm continues as the individuals/states become much more similar.

Offspring are subject to random mutations that may occur somewhere in their string (them).