Wednesday, July 8, 2009

Selection

During each successive epoch, a proportion of the existing population is selected to breed a new generation. Individual solutions are selected through a fitness-based process, where fitter solutions (as measured by a fitness function) are typically more likely to be selected.

Certain selection methods rate the fitness of each solution and preferentially select the best solutions. Other methods rate only a random sample of the population, as this process may be very time-consuming.There are many different techniques which a genetic algorithm can use to select the individuals to be copied over into the next generation.
Some of the methods are as follows
Elitist selection
The fittest members of each generation are guaranteed to be selected. Most GA does not use pure elitism, but instead use a modified form where the single best or a few of the best, individuals from each generation are copied into the next generation just in case nothing better turns up.
Roulette-wheel selection
It is also referred as proportionate selection in which the individuals are assigned a probability of being selected based on their fitness. The Probability that individual being selected is given by,

clip_image001


In this type of selection the fitness of the individual is represented as proportionate slice of wheel similar to roulette wheel game. The main disadvantage of this method is it cannot be used on minimization problems and loss of search direction as population converges. i.e., premature converges
Linear Rank selection
Each individual in the population is assigned a numerical rank based on fitness, and subjective fitness for each one is computed based on the rank as given below. One disadvantage associated with linear rank selection is that the population must be sorted on each cycle.[13]
clip_image002
Where
P – Population Size
Rj – Rank of individual
EMax – Fitness (Elapse time) of Best Individual
EMin – Fitness (Elapse time) of Worst Individual
The advantage of this method is that it can prevent very fit individuals from gaining dominance early at the expense of less fit ones, which would reduce the population's genetic diversity and might hinder attempts to find an acceptable solution.
Tournament selection
This can be viewed as a noisy version of rank selection. The selection process is thus: select a group of N (N ≥ 2) members, then select the fittest member of this group and discard the rest. Tournament selection inherits the advantages of rank selection but does not require the global reordering of the population and is more inspired by nature.
Scaling selection
As the average fitness of the population increases, the strength of the selective pressure (search direction) also increases and the fitness function becomes more discriminating. This method can be helpful in making the best selection later on when all individuals have relatively high fitness and only small differences in fitness distinguish one from another.
Generational selection
The offspring of the individuals selected from each generation become the entire next generation. No individuals are retained between generations.
Steady-state selection
In a steady-state genetic algorithm one member of the population is changed at a time. To perform selection a member of the population is chosen according to its fitness. It is copied and the copy mutated. A second member of the population is selected which is replaced by the mutated string. In crossover two members of the population are chosen, a single child created which replaces a member of the population. Any selection method can be used to select the individual for mutation or the parents. There are a number of different replacement strategies
  • replace worst (often gives too rapid convergence)
  • replace a randomly chosen member
  • select replacement using the negative fitness.
The major difference between steady-state and generational GA is that for each P members of the population generated in the generational GA there are 2P selections. Consequently the selection strength and genetic drift for a steady-state GA is twice that of the generational GA. The steady-state GA, therefore, appears twice as fast although it can lose out in the long term because it does not explore the landscape as well as the generational GA.
Hierarchical selection
Individuals go through multiple rounds of selection each generation. Lower-level evaluations are faster and less discriminating, while those that survive to higher levels are evaluated more rigorously. The advantage of this method is that it reduces overall computation time by using faster, less selective evaluation to weed out the majority of individuals that show little or no promise, and only
References:
Genetic algorithm - Wikipedia, the free encyclopedia, last modified 00:36, 15 February 2006, http://en.wikipedia.org/wiki/ Genetic_algorithm
Genetic Algorithms, “http://www.tjhsst.edu/~ai/AI2001/GA.HTM
Genetic Algorithms and Evolutionary Computation (April 23, 2004), Adam Marczyk, http://www.talkorigins.org/faqs/genalg/genalg.html
Book Reference: David Edward Goldberg (1989), “Genetic Algorithms in Search, Optimization and Machine Learning” Addison-Wesley.
Related Posts:
Table of Contents
© 2006 Kumaravel & Project Team

If references link found broken see the below e-printed version of webpage


Note: e-printed version of the webpage, are just for the reference and it was not owned by blog author. It had be created using the Open Source PDFCreator, which is environment friendly to save paper.

No comments :

Post a Comment

Blog authors can delete the comment if it contains the inappropriate contents.