Showing posts with label Optimisation Methods. Show all posts
Showing posts with label Optimisation Methods. Show all posts

Wednesday, July 8, 2009

Termination

This generational process is repeated until a termination condition has been reached. Common terminating conditions are
  • A solution is found that satisfies minimum criteria
  • Fixed number of generations reached

Genetic Operators

Genetic variation is a necessity for the process of evolution. Genetic operators used in genetic algorithms are analogous to those which occur in the natural world: survival of the fittest, or selection; asexual or sexual reproduction (crossover, or recombination); and mutation.
Crossover
In genetic algorithms, crossover is a genetic operator used to

Fitness

Fitness is used to determine which chromosomes will be used for the next generation. Mathematically, for chromosome Sk: fitness = f(Sk) where: k = 1,..., n; n is a population size, which function that you use to calculate the fitness is obviously dependant on how you decide to encode the string to represent your problem.

Selection

During each successive epoch, a proportion of the existing population is selected to breed a new generation. Individual solutions are selected through a fitness-based process, where fitter solutions (as measured by a fitness function) are typically more likely to be selected.

Sunday, July 5, 2009

Operational Parameters of Genetic Algorithm

As said earlier the following are genetic parameters which are similar to Biological background of the nature.
  • Chromosomes
  • Fitness

BIOLOGICAL BACKGROUND - GA

All living organisms consist of cells. In each cell there is the same set of chromosomes. Chromosomes are strings of DNA and serves as a model for the whole organism. A chromosome's characteristic is determined by the genes. Each gene has several forms or alternatives which are called alleles, producing differences in the set of characteristics associated with that gene. The set of chromosome is called the genotype, which defines a phenotype (the individual) with certain fitness.

Saturday, April 11, 2009

Genetic Algorithm – Short note on Implementation

Genetic algorithms are typically implemented as a computer simulation in which a population of abstract representations (called chromosomes) of candidate solutions (called individuals) to an optimization problem evolves toward better solutions. Traditionally, solutions are represented in binary as strings of 0s and 1s, but different encodings are also possible. The evolution starts from a population of completely random individuals and happens in generations. In each generation, the fitness of the whole population is evaluated, multiple individuals are stochastically selected from the current population (based on their fitness), modified (mutated or recombined) to form a new population, which becomes current in the next iteration of the algorithm.

References

†Genetic algorithm - Wikipedia, the free encyclopedia, last modified 00:36, 15 February 2006, http://en.wikipedia.org/wiki/ Genetic_algorithm

Related Posts:

Genetic Algorithm

Table of Contents

© 2006 Kumaravel & Project Team

Genetic Algorithm

A Genetic Algorithm (GA) is a programming technique that impersonates Darwin's theory of evolution or natural selection as a problem-solving approach and it is a part of evolutionary computing, which is a rapidly growing area of artificial intelligence. Since it is heuristic, it won’t give the solution which is exact. Most real-world problems are like that which don't calculate it exactly.

Related Posts:

Implementation of GA – a short note

Table of Contents

© 2006 Kumaravel & Project Team

Local Search

Local search is a meta-heuristic for solving computationally hard optimization problems. Local search can be used on problems that can be formulated as finding a solution maximizing a criterion among a number of candidate solutions. Local search algorithms move from solution to solution in the space of candidate solutions (the search space) until a solution deemed optimal is found or a time bound is elapsed.

References:

†Local search (optimization) - Wikipedia, the free encyclopedia, last modified 01:12, 1 March 2006, http://en.wikipedia.org/wiki/ Local_search_%28optimization%29

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Simulated annealing

Simulated annealing (SA) is a generic probabilistic meta-algorithm for the global optimization problem, namely locating a good approximation to the global optimum of a given function in a large search space. It was independently invented by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983, and by V. Cerny in 1985. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one.

References:

†Simulated annealing - Wikipedia, the free encyclopedia, last modified 21:53, 15 April 2006, http://en.wikipedia.org/wiki/ Simulated_annealing

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Greedy algorithm

Greedy algorithms are algorithms which follow the problem solving meta-heuristic of making the locally optimum choice at each stage with the hope of finding the global optimum. For instance, applying the greedy strategy to the travelling salesman problem yields the following algorithm: "At each stage visit the unvisited city nearest to the current city".

References:

†Greedy algorithm - Wikipedia, the free encyclopedia, last modified 01:22, 2 March 2006, http://en.wikipedia.org/wiki/Greedy_algorithm

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Tabu Search

Tabu search is a mathematical optimization method, belonging to the class of local search techniques. Tabu search enhances the performance of a local search method by using memory structures. Tabu search is generally attributed to Fred Glover. Tabu search uses a local or neighbourhood search procedure to iteratively move from a solution x to a solution x' in the neighbourhood of x, until some stopping criterion has been satisfied. To explore regions of the search space that would be left unexplored by the local search procedure and by doing this escape local optimality, Tabu search modifies the neighbourhood structure of each solution as the search progresses. The solutions admitted to N * (x), the new neighbourhood, are determined through the use of special memory structures. The search now progresses by iteratively moving from a solution x to a solution x' in N * (x).

References:

†Tabu search - Wikipedia, the free encyclopedia, last modified 16:41, 9 March 2006, http://en.wikipedia.org/wiki/Tabu_search

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Simplex algorithm

The simplex algorithm of George Dantzig is a popular technique for numerical solution of the linear programming problem. An unrelated, but similarly named method is the Nelder- Mead method or downhill simplex method due to Nelder & Mead (1965) and is a numerical method for optimising many-dimensional unconstrained problems, belonging to the more general class of search algorithms. In both cases, the method uses the concept of a simplex, which is a polytope of N + 1 vertices in N dimensions: a line segment on a line, a triangle on a plane, a tetrahedron in three-dimensional space and so forth.

References:

†Simplex algorithm - Wikipedia, the free encyclopedia, last modified 19:54, 6 April 2006 , http://en.wikipedia.org/wiki/Simplex_algorithm

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Hill climbing

Hill climbing is a graph search algorithm where the current path is extended with a successor node which is closer to the solution than the end of the current path. In simple hill climbing, the first closer node is chosen whereas in steepest ascent hill climbing all successors are compared and the closest to the solution is chosen. Both forms fail if there is no closer node. This may happen if there are local maxima in the search space which are not solutions. Steepest ascent hill climbing is similar to best first search but the latter tries all possible extensions of the current path in order whereas steepest ascent only tries one.

References:

†Hill climbing - Wikipedia, the free encyclopedia, last modified 01:34, 16 March 2006, http://en.wikipedia.org/wiki/Hill_climbing

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Critical Path Method

It was first developed by DuPont several decades ago for construction projects. The essential technique inherent in the method is to construct a list or model including all required activities to complete the task, the time they will take, and the sequence in which interdependent tasks have to be performed. The method then figures out what activities are critical to the completion of a project (the critical path) and what activities have some "float time" (are less critical). This allows management to prioritize critical activities. Usually the schedule is edited on a regular basis, to monitor the critical activities and to ensure that non-critical activities do not interfere with the critical ones.

References:

†Critical Path Method of Scheduling - Wikipedia, the free encyclopedia, last modified 22:30, 7 April 2006, http://en.wikipedia.org/wiki/Critical_Path_Method_of_Scheduling

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Branch and bound Method

It is a general algorithmic method for finding optimal solutions of various optimization problems, especially in discrete and combinatorial optimization. It belongs to the class of implicit enumeration methods. The method was first proposed by A. H. Land and A. G. Doig in 1960 for linear programming.

References:

†Branch and bound - Wikipedia, the free encyclopedia, last modified 19:36, 14 December 2005, http://en.wikipedia.org/wiki/Branch_and_bound

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Some other Optimisation methods

Related Posts:

Table of Contents

© 2006 Kumaravel & Project Team