Showing posts with label My Project. Show all posts
Showing posts with label My Project. Show all posts

Thursday, July 9, 2009

Results And Conclusions:Future Scope

Initially we have tried with the number of population is twice of the number of the machines which result the earlier convergence of the problem. Due to this there may be possible of stuck at local optima. So we fixed up the number of population to 30. It gives better result than the former.
For the further improvement, we gave attention to the selection parameter.

How our team approached to Job Shop Scheduling using Genetic Algorithm

As said in previous post below is our approach for the Job Shop Scheduling using Genetic Algorithm
Project Team: Kumaravel.S, Mohammed Ishaq.I, Sankaralingam.B, Venkatesh.G
As first to generate the chromosomes, we generate the Random number from

Wednesday, July 8, 2009

IMPLEMENTATION OF GA TO JSSP

From the earlier said operational parameters, we took the real-value encoding in which the job numbers are encoded to form the string which will give the sequence of machining order. i.e., each chromosome represents a legal solution to the problem and is composed of a string of genes. In our application, each solution is encoded as a string of job sequence. For example, the string “235641” represents for a job sequence of job 2 is processed before job “3”, job “3” is processed before job “5” and so on. The requirement is that repeating a job in a sequence is not allowed and all jobs should appear in the sequence.

For the crossover we preferred the one point method since it is simple and easy to implement and it is robust. In case of mutation the swap mutation is used, which play the major role to avoid the convergence. Termination condition used is Fixed Generation Termination. In general the roulette wheel selection is used for the maximize problem so here we used the rank selection with the fitness function as said above.

Configuration of Operational Parameters with which application runs 

Population Size

Number of chromosomes in the population is 30

Crossover

Probability of crossover’s tossing head is 0.6

Mutation

Probability of Mutation’s tossing head 0.2

Fitness

Subjective Fitness of Makes-span

Selection

Linear Ranking selection

Termination Condition

After 100 Generations

References:

Book Reference: David Edward Goldberg (1989), “Genetic Algorithms in Search, Optimization and Machine Learning” Addison-Wesley.

Citation Needed: Scheduling Problem For A Single Flexible Manufacturing machine, Nguyen Van Hop, Industrial Engineering Program, Sirindhorn International Institute of Technology, Thammasat University

Related Posts:

Table of Contents

© 2006 Kumaravel & Project Team


If references link found broken see the below e-printed version of webpage

Note: e-printed version of the webpage, are just for the reference and it was not owned by blog author. It had be created using the Open Source PDFCreator, which is environment friendly to save paper.


Genetic Operators

Genetic variation is a necessity for the process of evolution. Genetic operators used in genetic algorithms are analogous to those which occur in the natural world: survival of the fittest, or selection; asexual or sexual reproduction (crossover, or recombination); and mutation.
Crossover
In genetic algorithms, crossover is a genetic operator used to

Selection

During each successive epoch, a proportion of the existing population is selected to breed a new generation. Individual solutions are selected through a fitness-based process, where fitter solutions (as measured by a fitness function) are typically more likely to be selected.

Monday, July 6, 2009

Chromosomes

In genetic algorithms, a chromosome is a set of parameters which define a proposed solution to the problem that the genetic algorithm is trying to solve. The chromosome is often represented as a simple string; although a wide variety of other data structures are also in use as chromosomes.
A genetic algorithm creates many chromosomes, either randomly

Sunday, June 21, 2009

How GA differs from other optimization techniques?

  1. They work with a coding of the parameter set, not the parameters themselves.
  2. They search from a population of points in the problem domain, not a singular point.
  3. They use payoff information as the objective function rather than derivatives of the problem or auxiliary knowledge.

Applications of Genetic Algorithms

  • Automated design, including research on composite material design and multi-objective design of automotive components for crashworthiness, weight savings, and other characteristics.
  • Automated design of mechatronic systems using bond graphs and genetic programming (NSF).
  • Calculation of Bound States and Local Density Approximations.

Wednesday, June 10, 2009

Weakness of GA

Although genetic algorithms have proven to be an efficient and powerful problem-solving strategy, they are not a universal remedy. The following are some limitations of GA.
  • The language used to specify candidate solutions must be robust; i.e., it must be able to tolerate random changes such that fatal errors or nonsense do not consistently result.

Strengths of GA

  • The genetic algorithm is the most robust and intrinsically parallel
  • They can explore the solution space in multiple directions at once. If one path turns out to be a dead end, they can easily eliminate it and continue work on more promising avenues, giving them a greater chance each run of finding the optimal solution.

Sunday, April 12, 2009

Mechanism of GA

A GA is an algorithm that operates on a principle, a combination of natural selection and procreation permits the development of living species that are highly adapted to their environments.

When applied to a problem the standard genetic algorithm proceeds as follows: an initial population of individuals i.e., a set of solutions for the given problem represented by chromosomes, is generated at random or heuristically.

At every evolutionary step, known as a generation, the individuals in the current population are decoded and evaluated according to some predefined quality criterion, referred to as the fitness function.

To form the next generation, individuals are selected according to their fitness. Then some or the entire existing individual of the current population is replaced with the newly created individuals.

Creation of new individuals is done by crossover and mutation operations which motivated by a hope that the new population will be better than the old one. Many selection procedures are currently in use, one of the simplest being Holland's original fitness proportionate selection, where individuals are selected with a probability proportional to their relative fitness.

This ensures that the expected number of times an individual is chosen is approximately proportional to its relative performance in the population. Thus, high-fitness individuals stand a better chance of “reproducing”, while low-fitness ones are more likely to disappear.

This is repeated until some condition such as number of populations or improvement of the best solution is satisfied. If the GA has been designed well, the population will converge to an optimal solution to the problem.

Convergence is the progression towards increasing uniformity. However, it is important not to forget that GA are stochastic iterative processes and they are not guaranteed to converge; hence, the termination condition may be specified as some fixed, maximal number of generations or as the attainment of an acceptable fitness level.

References

†“Genetic Algorithms”, Burhaneddin SANDIKCI, http://www.ie.bilkent.edu.tr/~lors/ie572/burhaneddin_html/IE572_GA.html

http://www.ie.bilkent.edu.tr/~lors/ie572/burhaneddin.pdf

Related Posts:

Table of Contents

© 2006 Kumaravel & Project Team


If references link found broken see the below e-printed version of webpage

Note: e-printed version of the webpage, are just for the reference and it was not owned by blog author. It had be created using the Open Source PDFCreator, which is environment friendly to save paper.


an example - Darwin's Theory

As an example Darwin noted that the ptarmigan turns white in winter. This colour change, he inferred, helped protect it from predators, which would have a hard time spotting the bird in snow. Ptarmigans that didn't change colour in winter would be spotted easily and eaten. In this way, Darwin implied, ptarmigans that turned white in winter would be more likely to survive, reproduce, and pass this adaptation to future generations.

References

†“Natural Selection”, http://bioweb.cs.earlham.edu/9-12/evolution/HTML/natural.html

Related Posts:

Darwin's Theory of Natural Selection

Table of Contents

© 2006 Kumaravel & Project Team


If references link found broken see the below e-printed version of webpage

Note: e-printed version of the webpage, are just for the reference and it was not owned by blog author. It had be created using the Open Source PDFCreator, which is environment friendly to save paper.


Darwin’s Theory of Natural Selection

  • Individuals within a species vary and these variations are inherited (at least in part) by their offspring.
  • Organisms produce more offspring than can possibly survive.
  • On average, offspring that vary most strongly in directions favoured by the environment will survive and reproduce. Favourable variation will therefore accumulate in populations by natural selection. Unfavourable variations will be “weeded out” by natural selection.

Darwin's theory of natural selection built on the work of many scientists before him but was revolutionary because he was the first to put it together into a coherent theory that included a mechanism for how evolution occurred (natural selection) and because his conclusions directly challenged the orthodox religious thinking of the time.This was because:

  • Darwin argued that evolution has no purpose. Individuals "struggle" to increase the representation of their genes in future generations, and that is all.
  • Darwin maintained that evolution has no direction. It does not lead inevitably to higher things or to man. Organisms become better adapted to their environments, and that is all.

References

†“Darwin's Theory of Natural Selection”, Gould, http://www.chss.iup.edu/anthropology/courses/TC110/Darwin.htm.

Related Posts:

an example - Darwin's Theory

Table of Contents

© 2006 Kumaravel & Project Team


If references link found broken see the below e-printed version of webpage

Note: e-printed version of the webpage, are just for the reference and it was not owned by blog author. It had be created using the Open Source PDFCreator, which is environment friendly to save paper.


Saturday, April 11, 2009

Genetic Algorithm – Short note on Implementation

Genetic algorithms are typically implemented as a computer simulation in which a population of abstract representations (called chromosomes) of candidate solutions (called individuals) to an optimization problem evolves toward better solutions. Traditionally, solutions are represented in binary as strings of 0s and 1s, but different encodings are also possible. The evolution starts from a population of completely random individuals and happens in generations. In each generation, the fitness of the whole population is evaluated, multiple individuals are stochastically selected from the current population (based on their fitness), modified (mutated or recombined) to form a new population, which becomes current in the next iteration of the algorithm.

References

†Genetic algorithm - Wikipedia, the free encyclopedia, last modified 00:36, 15 February 2006, http://en.wikipedia.org/wiki/ Genetic_algorithm

Related Posts:

Genetic Algorithm

Table of Contents

© 2006 Kumaravel & Project Team

Genetic Algorithm

A Genetic Algorithm (GA) is a programming technique that impersonates Darwin's theory of evolution or natural selection as a problem-solving approach and it is a part of evolutionary computing, which is a rapidly growing area of artificial intelligence. Since it is heuristic, it won’t give the solution which is exact. Most real-world problems are like that which don't calculate it exactly.

Related Posts:

Implementation of GA – a short note

Table of Contents

© 2006 Kumaravel & Project Team

Local Search

Local search is a meta-heuristic for solving computationally hard optimization problems. Local search can be used on problems that can be formulated as finding a solution maximizing a criterion among a number of candidate solutions. Local search algorithms move from solution to solution in the space of candidate solutions (the search space) until a solution deemed optimal is found or a time bound is elapsed.

References:

†Local search (optimization) - Wikipedia, the free encyclopedia, last modified 01:12, 1 March 2006, http://en.wikipedia.org/wiki/ Local_search_%28optimization%29

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Simulated annealing

Simulated annealing (SA) is a generic probabilistic meta-algorithm for the global optimization problem, namely locating a good approximation to the global optimum of a given function in a large search space. It was independently invented by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983, and by V. Cerny in 1985. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one.

References:

†Simulated annealing - Wikipedia, the free encyclopedia, last modified 21:53, 15 April 2006, http://en.wikipedia.org/wiki/ Simulated_annealing

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Greedy algorithm

Greedy algorithms are algorithms which follow the problem solving meta-heuristic of making the locally optimum choice at each stage with the hope of finding the global optimum. For instance, applying the greedy strategy to the travelling salesman problem yields the following algorithm: "At each stage visit the unvisited city nearest to the current city".

References:

†Greedy algorithm - Wikipedia, the free encyclopedia, last modified 01:22, 2 March 2006, http://en.wikipedia.org/wiki/Greedy_algorithm

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Tabu Search

Tabu search is a mathematical optimization method, belonging to the class of local search techniques. Tabu search enhances the performance of a local search method by using memory structures. Tabu search is generally attributed to Fred Glover. Tabu search uses a local or neighbourhood search procedure to iteratively move from a solution x to a solution x' in the neighbourhood of x, until some stopping criterion has been satisfied. To explore regions of the search space that would be left unexplored by the local search procedure and by doing this escape local optimality, Tabu search modifies the neighbourhood structure of each solution as the search progresses. The solutions admitted to N * (x), the new neighbourhood, are determined through the use of special memory structures. The search now progresses by iteratively moving from a solution x to a solution x' in N * (x).

References:

†Tabu search - Wikipedia, the free encyclopedia, last modified 16:41, 9 March 2006, http://en.wikipedia.org/wiki/Tabu_search

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team

Simplex algorithm

The simplex algorithm of George Dantzig is a popular technique for numerical solution of the linear programming problem. An unrelated, but similarly named method is the Nelder- Mead method or downhill simplex method due to Nelder & Mead (1965) and is a numerical method for optimising many-dimensional unconstrained problems, belonging to the more general class of search algorithms. In both cases, the method uses the concept of a simplex, which is a polytope of N + 1 vertices in N dimensions: a line segment on a line, a triangle on a plane, a tetrahedron in three-dimensional space and so forth.

References:

†Simplex algorithm - Wikipedia, the free encyclopedia, last modified 19:54, 6 April 2006 , http://en.wikipedia.org/wiki/Simplex_algorithm

Related Posts:

Some other Optimisation methods

Table of Contents

© 2006 Kumaravel & Project Team