Evolutionary algorithm

From Wikipedia, the free encyclopedia

Jump to: navigation, search

In artificial intelligence, an evolutionary algorithm (EA) is a subset of evolutionary computation, a generic population-based metaheuristic optimization algorithm. An EA uses some mechanisms inspired by biological evolution: reproduction, mutation, recombination, and selection. Candidate solutions to the optimization problem play the role of individuals in a population, and the fitness function determines the environment within which the solutions "live" (see also cost function). Evolution of the population then takes place after the repeated application of the above operators. Artificial evolution (AE) describes a process involving individual evolutionary algorithms; EAs are individual components that participate in an AE.

Evolutionary algorithms often perform well approximating solutions to all types of problems because they ideally do not make any assumption about the underlying fitness landscape; this generality is shown by successes in fields as diverse as engineering, art, biology, economics, marketing, genetics, operations research, robotics, social sciences, physics, politics and chemistry[citation needed].

Apart from their use as mathematical optimizers, evolutionary computation and algorithms have also been used as an experimental framework within which to validate theories about biological evolution and natural selection, particularly through work in the field of artificial life. Techniques from evolutionary algorithms applied to the modelling of biological evolution are generally limited to explorations of microevolutionary processes, however some computer simulations, such as Tierra and Avida, attempt to model macroevolutionary dynamics.

A possible limitation of many evolutionary algorithms is their lack of a clear genotype-phenotype distinction. In nature, the fertilized egg cell undergoes a complex process known as embryogenesis to become a mature phenotype. This indirect encoding is believed[citation needed] to make the genetic search more robust (i.e. reduce the probability of fatal mutations), and also may improve[citation needed] the evolvability of the organism. Recent work in the field of artificial embryogeny, or artificial developmental systems, seeks to address these concerns.

Contents

[edit] Implementation of biological processes

Usually, an initial population of randomly generated candidate solutions comprise the first generation. The fitness function is applied to the candidate solutions and any subsequent offspring. Two main classes of fitness functions exist: one where the fitness function does not change, as in optimizing a fixed function or testing with a fixed set of test cases; and one where the fitness function is mutable, as in using niche differentiation or co-evolving the set of test cases.

In selection, parents for the next generation are chosen with a bias towards higher fitness. The parents reproduce by copying with recombination and/or mutation. Recombination acts on the two selected parents (candidates) and results in one or two children (new candidates). Mutation acts on one candidate and results in a new candidate. These operators create the offspring (a set of new candidates). These new candidates compete with old candidates for their place in the next generation (survival of the fittest).

This process can be repeated until a candidate with sufficient quality (a solution) is found or a previously defined computational limit is reached.

[edit] Evolutionary algorithm techniques

Similar techniques differ in the implementation details and the nature of the particular applied problem.

  • Genetic algorithm - This is the most popular type of EA. One seeks the solution of a problem in the form of strings of numbers (traditionally binary, although the best representations are usually those that reflect something about the problem being solved), by applying operators such as recombination and mutation (sometimes one, sometimes both). This type of EA is often used in optimization problems;
  • Genetic programming - Here the solutions are in the form of computer programs, and their fitness is determined by their ability to solve a computational problem.
  • Evolutionary programming - Like genetic programming, only the structure of the program is fixed and its numerical parameters are allowed to evolve;
  • Evolution strategy - Works with vectors of real numbers as representations of solutions, and typically uses self-adaptive mutation rates;

[edit] Related techniques

[edit] Bibliography

  • Ashlock, D. (2006), Evolutionary Computation for Modeling and Optimization, Springer, ISBN 0-387-22196-4.
  • Bäck, T. (1996), Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms, Oxford Univ. Press.
  • Bäck, T., Fogel, D., Michalewicz, Z. (1997), Handbook of Evolutionary Computation, Oxford Univ. Press.
  • Eiben, A.E., Smith, J.E. (2003), Introduction to Evolutionary Computing, Springer.
  • Holland, J. H. (1975), Adaptation in Natural and Artificial Systems, The University of Michigan Press, Ann Arbor
  • Poli, R., Langdon, W. B., McPhee, N. F. (2008). A Field Guide to Genetic Programming. Lulu.com, freely available from the internet. ISBN 978-1-4092-0073-4. 
  • Ingo Rechenberg (1971): Evolutionsstrategie - Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (PhD thesis). Reprinted by Fromman-Holzboog (1973).
  • Hans-Paul Schwefel (1974): Numerische Optimierung von Computer-Modellen (PhD thesis). Reprinted by Birkhäuser (1977).

[edit] See also

[edit] External links

Implementations:

  • ECJ (Java), popular evolutionary computation toolkit.
  • EO (C++) another popular EC toolkit
  • EvA2 (Java), a rich open source EA and heuristic optimization framework with GUI.
  • JAGA (Java) Open source API for implementing genetic algorithms and genetic programming applications
  • OpenBeagle (C++) yet another popular EC toolkit.
  • An implementation of PSO in MATLAB
Personal tools