0% found this document useful (0 votes)
43 views

Genetic Algorithm

This document provides an overview of genetic algorithms. It begins by defining genetic algorithms as metaheuristic optimization algorithms inspired by biological evolution. It then describes the key components of genetic algorithms, including populations of individuals with chromosomes and genes, fitness functions, selection, crossover, and mutation. The document outlines the typical phases of a genetic algorithm and provides an example application to the knapsack problem. It concludes by discussing variants, applications, benefits, and comparing genetic algorithms to traditional algorithms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
43 views

Genetic Algorithm

This document provides an overview of genetic algorithms. It begins by defining genetic algorithms as metaheuristic optimization algorithms inspired by biological evolution. It then describes the key components of genetic algorithms, including populations of individuals with chromosomes and genes, fitness functions, selection, crossover, and mutation. The document outlines the typical phases of a genetic algorithm and provides an example application to the knapsack problem. It concludes by discussing variants, applications, benefits, and comparing genetic algorithms to traditional algorithms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Introduction

A metaheuristic is a higher-level procedure or heuristic designed to find,


generate, or select a heuristic (partial search algorithm) that may provide
a sufficiently good solution to an optimization problem, especially with
incomplete or imperfect information or limited computation capacity.
An Evolutionary Algorithm is the one uses mechanisms inspired by
biological evolution, such as reproduction, mutation, recombination, and
selection. Candidate solutions to the optimization problem play the role of
individuals in a population, and the fitness function determines the quality
of the solutions
Genetic Algorithm

A genetic algorithm (GA) is a metaheuristic inspired by the process of


natural selection that belongs to the larger class of evolutionary
algorithms (EA).
Genetic algorithms are commonly used to generate high-quality solutions
to optimization and search problems by relying on biologically inspired
operators such as mutation, crossover and selection.
Some examples of GA applications include optimizing decision trees for
better performance, solving sudoku puzzles hyperparameter optimization,
etc.
Historical Background

● Developed by John Holland, University of Michigan (1970’s)


● To understand the adaptive processes of natural systems
● To design artificial systems software that retains the robustness of
natural systems
● Inspired by the biological evolution process.
● Uses Concepts of “Natural Selection” and “Genetic Inheritance”
(Darwin 1859)
Key Terms

Individual - Any possible solution


Population - Group of all individuals
Search Space - All possible solutions to the problem
Chromosome - Blueprint for an individual
Trait - Possible aspect (features) of an individual
Allele - Possible settings of trait (black, blond, etc.)
Locus - The position of a gene on the chromosome
Genome - Collection of all chromosomes for an individual
Process of Natural Selection

● The process of natural selection starts with the selection of fittest


individuals from a population. They produce offspring which inherit the
characteristics of the parents and will be added to the next generation. If
parents have good fitness, their offspring will be better than parents and
have a better chance at surviving. This process keeps on iterating and at
the end, a generation with the fittest individuals will be found.
● This notion can be applied for a search problem. We consider a set of
solutions for a problem and select the set of best ones out of them
Phases in a Genetic Algorithm

Five phases of genetic algorithm :


1. Initial Population
2. Fitness Function
3. Selection
4. Cross Over
5. Mutation
Initial Population

● The process begins with a set of individuals which is called a Population.


Each individual is a solution to the problem you want to solve.
● An individual is characterized by a set of parameters (variables) known as
Genes. Genes are joined into a string to form a Chromosome (solution).
● In a genetic algorithm, the set of genes of an individual is represented
using a string, in terms of an alphabet. Usually, binary values are used
(string of 1s and 0s). We say that we encode the genes in a chromosome.
Fitness Function

The fitness function determines how fit an individual is (the ability of an


individual to compete with other individuals). It gives a fitness score to
each individual. The probability that an individual will be selected for
reproduction is based on its fitness score.
Selection

The idea of selection phase is to select the fittest individuals and let them
pass their genes to the next generation.

Two pairs of individuals (parents) are selected based on their fitness


scores. Individuals with high fitness have more chance to be selected for
reproduction.
Crossover

Crossover is the most significant phase in a genetic algorithm. For each pair of
parents to be mated, a crossover point is chosen at random from within the
genes.
Offspring are created by exchanging
the genes of parents among
themselves until the crossover point is
reached.

The new offspring are added to the


population.
Mutation
In certain new offspring formed, some of their genes can be subjected to a mutation with a
low random probability. This implies that some of the bits in the bit string can be flipped.

Mutation occurs to maintain diversity within the population and prevent premature
convergence.
Termination

The algorithm terminates if the population has converged (does not


produce offspring which are significantly different from the previous
generation). Then it is said that the genetic algorithm has provided a set
of solutions to our problem.
Flowchart for a Genetic
Algorithm
Example: Knapsack Problem
Now that we have our Individual
representation done, we pick a random set
of Individuals that would form our initial
population. Every single individual is a
potential solution to the problem. Hence
for the Knapsack problem, from a search
space of 2^n, we pick a few individuals
randomly.

The idea here is to evolve the population


and make them fitter over time. Given that
the search space is exponential, we use
evolution to quickly converge to a
high-quality (need not be optimal)
solution.

Randomly Selected Population


Fitness Coefficient
● We randomly pick two individuals from the initial population and run a tournament
between them. The one who wins becomes the first parent. We repeat the same
procedure and get the second parent for the next step. The two parents are then passed
onto the next steps of evolution.
● We keep it simple and create two children from two fit parents such that both get one
half from each parent and the other half from the other parent. This essentially means
that we combine half elements from both the individual and form the two children.
● We define a parameter called the Mutation Rate, which is very low given that mutation
rate in nature. The rate typically is in the range of 0.01 to 0.02. The mutation changes an
individual, and with this change, it can have a higher or lower fitness coefficient, just
how it happens in nature.
● We repeat the entire process of Selection, Reproduction, Crossover, and Mutation till we
get the same number of children as the initial population, and we call it the first
generation. We repeat the same process again and create subsequent generations.
● When we continue to create such generations, we will observe that the Average Fitness
Coefficient of the population increases and it converges to a steady range
Genetic v/s Traditional Algorithms
Applications
● Optimization − Genetic Algorithms are most commonly used in optimization problems
wherein we have to maximize or minimize a given objective function value under a given set
of constraints. The approach to solve Optimization problems has been highlighted throughout
this presentation.
● Economics − GAs are also used to characterize various economic models like the cobweb
model, game theory equilibrium resolution, asset pricing, etc.
● Neural Networks − GAs are also used to train neural networks, particularly recurrent neural
networks.
● Parallelization − GAs also have very good parallel capabilities, and prove to be very effective
means in solving certain problems, and also provide a good area for research.
● Vehicle routing problems − With multiple soft time windows, multiple depots and a
heterogeneous fleet.
● Scheduling applications − GAs are used to solve various scheduling problems as well,
particularly the timetable problem.
● Robot Trajectory Generation − GAs have been used to plan the path which a robot arm takes
by moving from one point to another.
● Parametric Design of Aircraft − GAs have been used to design aircrafts by varying the
parameters and evolving better solutions.
● DNA Analysis − GAs have been used to determine the structure of DNA using spectrometric
data about the sample.
● Multimodal Optimization − GAs are obviously very good approaches for multimodal
optimization in which we have to find multiple optimum solutions.
Benefits of Genetic Algorithms

● Concept is easy to understand

● Modular, separate from application

● Supports multi-objective optimization

● Good for “noisy” environments

● Always an answer; answer gets better with time

● Inherently parallel; easily distributed


Benefits of Genetic Algorithms (Contd.)

● Many ways to speed up and improve a GA-based application as

knowledge about problem domain is gained

● Easy to exploit previous or alternate solutions

● Flexible building blocks for hybrid applications

● Substantial history and range of use


Variants
Elitism
A practical variant of the general process of constructing a new population is to allow the best
organism(s) from the current generation to carry over to the next, unaltered. This strategy is known
as elitist selection and guarantees that the solution quality obtained by the GA will not decrease
from one generation to the next.

Adaptive GAs
Genetic algorithms with adaptive parameters (adaptive genetic algorithms, AGAs) is another
significant and promising variant of genetic algorithms. The probabilities of crossover (pc) and
mutation (pm) greatly determine the degree of solution accuracy and the convergence speed
that genetic algorithms can obtain. Instead of using fixed values of pc and pm, AGAs utilize the
population information in each generation and adaptively adjust the pc and pm in order to
maintain the population diversity as well as to sustain the convergence capacity.
Issues
● Choosing basic implementation issues:
○ Representation
○ population size, mutation rate
○ selection, deletion policies
○ crossover, mutation operators
● Repeated fitness function evaluation for complex problems is often the most
prohibitive and limiting segment of artificial evolutionary algorithms. Finding
the optimal solution to complex high-dimensional, multimodal problems often
requires very expensive fitness function evaluations. In real world problems
such as structural optimization problems, a single function evaluation may
require several hours to several days of complete simulation.
● Genetic algorithms do not scale well with complexity. That is, where the
number of elements which are exposed to mutation is large there is often an
exponential increase in search space size. This makes it extremely difficult to
use the technique on problems such as designing an engine, a house or a
plane.
● In many problems, GAs have a tendency to converge towards local optima or
even arbitrary points rather than the global optimum of the problem. This
means that it does not "know how" to sacrifice short-term fitness to gain
longer-term fitness. The likelihood of this occurring depends on the shape of
the fitness landscape: certain problems may provide an easy ascent towards
a global optimum, others may make it easier for the function to find the local
optima.
● Operating on dynamic data sets is difficult, as genomes begin to converge
early on towards solutions which may no longer be valid for later data.
Real World Implications
Faster-Growing Trees

Demand for wood can be met by trees that grow faster than average. Genetic engineering has produced trees that
can ward off biological attacks, grow more quickly and strongly, and create better wood than trees that are not
genetically modified.

Golden Rice

Genetic modification is often used to make healthier foods, such as golden rice, which contains beta-carotene —
the very same vitamin that makes carrots orange. The result is that people without access to many vitamins will
get a healthy dose of vitamin A when the rice is consumed.

Salmon That Grow Faster

Salmon do not produce growth hormones year-round, so scientists have looked toward genetic engineering and
found a solution. A modification that allows salmon to grow twice as fast as those that are not engineered was
the answer

You might also like