Memetic Algorithms For The Traveling Salesman Problem
Memetic Algorithms For The Traveling Salesman Problem
Memetic Algorithms For The Traveling Salesman Problem
Salesman Problem
Peter Merz!
Department of Computer Science,
University of Tubingen,
Sand 1, D72076 Tubingen, Germany
Bernd Freisleben!
Department of Mathematics and Computer Science,
University of Marburg,
Hans-Meerwein-Strae,
D35032 Marburg, Germany
1. Introduction
Complex Systems, 13 (2001) 297345; " 2001 Complex Systems Publications, Inc.
298 P. Merz and B. Freisleben
For decades, the TSP has served as an initial proving ground for new
ideas to solve combinatorial optimization problems. Besides the fast
development in solving TSP instances to optimality, enormous progress
has been made in the field of heuristics.
Most of the earliest algorithms belong to the class of construction
heuristics. Examples of this class are nearest neighbor heuristics and in-
sertion heuristics, for which a detailed description and comparison can
be found in [53, 97]. Another intuitive approach is the greedy heuristic,
also known as the multi-fragment heuristic [5, 53]. Furthermore, there
are more elaborate tour construction heuristics, such as the Christofides
algorithm [16] which is based on spanning trees, or the savings heuristic
(also known as the Clarke and Wright algorithm) originally developed
for the vehicle routing problem [17]. However, these heuristics perform
poorly in comparison to local search heuristics which belong to the class
of improvement heuristics. But, instead of applying a local search to
randomly generated solutions, a local search can be applied to solutions
generated by a (randomized) tour construction heuristic. Surprisingly,
the best performing construction heuristic is not the best suited for com-
bining with local search, as shown by several researchers independently
[7, 53, 97]. For example, in [53] it is shown that although the savings
heuristics performs better than the greedy and nearest neighbor heuris-
tics, in combination with 2-opt or 3-opt local search it performs the
worst (even worse than the local search applied to random solutions).
In fact, the best suited construction heuristic is the greedy heuristic, as
shown in [7, 53]. It appears that the starting tour for a local optimiza-
tion must contain a number of exploitable defects, that is, some rather
long edges, and if a tour is too good it may not have these.
Since greedy and local search heuristics are among the most efficient
algorithms for the TSP with short running times and thus the most
interesting for incorporation into MA, these two types of algorithms
are described in the following paragraphs in more detail. Many other
heuristics have been proposed for the TSP, including simulated anneal-
ing [50, 111], tabu search [28], ant colonies [22, 36, 106], artificial
neural networks [23, 76, 92], search space smoothing [43], perturba-
Complex Systems, 13 (2001) 297345
300 P. Merz and B. Freisleben
tion [18], and evolutionary algorithms [30, 32, 49, 85, 108, 110, 112,
114].
Local search algorithms for the TSP are based on simple tour modi-
fications. A local search algorithm is specified in terms of a class of
operations called moves that can be used to transform one tour to an-
other. We can view local search as a neighborhood search process where
each tour has an associated neighborhood of tours, that is, those that
can be reached by a single move. The search algorithm repeatedly moves
to a better neighbor until no better neighbors exist. Moves proposed
for the TSP can be divided into node exchange operators, node insertion
operators, and edge exchange operators.
Viewing a TSP tour as a sequence of cities which defines the order in
which to visit the cities, the node exchange operator simply exchanges
two nodes in the sequence.
Node re-insertion operators work by deleting a node from a tour and
inserting it at another position in the tour. Variations of this scheme
exist in which two nodes are re-inserted (edge insertion) [97] or up to
three nodes are re-inserted (Or-opt) [90].
Complex Systems, 13 (2001) 297345
Memetic Algorithms for the Traveling Salesman Problem 301
3 3
8 8
7 7
0
5 1
! 0
5 1
2 2
4 4
9 9
6 6
u7
u4 u4
u3 u3 u3
u2 u2 u2
u1 u1 u1
u6
u5 u5
algorithm would go beyond the scope of this paper and can be found in
the original paper by Lin and Kernighan [66].
A major drawback of the LK heuristic besides the high effort needed
for its implementation is its rather long running time. Therefore, sev-
eral improvements to the original algorithm have been made, such as
candidate lists based on nearest neighbors and dont look bits [53]. Fur-
thermore, efficient data structures have been proposed to perform the
submoves since they consume most of the running time of the algorithm,
especially for large TSP instances (N > 1000) [33, 68].
procedure EA;
begin
t := 0;
initializePopulation(P(0));
evaluate(P(0));
repeat
P' := selectForVariation(P(t));
recombine(P' );
mutate(P');
evaluate(P');
P(t & 1) := selectForSurvival(P(t), P' );
t := t & 1;
until terminate = true;
end;
Partially-mapped crossover
Parent A 0 9 6 * 5 3 7 8 * 1 4 2
Parent B 0 5 3 * 7 4 1 8 * 2 6 9
Offspring A' 0 9 6 * 7 4 1 8 * 5 3 2
Offspring B' 0 1 4 * 5 3 7 8 * 2 6 9
3 3 3
8 8 8
7 7 7
5 1 5 1 5 1
" !
0 0 0
2 2 2
4 4 4
9 9 9
6 6 6
the objective value. In other words, the objective values of parents and
offspring may not be highly correlated.
The phenomenon of implicit mutation during recombination can be
observed by almost all recombination operators for the TSP. In [114],
Whitley et al. argue that it is essential to focus attention on edges rather
than preserving the positions of nodes. They developed the edge re-
combination operator which is aimed at preserving as many edges from
the parents as possible while keeping the recombination process simple.
Several variants of the edge recombination operator have been proposed
[24, 69, 105], none of which guarantees that no implicit mutation oc-
curs.
Grefenstette concludes from his studies [42]:
Finally, its widely recognized that GAs are not well suited to per-
forming finely tuned local search... Once the high performance
regions of the search space are identified by the GA, it may be
useful to invoke a local search routine to optimize the members of
the final population.
Assorting
The offspring contain only alleles from either one of the parents, that is,
all edges in the child tour are found in at least one of the parent tours,
thus no implicit mutation occurs.
are called memetic algorithms (MAs) [79, 84]. They differ from other
hybrid evolutionary approaches in that all individuals in the popula-
tion are local optima, since after each mutation or recombination, a
local search is applied. In the following, some of these approaches are
briefly described. They have been shown to be among the best heuristic
techniques for the TSP developed so far.
Bradys crossover
Parent A 4 2 0 * 8 9 5 3 7 * 6 1
Parent B 2 * 9 8 3 7 5 * 0 1 6 4
Offspring A' 4 2 0 * 9 8 3 7 5 * 6 1
In the above example, the sum d89 & d95 & d53 & d37 is greater than
the sum d98 & d83 & d37 & d75 , hence the subpath 9,8,3,7,5 from B is
copied over to A (see A' ) overwriting path 8,9,5,3,7. The path in parent
B remains unchanged. Brady reports in [13] that for a 64-city problem
it was best to search for subpaths of length between 8 and 32. A
disadvantage of this approach is that it is quite expensive to search for
possible crossover points.
With this scheme, only up to two foreign edges are copied to the
parents. In the above example, the edges are (0,9) and (5,6).
Bradys algorithm can be regarded as the first MA proposed for the
TSP.
The asynchronous parallel genetic optimization strategy (ASPARA-
GOS) [38, 39] has been the best evolutionary strategy for the TSP for
years. In this approach, offspring are generated using maximum preser-
vative crossover (MPX). Mutation is applied afterwards followed by a
2-repair, a variant of 2-opt local search focusing on newly introduced
edges.
The MPX proposed in [39, 85] has similarities with the traditional
two-point crossover. To construct an offspring tour, a subpath between
two randomly chosen crossover points is copied from the first parent to
the offspring. The partial tour is extended by copying edges from the
second parent afterwards. If the subpath cannot be extended in this way
to retain a feasible solution, the edges from the first parent are checked.
Complex Systems, 13 (2001) 297345
308 P. Merz and B. Freisleben
If there is no such edge from the first parent that can be used to extend
the tour, a previously unvisited city is added from the second parent
which comes next after the end point in the string. The table below
shows an example.
MPX crossover
Parent A 4 2 0 8 9 5 3 7 6 1
Parent B 2 9 8 3 7 5 0 1 6 4
Offspring C 0 8 9 5 7 3 1 6 4 2
Fine-grained PGAs for the TSP have also been studied in [52], and a
variant of ASPARAGOS has been proposed in [14] called the insular ge-
netic algorithm. A modified version of ASPARAGOS has been proposed
in [40] called ASPARAGOS96 with a hierarchical population structure
and slightly modified MPX and mutation.
The strategic edge crossover (SEX) introduced in [80] by Moscato and
Norman is similar to the edge recombination operator [114] in that an
edge list is utilized to find tour segments consisting only of parent edges.
These tour segments are closed up to a subtour eventually introducing
foreign edges and finally, the subtours are combined into a single tour by
Karps patching heuristic [55]. Later, Holstein and Moscato developed
in [48] another recombination operator, which first copies all common
edges to the offspring and is therefore respectful. Secondly, from the
parent tours, edges are chosen in order of increasing length ensuring
that the TSP constraints are obeyed. Finally, the resulting tour segments
are connected with a nearest neighbor heuristic.
The name genetic local search (GLS) was first used by Ulder et al. in
[109] to describe an EA with recombination and consequently applied
local search. Within this scheme, all individuals of the population rep-
resent local optima with respect to the chosen local search. In [109],
the population model of a GA has been used instead of a model with
a structured population and asynchronous application of the variation
operators. The recombination operator used was MPX, and opposed
to 2-repair, 2-opt and the LK local search were incorporated.
In [15], Bui and Moon also propose a GLS algorithm with LK as the
local search procedure. They developed a k-point crossover operator
with an additional repair mechanism to produce feasible offspring.
The approach described in the present paper as published in [34,
35, 71] is also a GLS and uses LK local search and a new recombi-
nation operator called the distance preserving crossover (DPX). This
algorithm has won the first international contest on evolutionary opti-
mization (ICEO) at the IEEE International Conference on Evolutionary
Optimization [8, 34].
In [112] Walters developed a two-point crossover for a nearest neigh-
bor representation and a repair mechanism called directed edge repair
(DER) to achieve feasibility of the solutions. He uses 3-opt local search
to improve the solutions further. Brood selection is incorporated to
select the best of the children produced by crossover.
In [56] Katayama and Narihisa proposed an EA with LK and small
populations (just two individuals) and a heuristic recombination scheme.
Their approach is similar to the iterated LK heuristic but additional di-
versification is achieved by the recombination of the current solution and
the best solution found. The results presented for large TSP instances
are quite impressive.
of a time series (f (xt )) defines the correlation of two points s steps away
along a random walk of length m through the fitness landscape ( 2 (f )
denotes the variance of the fitness values).
Based on this correlation function, the correlation length " [103] of
the landscape is defined as
1
"$% (3)
ln(*r(1)*)
for r(1) 1 0. The correlation length directly reflects the ruggedness of a
landscape. The lower the value for ", the more rugged the landscape.
If the landscape is statistically isotropic, that is, the time series (f (xt ))
forms a stationary random process, then a single random walk is suffi-
cient to obtain r(s). If a time series is isotropic, gaussian, and markovian,
then the corresponding landscape is called an AR(1) landscape and the
random walk correlation function is of the form r(s) $ r(1)s $ e%s/" with
" being the correlation length of the landscape. For example, AR(1)
landscapes are found in the NK-model and the TSP [113].
A ruggedness measure similar to the correlation length " has been
proposed in [1] called the autocorrelation coefficient which has ap-
proximately the same value.
Kauffman has shown for NK-landscapes that the number of local op-
tima increases with the ruggedness of a landscape. Thus, the higher the
correlation length, the smaller the number of local optima. Krakhofer
and Stadler have shown in [63] that for random graph bipartitioning
problems there is one local optimum on the average in a ball of radius
R("), where R(s) denotes the average distance of two points s steps away
on a random walk.
A further important measure is the fitness distance correlation (FDC)
coefficient, proposed in [54] as a measure for problem difficulty of GAs.
The FDC coefficient " is defined as
Cov(f , dopt )
"(f , dopt ) $ , (4)
(f ) (dopt )
and determines how closely fitness and distance to the nearest optimum
in the search space denoted by dopt are related. (Cov(x, y) denotes the
covariance of x and y and (x) denotes the standard deviation of x.) If
fitness increases when the distance to the optimum becomes smaller, then
search is expected to be easy for selection-based algorithms, since there
is a path to the optimum through solutions with increasing fitness. A
value of " $ %1.0 (" $ 1.0) for a maximization (minimization) problem
Complex Systems, 13 (2001) 297345
Memetic Algorithms for the Traveling Salesman Problem 313
indicates that fitness and distance to the optimum are perfectly related
and that search promises to be easy. A value of " $ 1.0 (" $ %1.0) means
that with increasing fitness the distance to the optimum increases, too.
To gain insight in the global structure of the landscape, a fitness distance
analysis (FDA) can be performed for locally optimum solutions of a
given problem instance, since the correlation of local optima has a large
influence on population-based search, as is is the case for MAs [79].
Thus, it can be determined whether there is a structure in the distri-
bution of locally optimum solutions which can be exploited by a meta-
heuristic based on local search. The local optima may be contained in
a small fraction of the search space or there may be a correlation be-
tween the fitness of the local optima with their distance to an optimum
solution.
Fitness distance plots are well suited to visualize the results obtained
from FDA. Several researchers have used FDA to analyze fitness land-
scapes, including Kauffman [57] for NK-landscapes, Boese [10] for the
TSP, Reeves for a flow-shop scheduling problem [95], and Merz and
Freisleben for the graph bipartitioning problem [74].
Additionally, when performing FDA, it is useful to calculate other
properties such as the number of distinct local optima found, and the
average distance between the local optima.
Several researchers have studied the fitness landscape of the TSP to find
more effective search techniques. Even a theoretical analysis exists that
coincides with conclusions drawn from experiments.
with T denoting the set of all tours of a given TSP instance. Note that
the node exchange neighborhood is a small subset of the 4-opt neighbor-
hood, and the node (re)insertion neighborhood is a subset of the 3-opt
neighborhood since 4 edges and 3 edges are exchanged, respectively.
3.2.3 Fitness distance correlation analysis for the traveling salesman problem
The correlation of fitness of local optima and distance to the optimum
solution has already been studied by Boese in [9, 10] in order to derive
a suitable search strategy for the TSP. However, he concentrated in his
studies [10] on a single TSP instance contained in TSPLIB [96], a publicly
accessible library of TSP instances.
To obtain more general information, additional instances have been
analyzed for which the results are presented in the following. The
instances are selected to cover different problem sizes as well as prob-
lem characteristics. The first three instances denoted mpeano7, mn-
peano7, and David5 are fractal instances based on L-systems (such as
the Koch curve) with known optimum tours described in [81, 82, 88].
The number in the name denotes the order of the fractal.
The other nine instances are chosen from TSPLIB. The first instance
denoted ts225 is known to be hard to solve exactly by branch and
cut algorithms [2] although it has a small number of cities. Instance
pcb442 is a printed circuit board production instance with a regular
location of the nodes. The instances att532, pr1002, and pr2392 are
instances derived from real city locations. rat783 is an instance with a
random distribution of the cities in a rectangular area. dsj1000 denotes
an instance with clustered cities. And finally, the instances fl1400 and
fl1577 are printed circuit board drilling problems. The latter of the
two has been the smallest unsolved problem in TSPLIB for a long time.
Recently, it could be solved to optimality, however. In Figure 5, some
characteristic instances are displayed.
To obtain insight into the structure of the fitness landscapes of these
instances, experiments have been conducted in which the (cor-)relation
of fitness and distance to the optimum of locally optimum solutions has
been investigated. For instances with more than one known optimum
solution, the distances to the nearest optimum was considered. For
example, the number of optima found in experiments for the instances
Complex Systems, 13 (2001) 297345
316 P. Merz and B. Freisleben
att532 fl1577
ts225, rat783, and fl1400, is 147, 17, and 7, respectively. For the first
two instances, the average distance between the optima is 25.8 and 9.5,
respectively. The optima found for instance fl1400 have an average
distance of 336.6. It is assumed that all fl instances have a high number
of global optima. Since just one global optimum was known to the
authors at the beginning of the experiments, no other global optima
have been considered in the analysis.
In a first series of experiments, the local optima were produced by
a fast 3-opt local search applied to randomly generated solutions. The
results are presented in Table 1. In the first column, the name of the
instance is displayed, and in the second column the problem size n is
given. In columns three through seven, the minimum distance of the
local optima to a global optimum (min dopt ), the average distance of the
local optima to the global optimum (dopt ), the average distance between
the local optima (dloc ), the number of distinct local optima (N3%opt ) out
of 2500, and the fitness distance correlation coefficient (") are provided,
respectively. Additionally, the normalized average distance, that is, the
average distance of the local optima to the global optimum divided by
Table 1. Results of the fitness distance analysis for 3-opt solutions of the TSP.
Table 2. Results of the fitness distance analysis for LK solutions of the TSP.
solutions and the fitness of the global optimum (<f $ c(loc ) % c(opt )).
The instance mpeano7 shows perfect correlation between the fitness
difference and the distance to the optimum. The local optima form a
straight line originating from the optimum. The plot for ts225 looks
quite different: for some fitness values, there are several local optima
while for most fitness values there is not even a single one, leading to
large gaps in fitness of the local optima. Problems att532, rat783, and
pr2392 exhibit a cloud of local optima in their scatter plots. The
means of the points are oriented along a straight line. The clustered
instance dsj1000 is similar but there is no orientation towards the op-
timum. This phenomenon becomes more apparent in the problems
fl1400 and fl1577. The means of the points are distributed parallel to
the <f -axis.
The analysis has shown that local optima in the TSP are found in
a small region of the search space: on the average, more than 3/4 of
the edges are common to all local optima with one exception, fl1400.
Furthermore, fitness and distance to the optimum are correlated for most
instances, and the average distance between the local optima is similar
to the distance to the optimum. Thus, the global optimum appears to be
more or less central among the local optima. Boese calls the structure
of the TSP landscape the big valley structure, since local optima are
closer together if they are closer to the optimum, and the smaller the
tour length (cost), the closer they are to the optimum. However, the
analysis has also shown that not all instances exhibit this structure as,
for example, ts225. Furthermore, the analysis indicates that problems
from application domains such as the drilling problems are harder to
solve than randomly generated instances with uniform distribution. The
ts225.tsp
mpeano7.tsp
7000
800
6000
Fitness difference !f
700
Fitness difference !f
600 5000
500 4000
400 3000
300 2000
200 1000
100
0
0 0 50 100 150 200
0 100 200 300 400 500 600 700 800 Distance to optimum dopt
Distance to optimum dopt
att532.tsp rat783.tsp
900 300
800
Fitness difference !f
Fitness difference !f
250
700
600 200
500
150
400
300 100
200
50
100
0 0
0 100 200 300 400 500 0 100 200 300 400 500 600 700
Distance to optimum dopt Distance to optimum dopt
dsj1000.tsp pr2392.tsp
1.2e+06 14000
12000
Fitness difference !f
Fitness difference !f
1e+06
10000
800000
8000
600000
6000
400000
4000
200000 2000
0 0
0 200 400 600 800 1000 0 500 1000 1500 2000
Distance to optimum dopt Distance to optimum dopt
fl1400.tsp fl1577.tsp
6000 4500
4000
Fitness difference !f
Fitness difference !f
5000
3500
4000 3000
2500
3000
2000
2000 1500
1000
1000
500
0 0
0 200 400 600 800 1000 1200 1400 0 200 400 600 800 1000 1200 1400
Distance to optimum dopt Distance to optimum dopt
fractal instances on the other hand are very easy to solve. They are not
well suited as benchmark problems for highly effective heuristics since
they do not have the same characteristics as the instances arising in TSP
applications. The big valley structure can be well exploited by an MA
with recombination since good solutions are more likely to be found
near other local optima, and most recombination operators produce
solutions that lie between other solutions (respectful recombination).
The proposed MAs for the TSP are similar to the EA outlined above:
A population of locally optimum solutions is evolved over time by ap-
plying evolutionary variation operators (mutation and recombination
operators). To ensure that the individuals in the population are local
optima, after each application an evolutionary variation operator, local
search is applied. This includes the initialization phase of the popu-
lation in which solutions are constructed from scratch: A local search
procedure is applied to these solutions so that even the first generation
consists exclusively of local optima.
The problem-specific parts of the algorithm comprise initialization,
local search, and the evolutionary variation operators.
Mutation operators used in simple EAs are not suited for use in MAs,
since subsequently applied local search procedures will usually revert
the changes made. For example, the inversion operator randomly ex-
changing two edges is ineffective when 2-opt, 3-opt, or LK local search
is used. Therefore, in MAs alternative mutation operators are required.
u5 u6 u5 u6
u2 u3 u2 u3
u1 u4 u1 u4
u8 u7 u8 u7
Parent 1: 5 3 9 1 2 8 0 6 7 4
Parent 2: 1 2 5 3 9 4 8 6 0 7
Fragments: 5 3 9 1 2 8 0 6 7 4
Offspring: 6 0 5 3 9 8 7 2 1 4
begin
let x = ();
let rem = n;
/* Copy common edges */
foreach edge e in a do
if (e in b and cRate < random[0,1)) then
add e to x;
rem := rem 1;
end;
end;
/* Insert new edges */
for k := 1 to (rem ! nRate) do
i := n ! random[0,1);
j := select from (the 5 nearest neighbors of i)
with (i, j) feasible and (i, j) not in a or b;
add edge (i, j) to x;
rem := rem 1;
end;
/* Inherit edges from parents */
for k := 1 to (rem ! iRate) do
parent := select randomly from (a, b);
if (parent has candidate edges) then
e := select from (2 shortest candidates);
add e to x;
rem := rem 1;
end;
end;
/* greedy completion */
while (rem > 0) do
e := select from (2 shortest candidates);
add e to x;
rem := rem 1;
end;
return x;
end;
In the implementation of the algorithms for the TSP described in this pa-
per, a nearest neighbor list of size m $ 100 for each node is maintained,
which is initialized by nearest neighbor queries on a two-dimensional
binary search tree [6]. In the local search procedures, a data structure
for maintaining dont look bits is incorporated, with the local search for
the initial population starting with all dont look bits set to zero. After
recombination has been performed, only the dont look bits of the nodes
that are incident to the edges not shared by both parents are cleared.
Similarly, after mutation, only nodes incident to the edges newly in-
cluded in the tour have their dont look flags set to zero. This focuses
the search of the hill-climber to the promising regions of the search space
and also reduces the time for checking the interesting members of the
neighborhood.
Additionally, in the algorithm for the TSP, data structures have been
incorporated to deal with large instances of up to 100,000 cities. Since
for large instances it is not possible to store the entire distance matrix
in main memory the euclidean distances are computed online. This is a
rather expensive operation, so a distance cache of size 35n is maintained,
where the first n entries are used to cache the distances of the edges in the
current tour and the remaining 2 5 n entries are organized as described
in [6]. The average hit rate of the cache varies between 80% and 95%.
Another target for optimizations is the LK heuristic itself. Most of
the computation time is spent in submoves that will be reversed later in
the algorithm. Hence, it is profitable to distinguish between tentative
and permanent moves. Applegate and Cook have proposed a segment
tree data structure for efficiently managing tentative moves, as described
in [33]. Instead of using a segment tree, the algorithms described here
operate on a segment list that represents a tentative tour. Operations
performing a flip on this tentative tour are highly optimized, such that a
high performance gain compared to the simple array representation can
be achieved. The running times for all operations are in O(1), since the
data structure is limited to perform 20 flips only. In practice, this has
proven to be sufficient.
significantly worse than the other algorithms. For fl1577, the MAs with
DPX and GX outperform all other competitors, with the MA using
DPX being the best. For pr2392, all recombination-based algorithms
perform similarly, but the MAs with mutation and ILK perform signif-
icantly worse. In the case of pcb3038, the largest instance considered,
all results lie close together. The MAs with DPX and MPX outperform
ILK and the MA with NS4. In the greedy recombination MAs, high dif-
ferences can be observed. The best results are obtained with a new edge
introduction rate of 0.25. The results show no clear tendency and often
the values lie too close together to be significantly different. However,
in none of the cases did ILK or the MA with mutation outperform the
MA using DPX or the best greedy recombination. The performance dif-
ferences between mutation and recombination operators have become
more apparent using 2-opt local search. For larger instances, this may
also be observed for MAs with the LK heuristic.
In an additional experiment, the combination of recombination and
mutation operators in MAs has been investigated. In the same exper-
imental setup as before, the MAs with DPX and MPX recombination
have been run with the nonsequential four change mutation operator.
The results are provided in Table 5. The table contains the results
achieved with DPX and MPX without mutation as well as the results
for a mutation operator application rate of m $ 0.1 and m $ 0.5. The
number of offspring per generation produced by mutation is m 5 P. The
results show a clear tendency: in the majority of runs, additional mu-
tation improves the results. Furthermore, it is shown that the mutation
application rate of m $ 0.1 is preferable.
The results show that a running time smaller than an hour is sufficient
to reach a quality of less than 1% for all problems except the largest one.
For the latter, the running time increases to 12,000 seconds. Increasing
the population size increases the final solution quality, but running times
increase drastically. In the extreme case of the largest problem, the
running times grow 4.2 times from 12,314 to 52,180 seconds. In most
other cases the running time grows less than 3 times. It can be observed
that the pla-problems are better solved than the other instances with
respect to the solution quality.
5. Conclusions
References
[15] T. G. Bui and B. R. Moon, A New Genetic Approach for the Traveling
Salesman Problem, in Proceedings of the First IEEE Conference on
Evolutionary Computation (IEEE Press, 1994).
[35] B. Freisleben and P. Merz, New Genetic Local Search Operators for the
Traveling Salesman Problem, in Proceedings of the Fourth International
Conference on Parallel Problem Solving from NaturePPSN IV, edited
by H.-M. Voigt, W. Ebeling, I. Rechenberg, and H.-P. Schwefel (Springer,
Berlin, 1996).
[37] D. E. Goldberg and J. R. Lingle, Alleles, Loci, and the Traveling Sales-
man Problem, in Proceedings of an International Conference on Ge-
netic Algorithms and their Applications (Carnegie Mellon Publishers,
1985).
[43] J. Gu and X. Huang, Efficient Local Search With Search Space Smooth-
ing: A Case Study of the Traveling Salesman Problem (TSP), IEEE
Transactions on Systems, Man, and Cybernetics, 24 (1994) 728735.
[50] C.-S. Jeong and M.-H. Kim, Fast Parallel Simulated Annealing for
Traveling Salesman Problem on SIMD Machines with Linear Intercon-
nections, Parallel Computing, 17(23) (1991) 221228.
[68] F. Margot, Quick Updates for p-opt TSP Heuristics, Operations Re-
search Letters, 11 (1992) 4546.
[71] P. Merz and B. Freisleben, Genetic Local Search for the TSP: New
Results, in Proceedings of the 1997 IEEE International Conference
on Evolutionary Computation, edited by T. Back, Z. Michalewicz, and
X. Yao (IEEE Press, Piscataway, NJ, 1997).
[108] G. Tao and Z. Michalewicz, Inver-over Operator for the TSP, in Pro-
ceedings of the Fifth International Conference on Parallel Problem Solv-
ing from NaturePPSN V, edited by A.-E. Eiben, T. Back, M. Schoe-
nauer, and H.-P. Schwefel (Springer, 1998).
[110] C. L. Valenzuela, Evolutionary Divide and Conquer (II) for the TSP, in
Proceedings of the Genetic and Evolutionary Computation Conference,
edited by W. Banzhaf, J. Daida, A. E. Eiben, M. H. Garzon, V. Honavar,
M. Jakiela, and R. E. Smith (Morgan Kaufmann, 1999).
[112] T. Walters, Repair and Brood Selection in the Traveling Salesman Prob-
lem, in Proceedings of the Fifth International Conference on Parallel
Problem Solving from NaturePPSN V, edited by A.-E. Eiben, T. Back,
M. Schoenauer, and H.-P. Schwefel (Springer, 1998).