I Shibu Chi 2009
I Shibu Chi 2009
72
(2) Maximum sum of the objective values: MaxSum If one carefully compares Fig. 3 with Fig. 1, one
The maximum value of the sum of the k objective may notice that the diversity started to increase in Fig.
values was calculated in each generation as follows: 3 just after the number of non-dominated solutions
k
exceeded 200 in Fig. 1 (i.e., just after all the solutions
MaxSum( Ψ ) = max ∑ f i ( x ) , (4) in the current population became non-dominated).
x∈Ψ i =1
where Ψ denotes the current population, and k is the 2-500 4-500 6-500 8-500 10-500
300
number of the given objectives. This MaxSum measure
evaluates the convergence of solutions toward the 250
Normalized Range
Pareto front around its center region.
200
Experimental results are summarized in Fig. 2. We
normalized all the experimental results for each test 150
problem so that the average result over ten initial
populations becomes 100. From Fig. 2, we can see that 100
73
w1 + w2 + L + wk = d , (8) for parent selection (PPS) and generation update (PGU):
wi ∈ {0, 1, 2, ..., d }, i = 1, 2, ..., k , (9) PPS = 0.0, 0.2, 0.4, 0.6, 0.8, 1.0,
PGU = 0.0, 0.2, 0.4, 0.6, 0.8, 1.0.
where d is a user-definable positive integer. Almost the
same idea was also used in [23]. In our computational We also examined several specifications of d in (8).
experiments, we examined various values of d. Experimental results are shown in Fig. 5 for d = 4 (126
NSGA-II is the framework of our hybrid algorithm. weight vectors) and Fig. 6 for d = 8 (1287).
When a pair of solutions are to be selected as parents
from the current population, the weighted sum scheme
is used with a probability PPS in our hybrid algorithm. (×10 25 )
3.20
In the weighted sum scheme for parent selection, a
weight vector is randomly selected from the set of
Hypervolume
3.00
uniformly distributed weight vectors in (8)-(9). The
weighted sum scheme is also used in the same manner
2.80
with a probability PGU for generation update in our
hybrid algorithm. Our hybrid algorithm is the same as 2.60
NSGA-II except for the use of the weighted sum 0.016
0.008
scheme for parent selection and generation update. It 2.40 0.004
should be noted that our hybrid algorithm is exactly 0.2 0.4 0.002 PM
0.6 0.8 0.001
the same as NSGA-II when PPS = PGU = 0.0 (i.e., when PC 1.0
the weighted sum scheme is never invoked).
Figure 4. NSGA-II on 6-500 (Hypervolume).
4. Results of our hybrid algorithm
We performed computational experiments under the (×10 25 )
following conditions: 3.20
3.00
Stopping conditions: 2,000 generations.
Before examining the effect of using the weighted 2.80
sum fitness function, we first examined the sensitivity
of the performance of NSGA-II to the specification of 2.60
1.0
the crossover and mutation probabilities. We examined 0.8
all the 5x5 combinations of the following probabilities: 2.40 0.6
0.0 0.2 0.4 PGU
Crossover probability: 0.2, 0.4, 0.6, 0.8, 1.0, 0.4 0.6 0.2
PPS 0.8 1.0 0.0
Mutation probability: 0.001, 0.002, 0.004, 0.008, 0.016.
For each combination, we calculated the average Figure 5. Hybrid on 6-500 (d = 4: 126 vectors).
value of the hypervolume measure over ten runs for
each of the 2-500, 4-500 and 6-500 test problems.
Since the hypervolume calculation takes a long time
for many-objective problems, we did not use it for the (×10 25 )
3.20
other test problems: 8-500 and 10-500. Due to the page
limitation, we only show experimental results on the 6-
Hypervolume
3.00
500 problem in Fig. 4. We can see that the choice of a
mutation probability has a large effect on the average 2.80
value of the hypervolume measure. We chose the
combination of the crossover probability 0.6 and the 2.60
1.0
mutation probability 0.004 based on the results for the 0.8
three test problems (e.g., Fig. 4). Hereafter, we use this 2.40 0.6
0.0 0.2 0.4 PGU
combination in our computational experiments. 0.4 0.6 0.2
Then we applied our hybrid algorithm to the 6-500 PPS 0.8 1.0 0.0
problem. We examined all the 6x6 combinations of the
following probabilities of the weighted sum scheme Figure 6. Hybrid on 6-500 (d = 8: 1287 vectors).
74
fitness function. Whereas the Range measure in Fig. 10
100000 was somewhat decreased, Fig. 11 does not suggest any
severe negative effect on the diversity of solutions.
99000
MaxSum
98000
158000
97000
155800
96000 1.0
MaxSum
0.8 153600
95000 0.6
0.0 0.2 0.4 PGU
0.4 0.6 0.2 151400
PPS 0.8 1.0 0.0
149200 1.0
Figure 7. MaxSum by Hybrid (6-500 with d = 4). 0.8
147000 0.6
0.0 0.2 0.4 PGU
0.4 0.6 0.2
PPS 0.8 1.0 0.0
25000
Figure 9. Hybrid on 10-500 (MaxSum).
22500
Range
20000 53000
17500 43000
1.0
0.8
Range
15000 0.6
0.4 33000
0.0 0.2
0.2
PGU
0.4 0.6
PPS 0.8 1.0 0.0
23000
1.0
Figure 8. Range by Hybrid (6-500 with d = 4). 0.8
13000 0.6
0.0 0.2 0.4 PGU
0.4 0.6 0.2
From the comparison of Fig. 4 by NSGA-II with PPS 0.8 1.0 0.0
Fig. 5 and Fig. 6, we can see that the probabilistic use
of the weighted sum fitness function clearly improved Figure 10. Hybrid on 10-500 (Range).
the performance of NSGA-II. It should be noted that
the right bottom corner with PPS = PGU = 0.0 of Fig. 5
and Fig. 6 is the same as NSGA-II. The performance 176000
of NSGA-II at the right bottom corner of each figure
was improved by probabilistically using the weighted 174000
sum fitness function especially for generation update
SumMax
75
We also showed that its use in the generation update Evolutionary Computation, pp. 3959-3966, Vancouver,
phase has a dominant effect. Future research issues July 16-21, 2006.
include the use of other scalarizing fitness functions [13] H. Ishibuchi, N. Tsukamoto, and Y. Nojima,
(e.g., scalarizing achievement function), computational “Evolutionary many-objective optimization: A short
experiments on other test problems, and performance review,” Proc. of 2008 IEEE Congress on Evolutionary
comparison with other EMO algorithms such as IBEAs Computation, pp. 2424-2431, Hong Kong, June 1-6, 2008.
[18]-[21], MSOPS-II [22] and MOEA/D [23]. [14] H. Ishibuchi, N. Tsukamoto, and Y. Nojima, “Behavior
This work was supported in part by Grant-in-Aid of evolutionary many-objective optimization,” Proc. of
10th International Conference on Computer Modeling
for Scientific Research (B) (20300084).
and Simulation, pp. 266-271, Cambridge, April 1-3, 2008.
[15] H. Ishibuchi, N. Tsukamoto, Y. Hitotsuyanagi, and Y.
References Nojima, “Effectiveness of scalability improvement
attempts on the performance of NSGA-II for many-
[1] K. Deb, Multi-Objective Optimization Using Evolution- objective problems,” Proc. of 2008 Genetic and
ary Algorithms, John Wiley & Sons, Chichester, 2001. Evolutionary Computation Conference, pp. 649-656,
[2] C. A. C. Coello, D. A. van Veldhuizen, and G. B. Atlanta, July 12-16, 2008.
Lamont, Evolutionary Algorithms for Solving Multi- [16] H. Ishibuchi, T. Doi, and Y. Nojima, “Incorporation of
Objective Problems, Kluwer Academic Publishers, scalarizing fitness functions into evolutionary
Boston, 2002. multiobjective optimization algorithms,” Lecture Notes
[3] C. A. C. Coello and G. B. Lamont, Applications of Multi- in Computer Science 4193: Parallel Problem Solving
Objective Evolutionary Algorithms, World Scientific, from Nature - PPSN IX, pp. 493-502, Springer, Berlin,
Singapore, 2004. September 2006.
[4] A. Abraham, L. C. Jain, and R. Goldberg (eds.), [17] E. Zitzler and L. Thiele, “Multiobjective optimization
Evolutionary Multiobjective Optimization: Theoretical using evolutionary algorithms – A comparative case
Advances and Applications, Springer, Berlin, 2005. study,” Lecture Notes in Computer Science, vol. 1498, pp.
[5] K. C. Tan, E. F. Khor, and T. H. Lee, Multiobjective 292-301, Springer, Berlin, September 1998.
Evolutionary Algorithms and Applications, Springer, [18] E. Zitzler and S. Künzli, “Indicator-based selection in
Berlin, 2005. multiobjective search,” Lecture Notes in Computer
[6] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, “A Science 3242: Parallel Problem Solving from Nature -
fast and elitist multiobjective genetic algorithm: NSGA- PPSN VIII, pp. 832-842, Springer, Berlin, 2004.
II,” IEEE Trans. on Evolutionary Computation 6 (2), pp. [19] M. Emmerich, N. Beume, and B. Naujoks, “An EMO
182-197, April 2002. algorithm using the hypervolume measure as selection
[7] E. Zitzler and L. Thiele, “Multiobjective evolutionary criterion,” Lecture Notes in Computer Science 3410:
algorithms: A comparative case study and the strength Evolutionary Multi-Criterion Optimization - EMO 2005,
Pareto approach,” IEEE Trans. on Evolutionary pp. 62-76, Springer, Berlin, 2005.
Computation 3 (4), pp. 257-271, November 1999. [20] N. Beume, B. Naujoks, and M. Emmerich, “SMS-
[8] V. Khara, X. Yao, and K. Deb, “Performance scaling of EMOA: multiobjective selection based on dominated
multi-objective evolutionary algorithms,” Lecture Notes hypervolume,” European Journal of Operational
in Computer Science 2632: Evolutionary Multi-Criterion Research 180 (3), pp. 1653-1669, September 2007.
Optimization - EMO 2003, pp. 367-390, Springer, Berlin, [21] T. Wagner, N. Beume, and B. Naujoks, “Pareto-,
April 2003. aggregation-, and indicator-based methods in many-
[9] R. C. Purshouse and P. J. Fleming, “Evolutionary many- objective optimization,” Lecture Notes in Computer
objective optimization: An exploratory analysis,” Proc. Science 4403: Evolutionary Multi-Criterion Optimization
of 2003 IEEE Congress on Evolutionary Computation, - EMO 2007, pp. 742-756, Springer, Berlin, March 2007.
pp. 2066-2073, Canberra, December 8-12, 2003. [22] E. J. Hughes, “MSOPS-II: A general-purpose many-
[10] A. Jaszkiewicz, “On the computational efficiency of objective optimiser,” Proc. of 2007 IEEE Congress on
multiple objective metaheuristics: The knapsack problem Evolutionary Computation, pp. 3944-3951, Singapore,
case study,” European Journal of Operational Research September 25-28, 2007.
158 (2), pp. 418-433, October 2004. [23] Q. Zhang and H. Li, “MOEA/D: A multiobjective
[11] E. J. Hughes, “Evolutionary many-objective evolutionary algorithm based on decomposition,” IEEE
optimization: Many once or one many?,” Proc. of 2005 Trans. on Evolutionary Computation 11 (6), pp. 712-731,
IEEE Congress on Evolutionary Computation, pp. 222- December 2007.
227, Edinburgh, September 2-5, 2005. [24] T. Murata, H. Ishibuchi, and M. Gen, “Specification of
[12] H. Ishibuchi, Y. Nojima, and T. Doi, “Comparison genetic search directions in cellular multi-objective
between single-objective and multi-objective genetic genetic algorithm,” Lecture Notes in Computer Science
algorithms: Performance comparison and performance 1993: Evolutionary Multi-Criterion Optimization - EMO
measures,” Proc. of 2006 IEEE Congress on 2001, pp. 82-95, Springer, Berlin, March 2001.
76