0% found this document useful (0 votes)
149 views14 pages

Optimization Algorithm

This article summarizes the pymoo framework, a Python library for multi-objective optimization. Pymoo provides implementations of optimization algorithms, test problems, performance metrics, and tools for visualization and parallelization. The framework is modular and extensible, allowing users to customize algorithms by supplying custom operators or submodules. Pymoo aims to support the full multi-objective optimization process within a single Python package.

Uploaded by

RACHANA GUPTA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
149 views14 pages

Optimization Algorithm

This article summarizes the pymoo framework, a Python library for multi-objective optimization. Pymoo provides implementations of optimization algorithms, test problems, performance metrics, and tools for visualization and parallelization. The framework is modular and extensible, allowing users to customize algorithms by supplying custom operators or submodules. Pymoo aims to support the full multi-objective optimization process within a single Python package.

Uploaded by

RACHANA GUPTA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

This article has been accepted for publication in a future issue of this journal, but has not been

fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Date of publication xxxx 00, 0000, date of current version xxxx 00, 0000.
Digital Object Identifier 10.1109/ACCESS.2017.DOI

pymoo: Multi-objective Optimization in


Python
JULIAN BLANK and KALYANMOY DEB (FELLOW, IEEE)
Michigan State University, East Lansing, MI 48824, USA
Corresponding Author: Julian Blank (blankjul@egr.msu.edu)

ABSTRACT Python has become the programming language of choice for research and industry projects
related to data science, machine learning, and deep learning. Since optimization is an inherent part of these
research fields, more optimization related frameworks have arisen in the past few years. Only a few of them
support optimization of multiple conflicting objectives at a time, but do not provide comprehensive tools
for a complete multi-objective optimization task. To address this issue, we have developed pymoo, a multi-
objective optimization framework in Python. We provide a guide to getting started with our framework
by demonstrating the implementation of an exemplary constrained multi-objective optimization scenario.
Moreover, we give a high-level overview of the architecture of pymoo to show its capabilities followed by
an explanation of each module and its corresponding sub-modules. The implementations in our framework
are customizable and algorithms can be modified/extended by supplying custom operators. Moreover, a
variety of single, multi- and many-objective test problems are provided and gradients can be retrieved by
automatic differentiation out of the box. Also, pymoo addresses practical needs, such as the parallelization
of function evaluations, methods to visualize low and high-dimensional spaces, and tools for multi-criteria
decision making. For more information about pymoo, readers are encouraged to visit: https://pymoo.org.

INDEX TERMS Customization, Genetic Algorithm, Multi-objective Optimization, Python

I. INTRODUCTION translates to a sketch of an algorithm and the implemen-


tation itself. However, the implementation of optimization
PTIMIZATION plays an essential role in many scien-
O tific areas, such as engineering, data analytics, and deep
learning. These fields are fast-growing and their concepts are
algorithms can be challenging and specifically benchmarking
is time-consuming. Having access to either a good collec-
tion of different source codes or a comprehensive library is
employed for various purposes, for instance gaining insights
time-saving and avoids an error-prone implementation from
from a large data sets or fitting accurate prediction models.
scratch.
Whenever an algorithm has to handle a significantly large
amount of data, an efficient implementation in a suitable
To address this need for multi-objective optimization in
programming language is important. Python [1] has become
Python, we introduce pymoo. The goal of our framework
the programming language of choice for the above mentioned
is not only to provide state of the art optimization algo-
research areas over the last few years, not only because it is
rithms, but also to cover different aspects related to the
easy to use but also good community support exists. Python
optimization process itself. We have implemented single,
is a high-level, cross-platform, and interpreted programming
multi and many-objective test problems which can be used
language that focuses on code readability. A large number
as a test-bed for algorithms. In addition to the objective and
of high-quality libraries are available and support for any
constraint values of test problems, gradient information can
kind of scientific computation is ensured. These character-
be retrieved through automatic differentiation [2]. Moreover,
istics make Python an appropriate tool for many research
a parallelized evaluation of solutions can be implemented
and industry projects where the investigations can be rather
through vectorized computations, multi-threaded execution,
complex.
and distributed computing. Further, pymoo provides imple-
A fundamental principle of research is to ensure repro- mentations of performance indicators to measure the quality
ducibility of studies and to provide access to materials used of results obtained by a multi-objective optimization algo-
in the research, whenever possible. In computer science, this rithm. Tools for an explorative analysis through visualization

VOLUME 4, 2016 1

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

of lower and higher-dimensional data are available and multi- tion or visualization, over the programming language itself.
criteria decision making methods guide the selection of a An overview of some existing multi-objective optimization
single solution from a solution set based on preferences. frameworks in Python is listed in Table 1, each of which is
Our framework is designed to be extendable through of its described in the following.
modular implementation. For instance, a genetic algorithm Recently, the well-known multi-objective optimization
is assembled in a plug-and-play manner by making use of framework jMetal [5] developed in Java [6] has been ported
specific sub-modules, such as initial sampling, mating selec- to a Python version, namely jMetalPy [7]. The authors aim
tion, crossover, mutation and survival selection. Each sub- to further extend it and to make use of the full feature set
module takes care of an aspect independently and, therefore, of Python, for instance, data analysis and data visualization.
variants of algorithms can be initiated by passing different In addition to traditional optimization algorithms, jMetalPy
combinations of sub-modules. This concept allows end-users also offers methods for dynamic optimization. Moreover, the
to incorporate domain knowledge through custom imple- post analysis of performance metrics of an experiment with
mentations. For example, in an evolutionary algorithm a several independent runs is automated.
biased initial sampling module created with the knowledge Parallel Global Multiobjective Optimizer, PyGMO [8], is
of domain experts can guide the initial search. an optimization library for the easy distribution of massive
Furthermore, we like to mention that our framework is optimization tasks over multiple CPUs. It uses the gener-
well-documented with a large number of available code- alized island-model paradigm for the coarse grained paral-
snippets. We created a starter’s guide for users to become lelization of optimization algorithms and, therefore, allows
familiar with our framework and to demonstrate its capabili- users to develop asynchronous and distributed algorithms.
ties. As an example, it shows the optimization results of a bi- Platypus [9] is a multi-objective optimization framework
objective optimization problem with two constraints. An ex- that offers implementations of state-of-the art algorithms. It
tract from the guide will be presented in this paper. Moreover, enables users to create an experiment with various algorithms
we provide an explanation of each algorithm and source code and provides post-analysis methods based on metrics and
to run it on a suitable optimization problem in our software visualization.
documentation. Additionally, we show a definition of test
A Distributed Evolutionary Algorithms in Python (DEAP)
problems and provide a plot of their fitness landscapes. The
[10] is novel evolutionary computation framework for rapid
framework documentation is built using Sphinx [3] and cor-
prototyping and testing of ideas. Even though, DEAP does
rectness of modules is ensured by automatic unit testing [4].
not focus on multi-objective optimization, however, due to
Most algorithms have been developed in collaboration with
the modularity and extendibility of the framework multi-
the second author and have been benchmarked extensively
objective algorithms can be developed. Moreover, paral-
against the original implementations.
lelization and load-balancing tasks are supported out of the
In the remainder of this paper, we first present related
box.
existing optimization frameworks in Python and in other
programming languages. Then, we provide a guide to getting Inspyred [11] is a framework for creating bio-inspired
started with pymoo in Section III which covers the most computational intelligence algorithms in Python which is
important steps of our proposed framework. In Section IV we not focused on multi-objective algorithms directly, but on
illustrate the framework architecture and the corresponding evolutionary computation in general. However, an example
modules, such as problems, algorithms and related analytics. for NSGA-II [12] is provided and other multi-objective algo-
Each of the modules is then discussed separately in Sec- rithms can be implemented through the modular implemen-
tions V to VII. Finally, concluding remarks are presented in tation of the framework.
Section VIII. If the search for frameworks is not limited to Python, other
popular frameworks should be considered: PlatEMO [13] in
II. RELATED WORKS Matlab, MOEA [14] and jMetal [5] in Java, jMetalCpp [15]
In the last decades, various optimization frameworks in and PaGMO [16] in C++. Of course this is not an exhaustive
diverse programming languages were developed. However, list and readers may search for other available options.
some of them only partially cover multi-objective optimiza-
tion. In general, the choice of a suitable framework for an III. GETTING STARTED 1
optimization task is a multi-objective problem itself. More- In the following, we provide a starter’s guide for pymoo. It
over, some criteria are rather subjective, for instance, the covers the most important steps in an optimization scenario
usability and extendibility of a framework and, therefore, the starting with the installation of the framework, defining an
assessment regarding criteria as well as the decision making optimization problem, and the optimization procedure itself.
process differ from user to user. For example, one might have
decided on a programming language first, either because of
personal preference or a project constraint, and then search 1 ALL SOURCE CODES IN THIS PAPER ARE RELATED TO PYMOO
for a suitable framework. One might give more importance to VERSION 0.4.0. A GETTING STARTED GUIDE FOR UPCOMING VER-
the overall features of a framework, for example paralleliza- SIONS CAN BE FOUND AT HTTPS://PYMOO.ORG.

2 VOLUME 4, 2016

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

TABLE 1: Multi-objective Optimization Frameworks in


Python
g1(x) g2(x) f1(x) f2(x)
Name License Focus on Pure Visua- Decision
multi- Python lization Making 1.00
objective
0.75
jMetalPy MIT 3 3 3 7
PyGMO GPL-3.0 3 7 7 7
0.50
0.25

x2
Platypus GPL-3.0 3 3 7 7
DEAP LGPL-3.0 7 3 7 7 0.00
Inspyred MIT 7 3 7 7
0.25
pymoo Apache 2.0 3 3 3 3
0.50
0.5 0.0 0.5 1.0 1.5
x1
A. INSTALLATION
Our framework pymoo is available on PyPI [17] which is a FIGURE 1: Contour plot of the test problem (Equation 2).
central repository to make Python software package easily
accessible. The framework can be installed by using the
package manager: In the following, we illustrate a bi-objective optimization
problem with two constraints.
$ pip install -U pymoo
min f1 (x) = (x21 + x22 ),
Some components are available in Python and addition- max f2 (x) = −(x1 − 1)2 − x22 ,
ally in Cython [18]. Cython allows developers to annotate
existing Python code which is translated to C or C++ pro- s.t. g1 (x) = 2 (x1 − 0.1) (x1 − 0.9) ≤ 0,
(2)
gramming languages. The translated files are compiled to a g2 (x) = 20 (x1 − 0.4) (x1 − 0.6) ≥ 0,
binary executable and can be used to speed up computations.
− 2 ≤ x1 ≤ 2,
During the installation of pymoo, attempts are made for
compilation, however, if unsuccessful due to the lack of a − 2 ≤ x2 ≤ 2.
suitable compiler or other reasons, the pure Python version is It consists of two objectives (M = 2) where f1 (x) is
installed. We would like to emphasize that the compilation is minimized and f2 (x) maximized. The optimization is with
optional and all features are available without it. More detail subject to two inequality constraints (J = 2) where g1 (x)
about the compilation and troubleshooting can be found in is formulated as a less-than-equal-to and g2 (x) as a greater-
our installation guide online. than-equal-to constraint. The problem is defined with respect
to two variables (N = 2), x1 and x2 , which both are in
B. PROBLEM DEFINITION the range [−2, 2]. The problem does not contain any equality
In general, multi-objective optimization has several objective constraints (K = 0). Contour plots of the objective functions
functions with subject to inequality and equality constraints are shown in Figure 1. The contours of the objective function
to optimize[19]. The goal is to find a set of solutions (variable f1 (x) are represented by solid lines and f2 (x) by dashed
vectors) that satisfy all constraints and are as good as possible lines. Constraints g1 (x) and g2 (x) are parabolas which in-
regarding all its objectives values. The problem definition in tersect the x1 -axis at (0.1, 0.9) and (0.4, 0.6). The Pareto-
its general form is given by: optimal set is marked by a thick orange line. Through the
combination of both constraints the Pareto-set is split into
two parts. Analytically, the Pareto-optimal set is given by
min fm (x) m = 1, .., M, P S = {(x1 , x2 ) | (0.1 ≤ x1 ≤ 0.4) ∨ (0.6 √≤ x1 ≤
0.9) ∧ x2 = 0} and the efficient-front by f2 = ( f1 − 1)2
s.t. gj (x) ≤ 0, j = 1, .., J, where f1 is defined in [0.01, 0.16] and [0.36, 0.81].
(1)
hk (x) = 0, k = 1, .., K, In the following, we provide an example implementation
of the problem formulation above using pymoo. We assume
xL U
i ≤ xi ≤ xi , i = 1, .., N. the reader is familiar with Python and has a fundamental
knowledge of NumPy [20] which is utilized to deal with
The formulation above defines a multi-objective optimiza- vector and matrix computations.
tion problem with N variables, M objectives, J inequality, In pymoo, we consider pure minimization problems for
and K equality constraints. Moreover, for each variable xi , optimization in all our modules. However, without loss of
lower and upper variable boundaries (xL U
i and xi ) are also generality an objective which is supposed to be maximized,
defined. can be multiplied by −1 and be minimized [19]. Therefore,
VOLUME 4, 2016 3

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

we minimize −f2 (x) instead of maximizing f2 (x) in our NumPy called Autograd [22]. Note that this is not obligatory
optimization problem. Furthermore, all constraint functions for a problem definition.
need to be formulated as a less-than-equal-to constraint.
For this reason, g2 (x) needs to be multiplied by −1 to flip C. ALGORITHM INITIALIZATION
the ≥ to a ≤ relation. We recommend the normalization Next, we need to initialize a method to optimize the prob-
of constraints to give equal importance to each of them. lem. In pymoo, an algorithm object needs to be created for
For g1 (x), the constant ‘resource’ value of the constraint is optimization. For each of the algorithms an API documenta-
2 · (−0.1) · (−0.9) = 0.18 and for g2 (x) it is 20 · (−0.4) · tion is available and through supplying different parameters,
(−0.6) = 4.8, respectively. We achieve normalization of algorithms can be customized in a plug-and-play manner. In
constraints by dividing g1 (x) and g2 (x) by the corresponding general, the choice of a suitable algorithm for optimization
constant [21]. problems is a challenge itself. Whenever problem character-
Finally, the optimization problem to be optimized using istics are known beforehand we recommended using those
pymoo is defined by: through customized operators. However, in our case the opti-
mization problem is rather simple, but the aspect of having
min f1 (x) = (x21 + x22 ), two objectives and two constraints should be considered.
For this reason, we decided to use NSGA-II [12] with its
min f2 (x) = (x1 − 1)2 + x22 ,
default configuration with minor modifications. We chose
s.t. g1 (x) = 2 (x1 − 0.1) (x1 − 0.9) / 0.18 ≤ 0, a population size of 40, but instead of generating the same
(3)
g2 (x) = −20 (x1 − 0.4) (x1 − 0.6) / 4.8 ≤ 0, number of offsprings, we create only 10 in each generation.
This is a steady-state variant of NSGA-II and it is likely to
− 2 ≤ x1 ≤ 2, improve the convergence property for rather simple optimiza-
− 2 ≤ x2 ≤ 2. tion problems without much difficulties, such as the existence
of local Pareto-fronts. Moreover, we enable a duplicate check
Next, the derived problem formulation is implemented in
which makes sure that the mating produces offsprings which
Python. Each optimization problem in pymoo has to inherit
are different with respect to themselves and also from the
from the Problem class. First, by calling the super()
existing population regarding their variable vectors. To illus-
function the problem properties such as the number of
trate the customization aspect, we listed the other unmodified
variables (n_var), objectives (n_obj) and constraints
default operators in the code-snippet below. The constructor
(n_constr) are initialized. Furthermore, lower (xl) and
of NSGA2 is called with the supplied parameters and returns
upper variables boundaries (xu) are supplied as a NumPy
an initialized algorithm object.
array. Additionally, the evaluation function _evaluate
needs to be overwritten from the superclass. The method from pymoo.algorithms.nsga2 import NSGA2
from pymoo.factory import get_sampling, get_crossover,
takes a two-dimensional NumPy array x with n rows and get_mutation
m columns as an input. Each row represents an individual
algorithm = NSGA2(
and each column an optimization variable. After doing the pop_size=40,
necessary calculations, the objective values are added to the n_offsprings=10,
sampling=get_sampling("real_random"),
dictionary out with the key F and the constraints with key crossover=get_crossover("real_sbx", prob=0.9, eta=15)
G. ,
mutation=get_mutation("real_pm", eta=20),
import autograd.numpy as anp eliminate_duplicates=True
from pymoo.model.problem import Problem )

class MyProblem(Problem):

def __init__(self): D. OPTIMIZATION


super().__init__(n_var=2,
n_obj=2, Next, we use the initialized algorithm object to optimize the
n_constr=2, defined problem. Therefore, the minimize function with
xl=anp.array([-2,-2]),
xu=anp.array([2,2])) both instances problem and algorithm as parameters
is called. Moreover, we supply the termination criterion of
def _evaluate(self, x, out, *args, **kwargs):
f1 = x[:,0]**2 + x[:,1]**2 running the algorithm for 40 generations which will result
f2 = (x[:,0]-1)**2 + x[:,1]**2 in 40 + 40 × 10 = 440 function evaluations. In addition, we
g1 = 2*(x[:, 0]-0.1) * (x[:, 0]-0.9) / 0.18 define a random seed to ensure reproducibility and enable the
g2 = - 20*(x[:, 0]-0.4) * (x[:, 0]-0.6) / 4.8 verbose flag to see printouts for each generation.
out["F"] = anp.column_stack([f1, f2]) from pymoo.optimize import minimize
out["G"] = anp.column_stack([g1, g2])
res = minimize(MyProblem(),
As mentioned above, pymoo utilizes NumPy [20] for most algorithm,
('n_gen', 40),
of its computations. To be able to retrieve gradients through seed=1,
automatic differentiation we are using a wrapper around verbose=True)

4 VOLUME 4, 2016

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

test problems. Gradients are available through automatic


2 differentiation and parallelization can be implemented
by using a variety of techniques.
1 (ii) Optimization: Since most of the algorithms are based
on evolutionary computations, operators such as sam-
0
x2

pling, mating selection, crossover and mutation have to


be chosen or implemented. Furthermore, because many
1
problems in practice have one or more constraints, a
2 methodology for handling those must be incorporated.
0.50 0.25 0.00 0.25 0.50 0.75 1.00 1.25 1.50 Some algorithms are based on decomposition which
x1 splits the multi-objective problem into many single-
(a) Design Space
objective problems. Moreover, when the algorithm is
used to solve the problem, a termination criterion must
0.8 be defined either explicitly or implicitly by the imple-
mentation of the algorithm.
0.6 (iii) Analytics: During and after an optimization run an-
alytics support the understanding of data. First, intu-
0.4
f2

itively the design space, objective space, or other met-


0.2 rics can be explored through visualization. Moreover, to
measure the convergence and/or diversity of a Pareto-
0.0 optimal set performance indicators can be used. For
0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 real-parameter problems, recently proposed theoretical
f1 KKT proximity metric [23] computation procedure is
(b) Objective Space
included in pymoo to compute the proximity of a solu-
FIGURE 2: Result of the Getting Started Optimization tion to the true Pareto-optimal front, despite not know-
ing its exact location. To support the decision making
process either through finding points close to the area of
The method returns a Result object which contains the interest in the objective space or high trade-off solutions.
non-dominated set of solutions found by the algorithm. This can be applied either during an optimization run to
The optimization results are illustrated in Figure 2 where mimic interactive optimization or as a post analysis.
the design space is shown in Figure 2a and in the objective
In the remainder of the paper, we will discuss each of the
space in Figure 2b. The solid line represents the analytically
modules mentioned in more detail.
derived Pareto set and front in the corresponding space
and the circles solutions found by the algorithm. It can be V. PROBLEMS
observed that the algorithm was able to converge and a set of It is common practice for researchers to evaluate the perfor-
nearly-optimal solutions was obtained. Some additional post- mance of algorithms on a variety of test problems. Since we
processing steps and more details about other aspects of the know no single-best algorithm for all arbitrary optimization
optimization procedure can be found in the remainder of this problems exist [24], this helps to identify problem classes
paper and in our software documentation. where the algorithm is suitable. Therefore, a collection of
The starters guide showed the steps starting from the test problems with different numbers of variables, objectives
installation up to solving an optimization problem. The inves- or constraints and alternating complexity becomes handy
tigation of a constrained bi-objective problem demonstrated for algorithm development. Moreover, in a multi-objective
the basic procedure in an optimization scenario. context, test problems with different Pareto-front shapes or
varying variable density close to the optimal region are of
IV. ARCHITECTURE interest.
Software architecture is fundamentally important to keep
source code organized. On the one hand, it helps developers A. IMPLEMENTATIONS
and users to get an overview of existing classes, and on the In our framework, we categorize test problems regarding the
other hand, it allows flexibility and extendibility by adding number of objectives: single-objective (1 objective), multi-
new modules. Figure 3 visualizes the architecture of pymoo. objective (2 or 3 objectives) and many-objective (more than
The first level of abstraction consists of the optimization 3 objectives). Test problems implemented in pymoo are listed
problems, algorithms and analytics. Each of the modules can in Table 2. For each problem the number of variables, ob-
be categorized into more detail and consists of multiple sub- jectives, and constraints are indicated. If the test problem
modules. is scalable to any of the parameters, we label the problem
(i) Problems: Optimization problems in our framework with (s). If the problem is scalable, but a default number was
are categorized into single, multi, and many-objective original proposed we indicate that with surrounding brackets.
VOLUME 4, 2016 5

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

Architecture

pymoo

Problems Optimization Analytics

single- multi- many-


Sampling Crossover Mutation
objective objective objective

Gradients Mating Selection Survival Repair Performance Decision


Visualization
Indicator Making

Constraint Termination
Parallelization Decomposition
Handling Criterion

FIGURE 3: Architecture of pymoo.

In case the category does not apply, for example because we TABLE 2: Multi-objective Optimization Test problems.
refer to a test problem family with several functions, we use
(·). Problem Variables Objectives Constraints
The implementations in pymoo let end-users define what Single-Objective
values of the corresponding problem should be returned. Ackley (s) 1 -
On an implementation level, the evaluate function of a Cantilevered Beams 4 1 2
Problem instance takes a list return_value_of which Griewank (s) 1 -
contains the type of values being returned. By default the Himmelblau 2 1 -
objective values "F" and if the problem has constraints the Knapsack (s) 1 1
constraint violation "CV" are included. The constraint func- Pressure Vessel 4 1 4
tion values can be returned independently by adding "G". Rastrigin (s) 1 -
This gives developers the flexibility to receive the values that Rosenbrock (s) 1 -
are needed for their methods. Schwefel (s) 1 -
Sphere (s) 1 -
B. GRADIENTS Zakharov (s) 1 -
All our test problems are implemented using Autograd [22]. G1-9 (·) (·) (·)
Therefore, automatic differentiation is supported out of the Multi-Objective
box. We have shown in Section III how a new optimization BNH 2 2 2
problem is defined. Carside 7 3 10
If gradients are desired to be calculated the prefix "d" Kursawe 3 2 -
needs to be added to the corresponding value of the OSY 6 2 6
return_value_of list. For instance to ask for the ob- TNK 2 2 2
jective values and its gradients return_value_of = Truss2D 3 2 1
["F", "dF"]. Welded Beam 4 2 4
Let us consider the problem we have implemented shown CTP1-8 (s) 2 (s)
in Equation 3. The derivation of the objective functions F ZDT1-3 (30) 2 -
with respect to each variable is given by: ZDT4 (10) 2 -
ZDT5 (80) 2 -
  ZDT6 (10) 2 -
2x1 2x2
∇F = . (4) Many-Objective
2(x1 − 1) 2x2
DTLZ 1-7 (s) (s) -
The gradients at the point [0.1, 0.2] are calculated by: CDTLZ (s) (s) -
DTLZ1−1 (s) (s) -
F, dF = problem.evaluate(np.array([0.1, 0.2]),
return_values_of=["F", "dF"]) SDTLZ (s) (s) -
WFG (s) (s) -
returns the following output
6 VOLUME 4, 2016

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

TABLE 3: Multi-objective Optimization Algorithms.


F <- [0.05, 0.85]
dF <- [[ 0.2, 0.4], Algorithm Reference
[-1.8, 0.4]]
GA
BRKGA [30]
It can easily be verified that the values are matching
DE [31]
with the analytic gradient derivation. The gradients for the
Nelder-Mead [32]
constraint functions can be calculated accordingly by adding
CMA-ES [33]
"dG" to the return_value_of list.
NSGA-II [12]
RNSGA-II [34]
C. PARALLELIZATION
NSGA-III [35, 36, 37]
If evaluation functions are computationally expensive, a seri- UNSGA-III [38]
alized evaluation of a set of solutions can become the bottle- RNSGA-III [39]
neck of the overall optimization procedure. For this reason, MOEAD [40]
parallelization is desired for an use of existing computational
resources more efficiently and distribute long-running cal-
culations. In pymoo, the evaluation function receives a set rion, and others are more related to evolutionary computing.
of solutions if the algorithm is utilizing a population. This By assembling those modules together algorithms are built.
empowers the user to implement any kind of parallelization
as long as the objective values for all solutions are written A. ALGORITHMS
as an output when the evaluation function terminates. In our Available algorithm implementations in pymoo are listed in
framework, a couple of possibilities to implement paralleliza- Table 3. Compared to other optimization frameworks the list
tion exist: of algorithms may look rather short, however, each algorithm
(i) Vectorized Evaluation: A common technique to par- is customizable and variants can be initialized with different
allelize evaluations is to use matrices where each row parameters. For instance, a Steady-State NSGA-II [27] can
represents a solution. Therefore, a vectorized evaluation be initialized by setting the number of offspring to 1. This
refers to a column which includes the variables of all can be achieved by supplying this as a parameter in the
solutions. By using vectors the objective values of all initialization method as shown in Section III. Moreover, it
solutions are calculated at once. The code-snippet of the is worth mentioning that many-objective algorithms, such
example problem in Section III shows such an imple- as NSGA-III or MOEAD, require reference directions to be
mentation using NumPy [20]. To run calculations on a provided. The reference directions are commonly desired to
GPU, implementing support for PyTorch [25] tensors be uniform or to have a bias towards a region of interest. Our
can be done with little overhead given suitable hardware framework offers an implementation of the Das and Dennis
and correctly installed drivers. method [28] for a fixed number of points (fixed with respect
(ii) Threaded Loop-wise Evaluation: If the function eval- to a parameter often referred to as partition number) and a
uation should occur independently, a for loop can be recently proposed Riesz-Energy based method which creates
used to set the values. By default the evaluation is serial- a well-spaced point set for an arbitrary number of points and
ized and no calculations occur in parallel. By providing is capable of introducing a bias towards preferred regions in
a keyword to the evaluation function, pymoo spawns a the objective space [29].
thread for each evaluation and manages those by using
the default thread pool implementation in Python. This B. OPERATORS
behaviour can be implemented out of the box and the The following evolutionary operators are available:
number of parallel threads can be modified. (i) Sampling: The initial population is mostly based on
(iii) Distributed Evaluation: If the evaluation should not be sampling. In some cases it is created through domain
limited to a single machine, the evaluation itself can knowledge and/or some solutions are already evaluated,
be distributed to several workers or a whole cluster. they can directly be used as an initial population. Other-
We recommend using Dask [26] which enables dis- wise, it can be sampled randomly for real, integer, or
tributed computations on different levels. For instance, binary variables. Additionally, Latin-Hypercube Sam-
the matrix operation itself can be distributed or a whole pling [41] can be used for real variables.
function can be outsourced. Similar to the loop wise (ii) Crossover: A variety of crossover operators for dif-
evaluation each individual can be evaluate element-wise ferent type of variables are implemented. In Figure 4
by sending it to a worker. some of them are presented. Figures 4a to 4d help to
visualize the information exchange in a crossover with
VI. OPTIMIZATION MODULE two parents being involved. Each row represents an
The optimization module provides different kinds of sub- offspring and each column a variable. The correspond-
modules to be used in algorithms. Some of them are more of a ing boxes indicate whether the values of the offspring
generic nature, such as decomposition and termination crite- are inherited from the first or from the second parent.
VOLUME 4, 2016 7

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

For one and two-point crossovers it can be observed


that either one or two cuts in the variable sequence 0 0

Individuals

Individuals
exist. Contrarily, the Uniform Crossover (UX) does not
have any clear pattern, because each variable is chosen
50 50
randomly either from the first or from the second parent.
For the Half Uniform Crossover (HUX) half of the 0 100 0 100
Variables Variables
variables, which are different, are exchanged. For the
purpose of illustration, we have created two parents that (a) One Point (b) Two Point
have different values in 10 different positions. For real
0 0

Individuals

Individuals
variables, Simulated Binary Crossover [42] is known
to be an efficient crossover. It mimics the crossover of 50 50
binary encoded variables. In Figure 4e the probability
distribution when the parents x1 = 0.2 and x2 = 0.8 0 100 0 100
where xi ∈ [0, 1] with η = 0.8 are recombined is shown. Variables Variables
Analogously, in case of integer variables we subtract 0.5 (c) UX (d) HUX
from the lower and add (0.5 − ) to the upper bound
before applying the crossover and round to the nearest
integer afterwards (see Figure 4f).
(iii) Mutation: For real and integer variables Polynomial

p(x)

p(x)
Mutation [19, 43] and for binary variables Bitflip mu-
tation [44] is provided.
Different problems require different type of operators. In 0.2 0.8 10 10
practice, if a problem is supposed to be solved repeatedly x x
and routinely, it makes sense to customize the evolutionary (e) SBX (real, eta=0.8) (f) SBX (int, eta=3)
operators to improve the convergence of the algorithm. More-
over, for custom variable types, for instance trees or mixed FIGURE 4: Illustration of some crossover operators for dif-
variables [45], custom operators [46] can be implemented ferent variables types.
easily and called by algorithm class. Our software documen-
tation contains examples for custom modules, operators and
technique can be either embedded in a multi-objective al-
variable types.
gorithm and solved simultaneously or independently using a
single-objective optimizer. Some decomposition methods are
C. TERMINATION CRITERION
based on the lp-metrics with different p values. For instance,
For every algorithm it must be determined when it should a naive but frequently used decomposition approach is the
terminate a run. This can be simply based on a predefined Weighted-Sum Method (p = 1), which is known to be not
number of function evaluations, iterations, or a more ad- able to converge to the non-convex part of a Pareto-front [19].
vanced criterion, such as the change of a performance metric Moreover, instead of summing values, Tchebysheff Method
over time. For example, we have implemented a termination (p = ∞) considers only the maximum value of the differ-
criterion based on the variable and objective space differ- ence between the ideal point and a solution. Similarly, the
ence between generations. To make the termination crite- Achievement Scalarization Function (ASF) [49] and a modi-
rion more robust the last k generations are considered. The fied version Augmented Achievement Scalarization Function
largest movement from a solution to its closest neighbour is (AASF) [50] use the maximum of all differences. Further-
tracked across generation and whenever it is below a certain more, Penalty Boundary Intersection (PBI) [40] is calculated
threshold, the algorithm is considered to have converged. by a weighted sum of the norm of the projection of a point
Analogously, the movement in the objective space can also be onto the reference direction and the perpendicular distance.
used. In the objective space, however, normalization is more Also it is worth to note that normalization is essential for
challenging and has to be addressed carefully. The default any kind of decomposition. All decomposition techniques
termination criterion for multi-objective problems in pymoo mentioned above are implemented in pymoo.
keeps track of the boundary points in the objective space and
uses them, when they have settled down, for normalization. VII. ANALYTICS
More details about the proposed termination criterion can be A. PERFORMANCE INDICATORS
found in [47]. For single-objective optimization algorithms the compari-
son regarding performance is rather simple because each
D. DECOMPOSITION optimization run results in a single best solution. In multi-
Decomposition transforms multi-objective problems into objective optimization, however, each run returns a non-
many single-objective optimization problems [48]. Such a dominated set of solutions. To compare sets of solutions,
8 VOLUME 4, 2016

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

various performance indicators have been proposed in the two points. It might be desired to normalize each objective
past [51]. In pymoo most commonly used performance in- to make sure a comparison between values is based on
dicators are described: relative and not absolute values. Pairwise Scatter Plots (see
(i) GD/IGD: Given the Pareto-front PF the deviation be- Figure 5c) visualize more than 3 objectives by showing each
tween the non-dominated set S found by the algorithm pair of axes independently. The diagonal is used to label the
and the optimum can be measured. Following this prin- corresponding objectives.
ciple, Generational Distance (GD) indicator [52] cal- Also, high-dimensional data can be illustrated by Parallel
culates the average Euclidean distance in the objective Coordinate Plots (PCP) as shown in Figure 5d. All axes are
space from each solution in S to the closest solution in plotted vertically and represent an objective. Each solution is
PF. This measures the convergence of S, but does not illustrated by a line from the left to the right. The intersection
indicate whether a good diversity on the Pareto-front has of a line and an axis indicate the value of the solution
been reached. Similarly, Inverted Generational Distance regarding the corresponding objective. For the purpose of
(IGD) indicator [52] measures the average Euclidean comparison solution(s) can be highlighted by varying color
distance in the objective space from each solution in PF and opacity.
to the closest solution in S. The Pareto-front as a whole Moreover, a common practice is to project the higher
needs to be covered by solutions from S to minimize dimensional objective values onto the 2D plane using a
the performance metric. Thus, lower the GD and IGD transformation function. Radviz (Figure 5e) visualizes all
values, the better is the set. However, IGD is known to points in a circle and the objective axes are uniformly posi-
be not Pareto compliant [53]. tioned around on the perimeter. Considering a minimization
(ii) GD+/IGD+: A variation of GD and IGD has been problem and a set of non-dominated solutions, an extreme
proposed in [53]. The Euclidean distance is replaced by point very close to an axis represents the worst solution
a distance measure that takes the dominance relation for that corresponding objective, but is comparably "good"
into account. The authors show that IGD+ is weakly in one or many other objectives. Similarly, Star Coordinate
Pareto compliant. Plots (Figure 5f) illustrate the objective space, except that the
(iii) Hypervolume: Moreover, the dominated portion of the transformation function allows solutions outside of the circle.
objective space can be used to measure the quality of Heatmaps (Figure 5g) are used to represent the goodness
non-dominated solutions [54]. The higher the hypervol- of solutions through colors. Each row represents a solution
ume, the better is the set. Instead of the Pareto-front a and each column a variable. We leave the choice to the end-
reference point needs to be provided. It has been shown user of what color map to use and whether light or dark
that Hypervolume is Pareto compliant [55]. Because colors illustrate better or worse solutions. Also, solutions can
the performance metric becomes computationally ex- be sorted lexicographically by their corresponding objective
pensive in higher dimensional spaces the exact measure values.
becomes intractable. However, we plan to include some Instead of visualizing a set of solutions, one solution can
proposed approximation methods in the near future. be illustrated at a time. The Petal Diagram (Figure 5h) is a
pie diagram where the objective value is represented by each
Performance indicators are used to compare existing algo- piece’s diameter. Colors are used to further distinguish the
rithms. Moreover, the development of new algorithms can be pieces. Finally, the Spider-Web or Radar Diagram (Figure 5i)
driven by the goodness of different metrics itself. shows the objectives values as a point on an axis. The ideal
and nadir point [19] is represented by the inner and outer
B. VISUALIZATION polygon. By definition, the solution lies in between those two
The visualization of intermediate steps or the final result is extremes. If the objective space ranges are scaled differently,
inevitable. In multi and many-objective optimization, visual- normalization for the purpose of plotting can be enabled and
ization of the objective space is of interest so that trade-off the diagram becomes symmetric. New and emerging methods
information among solutions can be easily experienced from for visualizing more than three-dimensional efficient solu-
the plots. Depending on the dimension of the objective space, tions, such as 2.5-dimensional PaletteViz plots [57], would
different types of plots are suitable to represent a single be implemented in the future.
or a set of solutions. In pymoo the implemented visualiza-
tions wrap around the well-known plotting library in Python C. DECISION MAKING
Matplotlib [56]. Keyword arguments provided by Matplotlib In practice, after obtaining a set of non-dominated solutions a
itself are still available which allows to modify for instance single solution has to be chosen for implementation. pymoo
the color, thickness, opacity of lines, points or other shapes. provides a few “a posteriori” approaches for decision making
Therefore, all visualization techniques are customizable and [19].
extendable. (i) Compromise Programming: One way of making a de-
For 2 or 3 objectives, scatter plots (see Figure 5a and cision is to compute value of a scalarized and aggregated
5b) can give a good intuition about the solution set. Trade- function and select one solution based on minimum or
offs can be observed by considering the distance between maximum value of the function. In pymoo a number of
VOLUME 4, 2016 9

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

0.5 0.5 0.5


f1

f2

f3

f4
1.0 0.0
0.0
f1
0.5
0.0
0.0
f1
0.5
0.0
0.0
f1
0.5

0.5 0.5 0.5


0.5 f2

f1

f3

f4
0.0 0.0 0.0
0.5 0.0 0.5 0.0 0.5 0.0 0.5
0.4 f2 f2 f2
0.0
f2

0.3 0.5 0.5 0.5


f3

f3
0.2

f1

f2

f4
0.1 0.0 0.0 0.0
0.5 0.0 0.0
f3
0.5 0.0
f3
0.5 0.0
f3
0.5

0.0 0.0 0.5 0.5 0.5


0.1 0.1 f4
0.00 0.25 0.50 0.75 0.2 0.2

f1

f2

f3
0.0 0.0 0.0
0.3 0.3
f1 f2 0.4
0.5 0.5
0.4 f1
0.0
f4
0.5 0.0
f4
0.5 0.0
f4
0.5

(a) Scatter 2D (b) Scatter 3D (c) Scatter ND [58]

f4 f3
f4 f3
f5 f2
f5 f2
f6 f1 f6 f1

f7 f10
f7 f10 f8 f9
f1 f2 f3 f4 f5 f6 f7 f8 f9 f10 f8 f9
(d) PCP [59] (e) Radviz [60] (f) Star Coordinate Graph [61]

f2
f2 f3
f3 f1 f1

f4 f6 f4
f1 f2 f3 f4 f5 f6 f5 f5
(g) Heatmap [62] (h) Petal Diagram [63] (i) Spider-Web/Radar [64]

FIGURE 5: Different visualization methods coded in pymoo.

scalarization functions described in Section VI-D can be pseudo-weight to a target preference vector of objectives
used to come to a decision regarding desired weights of (f1 being preferred twice as important as f2 results in a
objectives. target preference vector of (0.667, 0.333)) can be chosen
as the preferred solution from the efficient set.
(ii) Pseudo-Weights: However, a more intuitive way to
chose a solution out of a Pareto-front is the pseudo- (iii) High Trade-Off Solutions: Furthermore, high trade-off
weight vector approach proposed in [19]. The pseudo solutions are usually of interest, but not straightforward
weight wi for the i-th objective function is calculated to detect in higher-dimensional objective spaces. We
by: have implemented the procedure proposed in [65]. It
was described to be embedded in an algorithm to guide
(f max − fi (x)) / (fimax − fimin ) the search; we, however, use it for post-processing.
wi = PM i . (5)
max − f (x)) / (f max − f min ) The metric for each solution pair xi and xj in a non-
m=1 (fm m m m
dominated set is given by:
The normalized distance to the worst solution regarding
PM
each objective i is calculated. It is interesting to note that max[0, fm (xj ) − fm (xi )]
for non-convex Pareto-fronts, the pseudo weight does T (xi , xj ) = Pi=1
M
, (6)
i=1 max[0, fm (xi ) − fm (xj )]
not correspond to the result of an optimization using
the weighted-sum method. A solution having the closest where the numerator represents the aggregated sacrifice
10 VOLUME 4, 2016

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

and the denominator the aggregated gain. The trade-off collaboration with industries. In the future, we plan to im-
measure µ(xi , S) for each solution xi with respect to a plement more optimization algorithms and test problems to
set of neighboring solutions S is obtained by: provide more choices to end-users. Also, we aim to imple-
ment some methods from the classical literature on single-
µ(xi , S) = min T (xi , xj ) (7) objective optimization which can also be used for multi-
xj ∈S
objective optimization through decomposition or embedded
It finds the minimum T (xi , xj ) from xi to all other as a local search. So far, we have provided a few basic perfor-
solutions xj ∈ S. Instead of calculating the metric mance metrics. We plan to extend this by creating a module
with respect to all others, we provide the option to that runs a list of algorithms on test problems automatically
only consider the k closest neighbors in the objective and provides a statistics of different performance indicators.
space to reduce the computational complexity. Based Furthermore, we like to mention that any kind of con-
on circumstances, the ‘min’ operator can be replaced tribution is more than welcome. We see our framework as
with ‘average’, or ‘max’, or any other suitable operator. a collaborative collection from and to the multi-objective
Thereafter, the solution having the maximum µ can be optimization community. By adding a method or algorithm
chosen as the preferred solution, meaning that this solu- to pymoo the community can benefit from a growing compre-
tion causes a maximum sacrifice in one of the objective hensive framework and it can help researchers to advertise
values for a unit gain in another objective value for it be their methods. Interested researchers are welcome to con-
the most valuable solution for implementation. tact the authors. In general, different kinds of contributions
The above methods are algorithmic, but requires an user are possible and more information can be found online.
interaction to choose a single preferred solution. However, Moreover, we would like to mention that even though we
in real practice, a more problem specific decision-making try to keep our framework as bug-free as possible, in case
method must be used, such as an interaction EMO method of exceptions during the execution or doubt of correctness,
suggested elsewhere [66]. We emphasize here the fact please contact us directly or use our issue tracker.
that multi-objective frameworks should include methods for
multi-criteria decision making and support end-user further REFERENCES
in choosing a solution out of a trade-off solution set. [1] G. Rossum, “Python reference manual,” Amsterdam,
The Netherlands, The Netherlands, Tech. Rep., 1995.
VIII. CONCLUDING REMARKS [2] M. Bücker, G. Corliss, P. Hovland, U. Naumann,
This paper has introduced pymoo, a multi-objective opti- and B. Norris, Automatic Differentiation: Applications,
mization framework in Python. We have walked through our Theory, and Implementations (Lecture Notes in Com-
framework beginning with the installation up to the opti- putational Science and Engineering). Berlin, Heidel-
mization of a constrained bi-objective optimization problem. berg: Springer-Verlag, 2006.
Moreover, we have presented the overall architecture of the [3] R. Lehmann, “Sphinx documentation,” 2019.
framework consisting of three core modules: Problems, Op- [4] A. Pajankar, Python Unit Test Automation: Practical
timization, and Analytics. Each module has been described Techniques for Python Developers and Testers, 1st ed.
in depth and illustrative examples have been provided. We Berkely, CA, USA: Apress, 2017.
have shown that our framework covers various aspects of [5] J. J. Durillo and A. J. Nebro, “jMetal: a java
multi-objective optimization including the visualization of framework for multi-objective optimization,” Advances
high-dimensional spaces and multi-criteria decision making in Engineering Software, vol. 42, pp. 760–771,
to finally select a solution out of the obtained solution set. 2011. [Online]. Available: http://www.sciencedirect.
One distinguishing feature of our framework with other com/science/article/pii/S0965997811001219
existing ones is that we have provided a few options for [6] J. Gosling, B. Joy, G. L. Steele, G. Bracha, and A. Buck-
various key aspects of a multi-objective optimization task, ley, The Java Language Specification, Java SE 8 Edi-
providing standard evolutionary operators for optimization, tion, 1st ed. Addison-Wesley Professional, 2014.
standard performance metrics for evaluating a run, standard [7] A. Benítez-Hidalgo, A. J. Nebro, J. García-
visualization techniques for showcasing obtained trade-off Nieto, I. Oregi, and J. D. Ser, “jmetalpy: A
solutions, and a few approaches for decision-making. Most python framework for multi-objective optimization
such implementations were originally suggested and devel- with metaheuristics,” Swarm and Evolutionary
oped by the second author and his collaborators for more than Computation, vol. 51, p. 100598, 2019. [Online]. Avail-
25 years. Hence, we consider that the implementations of all able: http://www.sciencedirect.com/science/article/pii/
such ideas are authentic and error-free. Thus, the results from S2210650219301397
the proposed framework should stay as benchmark results of [8] D. Izzo, “PyGMO and PyKEP: open source
implemented procedures. tools for massively parallel optimization in
However, the framework can be definitely extended to astrodynamics (the case of interplanetary trajectory
make it more comprehensive and we are constantly adding optimization),” in 5th International Conference on
new capabilities based on practicalities learned from our Astrodynamics Tools and Techniques (ICATT 2012),
VOLUME 4, 2016 11

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

2012. [Online]. Available: http://www.esa.int/gsp/ session/1/contribution/6/3/material/paper/0.pdf,https:


ACT/doc/MAD/pub/ACT-RPR-MAD-2012-(ICATT) //github.com/HIPS/autograd
PyKEP-PaGMO-SOCIS.pdf [23] K. Deb and M. Abouhawwash, “An optimality the-
[9] D. Hadka, “Platypus: multiobjective optimization ory based proximity measure for set based multi-
in python,” https://platypus.readthedocs.io, accessed: objective optimization,” IEEE Transactions on Evolu-
2019-05-16. tionary Computation, vol. 20, no. 4, pp. 515–528, 2016.
[10] F.-A. Fortin, F.-M. De Rainville, M.-A. Gardner, [24] D. H. Wolpert and W. G. Macready, “No free lunch
M. Parizeau, and C. Gagné, “DEAP: Evolutionary al- theorems for optimization,” Trans. Evol. Comp, vol. 1,
gorithms made easy,” Journal of Machine Learning no. 1, pp. 67–82, Apr. 1997. [Online]. Available:
Research, vol. 13, pp. 2171–2175, jul 2012. https://doi.org/10.1109/4235.585893
[11] A. Garrett, “inspyred: python library for [25] A. Paszke, S. Gross, S. Chintala, G. Chanan, E. Yang,
bio-inspired computational intelligence,” Z. DeVito, Z. Lin, A. Desmaison, L. Antiga, and
https://github.com/aarongarrett/inspyred, accessed: A. Lerer, “Automatic differentiation in pytorch,” 2017.
2019-05-16. [26] Dask Development Team, Dask: Library for dynamic
[12] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, task scheduling, 2016. [Online]. Available: https:
“A fast and elitist multiobjective genetic algorithm: //dask.org
NSGA-II,” Trans. Evol. Comp, vol. 6, no. 2, pp. [27] S. Mishra, S. Mondal, and S. Saha, “Fast implementa-
182–197, Apr. 2002. [Online]. Available: http://dx.doi. tion of steady-state nsga-ii,” in 2016 IEEE Congress on
org/10.1109/4235.996017 Evolutionary Computation (CEC), July 2016, pp. 3777–
[13] Y. Tian, R. Cheng, X. Zhang, and Y. Jin, “PlatEMO: 3784.
A MATLAB platform for evolutionary multi-objective [28] I. Das and J. Dennis, “Normal-boundary intersection:
optimization,” IEEE Computational Intelligence Maga- A new method for generating the Pareto surface in
zine, vol. 12, no. 4, pp. 73–87, 2017. nonlinear multicriteria optimization problems,” SIAM
[14] D. Hadka, “MOEA Framework: a free and open source J. of Optimization, vol. 8, no. 3, pp. 631–657, 1998.
Java framework for multiobjective optimization,” http: [29] J. Blank, K. Deb, Y. Dhebar, S. Bandaru, and H. Seada,
//moeaframework.org, accessed: 2019-05-16. “Generating well-spaced points on a unit simplex
[15] E. López-Camacho, M. J. García-Godoy, A. J. for evolutionary many-objective optimization,” IEEE
Nebro, and J. F. A. Montes, “jmetalcpp: optimizing Transactions on Evolutionary Computation, in press.
molecular docking problems with a C++ metaheuristic [30] J. C. Bean, “Genetic algorithms and random keys
framework,” Bioinformatics, vol. 30, no. 3, pp. 437– for sequencing and optimization,” ORSA Journal on
438, 2014. [Online]. Available: https://doi.org/10.1093/ Computing, vol. 6, no. 2, pp. 154–160, 1994. [Online].
bioinformatics/btt679 Available: https://doi.org/10.1287/ijoc.6.2.154
[16] F. Biscani, D. Izzo, and C. H. Yam, “A global [31] K. Price, R. M. Storn, and J. A. Lampinen, Differential
optimisation toolbox for massively parallel engineering Evolution: A Practical Approach to Global Optimiza-
optimisation,” CoRR, vol. abs/1004.3824, 2010. tion (Natural Computing Series). Berlin, Heidelberg:
[Online]. Available: http://arxiv.org/abs/1004.3824 Springer-Verlag, 2005.
[17] P. S. Foundation, “PyPI: the python package index,” [32] J. A. Nelder and R. Mead, “A Simplex Method for
https://pypi.org, accessed: 2019-05-20. Function Minimization,” The Computer Journal, vol. 7,
[18] S. Behnel, R. Bradshaw, C. Citro, L. Dalcin, D. Selje- no. 4, pp. 308–313, 01 1965. [Online]. Available:
botn, and K. Smith, “Cython: The best of both worlds,” https://doi.org/10.1093/comjnl/7.4.308
Computing in Science Engineering, vol. 13, no. 2, pp. [33] N. Hansen, The CMA Evolution Strategy: A
31 –39, April 2011. Comparing Review. Berlin, Heidelberg: Springer
[19] K. Deb, Multi-Objective Optimization Using Evolu- Berlin Heidelberg, 2006, pp. 75–102. [Online].
tionary Algorithms. New York: Wiley, 2001. Available: https://doi.org/10.1007/3-540-32494-1_4
[20] T. Oliphant, “NumPy: A guide to NumPy,” USA: [34] K. Deb and J. Sundar, “Reference point based multi-
Trelgol Publishing, 2006–, [Online; accessed May 16, objective optimization using evolutionary algorithms,”
2019]. [Online]. Available: http://www.numpy.org/ in Proceedings of the 8th Annual Conference on
[21] K. Deb and R. Datta, “A bi-objective constrained op- Genetic and Evolutionary Computation, ser. GECCO
timization algorithm using a hybrid evolutionary and ’06. New York, NY, USA: ACM, 2006, pp. 635–
penalty function approach,” Engineering Optimization, 642. [Online]. Available: http://doi.acm.org/10.1145/
vol. 45, no. 5, pp. 503–527, 2012. 1143997.1144112
[22] D. Maclaurin, D. Duvenaud, and R. P. Adams, [35] K. Deb and H. Jain, “An evolutionary many-objective
“Autograd: Effortless gradients in numpy,” optimization algorithm using reference-point-based
in ICML 2015 AutoML Workshop, 2015. nondominated sorting approach, part I: Solving prob-
[Online]. Available: /bib/maclaurin/maclaurinautograd/ lems with box constraints,” IEEE Transactions on Evo-
automl-short.pdf,https://indico.lal.in2p3.fr/event/2914/ lutionary Computation, vol. 18, no. 4, pp. 577–601,
12 VOLUME 4, 2016

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

2014. multi- and many-objective optimization algorithms,” in


[36] H. Jain and K. Deb, “An evolutionary many-objective 2020 IEEE Symposium Series on Computational Intel-
optimization algorithm using reference-point based ligence (SSCI), in press.
nondominated sorting approach, part II: Handling con- [48] A. Santiago, H. J. F. Huacuja, B. Dorronsoro,
straints and extending to an adaptive approach,” IEEE J. E. Pecero, C. G. Santillan, J. J. G.
Transactions on Evolutionary Computation, vol. 18, Barbosa, and J. C. S. Monterrubio, A Survey
no. 4, pp. 602–622, Aug 2014. of Decomposition Methods for Multi-objective
[37] J. Blank, K. Deb, and P. C. Roy, “Investigating the Optimization. Cham: Springer International Pub-
normalization procedure of NSGA-III,” in Evolution- lishing, 2014, pp. 453–465. [Online]. Available:
ary Multi-Criterion Optimization, K. Deb, E. Good- https://doi.org/10.1007/978-3-319-05170-3_31
man, C. A. Coello Coello, K. Klamroth, K. Miettinen, [49] A. P. Wierzbicki, “The use of reference objectives in
S. Mostaghim, and P. Reed, Eds. Cham: Springer multiobjective optimization,” in Multiple criteria deci-
International Publishing, 2019, pp. 229–240. sion making theory and application. Springer, 1980,
[38] H. Seada and K. Deb, “A unified evolutionary op- pp. 468–486.
timization procedure for single, multiple, and many [50] ——, “A mathematical basis for satisficing decision
objectives,” IEEE Transactions on Evolutionary Com- making,” Mathematical Modelling, vol. 3, no. 5,
putation, vol. 20, no. 3, pp. 358–369, June 2016. pp. 391 – 405, 1982, special IIASA Issue.
[39] Y. Vesikar, K. Deb, and J. Blank, “Reference point [Online]. Available: http://www.sciencedirect.com/
based NSGA-III for preferred solutions,” in 2018 science/article/pii/0270025582900380
IEEE Symposium Series on Computational Intelligence [51] J. Knowles and D. Corne, “On metrics for compar-
(SSCI), Nov 2018, pp. 1587–1594. ing non-dominated sets,” in Proceedings of the 2002
[40] Q. Zhang and H. Li, “A multi-objective evolutionary Congress on Evolutionary Computation Conference
algorithm based on decomposition,” IEEE Transactions (CEC02). United States: Institute of Electrical and
on Evolutionary Computation, Accepted, vol. 2007, Electronics Engineers, 2002, pp. 711–716.
2007. [52] D. A. V. Veldhuizen and D. A. V. Veldhuizen, “Multi-
[41] M. D. McKay, R. J. Beckman, and W. J. Conover, objective evolutionary algorithms: Classifications, anal-
“A comparison of three methods for selecting values yses, and new innovations,” Evolutionary Computation,
of input variables in the analysis of output from Tech. Rep., 1999.
a computer code,” Technometrics, vol. 42, no. 1, [53] H. Ishibuchi, H. Masuda, Y. Tanigaki, and Y. Nojima,
pp. 55–61, Feb. 2000. [Online]. Available: http: “Modified distance calculation in generational dis-
//dx.doi.org/10.2307/1271432 tance and inverted generational distance,” in Evolution-
[42] K. Deb, K. Sindhya, and T. Okabe, “Self- ary Multi-Criterion Optimization, A. Gaspar-Cunha,
adaptive simulated binary crossover for real- C. Henggeler Antunes, and C. C. Coello, Eds. Cham:
parameter optimization,” in Proceedings of the 9th Springer International Publishing, 2015, pp. 110–125.
Annual Conference on Genetic and Evolutionary [54] E. Zitzler and L. Thiele, “Multiobjective optimization
Computation, ser. GECCO ’07. New York, NY, using evolutionary algorithms - a comparative case
USA: ACM, 2007, pp. 1187–1194. [Online]. Available: study,” in Proceedings of the 5th International
http://doi.acm.org/10.1145/1276958.1277190 Conference on Parallel Problem Solving from
[43] K. Deb and D. Deb, “Analysing mutation schemes for Nature, ser. PPSN V. London, UK, UK: Springer-
real-parameter genetic algorithms,” International Jour- Verlag, 1998, pp. 292–304. [Online]. Available:
nal of Artificial Intelligence and Soft Computing, vol. 4, http://dl.acm.org/citation.cfm?id=645824.668610
no. 1, pp. 1–28, 2014. [55] E. Zitzler, D. Brockhoff, and L. Thiele, “The
[44] D. E. Goldberg, Genetic Algorithms for Search, Op- hypervolume indicator revisited: On the design of
timization, and Machine Learning. Reading, MA: pareto-compliant indicators via weighted integration,”
Addison-Wesley, 1989. in Proceedings of the 4th International Conference
[45] K. Deb and M. Goyal, “A robust optimization procedure on Evolutionary Multi-criterion Optimization, ser.
for mechanical component design based on genetic EMO’07. Berlin, Heidelberg: Springer-Verlag, 2007,
adaptive search,” Transactions of the ASME: Journal of pp. 862–876. [Online]. Available: http://dl.acm.org/
Mechanical Design, vol. 120, no. 2, pp. 162–164, 1998. citation.cfm?id=1762545.1762618
[46] K. Deb and C. Myburgh, “A population-based fast [56] J. D. Hunter, “Matplotlib: A 2d graphics environment,”
algorithm for a billion-dimensional resource allocation Computing in Science & Engineering, vol. 9, no. 3, pp.
problem with integer variables,” European Journal of 90–95, 2007.
Operational Research, vol. 261, no. 2, pp. 460–474, [57] A. K. A. Talukder and K. Deb, “PaletteViz: A vi-
2017. sualization method for functional understanding of
[47] J. Blank and K. Deb, “A running performance metric high-dimensional pareto-optimal data-sets to aid multi-
and termination criterion for evaluating evolutionary criteria decision making,” IEEE Computational Intelli-
VOLUME 4, 2016 13

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.
This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI
10.1109/ACCESS.2020.2990567, IEEE Access

Blank J. et al.: Preparation of Papers for IEEE TRANSACTIONS and JOURNALS

gence Magazine, vol. 15, no. 2, pp. 36–48, 2020. JULIAN BLANK received his B.Sc. in Busi-
[58] J. M. Chambers and B. Kleiner, “10 graphical tech- ness Information Systems from Otto von Guericke
University, Germany in 2010. He was a visiting
niques for multivariate data and for clustering,” 1982. scholar for six months at the Michigan State Uni-
[59] E. Wegman, “Hyperdimensional data analysis using versity, Michigan, USA in 2015, and, finished
parallel coordinates,” Journal of the American Statis- his M.Sc. in Computer Science at Otto von Gu-
tical Association, vol. 85, pp. 664–675, 09 1990. ericke University, Germany in 2016. He is cur-
rently a Ph.D. student in Computer Science at the
[60] P. Hoffman, G. Grinstein, and D. Pinkney, “Dimen- Michigan State University, Michigan, USA. His
sional anchors: a graphic primitive for multidimen- current research interests include multi-objective
sional multivariate information visualizations,” in Proc. optimization, evolutionary computation, surrogate-assisted optimization and
Workshop on new Paradigms in Information Visualiza- machine learning.
tion and Manipulation in conjunction with the ACM
International Conference on Information and Knowl-
edge Management (NPIVM99). New York, NY, USA:
ACM, 1999, pp. 9–16.
[61] E. Kandogan, “Star coordinates: A multi-dimensional
visualization technique with uniform treatment of di-
mensions,” in In Proceedings of the IEEE Information
Visualization Symposium, Late Breaking Hot Topics,
2000, pp. 9–12.
[62] A. Pryke, S. Mostaghim, and A. Nazemi, “Heatmap
visualization of population based multi objective al-
gorithms,” in Evolutionary Multi-Criterion Optimiza-
tion, S. Obayashi, K. Deb, C. Poloni, T. Hiroyasu, and
T. Murata, Eds. Berlin, Heidelberg: Springer Berlin
Heidelberg, 2007, pp. 361–375. KALYANMOY DEB is the Koenig Endowed Chair
[63] Y. S. Tan and N. M. Fraser, “The modified star graph Professor with the Department of Electrical and
and the petal diagram: two new visual aids for discrete Computer Engineering, Michigan State Univer-
sity, East Lansing, Michigan, USA. He received
alternative multicriteria decision making,” Journal of his Bachelor’s degree in Mechanical Engineering
Multi-Criteria Decision Analysis, vol. 7, no. 1, pp. 20– from IIT Kharagpur in India, and his Master’s and
33, 1998. Ph.D. degrees from the University of Alabama,
[64] E. Kasanen, R. Östermark, and M. Zeleny, “Gestalt Tuscaloosa, USA, in 1989 and 1991. He is largely
known for his seminal research in evolutionary
system of holistic graphics: New management support multi-criterion optimization. He has authored two
view of MCDM,” Computers & OR, vol. 18, no. 2, pp. text books on optimization and published over 535 international journal and
233–239, 1991. [Online]. Available: https://doi.org/10. conference research papers to date. His current research interests include
1016/0305-0548(91)90093-7 evolutionary optimization and its application in design, AI, and machine
learning. He is one of the top cited EC researchers with more than 140,000
[65] L. Rachmawati and D. Srinivasan, “Multiobjective evo- Google Scholar citations. Dr. Deb is a recipient of the IEEE CIS EC Pioneer
lutionary algorithm with controllable focus on the knees Award in 2018, the Lifetime Achievement Award from Clarivate Analytics
of the pareto front,” Evolutionary Computation, IEEE in 2017, the Infosys Prize in 2012, the TWAS Prize in Engineering Sciences
Transactions on, vol. 13, pp. 810 – 824, 09 2009. in 2012, the CajAstur Mamdani Prize in 2011, the Distinguished Alumni
Award from IIT Kharagpur in 2011, the Edgeworth-Pareto Award in 2008,
[66] K. Deb, A. Sinha, P. Korhonen, and J. Wallenius, the Shanti Swarup Bhatnagar Prize in Engineering Sciences in 2005, and the
“An interactive evolutionary multi-objective optimiza- Thomson Citation Laureate Award from Thompson Reuters. He is a fellow
tion method based on progressively approximated value of IEEE and ASME. More information about his work can be found from
functions,” IEEE Transactions on Evolutionary Compu- http://www.coin-lab.org.
tation, vol. 14, no. 5, pp. 723–739, 2010.

14 VOLUME 4, 2016

This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/.

You might also like