journal.pone.0274054
journal.pone.0274054
journal.pone.0274054
RESEARCH ARTICLE
* 503212590@qq.com
a1111111111
a1111111111 Abstract
a1111111111
a1111111111 Image contrast enhancement uses the object intensity transformation function to maximize
a1111111111
the amount of information to enhance an image. In this paper, the image enhancement prob-
lem is regarded as an optimization problem, and the particle swarm algorithm is used to
obtain the optimal solution. First, an improved particle swarm optimization algorithm is pro-
posed. In this algorithm, individual optimization, local optimization, and global optimization
OPEN ACCESS are used to adjust the particle’s flight direction. In local optimization, the topology is used to
Citation: Zhang X, Ren Y, Zhen G, Shan Y, Chu C induce comparison and communication between particles. The sparse penalty term in
(2023) A color image contrast enhancement
speed update formula is added to adjust the sparsity of the algorithm and the size of the
method based on improved PSO. PLoS ONE 18(2):
e0274054. https://doi.org/10.1371/journal. solution space. Second, the three channels of the color images R, G, and B are represented
pone.0274054 by a quaternion matrix, and an improved particle swarm algorithm is used to optimize the
Editor: Diego Oliva, Universidad de Guadalajara, transformation parameters. Finally, contrast and brightness elements are added to the fit-
MEXICO ness function. The fitness function is used to guide the particle swarm optimization algorithm
Received: November 19, 2021 to optimize the parameters in the transformation function. This paper verifies via two experi-
ments. First, improved particle swarm algorithm is simulated and tested. By comparing the
Accepted: August 19, 2022
average values of the four algorithms under the three types of 6 test functions, the average
Published: February 9, 2023
value is increased by at least 15 times in the single-peak 2 test functions: in the multi-peak
Peer Review History: PLOS recognizes the and multi-peak fixed-dimension 4 test functions, this paper can always search for the global
benefits of transparency in the peer review
optimal solution, and the average value is either the same or at least 1.3 times higher. Sec-
process; therefore, we enable the publication of
all of the content of peer review and author ond, the proposed algorithm is compared with other evolutionary algorithms to optimize con-
responses alongside final, published articles. The trast enhancement, select images in two different data sets, and calculate various
editorial history of this article is available here: evaluation indicators of different algorithms under different images. The optimal value is the
https://doi.org/10.1371/journal.pone.0274054
algorithm in this paper, and the performance indicators are at least a 5% increase and a min-
Copyright: © 2023 Zhang et al. This is an open imum 15% increase in algorithm running time. Final results show that the effects the pro-
access article distributed under the terms of the
posed algorithm have obvious advantages in both subjective and qualitative aspects.
Creative Commons Attribution License, which
permits unrestricted use, distribution, and
reproduction in any medium, provided the original
author and source are credited.
Competing interests: The authors have declared sensing [1], biomedical image analysis [2], fault detection, and other fields, strong require-
that no competing interests exist. ments for high-brightness, high-contrast, and high-detail digital images to produce visual nat-
ural images or to transform images, such as enhancing internal visual information, have
become basic requirements for nearly all image processing tasks [3]. However, optical systems,
cameras, and image capture systems under the effects of lighting cause low contrast in cap-
tured images, which cannot meet the requirements of engineering. Therefore, enhancing the
contract of the image is particularly important. Contrast enhancement techniques can improve
the perception of information in images or provide meaningful information for real-time
image processing applications.
Owing to lighting or some other conditions (e.g., imaging device limitations or inappropri-
ate exposure parameter settings), the captured images tend to suffer from low image contrast,
and blurring, to name a few. These issues affect the collection of images in photography, foren-
sics, analysis, surveillance, and some other optical imaging systems. Despite the astonishing
advancements in image capture devices, various natural and artificial artifacts that result in
poor quality of the captured images persist. Therefore, quality improvement over the original
captured images is an essential part of image preprocessing. Low-light image enhancement
technology aims to enhance the visual effect of images by highlighting blurred or even hidden
details in the image, increasing the brightness and contrast of the image. This problem is rela-
tively challenging and has become a hot research topic and has been extensively studied by
scholars in recent years.
Given the importance of image contrast enhancement techniques, researchers have devel-
oped many algorithms. Currently, contrast enhancement methods can be divided into spatial
and transform domain enhancements. Contrast enhancement in the spatial domain is based
on grayscale transformations of non-linear functions, such as logarithmic transformation [4],
gamma function [5], histogram-based technology [6], and non-linear secondary filtering [7].
Methods for enhancing the contrast in the transform domain include filtering methods [8].
Histogram equalization is a widely used spatial image enhancement method. An adaptive his-
togram equalization method is proposed in [9]. This method estimates the probability density
of the original input image, and then the size of the function is scaled and the scale factor
adjusted adaptively according to the average intensity value of the image. In [10], the
researcher proposes a technology on the basis of joint histogram equalization, which uses
information between each pixel and its neighboring pixels to improve the contrast of the
image. The study posits a non-parametric modified histogram equalization that effectively
handles histogram peaks, reduces distortion in smooth regions, and requires no empirical
adjustment of parameters [11, 12]. The intensity value of the image can also be adjusted
according to different contrast and sharpness measurements [13]. The method based on histo-
gram equalization reduces the gray level of the image after the transformation, and certain
details disappear. Some images, such as the histogram, have peaks, and the contrast is unnatu-
rally enhanced after processing.
In most applications, automatic contrast enhancement technology is required without
manual intervention. However, the automation of the algorithm is not a simple task, because it
requires the evaluation of an objective function that measures the quality of the enhanced
image. To this end, an optimization algorithm based on evolutionary calculations [13–15] was
proposed to automate the contrast enhancement task. These techniques are used to determine
the best parameter settings or the best input/output mapping to obtain the best quality images.
One disadvantage of the image contrast enhancement algorithm based on evolutionary algo-
rithms is that the algorithm iterative time is long, and the image processing time is slow. Aim-
ing at this problem, this paper proposes a color image contrast enhancement algorithm on the
basis of the improved particle swarm algorithm based on the concept of sparse penalty in
machine learning. While enhancing the image contrast, the method shortens the time required
for the evolutionary algorithm to optimize the image. It also largely improves the image qual-
ity. This study provides a new idea for the enhancement of low-contrast images in the future.
The rest of this article is organized as follows. Section 2 sorts out the relevant literature,
makes a theoretical comparison with the existing methods, and introduces the traditional PSO
algorithm. Section 3 proposes an improved version of PSO and enhances image contrast on
the basis of the improved PSO algorithm. Section 4 evaluates the method in extensive experi-
ments. Finally, Section 5 concludes the paper.
2 Background
2.1 Related work
Contrast enhancement algorithms aim to provide an image with additional vivid colors and
higher detail clarity. Contrast enhancement algorithms are closely related to different visual
properties. Attributes include brightness, hue, color, hue, and saturation. Existing image
enhancement techniques are mostly empirical or heuristic methods that are strongly correlated
to specific types of images and usually aim to improve the contrast of degraded images during
acquisition. In fact, finding the optimal grayscale map that adaptively enhances each different
input image can be viewed as an optimization problem.
In recent years, many bionic algorithms for image enhancement have been developed, such
as particle swarm algorithm(PSO), artificial bee colony algorithm, genetic algorithm, and
cuckoo algorithm, among others.
Genetic algorithm [14] can be used as a preprocessing step to enhance the histogram dis-
tribution near the bimodal image and improve the effect of downstream image processing
technology. Munteanu and Rose [15] proposed another genetic algorithm-based contrast
enhancement method. This paper uses a four-parameter local/global transformation func-
tion and optimizes these parameters by genetic algorithm. The objective function combines
the entropy of the enhanced image, the number of edge pixels, and the sum of edge
intensity.
In [16], a new enhanced cuckoo search (ECS) algorithm is proposed to realize the automatic
enhancement of image contrast by optimizing the local/ global transformation parameters. In
the ECS image enhancement algorithm, the sum of the entropy, number of edge pixels, and
edge strength are used to construct the objective function. A new local/ global enhancement
transform parameter search range is proposed to overcome the problems of local/ global
enhancement transforms in enhancing edges, and distorted images achieve better enhance-
ment effects. A method for contrast enhancement of fingerprint images based on cuckoo
search (CS) was proposed in [17]. In [17], the fingerprint image is enhanced using grayscale
mapping transformation, and the grayscale between the input image and the enhanced image
is mapped using the CS algorithm. Here, a parameter Pa called abandonment probability is
used to update the solution in each iteration. The improved CS algorithm uses Cauchy muta-
tion to reconstruct Pa worst nests. In [18], an adaptive image enhancement algorithm with
hybrid CS and PSO (CSPSO) search algorithm was proposed. Here, an incomplete beta func-
tion is used as the boosting transform, and the worst CS set is updated using the PSO algo-
rithm instead of reconstructing the worst set Pa such as the CS algorithm. Recently, an
adaptive CS algorithm was proposed for satellite image contrast enhancement in [19], which
has the same local/global enhancement transformation and parameter range as [14], with dis-
covery and mutation randomization instead of fixed Pa, generates a new solution and proposes
a medical image enhancement algorithm based on wavelet masking. In this paper, Pa is not
fixed but adaptive through adaptive crossover and mutation process.
In [20], a new target fitness function is proposed for the indispensable fitness function in
the enhanced image quality evaluation, including four performance indicators, namely, the
sum of edge strengths, the edges of the number of pixels, image entropy, and image contrast.
The fitness function automatically measures the quality of the generated image, and the artifi-
cial bee searches for the optimal conversion function under the guidance of the new cost func-
tion. Second, this paper uses the incomplete beta function as the image transformation
function to guide the search action of artificial bees. Image enhancement based on firefly algo-
rithm is proposed in [21] using the same transformation and objective function as [14]. A
method for contrast enhancement of grayscale and color images using artificial bee colony
algorithm is proposed. Furthermore, in [1], using the same local/global transformation func-
tions and parameter ranges as [14], an improved differential evolution algorithm is proposed
for contrast enhancement of satellite images. However, it uses a different fitness function that
combines entropy, edge information, and standard deviation.
In [22], the intensity transform function serves as a traditional particle swarm algorithm to
maximize the information amount of the enhanced image. A parametric transformation func-
tion that utilizes local and global information from the image is used. An objective criterion for
image enhancement considering image entropy and edge information is proposed to achieve
the best enhancement effect. Zhang [23] proposed a sonar image enhancement method on the
basis of particle swarm optimization. Sonar images have important characteristics such as low
resolution, strong echo interference, small target area, and blurred target edges. In this case,
obtaining satisfactory results using the global image enhancement algorithm is difficult. In this
paper, an adaptive local enhancement algorithm is proposed, which uses edge count, edge
strength, and entropy to evaluate enhanced images. Particle swarm optimization (PSO) is used
to determine the best enhancement parameters. Kanmani [24] proposed an optimized color
contrast enhancement algorithm to improve the visual perception of information in images
using an adaptive gamma correction factor selected by particle swarm optimization (PSO) to
improve entropy and enhance image detail. And Kanmani [25] proposed a new Eigen face rec-
ognition method that uses particle swarm optimization (PSO), self-tuning particle swarm opti-
mization (STPSO), and brainstorm optimization (BSO) to determine the optimal fusion
coefficient. To fuse CT and MRI images. And they used PSO to optimize the weights to obtain
the weighted average fusion image information, and the objective function jointly maximized
the entropy and minimized the root mean square error to improve the image quality [26].
where x represents the position, v represents the speed, and p represents the optimal solu-
tion. The individual optimal solution is pb,j, and the global optimal solution is pg,j. i repre-
sents the current particle, j refers to the current dimension, t is the current number of
iterations, and c1 and c2 represent the acceleration factor of the learning factor. The value is
usually between 0–2, and r1,j*U(0,1) and r2,j*U(0,1) represent two independent random
functions.
Determine: if the iteration termination condition is met, terminate the iteration and output
the global optimum. Otherwise, return to the previous step to continue the iterative
calculation.
Fig 1. Schematic diagram of local optimal iteration. (a) Macro topology. (b) Deep structure. (c) Particle distribution.
https://doi.org/10.1371/journal.pone.0274054.g001
Therefore, the global, local, individual optimal values and sparse penalty term are added to
the speed and position update formula. The updated formula is as follows:
� � � � � � � �
Vi;j ðt þ 1Þ ¼ k oVi;j ðtÞ þ c1 r1 pb;j xi;j ðtÞ þ c2 r2 pg;j xi;j ðtÞ þ c3 r3 pl;j xi;j ðtÞ þ L1 ; ð3Þ
In Eq (3)), k is the constriction factor, ω is the inertia weight, pb,j is the best solution of the
individual particles, pl,j is the best solution of the particles in the local neighborhood, and pg,j is
the best solution of the global particles that are recorded, where L1 = λw1 are sparse penalty
terms, λ is the penalty coefficient, w1 is the weight of the last solution, c1 denotes adjusting the
individual optimal acceleration coefficient, c2 denotes adjusting the local optimal acceleration
coefficient, c3 denotes adjusting the global optimal acceleration coefficient, and r1, r2, r3 2 (0,
1) are independent of each other.
Notably, in the improved particle swarm algorithm, considering the different values
obtained in each run, deviations are anticipated. Therefore, in the experiment, this paper tests
each particle swarm algorithm in each test function. Averaged over 30 runs, when optimizing
the four parameters in the transformation function, the average value obtained each time is
used to optimize. Thus, in the data set experiment, only one result is obtained.
Among them, fR(x,y), fG(x,y), and fB(x,y) represent the R, G, and B components of the color
image, respectively; and x, y represent the coordinates of the image matrix where the pixel is
located. In this way, the color image can be represented by a pure quaternion matrix. Quater-
nion-based color image processing is used to handle its quaternion matrix. Compared with the
traditional sub-channel or the method of transforming to grayscale image and then processing,
the quaternion method can better reflect the color image holism, whether for theoretical inno-
vation or practical application, which provides a new direction.
According to the introduction in the previous section, two main factors should be consid-
ered when applying the PSO algorithm to image contrast enhancement: (i) Design of a trans-
formation function that will generate a new pixel intensity of the enhanced image from the
original image; and (ii) Design of fitness function that checks the quality of the generated
image. These important factors affect the quality of enhanced images and are thoroughly
described in the following sections.
3.2.1 Transformation function. A contrast function is used to enhance the contrast of
the image in the spatial domain. The transform function generates a new intensity for each
pixel of the original image and generates an enhanced image. In this paper, a low-illumination
image is mapped by a transformation function to obtain a high-contrast image. This article
uses the local/ global enhancement transformation proposed in [35] to use local image statis-
tics, such as mean, variance, and global image information. This function is an extended ver-
sion of the local enhancement function in [36]. Transform is applied to each pixel of the image
of size M� N at position (x, y), and the old intensity f (x, y) is mapped to the new intensity value
g (x, y). The formula of this transformation function is
g ðx; yÞ ¼ T ðf ðx; yÞÞ
a
: ð6Þ
¼ k � ðGm =ðsðx; yÞ þ bÞÞ � ½f ðx; yÞ c � mðx; yÞ� þ ðmðx; yÞÞ
In Eq (6), κ, a, b, and c are four parameters, m (x, y) is the local mean of the (x, y)th pixel of
the input image over a n×n window. Gm is the global mean, and σ (x, y) is the local standard
deviation of (x, y)th pixel of the input image over a n×n window, which are defined as follows:
1 X n 1 X
n 1
mðx; yÞ ¼ f ðx; yÞ; ð7Þ
n � n x¼0 y¼0
1 X M 1XN 1
Gm ¼ f ði; jÞ; ð8Þ
M � N i¼0 j¼0
vffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
u
u 1 X n X n
2
sði; jÞ ¼ t ðf ðx; yÞ mði; jÞÞ : ð9Þ
n � n x¼0 y¼0
Eq (6) has four unknown parameters, i.e., κ, a, b, and c, and these parameters cause consid-
erable changes to the processed image. It enhances low-contrast images with a local mean
enhancement center. The local mean and local standard deviation define the local brightness
and contrast of the image, respectively. Parameter an introduces smoothness and brightness
effects in the image, while b introduces an offset from the standard deviation in the neighbor-
hood. Parameter c controls how much the average value is to be subtracted from the image f
(u, v). Finally, parameter κ controls the global enhancement of the image. For these four opti-
mization parameters, the parameter search range given by different authors is different, which
affects the quality of the enhanced image. In [36], the parameter b is set to zero, which makes
the master dependent on the local variance. Therefore, zero local variance causes instability in
the conversion. In addition, parameters a, c, and κ are set to 1, which limits the optimal choice
of parameters to achieve the best performance. In [22], the parameters are as follows: a, b, and
c can have any true positive non-zero value, and κ remains between 0.5 and 1.5. Therefore, the
purpose of the optimization algorithm is to determine the values of these parameters on the
bases of a given objective function, thereby obtaining the best-enhanced image.
3.2.2 Fitness function. As mentioned above, another factor that affects the enhancement
of image contrast is the fitness value function. The fitness function is an objective evaluation
criterion for automatically measuring the quality of the generated image. Enhancing image
quality is the key to determining image quality. Intuitively, compared with the original image,
the enhanced image needs other edges, higher edge strength, and higher contrast. For color
images, chroma is a judgment index. On the basis of this fact, a new fitness function is pro-
posed, which contains four performance indicators: (i) the sum of edge strength, (ii) the num-
ber of edge pixels, (iii) color entropy, (iv) image contrast, and (v) Chroma. More specifically,
given the original image, this method will enhance the image and generate an enhanced image
on the basis of the following fitness function. In this paper, the formula is expressed as follows:
f ðIe Þ ¼ logðlogðEðIs ÞÞÞ � ððn edgelsðIs ÞÞ=ðM � NÞÞ � H ðIe Þ � CðIe Þ � I ðIe Þ; ð10Þ
Where Ie is the enhanced image of the original image. Is is an edge image on the produced
enhanced image. E(Is) is the image edge intensity, H(Ie) is the color entropy, n_edgels(Is) is the
number of image edge pixels, C(Ie) is the image contrast, and I(Ie) is the image brightness.
Image contrast refers to the measurement of different brightness levels between the bright-
est white and the darkest black in the light and dark areas of an image, that is, the magnitude
of the gray contrast of an image. The larger the range of difference, the more vivid and rich col-
ors can be easily displayed. The image contrast calculation formula is
X 2
C¼ dði; jÞ Pd ði; jÞ; ð11Þ
d
Where δ(i, j) = |i − j| is the quaternion difference between adjacent pixels, and Pδ(i, j) is the
pixel distribution probability of quaternion difference between adjacent pixels.
Image entropy is expressed as the average number of bits in the gray level set of the image,
and it describes the average amount of information of the image source. The color entropy cal-
culation of the image is to first calculate the image entropy for the three channels and then
average the information of the three channels. The image entropy calculation formula is as fol-
lows:
X
255
H¼ pi log pi ; ð12Þ
i¼0
where pi is the probability that a certain gray level appears in the image.
The edge information is obtained by the Sobel [37] detector, and the calculation formula is
as follows:
Gx ¼ ð 1Þ � f ðx 1; y 1Þ þ 0 � f ðx; y 1Þ þ 1 � f ðx þ 1; y 1Þ þ ð 2Þ � f ðx 1; yÞ þ 0 � f ðx; yÞ
þ2 � f ðx þ 1; yÞ þ ð 1Þ � f ðx 1; y þ 1Þ þ 0 � f ðx; y þ 1Þ þ 1 � f ðx þ 1; y þ 1Þ ; ð13Þ
¼ ½f ðx þ 1; y 1Þ þ 2 � f ðx þ 1; yÞ þ f ðx þ 1; y þ 1Þ� ½f ðx 1; y 1Þ þ 2 � f ðx 1; yÞ þ f ðx 1; y þ 1Þ�
Gy ¼ 1 � f ðx 1; y 1Þ þ 2 � f ðx; y 1Þ þ 1 � f ðx þ 1; y 1Þ þ 0 � f ðx 1; yÞ þ 0 � f ðx; yÞ
þ0 � f ðx þ 1; yÞ þ ð 1Þ � f ðx 1; y þ 1Þ þ ð 2Þ � f ðx; y þ 1Þ þ ð 1Þ � f ðx þ 1; y þ 1Þ ; ð14Þ
¼ ½f ðx 1; y 1Þ þ 2 � f ðx; y 1Þ þ f ðx þ 1; y 1Þ� ½f ðx 1; y þ 1Þ þ 2 � f ðx; y þ 1Þ þ 2 � f ðx þ 1; y þ 1Þ�
where f(a, b) represents the quaternion value of the point (a, b). The horizontal and vertical
gray value of each pixel of the image is calculated by the following formula:
Brightness corresponds to imaging brightness and image gray and is the brightness of the
color. According to the sensitivity of the human eye to the three primary colors of R, G, and B,
3.2.3 Image contrast enhancement approach. In this work, the input image is first repre-
sented by quaternion, and the image function is used as input. To generate an augmented
image from an input image, a parametric transformation function is defined using Eq (6),
which combines global and local information of the input image. The transformation function
contains four parameters, namely, κ, a, b, and c. These four parameters have their defined
ranges, and different values produce different enhanced images. Our goal is to find the set of
values for these four parameters that produce the best results (according to fitness function val-
ues) using PSO. Through the steps on image representation, image transformation, and fitness
function, this section describes the algorithm steps and advantages for image contrast
enhancement on the bases of the improved particle swarm algorithm.
Initialization. In this step, using particle swarm-related parameters such as population size,
search space range, and velocity, among others, the initial solution is obtained. At the same
time, the parameters of κ, a, b, and c in the transformation function are initialized. First, P par-
ticles are initialized. The position vector of each particle X has four components, namely, k, a,
b, and c. Now, using these parameter values, each particle generates an enhanced image using
the intensity transformation function defined in Eq (6). The transform function is applied to
each pixel in the input image, takes parameter values from each particle, and generates a modi-
fied intensity value for that pixel. Thus, each generation produces P enhanced images, and the
quality of each enhanced image is measured by the objective function (fitness function)
defined in Eq (16).
In the traditional particle swarm algorithm, the fitness values of all enhanced images gener-
ated by all particles are calculated. In PSO, the most attractive property is the direction in
which PBEST and GBEST are responsible for driving each particle (solution) to the best posi-
tion. In each iteration, P new positions are generated, and the PBEST and GBEST of each gen-
eration are found according to their fitness values. With the help of these two optimal values,
the component new velocities of each particle are calculated to obtain the optimal solution
used. When the process is complete, the augmented image is created from the GBEST position
of the particle as it provides the maximum fitness value.
When using traditional PSO for image enhancement, only two items, namely, PBEST and
GBEST, are considered. To further increase the diversity of particles, this paper adds a local
optimum for topology calculation to fully utilize the diversity of particles. Therefore, the qua-
ternion representation image is iteratively updated by formula 6 to update the velocity and
position of particles. Then, formula 16 calculates the fitness value of the augmented image gen-
erated by all particles. In each step of iteration, P new positions will be generated. The individ-
ual optimum, global optimum, and local optimum of each generation are found according to
the fitness value. When the three are balanced, the obtained optimum value is the global opti-
mum. Fig 2 shows the flowchart of the algorithm.
4 Experimental results
This paper verifies the effectiveness of the algorithm via two experiments. First, it simulates
and tests the improved particle swarm algorithm, and then, it verifies the convergence perfor-
mance of the algorithm by standard test functions. Second, for the image enhancement
method, the images in the test set are selected and evaluated via subjective vision and objective
quality criteria.
F5 Three-hump Camel F5 ðxÞ ¼ 4x12 2:1x12 þ 13 x16 þ x1 x2 4x22 þ 4x24 (−5 � xi � 5) -1.0316
� 2 ��
F6 Goldstein price F6 ðxÞ ¼ 1 þ ðx1 þ x2 þ 1Þ 19 14x1 þ 3x12 14x2 þ 6x1 x2 þ 3x22 (−2 � xi � 2) 3
� 2 ��
� 30 þ ð2x1 3x2 Þ 18 32x1 þ 12x12 þ 48x2 36x1 x2 þ 27x22
https://doi.org/10.1371/journal.pone.0274054.t001
times. By comparing the data in Table 2, the improved PSO algorithm has obvious advantages
over the other three PSO, EDAPSO, and TPSO. Considering that the local optimal value is
added to the algorithm in this paper and comparing the data in the table, the algorithm in this
paper can converge to the global optimal value, especially in F3, F5, and F6, after the
comparison of the minimum value and the optimal value of the function. For the single-peak
function, comparing the average values of several algorithms, the average value can be con-
cluded to have increased by 4–10 times. For the Schwefel problem 2.2 functions, the average
value of convergence has been improved qualitatively, which means the improved algorithm
in this paper improved the search accuracy higher than the other three algorithms, and the
diversity of particles is fully utilized. For multi-peak functions, a large number of local extre-
mums are included in the entire function search space. The PSO algorithm can easily cause it
to fall into local extremums. However, the improved PSO uses the comparison of topological
structures to obtain the diversity of particles. Full use makes the algorithm reach a balance in
the learning process and enables it to stand out from the local optimal value, allowing it to con-
tinue searching for the global optimal value. For the multimodal fixed-dimensional test func-
tion, the solutions obtained by different algorithms have little difference due to its low solution
space dimension. Although no true value was found, the improved algorithm in this paper
generates better search results.
Fig 3 shows the convergence curves of several algorithms applied to the same constraint
function. Evidently, the improved PSO algorithm has better convergence than the other three
algorithms.
Given that this paper uses a deep topology, it has a certain impact on the time complexity of
the entire algorithm. Owing to the addition of a sparse penalty term, the final time complexity
presents a middle level. It carries out theoretical analysis and comparison of the average
computational time complexity of the proposed algorithm and the traditional PSO algorithm.
According to the description of the two algorithms, for the traditional PSO algorithm, the
number of particles in each iteration is unchanged. Suppose the number of particles in the i-th
iteration is Ni, where i = 1, 2,. . .,m, where m represents the maximum number of iterations;
thus, N1 = N2 = . . . = Nm = N. Assuming that the computing time required for each iteration of
each particle is TT, the total running time required by the traditional PSO algorithm for opti-
mization can be concluded to be N × m × TT. For the algorithm in this paper, the number of
particles slowly decreases with the iteration, that is, N1 � N2. . . � Nm. Assuming that the com-
puting time required for each iteration of each particle is TD, the total running time required
Xm
by the algorithm in this paper after optimization is Ni � TD . From the above analysis, the
i
difference in the complexity of the two algorithms is reflected in the number of particles in
each iteration and the running time required for each iteration of each particle.
For the traditional PSO algorithm, the operations required for a particle to update are as fol-
lows: 5 multiplications and 5 additions. The times required for multiplication and addition
operations are assumed to be Tm and Ta. In the experiment, the number of particles in the tra-
ditional PSO algorithm is set to 50, and the number of iterations is m. Therefore, the average
time required for the traditional PSO algorithm to complete the optimization is N×m×TT =
m×50×(5Tm+5Ta). For the algorithm in this paper, the operations required for a particle to
compare and update are as follows: 45n Tm ðn ¼ 1; 2; . . . ; mÞ multiplications and
5
T ðn ¼ 1; 2; . . . ; mÞ additions. In the experiment, the maximum number of iterations is set
4n a
to 1,000. Therefore, the average time required for the algorithm in this paper to complete the
X m Xm
�
optimization is Ni � TD ¼ Ni � 45n Tm þ 45n Ta n ¼ 1; 2; . . . ; m. i = 1, 2, . . ., 1000,
i¼1 i¼1
where Ni represents the number of particles in the i-th iteration. When n progressively
enlarges, 45n continuously diminishes. After adding the sparse penalty coefficient, the sub-opti-
mal solution is penalized, and the average time of the algorithm in this paper tends to
N×m×TT = m×50×(Tm+Ta). Given that the number of particles in each iteration of the algo-
rithm in this paper is different, conducting multiple experiments, counting the number of par-
ticles in the corresponding iterations, and calculating the average number of particles in the
corresponding iterations are necessary.
4.2.2 Objective evaluation of image quality. In objective image quality evaluation, the
main evaluation indicators include peak signal-to-noise ratio (PSNR), structural similarity
index measure (SSIM) [41], information fidelity criterion (IFC) [42], visual information fidel-
ity (VIF) [43], feature similarity (FSIM) [44], Information Entropy (Entropy), and Average
Gradient (AG). Several evaluation indicators are thoroughly described below:
(1) PSNR is generally used for an engineering project between maximum signal and back-
ground noise. To measure the image quality after processing, the PSNR value is generally used
to measure whether a processing program is satisfactory. It is the logarithm of the mean square
error between the original image and the processed image with respect to (2n − 1)2 (the square
of the maximum value of the signal, n is the number of bits per sample), and its unit is dB.
PSNR is defined by the mean square error (MSE), two images I and K, sizes are M × N, if one
is the noise approximation of the other; the mean square error is defined as follows:
1 X M 1XN 1
2
MSE ¼ jjI ði; jÞ K ði; jÞjj ð17Þ
M � N i¼0 j¼0
where MAXI represents the maximum value of the image point color.
(2) SSIM. The human visual system can highly adaptively extract the structural information
in the scene. Considering the distortion of the image by comparing the changes in the image
structure information, the objective quality evaluation SSIM is obtained. The evaluation model
of SSIM is as follows:
� �� �
2ux uy þ c1 2sxy þ c2
SSIM ðx; yÞ ¼ � �� � ð19Þ
u2x þ u2y þ c1 s2x þ s2y þ c2
where x and y are the reference and images to be tested, and mx ; my ; s2x ; s2y ; sxy is the mean, vari-
ance, and covariance of the images x and y. C1, C2 is a small positive number to avoid instabil-
ity due to zero denominator in the above formula. The closer x is to y, the closer the value of
SSIM (x, y) is to 1. The closer the SSIM is to 1, the better the quality of the image.
(3) IFC. According to the IFC standard, the mutual information between the content
observed by the human eye and the content of the image itself is assumed to be positively cor-
related, that is, the mutual information between the image information observed by the
human eye and the content of the image itself characterizes the content observed by the
human eye. When the image is distorted, the amount of mutual information between the
observed and original images will decrease. If we can calculate the mutual information of the
original reference image and the distorted image observed by the human eye and then calcu-
late the mutual information of the original reference image and the reference image observed
by the human eye, we can calculate the proportion of the information of the distorted image.
The ratio of the amount of reference image information that the human eye can accept and
this ratio can also characterize the image quality of the distorted image, which is the index we
ultimately require.
(4) VIF. Starting from the knowledge of information theory, the distorted image is regarded
as a decrease in information fidelity. The common information of the reference image and the
test image is correlated, that is, the mutual information of the two is correlated. Mutual infor-
mation is a statistical measure of information fidelity. Although it is not strongly correlated
with image quality, it can limit the extraction of cognitive information from images. Informa-
tion fidelity explores the relationship between visual quality and image information. Associa-
tion. In the VIF evaluation model, the input image, the image distortion channel, and the
distorted image model are all assumed to be accurate. The VIF indicator can be expressed as
X
K
� ��
I Crk ; F k jzrk
k¼1
VIF ¼ ð20Þ
X
K
� ��
k k k
I C ; E jz
r r
k¼1
� �
I Crk ; Ek jzrk and I Crk ; F k jzrk are the corresponding mutual information measurements of the
k-th sub-band, respectively, where k is the number of sub-bands.
(5) FSIM. FSIM is an image evaluation index based on the underlying features. It uses
phase consistency to extract the underlying features of the image, which is closer to the human
visual system. The main calculation methods are
SPC ðx; yÞ � SG ðx; yÞ � PCm ðx; yÞ
FSIM ¼ X ð21Þ
PCm ðx; yÞ
x;y2O
2PCðxÞ � PCðyÞ þ T1
SPC ðx; yÞ ¼ ð22Þ
PC2 ðxÞ þ PC2 ðyÞ þ T1
2GðxÞ � GðyÞ þ T2
SG ðx; yÞ ¼ ð23Þ
G2 ðxÞ þ G2 ðyÞ þ T2
where PCm ðx; yÞ ¼ maxðPCðxÞ; PCðyÞÞ is used to weight the overall similarity of the image x
and y. SPC(x, y) represents the feature similarity of images x and y, SG(x, y) represents the gradi-
ent similarity, PC represents the phase consistency information, and G represents the gradient
amplitude of the image. The constant is introduced to avoid the situation where the denomina-
tor is zero in the above format. The larger the value of FSIM, the more similar the reference
image and the image to be tested and the higher the quality of the image to be tested, and vice
versa, and the lower the quality of the image to be tested.
(6) AG. The average gradient can reflect the detailed contrast and texture transformation in
the image. It somewhat reflects the clarity of the image. The calculation formula is
vffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
u �2 � �2
u @f @f
1 X X t @x þ @y
M N
G¼ ð24Þ
M�N i¼1 j¼1
2
@f
M×N represents the size of the image, @x represents the gradient in the horizontal direction,
@f
and @y represents the gradient in the vertical direction.
4.2.3 Results and analysis. This study selected six images from two test image sets, and it
performs contrast enhancement on different algorithms. In order to reflect the effectiveness of
the algorithm in this paper, the images used in this paper are images from other datasets and
network images as a reference, and we collect multiple low-contrast images in different scenes.
The figure shows the effect of the enhancement. This paper selects six images from two data
sets as reference images and compares them with other seven types on the bases of evolution.
Contrast image enhancement methods were compared, including the following: based on arti-
ficial bee colony algorithm, cuckoo search algorithm, scoring algorithm, butterfly algorithm,
genetic algorithm, and basic particle swarm algorithm; and performance evaluation using
7-image quality criteria. All these criteria are widely used in the field of image contrast
enhancement and evaluate the performance of various image contrast enhancement methods.
The parameters of all methods set the maximum number of iterations at 50, and the swarm of
population size is set to be 50. All test images use the same parameter settings because the pro-
posed method is quite robust to the choice of parameters used.
Fig 4 shows a comparison of the effects of the input images and the enhanced images.
Tables 5–8 contain the objective index data of the enhanced images obtained by the six test
images according to different algorithms. Each table corresponds to the comparative data of
different performance indexes of an image under the seven algorithms. The values in entropy
and AG brackets are for the original image. The diagrams and tables show that this method is
superior to other optimization algorithms for image enhancement methods and achieves the
best image visual quality and the best target performance.
Table 9 shows the calculation time of the seven algorithms for different images. The table
clearly shows that the running time of the proposed algorithm is improved qualitatively com-
pared with other algorithms. The optimal value is shown in bold.
Table 10 shows the MOS scores of Figs 1–4 under several image enhancement methods. By
comparing the MOS of several algorithms, the results show that our method achieves the high-
est MOS value on all enhanced images, which indicates that the method is visually superior to
other state-of-the-art algorithms.
Another experiment was conducted to compare the proposed approach with a few recent
image contrast enhancement approaches using selected images. Fig 5 presents the input
images and enhancement images. Tables 11–14 are the objective index data of the enhanced
images obtained by the four test images according to different algorithms.
Tables 11–14 correspond to the comparative data of different performance indexes of an
image under the four algorithms, where one can see that the proposed approach can enhance
the image contrast. Table 15 shows the calculation time of the four algorithms for different
Fig 4. The visual quality performance comparison of the enhanced image obtained by various image enhancement approaches. The first
column is the input images, the second to the eighth column are results of ABC, Cuckoo, DE, FA, GA, PSO, respectively. The last column is the
results of the proposed approach. From top to bottom, the test images are images 1–4, respectively.
https://doi.org/10.1371/journal.pone.0274054.g004
Table 5. The performance evaluation of various image contrast enhancement approaches using test images 1.
Image1 method PSNR SSIM IFC VIF FSIM Entropy (6.3489) AG(0.0482)
ABC 39.9824 0.5827 1.0283 0.3018 0.5766 5.8493 0.0885
Cuckoo 38.5637 0.5092 0.0801 0.2603 0.5239 5.4863 0.0772
DE 35.4092 0.4222 0.6748 0.1987 0.4783 5.1192 0.0575
FA 38.8869 0.5181 0.0826 0.2756 0.5187 5.4958 0.0767
GA 39.4758 0.5746 0.0938 0.2981 0.5837 5.6475 0.0866
PSO 40.8482 0.6326 1.7463 0.2983 0.6018 6.2959 0.1022
Proposed 48.0932 0.7827 2.1182 0.3749 0.7633 7.2739 0.1673
https://doi.org/10.1371/journal.pone.0274054.t005
Table 6. The performance evaluation of various image contrast enhancement approaches using test images 2.
Image2 method PSNR SSIM IFC VIF FSIM Entropy(3.4982) AG(0.0139)
ABC 26.4348 0.5374 0.8498 0.4096 0.5557 4.6891 0.0278
Cuckoo 26.4994 0.5371 0.8526 0.4721 0.5623 4.9819 0.0309
DE 23.7586 0.3219 0.6875 0.4759 0.4196 3.5781 0.0176
FA 27.4373 0.5777 0.8421 0.4127 0.5684 5.0215 0.0363
GA 28.3584 0.6365 1.2382 0.4365 0.6232 5.2819 0.0428
PSO 29.8473 0.6565 1.2788 0.4881 0.6758 5.5611 0.0683
Proposed 37.6251 0.8175 2.7586 0.8571 0.8039 6.2967 0.0849
https://doi.org/10.1371/journal.pone.0274054.t006
Table 7. The performance evaluation of various image contrast enhancement approaches using test images 3.
Image3 method PSNR SSIM IFC VIF FSIM Entropy(4.9283) AG(0.0175)
ABC 26.9891 0.4375 0.7463 0.1278 0.3718 3.8758 0.0184
Cuckoo 28.8284 0.5123 1.0632 0.2920 0.4921 4.6274 0.0283
DE 29.3456 0.5351 1.3372 0.3162 0.5291 4.9034 0.0337
FA 28.9483 0.5134 0.9744 0.3093 0.5018 4.7683 0.0237
GA 30.6166 0.5788 1.5937 0.3592 0.5621 5.1627 0.0472
PSO 31.8593 0.6123 1.6281 0.3626 0.5901 5.0192 0.0589
Proposed 35.9381 0.6776 2.6887 0.6471 0.6827 5.6572 0.0773
https://doi.org/10.1371/journal.pone.0274054.t007
Table 8. The performance evaluation of various image contrast enhancement approaches using test images 4.
Image4 method PSNR SSIM IFC VIF FSIM Entropy(4.826) AG(0.0313)
ABC 46.4861 0.6263 2.1232 0.2029 0.8174 5.4322 0.0559
Cuckoo 47.1777 0.6182 2.3693 0.1399 0.7802 5.6528 0.0566
DE 46.2921 0.6233 1.9447 0.3192 0.8600 5.4792 0.0512
FA 45.9491 0.5588 0.8372 0.1017 0.7954 5.6351 0.0571
GA 40.5044 0.4435 0.7879 0.0970 0.7558 5.3736 0.0387
PSO 46.6007 0.5505 1.3989 0.1549 0.8220 5.3837 0.0598
Proposed 49.6988 0.6907 2.9784 0.3922 0.8633 7.5021 0.0858
https://doi.org/10.1371/journal.pone.0274054.t008
Table 9. The execution time performance (in s) evaluation of various image contrast enhancement approaches.
Image ABC Cuckoo DE FA GA PSO Proposed
1 413.4708 152.4987 56.3841 87.1932 446.3842 109.0803 45.8375
2 420.3819 165.2837 60.3716 97.3724 461.8472 116.7566 48.7282
3 427.6582 163.0495 68.0911 96.9382 472.4098 120.8473 50.7364
4 409.4721 150.6126 49.3273 85.0674 434.8549 107.6572 41.0843
https://doi.org/10.1371/journal.pone.0274054.t009
images. The table shows that the running time of the proposed algorithm is improved qualita-
tively compared with other algorithms.
Table 16 shows the MOS scores of images 5–8 under several image enhancement methods.
By comparing the MOS of several algorithms, the results show that our method achieves the
highest MOS value on all enhanced images, which indicates that the method is visually supe-
rior to other state-of-the-art algorithms.
5 Conclusions
With the aim to address the issue of long processing time and poor quality of low contrast
enhancement by evolutionary algorithm optimization, this paper proposes an improved
Fig 5. Visual quality performance comparison of the enhanced image obtained by various image enhancement approaches. The first
column is the input images, the second to the fourth column2 are results of Ref. [45–47], respectively. The last column is the results of the
proposed approach. From top to bottom, the test images are image 5–8.
https://doi.org/10.1371/journal.pone.0274054.g005
Table 11. The performance evaluation of various image contrast enhancement approaches using test images 5.
Image5 method PSNR SSIM IFC VIF FSIM Entropy(5.3818) AG(0.0349)
Ref. [45] 32.0512 0.7785 2.1843 0.3237 0.8689 5.8721 0.0381
Ref. [46] 33.8049 0.8306 3.1103 0.4003 0.8853 5.7192 0.0424
Ref. [47] 31.5446 0.9136 4.0441 0.5123 0.9347 5.1778 0.0368
Proposed 36.2669 0.9470 6.1938 1.0424 0.9572 6.1659 0.0771
https://doi.org/10.1371/journal.pone.0274054.t011
Table 12. The performance evaluation of various image contrast enhancement approaches using test images 6.
Image6 method PSNR SSIM IFC VIF FSIM Entropy(5.2933) AG(0.0165)
Ref. [45] 45.8738 0.4284 0.6855 0.3647 0.6744 6.3175 0.0196
Ref. [46] 46.6472 0.4381 0.7101 0.3811 0.6829 6.3386 0.0187
Ref. [47] 47.1637 0.4309 0.6983 0.3796 0.6799 6.8191 0.0206
Proposed 56.9928 0.5047 1.5363 0.5029 0.7811 7.7366 0.0386
https://doi.org/10.1371/journal.pone.0274054.t012
Table 13. The performance evaluation of various image contrast enhancement approaches using test images 7.
Image7 method PSNR SSIM IFC VIF FSIM Entropy(4.0192) AG(0.0167)
Ref. [45] 38.3788 0.3562 1.5919 0.5723 0.6646 6.2379 0.0326
Ref. [46] 36.3849 0.3274 0.6728 0.4894 0.6372 5.9283 0.0247
Ref. [47] 39.5822 0.3589 1.8473 0.6029 0.6742 6.5674 0.0384
Proposed 46.9409 0.5301 3.0571 1.1733 0.8918 7.4985 0.0476
https://doi.org/10.1371/journal.pone.0274054.t013
Table 14. The performance evaluation of various image contrast enhancement approaches using test images 8.
Image8 method PSNR SSIM IFC VIF FSIM Entropy(6.4021) AG(0.0913)
Ref. [45] 38.5748 0.3562 1.4753 0.1428 0.7636 6.8503 0.1452
Ref. [46] 36.1924 0.3109 0.9856 0.1059 0.7142 6.4211 0.1111
Ref. [47] 37.1627 0.3322 1.1866 0.1321 0.7376 6.4327 0.1287
Proposed 44.4738 0.6274 2.5831 0.3728 0.9103 7.4895 0.1771
https://doi.org/10.1371/journal.pone.0274054.t014
Table 15. The execution time performance (in s) evaluation of various image contrast enhancement approaches.
Image Ref. [45] Ref. [46] Ref. [47] Proposed
1 758.29 753.25 695.33 451.06
2 732.65 740.93 684.32 427.57
3 746.01 748.72 687.29 439.64
4 767.39 773.48 703.66 463.21
https://doi.org/10.1371/journal.pone.0274054.t015
particle swarm algorithm to optimize the low contrast color image enhancement algorithm.
This algorithm used the quaternion matrix to represent the three channels of RGB images. The
matrix can perform function transformation and parameter optimization. In the parameter
optimization stage, an improved particle swarm algorithm is used. In the improved algorithm,
the individual optimization of the particles, local optimization, and global optimization are
used to adjust the particle’s flight direction. Local optimization uses a topological structure,
and the particles communicate with each other. The local optimal solution is obtained, which
not only increases the diversity of particles but also prevents subsequent particles from stop-
ping iteration because they are trapped in local optimization. At the same time, a sparse pen-
alty term is added to the speed update formula. The function of this term is to eliminate the
least solution and reduce the iteration time. In the contrast enhancement process, the transfor-
mation function and the fitness value function are used to change the image quality. The algo-
rithm in this paper adds contrast elements and brightness elements to the fitness function and
uses the fitness function in guiding the particle swarm algorithm to optimize the four parame-
ters in the transformation function. This article compares the proposed algorithm with other
evolutionary algorithms to optimize the contrast enhancement. The images in two different
data sets are selected for subjective and objective evaluation. The result implies the superiority
of the effect in both subjective-qualitative and objective quantitative aspects.
Supporting information
S1 Table. Relevant data underlying the findings described in Table 2.
(XLSX)
S2 Table. Relevant data underlying the findings described in Tables 10 and 16.
(XLSX)
S3 Table. Relevant data underlying the findings described in Tables 5–9 and 11–15.
(XLSX)
S1 Fig. Relevant data underlying the findings described in Figs 4, 5.
(RAR)
Author Contributions
Data curation: Yongfeng Ren.
Formal analysis: Guoyong Zhen.
Investigation: Yanhu Shan.
Methodology: Chengqun Chu.
Writing – original draft: Xiaowen Zhang.
References
1. Suresh S., Lal S., Modified differential evolution algorithm for contrast and brightness enhancement of
satellite images, Appl. Soft Comput. J. 61 (2017) 622–641. https://doi.org/10.1016/j.asoc.2017.08.019
2. Rundo L., Tangherloni A., Nobile M. S., Militello C., Besozzi D., Mauri G., Cazzaniga P., MedGA: A
novel evolutionary method for image enhancement in medical imaging systems, Expert Syst. Appl. 119
(2019) 387–399. https://doi.org/10.1016/j.eswa.2018.11.013
3. S. Intelligence, I. Science, I. Science, C. Science, I. Science, A New Framework for Retinex based
Color Image Enhancement using Particle Swarm Optimization, 2009 Inderscience Enterp. Ltd. x (2009)
1–24.
4. B. Olena, Roman, Vorobel, I. Ivasenko, Color Image Enhancement by Logarithmic Transformation in
Fuzzy Domain, 2019 IEEE 2nd Ukr. Conf. Electr. Comput. Eng. (2019) 1147–1151.
5. Shih-chia H., Fan-chieh C., Yi-sheng C., Efficient Contrast Enhancement Using Adaptive Gamma Cor-
rection With Weighting Distribution, IEEE Trans. Image Process. 22 (2013) 1032–1041. https://doi.org/
10.1109/TIP.2012.2226047 PMID: 23144035
6. Yu W., Qian C., Baomin Z., Image enhancement based on equal area dualistic sub-image histogram
equalization method, IEEE Trans. Consum. Electron. 45 (1999) 68–75.
7. C. Gao, K. Panetta, A New Color Contrast Enhancement Algorithm for Robotic Applications, 2012 IEEE
Int. Conf. Technol. Pract. Robot Appl. (2012) 42–47.
8. Junwon M., Yuneseok J., Yoojun N., Jaeseok K., Edge-enhancing bi-histogram equalisation using
guided image filter, J. Vis. Commun. Image Represent. 58 (2019) 688–700. https://doi.org/10.1016/j.
jvcir.2018.12.037
9. P. M.Narendra, R. C.Fitch, Real-Time Adaptive Contrast Enhancement, IEEE. (1981) 655–661.
10. Sanjay A., Rutuparna P., P.K M., Ajith A., A novel joint histogram equalization based image contrast
enhancement, J. King Saud Univ.—Comput. Inf. Sci. (2019) 1–11. https://doi.org/10.1016/j.jksuci.2019.
05.010
11. Hyoung-joon K., Jong-myung L., Jin-aeon L., Sang-geun O., Whoi-yul K., Contrast Enhancement Using
Adaptively Modified Histogram Equalization, PSIVT 2006. (2006) 1150–1158.
12. Shashi P., Suman T., Deewakar S., Vinod K., Ashish G., Pal S. K., Non-parametric modified histogram
equalisation for contrast enhancement, IET Image Process. (2013) 641–652. https://doi.org/10.1049/
iet-ipr.2012.0507
13. Tian J., Chen L., Lihong M., Weiyu Y., Multi-focus image fusion using a bilateral gradient-based sharp-
ness criterion, Opt. Commun. 284 (2011) 80–87. https://doi.org/10.1016/j.optcom.2010.08.085
14. F. Saitoh, Image Contrast Enhancement Using Genetic Algorithm, IEEE. (1999) 899–904.
15. MunteanuC RosaA. Gray-scale image enhancement as an automatic process driven by evolution[J].
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics, 2004.
16. Kamoona A.M., Chandra Patra J., A novel enhanced cuckoo search algorithm for contrast enhance-
ment of gray scale images, Appl. Soft Comput. J. 85 (2019) 1–20. https://doi.org/10.1016/j.asoc.2019.
105749
17. A. Bouaziz, A. Draa, S. Chikhi, A cuckoo search algorithm for fingerprint image contrast enhancement,
in: Second World Conference on Complex Systems (WCCS), IEEE, Agadir, Morocco, 2014, pp. 678–
685, http://dx.doi.org/10.1109/ICoCS.2014.7060930.
18. Ye Z., Wang M., Hu Z., Liu W., An adaptive image enhancement techniqueby combining cuckoo search
and particle swarm optimization algorithm,Comput. Intell. Neurosci. (2015) 1–12, https://doi.org/http%
3A//dx.doi.org/10.1155/2015/825398 PMID: 25784928
19. Suresh S., Lal S., Reddy C.S., Kiran M.S., A novel adaptive cuckoo search algorithm for contrast
enhancement of satellite images, IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 10 (2017) 3665–
3676, https://doi.org/http%3A//dx.doi.org/10.1109/JSTARS.2017.2699200
20. Jia C., Weiyu Y., Jing T., Li C., Zhili Z., Image contrast enhancement using an artificial bee colony algo-
rithm, Swarm Evol. Comput. 38 (2018) 287–294. https://doi.org/10.1016/j.swevo.2017.09.002
21. T. Hassanzadeh, H. Vojodi, F. Mahmoudi, Non-linear grayscale image enhancement based on firefly
algorithm, in: International Conference on Swarm, Evolutionary, and Memetic Computing, Springer,
Berlin, Heidelberg, 2011, pp. 174–181, http://dx.doi.org/10.1007/978-3-642-27242-4_21.
22. Gorai A., Ghosh A., Gray-level Image Enhancement By Particle Swarm Optimization, 2009 World
Congr. Nat. Biol. Inspired Comput. (2009) 72–77.
23. Zhang T, Lei W, Xu Y, et al. Sonar image enhancement based on particle swarm optimization[C]// IEEE
Conference on Industrial Electronics & Applications. IEEE, 2008.
24. Kanmani M, Narasimhan V. An optimal weighted averaging fusion strategy for thermal and visible
images using dual tree discrete wavelet transform and self tunning particle swarm optimization[J]. Multi-
media Tools & Applications, 2016.
25. Kanmani M, Narasimhan V. Optimal fusion aided face recognition from visible and thermal face images
[J]. Multimedia Tools and Applications, 2020:1–25.
26. Kanmani M, Narasimhan V. Particle swarm optimisation aided weighted averaging fusion strategy for
CT and MRI medical images[J]. International Journal of Biomedical Engineering and Technology, 2019,
31(3):278.
27. J. Kennedy, R. Eberhart, Particle Swarm Optimization, IEEE. (1995) 1942–1948.
28. J. Kennedy, Small Worlds and Mega-Minds: Effects of Neighborhood Topology on Particle Swarm Per-
formance, IEEE. (1999) 1931–1938.
29. M.W. Fakhr, L1-Regularized Least Squares Sparse Extreme Learning Machine for Classification, 2015
Int. Conf. Inf. Commun. Technol. Res. (2015) 222–225.
30. Da Wei W., On the geometric meaning and physical application of quaternions, Sci. Instr. (2015) 164–
165.
31. Lianghai J., Zhiliang Z., Enmin S., Xiangyang X., An effective vector filter for impulse noise reduction
based on adaptive quaternion color distance mechanism, Signal Processing. 155 (2019) 334–345.
https://doi.org/10.1016/j.sigpro.2018.10.007
32. Sveier A., Egeland O., Pose Estimation using Dual Quaternions Pose Estimation using Dual Quaterni-
ons Pose Estimation using Dual Quaternions Pose Estimation using Dual Quaternions and Moving
Horizon Estimation Pose Estimation using Dual Quaternions and Moving Horizon Estimation, IFAC-
PapersOnLine. 51 (2018) 186–191. https://doi.org/10.1016/j.ifacol.2018.07.275
33. Jinwei W., Yangyang L., Jian L., Xiangyang L., Yun-qing S., Kr.Jha S., Color image-spliced localization
based on quaternion principal component analysis and quaternion skewness, J. Inf. Secur. Appl. 47
(2019) 353–362. https://doi.org/10.1016/j.jisa.2019.06.004
34. Kun W., Guiju L., Guangliang H., Hang Y., Yuqing W., Color Image Detail Enhancement Based on Qua-
ternion Guided Filter, J. Comput. Des. $Computer Graph. 29 (2017).
35. Maurya L., Kumar Mahapatra P., Kumar A., A social spider optimized image fusion approach for con-
trast enhancement and brightness preservation, Appl. Soft Comput. 52 (2017) 575–592. https://doi.
org/10.1016/j.asoc.2016.10.012
36. R.W.R.Gonzalez, Digital Image Processing fourthed, Elsevier B.V., 2018.
37. I. Sobel, Camera models and machine perception, 1970. 10.1023/A.
38. Rong Z., Yan Xia J., Hua Wei F., Particle Swarm Optimization Algorithm Combination with the Distribu-
tion of Superior Quali- ty Particles e PSOi e EDAsi, J. Chinese Comput. Syst. 36 (2015) 3–7.
39. Yan-xia J., Xiaowen Z., Li Y., Xin Z., An Improved Particle Swarm Optimization Algorithm for Optimal
Leaf Nodes, Microelectronics&Computer. 33 (2016) 64–69. https://doi.org/10.19304/j.cnki.issn1000-
7180.2016.09.015
40. Kanmani M, Narasimhan V. Swarm intelligent based contrast enhancement algorithm with improved
visual perception for color images[J]. Multimedia Tools and Applications, 2017, 77(10):12701–12724.
41. Zhou W., Alan Conrad B., Hamid Rahim S., Simoncelli E.P., Image Quality Assessment: From Error
Visibility to Structural Similarity, IEEE Trans. Image Process. 13 (2004) 600–612. https://doi.org/10.
1109/tip.2003.819861 PMID: 15376593
42. Sheikh H.R., Member S., Bovik A.C., De G., An Information Fidelity Criterion for Image Quality Assess-
ment Using Natural Scene Statistics, IEEE Trans. Image Process. (2006) 1–22.
43. Hamid Rahim S., Bovik A.C., Image Information and Visual Quality, IEEE Trans. Image Process. 15
(2006) 430–444. https://doi.org/10.1109/tip.2005.859378 PMID: 16479813
44. Lin Z., Lei Z., Xuanqin M., David Z., FSIM: A Feature Similarity Index for Image, IEEE Trans. Image Pro-
cess. 20 (2011) 2378–2386.
45. Tian Q C, Cohen L D. A variational-based fusion model for non-uniform illumination image enhance-
ment via contrast optimization and color correction[J]. Signal Processing, 2018, 153(DEC.):210–220.
46. Huang Z, Wang Z, Zhang J, et al. Image enhancement with the preservation of brightness and struc-
tures by employing contrast limited dynamic quadri-histogram equalization[J]. Optik—International
Journal for Light and Electron Optics, 2021, 226(2):165877.
47. Huang Z, Zhu Z, An Q, et al. Global-local image enhancement with contrast improvement based on
weighted least squares[J]. Optik—International Journal for Light and Electron Optics, 2021(2):167433.