Hpso: A New Version of Particle Swarm Optimization Algorithm Journal of Artificial Intelligence
Hpso: A New Version of Particle Swarm Optimization Algorithm Journal of Artificial Intelligence
Hpso: A New Version of Particle Swarm Optimization Algorithm Journal of Artificial Intelligence
net/publication/320672460
CITATIONS READS
0 38
2 authors:
Some of the authors of this publication are also working on these related projects:
Improve particle swarm optimization Algorithm based on particle confidence performance View project
All content following this page was uploaded by Sharandeep Singh on 27 October 2017.
Abstract- In this paper, a new version of Particle Swarm Optimization Algorithms has been proposed. This algorithm has been developed
by combining two different approaches of PSO i.e., Standard Particle Swarm Optimization (SPSO) and Mean Particle Swarm Optimization
(MPSO). Numerical experiments for several scalable and non-scalable problems have been done. The results indicate that the proposed
algorithm performs better than the existing ones in terms of efficiency, reliability, accuracy and stability
Keywords- SPSO, HPSO (Hybrid Particle Swarm Optimization), global optimization, personal best position, global best position, velocity
update equation.
Citation: Narinder Singh, Sharandeep Singh and Singh S.B. (2012) HPSO: A New Version of Particle Swarm Optimization Algorithm. Journal
of Artificial Intelligence, ISSN: 2229-3965 & E-ISSN: 2229-3973, Volume 3, Issue 3, pp.-123-134.
Copyright: Copyright©2012 Narinder Singh, et al. This is an open-access article distributed under the terms of the Creative Commons Attribu-
tion License, which permits unrestricted use, distribution and reproduction in any medium, provided the original author and source are credit-
ed.
Introduction yi i
The neighborhood best position , of particle depends on the
The particle swarm optimization (PSO) is a new global optimization
method based on a metaphor of social interaction [1,2]. Since its neighborhood topology used [4].
inspection PSO is finding applications in all areas of science and Shi and Eberhart [11] proposed to use an “inertia weight” parame-
engineering [3]. PSO has been successfully applied in many re- ter [Equ-3].
search and application areas. It is demonstrated that PSO gets vij (k 1) w vij (k ) c1r1 j ( yij xij ) c2r2 j ( yˆ j xij ) (3)
better results in a faster, cheaper way compared with other optimi-
zation methods. Eberhart and Shi suggested to use the inertia weight which de-
creasing over time, typically from 0.9 to 0.4. It has the effect of
A PSO algorithm maintains a swarm of individuals (called parti- narrowing the search, gradually changing from an exploratory to an
cles), where each individual (particle) represents a candidate solu- exploitative mode.
tion. Particles follow a very simple behavior: emulate the success
of neighboring particles, and own successes achieved. The posi- Clerc and Kennedy [34] suggested a more generalized PSO,
tion of a particle is therefore influenced by the best particle in a where a constriction coefficient is applied to both terms of the ve-
neighborhood, as well as the best solution found by the particle. locity formula. The authors showed that the constriction PSO can
converge without using [Eua-4].
xi
Particle position is adjusted using [Equ-1] Vmax: vij (k 1) (vij (k ) c1r1 j ( yij xij ) c2r2 j ( yˆ j xij )) (4)
x (k 1) x (k ) v (k 1)
i i i (1)
where the constriction factor was set 0.7289. Clerc and Kennedy
vi (k )
where the velocity component, represents the step size. For [33]. By using the constriction coefficient, the amplitude of the parti-
the basic PSO. cle’s oscillation decreases, resulting in its convergence over time.
v (k 1) wv (k ) c r ( y x ) c r ( yˆ x ) (2) PSO variants are continually being devised in an attempt to over-
ij ij 1 1 j ij ij 2 2 j j ij
come this deficiency (see e.g. [18-26] for a few recent additions).
The standard Particle Swarm Optimization Algorithm uses the val- These PSO variants greatly increase the complexity of the original
ues for inertia weight range = 0.4 to 1.4, acceleration coefficient methods. Pedersen and co workers [27,28] have demonstrated
range = 1.5 to 2.0 respectively, and suggests the upper and lower that satisfactory performance can be achieved with the basic PSO
limits on these values as shown in [Equ-2]. if only its parameters are properly tuned.
The pseudo code of HPSO is shown below: [Fig-2] reflects comparison between the particle position in SPSO
and HPSO Algorithm.
Algorithm- HPSO
The pseudo code of the procedure is as follows [Fig-1]:
For each particle
Initialize particle
END
Do
For each particle
Calculate fitness value
If the fitness value is better than the best fitness value pbest
(personal best position) in history, set current value as the new
pbest
Fig. 2- Comparison of SPSO Particle and HPSO Particle by Scala-
End ble Problems
Choose the particle with the best fitness value of all the particles as Parameter Setting
the gbest (global best position)
Computational Experiments were performed to fine tune the values
For each particle of various parameters for its best performance. For that purpose all
measure values of parameters viz. inertia weight in the range [0.4,
Calculate particle velocity according equation
0.9] and acceleration coefficient in the range [1.5, 2.0] were tested.
(3 y yˆ )
ij j
v (k 1) 2 wv (k ) c r (
ij ij 11j 2
2x )
ij Test Problems
( y yˆ )
c r (
ij j
2x )
Many times it is found that the evaluation of a proposed algorithm
2 2j ij
2 is evaluated only on a few scalable and non scalable problems.
Update particle position according equation However, in this paper we consider a test of 15 scalable and 13
x (k 1) x (k ) v (k 1)
non scalable problems with varying difficulty levels and problem
i i i size. The performance of Standard Particle Swarm Optimization
End Algorithm and newly proposed HPSO has been verified on these
two types of problem sets.
While maximum iterations or minimum error criteria is not attained.
Detail of 15 Scalable Problems SET-I (Continued)
Problem I (Ackley):
n
Min f ( x) 20 exp(0.02 n 1 xi2 )
i 1
n
exp(n 1
cos( x )) 20 e
i 1
i
30 x 30
i
In which search space lies between and Minimize
Objective Function Value is 0.
Problem II (Cosine Mixture):
n n
Min f ( x) 0.1 cos(5 xi ) xi2
i 1 i 1
1 x 1
i
In which search space lies between and Minimize
0.1 (n)
Objective Function Value is .
Problem III (Exponential):
n
Min f ( x) (0.5 x 2 )
i
i 1
Fig. 1- Flow Chart of HPSO
sults of [Table-3], [Table-4], [Table-5] and [Table-6], it is concluded scalable problems with 100% success.
that SPSO and HPSO could not solve two scalable and five non-
Table1- Comparison of minimum objective function value of SPSO and HPSO for 15 Scalable Problem Set-I
Minimum Function Value Mean Function Value Standard Deviation Rate of Success
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 0.387931 2.213546 0.463183 2.785904 0.028413 0.154321 100% 0.00%
2 1 1 1 1 0 0 0.00% 0.00%
3 0.080679 0.098733 0.127769 0.138034 0.023686 0.026596 100% 100%
4 7.959672 40.31155 22.20173 52.49853 3.815774 5.94983 0.00% 0.00%
5 0.372065 2.083941 0.451844 2.722288 0.035701 0.32848 100% 0.00%
6 0.000011 0.000016 0.045324 0.093635 0.059453 0.116726 100% 100%
7 0.000002 0.000013 0.034576 0.054799 0.061729 0.085849 100% 100%
8 0 0 0.000221 0.000259 0.000494 0.000406 100% 100%
9 0 0.000002 0.00641 0.007523 0.01432 0.011785 100% 100%
10 0.000183 0.000489 0.020887 0.024727 0.021163 0.020645 100% 100%
11 0.204376 4.340918 0.399203 8.632978 0.070762 2.960194 100% 100%
12 0.000092 0.000244 0.010444 0.012363 0.010581 0.010322 100% 100%
13 0.000873 0.158666 0.238075 13.71884 0.155831 10.9795 100% 4.00%
14 0.000357 0.000506 0.023792 0.04026 0.0215 0.036367 100% 100%
15 0 0 0.000052 0.000064 0.000103 0.000127 100% 100%
Analysis SPSO HPSO
Swarm Size 30 dim 30 dim
Function Evaluation 30,000 30,000
Inertia Weight 0.5 0.5
Acceleration Coefficient 1.3 1.3
Table 2- Comparison of minimum objective function value of SPSO and HPSO for 13 Non-Scalable Problem Set-II
Minimum Function Value Mean Function Value Standard Deviation Success of Rate
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 25 25 25 25 0 0 0.00% 0.00%
2 0.000252 0.000485 0.014243 0.018005 0.025041 0.025188 100% 100%
3 0.000274 0.000274 0.014718 0.017547 0.01185 0.017243 100% 100%
4 27.71683 27.71683 27.71683 27.71683 0 0 0.00% 0.00%
5 0.00045 0.003603 0.189215 0.219694 0.130739 0.142171 100% 100%
6 73046.6 85046.6 73046.6 89046.6 0 0 0.00% 0.00%
7 20.70113 217.384 22.42282 339.9513 1.234575 58.71676 0.00% 0.00%
8 0 0.00002 0 0 0 0 100% 100%
9 0.580465 1.380465 2.780465 1.380465 0 0 100% 0.00%
10 0.002558 0.025302 0.224278 0.26239 0.133917 0.136451 100% 100%
11 0.000096 0.000049 0.008199 0.017657 0.007526 0.016862 100% 100%
12 0.000004 0.000014 0.000789 0.010052 0.005954 0.008192 100% 100%
13 0.000004 0.000008 0.001258 0.001605 0.001139 0.001605 100% 100%
Analysis SPSO HPSO
Swarm Size 30 dim 30 dim
Function Evaluation 30,000 30,000
Inertia Weight 0.6 0.6
Acceleration Coefficient 1.4 1.4
Table 3- Comparison of minimum objective function value of SPSO and HPSO for 15 Scalable Problem Set-I
Minimum Function Value Mean Function Value Standard Deviation Rate of Success
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 0.39522 2.82538 0.469694 3.094737 0.027105 0.123042 100% 0.00%
2 1 1 1 1 0 0 0.00% 0.00%
3 0.094588 0.101718 0.148249 0.188355 0.02232 0.034292 100% 100%
4 16.92943 33.20485 25.94843 45.77758 2.620415 5.845835 0.00% 0.00%
5 0.335672 3.011404 0.442806 3.957319 0.045047 0.439678 100% 0.00%
6 0 0.000044 0.06982 0.081741 0.117727 0.10061 100% 100%
7 0 0.000083 0.041217 0.064499 0.057818 0.076679 100% 100%
8 0 0 0.000195 0.000305 0.000274 0.000363 100% 100%
9 0 0.000011 0.005658 0.008854 0.007937 0.010526 100% 100%
10 0.000073 0.001256 0.021234 0.028744 0.018154 0.019877 100% 100%
11 0.259205 6.941149 0.420516 17.56031 0.057058 5.788724 100% 0.00%
12 0.000037 0.000628 0.010617 0.014372 0.009077 0.009938 100% 100%
13 0.004746 0.083234 0.253189 15.18506 0.11678 14.23525 100% 2.00%
14 0.000629 0.000658 0.035198 0.040451 0.03469 0.040122 100% 100%
15 0 0 0.000034 0.000074 0.000049 0.00012 100% 100%
Table 4- Comparison of minimum objective function value of SPSO and HPSO for 13 Non-Scalable Problem Set-II
Minimum Function Value Mean Function Value Standard Deviation Success of Rate
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 25 25 25 25 0 0 0.00% 0.00%
2 0.000185 0.000211 0.010405 0.013367 0.021601 0.025188 100% 100%
3 0.000274 0.000274 0.014355 0.016635 0.010956 0.012941 100% 100%
4 27.71683 27.71683 27.71683 27.71683 0 0 0.00% 0.00%
5 0.003603 0.006597 0.197137 0.214412 0.14323 0.145243 100% 100%
6 73046.6 73046.6 73046.6 73046.6 0 0.000001 0.00% 0.00%
7 22.41474 356.7507 23.55341 506.359 0.342964 93.16889 0.00% 0.00%
8 0 0 0 0 0 0 100% 100%
9 1.380465 1.380465 1.380465 1.380465 0 0 0.00% 0.00%
10 0.010489 0.005826 0.242355 0.287753 0.12514 0.139614 100% 100%
11 0.000388 0.000186 0.011989 0.022206 0.010398 0.021582 100% 100%
12 0.000019 0.00002 0.007385 0.012785 0.00758 0.014302 100% 100%
13 0.000016 0.000016 0.001705 0.002644 0.001511 0.002192 100% 100%
Analysis SPSO HPSO
Swarm Size 30 dim 30 dim
Function Evaluation 30,000 30,000
Inertia Weight 0.8 0.8
Acceleration Coefficient 1.6 1.6
Table 5- Comparison of minimum objective function value of SPSO and HPSO for 15 Scalable Problem Set-I
Minimum Function Value Mean Function Value Standard Deviation Rate of Success
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 0.302491 2.908172 0.691205 3.262553 0.32316 0.087067 62.00% 0.00%
2 1 1 1 1 0 0 0.00% 0.00%
3 0.059008 0.192762 0.158936 0.27673 0.026272 0.03494 100% 100%
4 25.93121 29.30517 27.37303 37.7589 0.966569 4.39901 0.00% 0.00%
5 0.410389 4.720973 0.473893 6.463293 0.02743 0.708598 100% 0.00%
6 0.000008 0.000017 0.071594 0.109527 0.107192 0.117171 100% 100%
7 0.000031 0.000448 0.042033 0.097578 0.062633 0.12661 100% 100%
8 0 0 0.000199 0.000814 0.000296 0.001687 100% 100%
9 0.000004 0.000005 0.00577 0.023604 0.008598 0.048929 100% 100%
10 0.000763 0.000807 0.021675 0.040952 0.018058 0.039732 100% 100%
11 0.320697 24.19404 0.467348 44.75876 0.045955 9.381633 100% 0.00%
12 0.000382 0.000404 0.010838 0.020476 0.009029 0.019866 100% 100%
13 0.009936 0.061379 0.275779 28.77909 0.132544 29.07476 100% 2.00%
14 0.000407 0.000658 0.031531 0.041339 0.031074 0.034328 100% 100%
15 0 0 0.000039 0.000157 0.000052 0.000319 100% 100%
Table 6- Comparison of minimum objective function value of SPSO and HPSO for 13 Non-Scalable Problem Set-II
Minimum Function Value Mean Function Value Standard Deviation Success of Rate
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 25 25 25 25 0 0 0.00% 0.00%
2 0.000485 0.000485 0.012163 0.021492 0.009654 0.033967 100% 100%
3 0.000274 0.000274 0.012683 0.016529 0.008866 0.013177 100% 100%
4 27.71683 27.71683 27.71683 27.71683 0 0 0.00% 0.00%
5 0.000973 0.000788 0.134761 0.204412 0.124735 0.13335 100% 100%
6 73046.6 73046.6 73046.6 73046.6 0 0.000001 0.00% 0.00%
7 28.35727 358.851 37.83285 760.6364 11.33987 136.7744 0.00% 0.00%
8 0 0 0 0 0 0 100% 100%
9 1.380465 1.380465 1.380465 1.380465 0 0 0.00% 0.00%
10 0.013064 0.051677 0.239829 0.358124 0.131305 0.152438 100% 90.00%
11 0.000025 0.000348 0.012279 0.031316 0.011362 0.028257 100% 100%
12 0.000028 0.000971 0.008485 0.015242 0.01002 0.013412 100% 100%
13 0.000016 0.000016 0.00171 0.003068 0.001651 0.002486 100% 100%
Analysis SPSO HPSO
Swarm Size 30 dim 30 dim
Function Evaluation 30,000 30,000
Inertia Weight 0.9 0.9
Acceleration Coefficient 1.7 1.7
Thirdly, we are testing the SPSO and HPSO on the parameter [Table-7] and [Table-8], it is concluded SPSO and HPSO could not
setting as swarm size 30 dim, function evaluation 30,000, inertia solve four scalable and four non-scalable problems.
weight 0.9 and acceleration coefficient 1.7. From the results of ]
Table 7- Comparison of minimum objective function value of SPSO and HPSO for 15 Scalable Problem Set-I
Minimum Function Value Mean Function Value Standard Deviation Rate of Success
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 2.09691 3.148713 2.709167 3.307034 0.209419 0.05205 62.00% 0.00%
2 1 1 1 1 0 0 0.00% 0.00%
3 0.063837 0.192762 0.160985 0.291546 0.029135 0.034786 100% 100%
4 27.00595 27.79408 27.78788 35.05702 0.609905 4.018404 0.00% 0.00%
5 1.536115 4.563332 2.630977 6.749619 0.568664 0.719269 0.00% 0.00%
6 0 0.000016 0.066672 0.113697 0.092314 0.113536 100% 100%
7 0.00002 0 0.039573 0.077499 0.048887 0.115932 100% 100%
8 0 0 0.000187 0.000877 0.000231 0.001838 100% 100%
9 0.000003 0 0.005432 0.025444 0.006711 0.053295 100% 100%
10 0.000619 0.00004 0.022017 0.04167 0.016266 0.042109 100% 100%
11 2.089925 22.46909 6.810866 46.77206 2.541166 8.735867 0.00% 0.00%
12 0.000309 0.00002 0.011008 0.020835 0.008133 0.021055 100% 100%
13 0.004454 0.884075 0.215886 39.08755 0.134588 36.85742 100% 0.00%
14 0.000382 0.000658 0.033447 0.043317 0.028908 0.037955 100% 100%
15 0 0 0.000047 0.000186 0.000069 0.000393 100% 100%
Table 8- Comparison of minimum objective function value of SPSO and HPSO for 13 Non-Scalable Problem Set-II
Minimum Function Value Mean Function Value Standard Deviation Success of Rate
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 25 25 25 25 0 0 0.00% 0.00%
2 0.000088 0.000485 0.011417 0.023255 0.008292 0.027897 100% 100%
3 0.000274 0.000274 0.012265 0.022954 0.006836 0.038757 100% 100%
4 27.71683 27.71683 27.71683 27.71683 0 0 0.00% 0.00%
5 0.002251 0.002542 0.1857 0.236147 0.146323 0.148277 100% 100%
6 73046.6 73046.6 73046.6 73046.6 0 0.000002 0.00% 0.00%
7 0.14845 499.5557 227.9079 784.9403 139.8914 129.3896 8.00% 0.00%
8 0 0 0 0 0 0 100% 100%
9 1.380465 1.380465 1.380465 1.380465 0 0 0.00% 0.00%
10 0.030838 0.011159 0.251399 0.223675 0.12742 0.376073 100% 72.00%
11 0.000352 0.001827 0.014684 0.037675 0.014275 0.03201 100% 100%
12 0.00008 0.000006 0.009637 0.015294 0.010168 0.014321 100% 100%
13 0.000016 0.000016 0.001704 0.002428 0.001381 0.001928 100% 100%
Continuing in the same manner authors concluded that the param- = 100. The maximum number of iterations allowed is 30,000. If the
SPSO and HPSO implementation cannot find an acceptable solu-
eter setting of three weight factors
w c1
, and
c2
at 0.7, 1.5, tion within 30,000 iterations, it is ruled that it fails to find the global
1.5, swarm size = 30 and function evaluation = 30,000 respectively optimum in this run.
provides the best convergence rate for the scalable and non- As stated earlier in section 7, the parameter setting the three
scalable problems considered. Other combination of parameter
values usually lead to much slower convergence or sometimes non
weight factors
w , c1 and c2 at 0.7, 1.5, 1.5, swarm size = 30
-convergence at all. and function evaluation = 30,000 respectively provides the best
Experiments and Discussion on the Results convergence rate for the scalable and non-scalable problems con-
sidered. Other combination of parameter values usually lead to
Performance of the algorithm was tested on a set of 28 benchmark much slower convergence or sometimes non convergence at all.
Problems (15 Scalable and 13 Non-Scalable). The scalable and
non-scalable problems were chosen as the test problems. The In observing [Table-9], the quality of the solution obtained is meas-
Standard Particle Swarm Optimization implementation was written ured by the minimum function value, mean function value, standard
in C and compiled using the Borland C++ Version 4.5 compiler. For deviation and success of rate, the objective function values out of
the purpose of comparison, all the simulation use the parameter 30 runs. it can be seen that HPSO gives a better quality of solu-
w tions as compared to SPSO. Thus, for the scalable problems
setting of the SPSO implementation except the inertia weight , HPSO outperforms SPSO with respect to efficiency, reliability, cost
acceleration coefficient, swarm size and maximum velocity allowed. and robustness. From the [Table-1], it can be shown that the new
The swarm size (number of particles) is 30. The dynamic range for algorithm HPSO solve all the scalable problem with 100% success
each element of a particle is defined as (-100, 100), that is, the while SPSO cannot solve all the scalable problems 100% success-
particle cannot move out of this range in each dim and thus Xmax fully.
Table 9- Comparison of minimum objective function value of SPSO and HPSO for 15 Scalable Problem Set-I
Minimum Function Value Mean Function Value Standard Deviation Rate of Success
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 0.667619 0.438935 16485.6 2016 0.142795 0.115137 98.00% 100%
2 0.644392 0.403938 1708.2 174.6 0.053545 0.133075 100% 100%
3 0 0 60 60 0.000207 0.000282 100% 100%
4 0.777974 0.356199 14364.6 3393.6 0.026005 0.12876 100% 100%
5 27.12782 0.244824 30000 15957 29.80959 14.36265 0.00% 100%
6 0.000061 0.000037 166.2 141.6 0.200616 0.260962 100% 100%
7 0.000274 0.000253 72 73.8 0.22966 0.197222 100% 100%
8 0.685057 0.291755 6096 569.4 0.054336 0.174963 100% 100%
9 0.000002 0.000001 60.6 64.6 0.179978 0.206901 100% 100%
10 0.001109 0.0011 60.6 60.6 0.161759 0.177124 100% 100%
11 0.60187 0.077945 11341.8 3139.8 0.067786 0.245377 100% 100%
12 0.022248 0.002793 78 87 0.243564 0.23989 100% 100%
13 0.001848 0.001648 1767 1767 0.253535 0.263535 100% 100%
14 0.000126 0.000109 60 60 0.048579 0.055405 100% 100%
15 0.000009 0.000003 60 60 0.005729 0.003796 100% 100%
Table 10- Comparison of minimum objective function value of SPSO and HPSO for 13 Non-Scalable Problem Set-II
Minimum Function Value Mean Function Value Standard Deviation Success of Rate
Problem No.
SPSO HPSO SPSO HPSO SPSO HPSO SPSO HPSO
1 0.5 0.5 60 60 0.042453 0.042463 100% 100%
2 0.017193 0.002786 64.2 63 0.258362 0.299918 100% 100%
3 0.001029 0.001027 66.6 74.4 0.224219 0.236823 100% 100%
4 0.3986 0.390856 128.4 181.8 0.13771 0.148293 100% 100%
5 0.018613 0.01239 72 66 0.240972 0.232866 100% 100%
6 0.4986 0.4786 128.4 128.4 0.16771 0.13771 100% 100%
7 0.027193 0.012786 64.2 63 0.358362 0.309918 100% 100%
8 0.015341 0.014276 82.2 85.2 0.281294 0.257859 100% 100%
9 0.480507 0.48047 60 60 0.026709 0.023939 100% 100%
10 0.067997 0.037472 840.6 517.2 0.215576 0.251745 100% 100%
11 0.003378 0.003178 60.6 61.6 0.207517 0.227517 100% 100%
12 0.005549 0.00536 63.6 65.4 0.270722 0.275501 100% 100%
13 0.002655 0.002378 65.4 62 0.229666 0.21365 100% 100%
Fig. 3A-
Fig. 4A-
the algorithm. Proposed Algorithm with the values of parameters [2] Kennedy J. and Eberhart R.C. (1995) IEEE International Joint
inertia weight 0.7 and acceleration coefficient 1.5 gives the best Conference on Neural Networks, IEEE Press, 1942-1948.
convergence. Other combination of parameters may in some cases
[3] Chatterjee A., Pulasinghe K., Watanabe K., et al. (2005) IEEE
lead to non- convergence. On the basis of results obtained it may
Trans. on Industrial Electronics, 52, 1478-1489.
be concluded that the newly proposed HPSO algorithm outper-
forms the classical SPSO algorithm in terms of convergence, [4] Kennedy J. and Mendes R. (2002) IEEE Congress on Evolu-
speed and quality of the solution. tionary Computation, IEEE Press, 1671-1676.
Nomenclature [5] Peer E.S., Van den Bergh F. and Engelbrecht A.P. (2003)
IEEE Swarm Intelligence Symposium, IEEE Press, 235-242.
C1 Self Confidence Factor
Swarm Confidence Factor [6] Engelbrecht A.P. (2005) Fundamentals of Computational
C1 C2 Swarm Intelligence, Wiley & Sons.
(The parameters and in equation (2), are not critical for
C2
C1 [7] Kennedy J., Eberhart R.C., and Shi Y. (2001) Swarm
PSO’s convergence and alleviation of local minima, than a
Intelligence, Morgan Kaufmann.
C2 C1 C2
social parameter but with + =4)
[8] Van den Bergh F. (2002) An Analysis of Particle Swarm Opti-
f Fitness Function mizers, Ph.D. thesis, Department of Computer Science, Univer-
yij j th sity of Pretoria, Pretoria, South Africa.
i th
Personal Best Position of the particle in dimension
[9] Van den Bergh F. and Engelbrecht A.P. (2006) Information
yˆ ij i th j th
Sciences, 176(8), 937-971.
Global Best Position of the particle in dimension
[25] Liua H., Caia Z. and Wang Y. (2010) Applied Soft Computing,
629-640.
[26] Mahadevana K. and Kannan P.S. (2010) Applied Soft Compu-
ting, 641-652.
[27] Pedersen M.E.H. (2010) Tuning & Simplifying Heuristical Opti-
mization,” Ph.D. thesis, School of Engineering Sciences, Uni-
versity of Southampton, England.
[28] Pedersen M.E.H. and Chipper A.J. (2010) Applied Soft Compu-
ting, 618-628.
[29] Montes de Oca M.A. (2007) Institut de Recherches Interdisci-
plinairrs et de Developpements en Intelligence Artificielle.
[30] Schutte J.F. and Groenwold A.A. (2005) Journal of Global Opti-
mization, 31(1), 93-108.
[31] Pedersen M.E.H. and Chipper A.J. (2010) Applied Soft Compu-
ting, 10(2), 618-628.
[32] Sedighizadeh D. and Masehian E. (2009) International Journal
of Computer Theory and Engineering, 1(5), 1793-8201.
[33] Clerc M. and Kennedy J. (2002) IEEE Trans. Evol. Computer, 6
(1), 58-73.
[34] Eberhart R.C. and Shi Y. (2000) IEEE Congress Evolutionary
Computation, San Diego, CA, 84-88.
[35] Deep K. and Bansal J.C. (2009) Int. J. Computational Intelli-
gence Studies, 1(1).