SlideShare a Scribd company logo
International Journal of Electrical and Computer Engineering (IJECE)
Vol. 10, No. 3, June 2020, pp. 3261~3274
ISSN: 2088-8708, DOI: 10.11591/ijece.v10i3.pp3261-3274  3261
Journal homepage: http://ijece.iaescore.com/index.php/IJECE
Population based optimization algorithms improvement using
the predictive particles
M. M. H. Elroby1
, S. F. Mekhamer2
, H. E. A. Talaat3
, and M. A. Moustafa. Hassan4
1
Electrical Engineering Department, Faculty of Engineering, Ain Shams University, Egypt
2,3
Electrical Engineering Department, Future University, Egypt
4
Electrical Engineering Department, Cairo University, Egypt
Article Info ABSTRACT
Article history:
Received Jun 12, 2019
Revised Dec 2, 2019
Accepted Dec 11, 2019
A new efficient improvement, called Predictive Particle Modification (PPM),
is proposed in this paper. This modification makes the particle look to
the near area before moving toward the best solution of the group.
This modification can be applied to any population algorithm. The basic
philosophy of PPM is explained in detail. To evaluate the performance of
PPM, it is applied to Particle Swarm Optimization (PSO) algorithm and
Teaching Learning Based Optimization (TLBO) algorithm then tested using
23 standard benchmark functions. The effectiveness of these modifications
are compared with the other unmodified population optimization algorithms
based on the best solution, average solution, and convergence rate.
Keywords:
Optimization
Particle Swarm Optimization
Population optimization
Predictive particle
Teaching Learning Based Copyright © 2020 Institute of Advanced Engineering and Science.
All rights reserved.
Corresponding Author:
M. M. H. Elroby,
Electrical Engineering Department,
Faculty of Engineering,
Ain Shams University, Egypt.
Email: mousaelroby@yahoo.com
1. INTRODUCTION
Recently, many Meta heuristic optimization algorithms have been developed. These include Particle
Swarm Optimization (PSO) [1-5], Genetic Algorithm (GA) [6-9] , Deferential Evolution (DE) [10],
Ant Colony (AC) [11], Gravitational Search algorithm (GSA) [12], Sine Cosine Algorithm (SCA) [13-15],
Hybrid PSOGSA Algorithm [16], Adaptive SCA integrated with particle swarm [17], and Teaching Learning
Based Optimization (TLBO) [18-20]. The same goal for them is to find the global optimum. In order to do
this, a heuristic algorithm should be equipped with two main characteristics to ensure finding global
optimum. These two major characteristics are exploration and exploitation. Exploration is the ability to
search whole parts of the space whereas exploitation is the convergence ability to the best solution. The goal
of all Meta heuristic optimization algorithms is to balance the ability of exploitation and exploration in order
to find global optimum. According to [21], exploitation and exploration in evolutionary computing are not
clear due to lake of a generally accepted perception. In other hand, with strengthening one ability, the other
will weaken and vice versa. Because of the above-mentioned points, the existing Meta heuristic optimization
algorithms are capable of solving finite set of problems. It has been proved that there is no algorithm, which
can perform general enough to solve all optimization problems [22]. Many hydride optimization algorithms
are to balance the overall exploration and exploitation ability.
In this study, the proposed modification increases the exploration and make the particle look to
the surrounding space before affected by the best solution. The proposed modification can be applied to any
population optimization algorithms. The PSO is one of the widely used population algorithms due to its
simplicity, convergence speed, and ability of searching global optimum. Recently TLBO is a new efficient
optimization method combine between teaching and learning phases. For the reasons listed above this
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274
3262
modification has been applied to PSO and TLBO. The organization of this paper is as follows: Section 2
describes the standard PSO and its exploration problem. Section 3 describes the standard TLBO.
The proposed modification is presented in Section 4. Section 5 describes the results of the proposed
modification. Section 6 concludes this research.
2. THE STANDARD PARTICLE SWARM OPTIMIZATION
2.1. Particle Swarm Optimization Algorithm
PSO is a population computation algorithm, which is proposed by Kennedy and Eberhart [1].
The PSO was inspired from social behavior of bird flocking. It uses a number of particles, which fly, around
the search space. All particles try to find best solution. Meanwhile, they all look at the best particle in their
paths. In other words, particles consider their own best solutions and the best solution has found so far.
Each particle in PSO should consider the current position, the distance to pbest, the current velocity, and
the distance to global best (gbest) to modify its position. PSO was modeled as follow [1]:
𝑣𝑖
𝑡+1
= 𝑤𝑣𝑖
𝑡
+ 𝑐1 × 𝑟𝑎𝑛𝑑 × ( 𝑝𝑏𝑒𝑠𝑡𝑖
𝑡
− 𝑥𝑖
𝑡
) + 𝑐2 × 𝑟𝑎𝑛𝑑 × ( 𝑔𝑏𝑒𝑠𝑡𝑡
− 𝑥𝑖
𝑡
) (1)
𝑥𝑖
𝑡+1
= 𝑥𝑖
𝑡
+ 𝑣𝑖
𝑡+1
(2)
where vi
t+1
is the velocity of particle i at iteration t,
w is a weighting function,
cj is a weighting factor,
rand is a random number between 0 and 1,
xi
t
is the current position of particle i at iteration t,
pbesti is the pbest of agent i at iteration t,
gbest is the best solution so far.
The first part of (1), 𝑤𝑣𝑖
𝑡
provides exploration ability for PSO. The second and third parts,
𝑐1× 𝑟𝑎𝑛𝑑 × (𝑝𝑏𝑒𝑠𝑡 − 𝑥𝑖
𝑡
) and, 𝑐1× 𝑟𝑎𝑛𝑑 × (𝑝𝑏𝑒𝑠𝑡 − 𝑥𝑖
𝑡
) represent private thinking and collaboration of
particles respectively [23, 24]. The PSO is initialized with randomly placing the particles in a problem space.
In each iteration, the particles velocities are calculated using (1). After velocities calculating, the position of
particle can be calculated as (2). This process will continue until meeting an end criterion.
2.1.1. PSO Exploration Problem
The first part of (1), 𝑤𝑣𝑖
𝑡
provides PSO exploration ability. When the algorithm is started,
the velocity is initialized with zero value. Thus from Equation 1, the Global Best Particle (GBP)
(i.e. P1 in Figure 1 (a)) remains in its place until the best global solution is changed by a new particle.
This means the global best particle cannot explore near area because it is not exited by any particle.
In addition, particles that arrive from another places (P2 - P5) to the place of the global best solution with
a certain velocity after a number of iteration may be damped before reaching the optimal solution as shown in
Figure 1 (b). This phenomenon will be treated using PPM in Section 2.
(a) (b)
Figure 1.Particles at initial and final iteration, (a) initial iteration, (b) final iteration
Int J Elec & Comp Eng ISSN: 2088-8708 
Population based optimization algorithms improvement… (M. M. H. Elroby)
3263
3. THE STANDARD TEACHING LEARNING BASED OPTIMIZATION
The TLBO method is based on the effect of the teacher on the learners. The teacher is considered as
a global best learned person (𝑔𝑏𝑒𝑠𝑡 𝑡
) who shares his knowledge with the learners. The process of TLBO is
divided to two phase. The first phase consists of the ‘Teacher Phase’ and the second phase consists of
the ‘Learner Phase’. The ‘Teacher Phase’ means learning from the teacher and the ‘Learner Phase’ means
learning through the interaction between learners. TLBO was modeled as follows [18]:
3.1. Teacher Phase
A learner learns from teacher by moving its mean to teacher value. Learner modification is
expressed as:
3.2. Learner Phase
A learner learns new something if the other learner has better knowledge than him. Learner
modification is expressed as:
4. PREDICTIVE PARTICLE
The main idea of the PPM based on that each iteration the particle should look at its near area and
see if it have a value best than the GBP or not. If it have value better than GBP, it will be the GBP. The PPM
can remedy non-exiting GBP (P1 in Figure 1 (a)) and not wait until excitation from another particle.
In addition, it can improve the vision of the particle before movement toward GBP and overcome the jump
over narrow area leaving goloabal solution.
Consider the initial values of the particles P1 to P5, which are shown in Figure 2. In the next
iteration, these particles will move toward P1 (as it is the GBP at this moment) and take positions P1, P2 to
P5. In addition, the P3 may jump to P3 without converge to gbest especially when the fitness function have
narrow area with high deep value. In addition, the P1 still in its position as it is GBP. These phenomena can
be treated if the particle try to find a best solution (target) from near area before move to GBP as shown in
Figure 3. This can be done using the numerical gradient with a definite target. Assume the fitness function
(F) is a linear function near the particle position in matrix form:
Figure 2. Particles movement
𝑭𝒐𝒓 𝑖 = 1 ∶ 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑙𝑒𝑎𝑟𝑛𝑒𝑎𝑟𝑠
𝑅𝑎𝑛𝑑𝑜𝑚𝑙𝑦 𝑠𝑒𝑙𝑒𝑐𝑡 𝑡𝑤𝑜 𝑙𝑒𝑎𝑟𝑛𝑒𝑟𝑠 𝑋𝑖
𝑡
𝑎𝑛𝑑 𝑋𝑗
𝑡
, 𝑤ℎ𝑒𝑟𝑒 𝑖 ≠ 𝑗
𝑰𝒇 𝑓 (𝑋𝑖
𝑡
) < 𝑓 (𝑋𝑗
𝑡
)
𝑋𝑖
𝑡+1
= 𝑋𝑖
𝑡
+ 𝑟𝑎𝑛𝑑(𝑋𝑖
𝑡
− 𝑋𝑗
𝑡
)
𝑬𝒍𝒔𝒆
𝑋𝑖
𝑡+1
= 𝑋𝑖
𝑡
+ 𝑟𝑎𝑛𝑑(𝑋𝑗
𝑡
− 𝑋𝑖
𝑡
)
𝑬𝒏𝒅
𝐴𝑐𝑐𝑒𝑝𝑡 𝑋𝑖
𝑡+1
𝑖𝑓 𝑖𝑡 𝑔𝑖𝑣𝑒𝑠 𝑎 𝑏𝑒𝑡𝑡𝑒𝑟 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑣𝑎𝑙𝑢𝑒.
𝑬𝒏𝒅
𝑇𝐹 = 𝑟𝑜𝑢𝑛𝑑[1 + 𝑟𝑎𝑛𝑑(0, 1)]
𝑚𝑒𝑎𝑛 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑡
= 𝑟(𝑔𝑏𝑒𝑠𝑡 𝑡
− 𝑇𝐹 𝑀 𝑡
)
Where 𝑀 𝑡
is the mean of the learner and ‘𝑔𝑏𝑒𝑠𝑡 𝑡
’ is the global best (the teacher) at any iteration 𝑡.
𝑭𝒐𝒓 𝑖 = 1 ∶ 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑙𝑒𝑎𝑟𝑛𝑒𝑎𝑟𝑠
𝑋𝑖
𝑡+1
= 𝑋𝑖
𝑡
+ 𝑚𝑒𝑎𝑛 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑡
𝐴𝑐𝑐𝑒𝑝𝑡 𝑋𝑖
𝑡+1
𝑖𝑓 𝑖𝑡 𝑔𝑖𝑣𝑒𝑠 𝑎 𝑏𝑒𝑡𝑡𝑒𝑟 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑣𝑎𝑙𝑢𝑒.
𝑬𝒏𝒅
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274
3264
Figure 3. Initial and target of the particle
F = AX + b (3)
Using numerical gradient method:
Xnew = Xold − R ∗
dF
dX
(4)
where
𝑑𝐹
𝑑𝑋
= [
∆𝐹/∆𝑥1
∆𝐹/∆𝑥 𝑛
] = 𝐴′
Xnew is the new postion of the particle in column form
Xold is the current position of the particle
R is the step size
∆𝐹/∆𝑥𝑖 is calculated numerically near 𝑋 𝑜𝑙𝑑 by change only 𝑥𝑖
From (3)∶
F 𝑜𝑙𝑑 = AX 𝑜𝑙𝑑 + b (5)
F 𝑛𝑒𝑤 = AX 𝑛𝑒𝑤 + b (6)
From (5) and (6) by substraction:
Xnew = −
Fold − Fnew
A
+ Xold (7)
where
Fold is the current fitinenss value
Fnew is the new fitinenss value
From (4) and (7).
R =
Fold − Fnew
A ∗
dF
dX
=
Fold − Fnew
(
dF
dX
)′ ∗
dF
dX
(8)
Xnew = Xold −
Fold − Fnew
(
dF
dX
)′ ∗
dF
dX
∗ dF/dX (9)
Int J Elec & Comp Eng ISSN: 2088-8708 
Population based optimization algorithms improvement… (M. M. H. Elroby)
3265
If Fi is the current fitness value of the particle and Ft is the target fitness of the particle (less than gbest
value). It is nice to dived search steps to N steps as follows:
Assume dist = Fi − Ft (10)
for each step
∆X =
dist/N
(
dF
dX
)′ ∗
dF
dX
∗
dF
dX (11)
Xnew = Xold − ∆X (12)
The complete PPM algorithm before moving to GBP is shown in Table 1. In addition, the Modified PSO
(MPSO) and Modified TLBO (MTLBO) are shown in Table 2 and Table 3 respectively.
Table 1. Gradient algorithm
Set particle gradient parameter:
𝐹𝑡 < gbest
𝑋 𝑜𝑙𝑑 = current position of particle
𝑑𝑖𝑠𝑡 = 𝐹𝑖 − 𝐹𝑡
𝑖𝑛𝑡𝑖𝑎𝑙𝑖𝑧𝑒 Vtemp = 0
Execute gradiant algorithm:
For N step
𝑋 𝑛𝑒𝑤 = 𝑋 𝑜𝑙𝑑 − ∆𝑋 according to (12)
𝑋 𝑛𝑒𝑤 =max(𝑋 𝑛𝑒𝑤, xmin);
𝑋 𝑛𝑒𝑤 = min(𝑋 𝑛𝑒𝑤, xmax);
If F(𝑋 𝑛𝑒𝑤 ) < F(𝑋 𝑛𝑒𝑤 )
𝑋𝑡𝑒𝑚𝑝 = 𝑋 𝑛𝑒𝑤
Vtemp=∆𝑋
Else
𝑋𝑡𝑒𝑚𝑝 = 𝑋𝑡 − 2 ∗ Vtemp
𝑋𝑡𝑒𝑚𝑝=max(𝑋𝑡𝑒𝑚𝑝, xmin);
𝑋𝑡𝑒𝑚𝑝= min(𝑋𝑡𝑒𝑚𝑝, xmax);
Vtemp=Vtemp
End
End
Update particle position :
If 𝐹(𝑋𝑡𝑒𝑚𝑝) < 𝑔𝑏𝑒𝑠𝑡
𝑥𝑖
𝑡+1
= 𝑋𝑡𝑒𝑚𝑝
𝑣𝑖
𝑡+1
=Vtemp
End
Table 2. Modified PSO
For each particle
initialize particle
End
Choose the particle with the best fitness value
of all the particles as the gbest
Do
For each particle
Update particle position according to
𝑣𝑖
𝑡+1
= 𝑤𝑣𝑖
𝑡
+ 𝑐2 × 𝑟𝑎𝑛𝑑 × (𝑔𝑏𝑒𝑠𝑡 − 𝑥𝑖
𝑡
)
𝑥𝑖
𝑡+1
= 𝑥𝑖
𝑡
+ 𝑣𝑖
𝑡+1
gradient algorithm as shown in Table 1
End
For each particle
Calculate fitness value
If the fitness value is better than the best
fitness value (pbest) in history set current
value as the new pbest
End
Choose the particle with the best fitness value
of all the particles as the gbest
While maximum iterations or minimum error
criteria is not attained
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274
3266
Table 3. Modified TLBO
For each particle
initialize particle
End
Choose the particle with the best fitness value of all the particles as the gbest
Do
1 ) Teacher phase
𝑇𝐹 = 𝑟𝑜𝑢𝑛𝑑[1 + 𝑟𝑎𝑛𝑑(0, 1)]
𝑚𝑒𝑎𝑛 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑡
= 𝑟(𝑔𝑏𝑒𝑠𝑡 𝑡
− 𝑇𝐹 𝑀 𝑡
)
𝑭𝒐𝒓 𝑖 = 1 ∶ 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑙𝑒𝑎𝑟𝑛𝑒𝑎𝑟𝑠
𝑋𝑖
𝑡+1
= 𝑋𝑖
𝑡
+ 𝑚𝑒𝑎𝑛 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑡
gradient algorithm as shown in Table. 1 for i 𝐴𝑐𝑐𝑒𝑝𝑡 𝑋𝑖
𝑡+1
𝑖𝑓 𝑖𝑡 𝑔𝑖𝑣𝑒𝑠 𝑎 𝑏𝑒𝑡𝑡𝑒𝑟 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑣𝑎𝑙𝑢𝑒.
𝑬𝒏𝒅
2) learner phase
𝑭𝒐𝒓 𝑖 = 1 ∶ 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑙𝑒𝑎𝑟𝑛𝑒𝑎𝑟𝑠
𝑅𝑎𝑛𝑑𝑜𝑚𝑙𝑦 𝑠𝑒𝑙𝑒𝑐𝑡 𝑡𝑤𝑜 𝑙𝑒𝑎𝑟𝑛𝑒𝑟𝑠 𝑋𝑖
𝑡
𝑎𝑛𝑑 𝑋𝑗
𝑡
, 𝑤ℎ𝑒𝑟𝑒 𝑖 ≠ 𝑗
gradient algorithm as shown in Table 1 for i and j
𝑰𝒇 𝑓 (𝑋𝑖
𝑡
) < 𝑓 (𝑋𝑗
𝑡
)
𝑋𝑖
𝑡+1
= 𝑋𝑖
𝑡
+ 𝑟𝑎𝑛𝑑(𝑋𝑖
𝑡
− 𝑋𝑗
𝑡
)
𝑬𝒍𝒔𝒆
𝑋𝑖
𝑡+1
= 𝑋𝑖
𝑡
+ 𝑟𝑎𝑛𝑑(𝑋𝑗
𝑡
− 𝑋𝑖
𝑡
)
𝑬𝒏𝒅
gradient algorithm as shown in Table 1 for i
𝐴𝑐𝑐𝑒𝑝𝑡 𝑋𝑖
𝑡+1
𝑖𝑓 𝑖𝑡 𝑔𝑖𝑣𝑒𝑠 𝑎 𝑏𝑒𝑡𝑡𝑒𝑟 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑣𝑎𝑙𝑢𝑒.
𝑬𝒏𝒅
Choose the particle with the best fitness value of all the particles as the gbest
While maximum iterations or minimum error criteria is not attained
5. EXPERIMENTAL RESULTS AND DISCUSSION
The standard PSO, PSOSGSA, SCA, TLBO, MPSO, and MTLBO with the parameter in
Table 4 [25-28] have executed 30 independent runs over each benchmark function for statistical analysis.
As shown in Table 5, MPSO and MTLBO outperformed all of the other algorithms with regard to the quality
of the solutions for all functions. In contrast, the other algorithms produced poor results on certain functions
and accurate results on others. This finding reflects the efficient performance of the MPSO and MTLBO in
comparison with the other unmodifeied algorithms. In addition, Figure 4 to Figure 11 show a comparison
between MPSO and MTLBO and all the other algorithms for the convergence rate for the fitness versus
the iterations. These figures show that MPSO and MTLBO outperforms all the other unmodifeied algorithms
in terms of the convergence speed with an accurate solutio
Table 4. Algorithms parameter
Algorithm Parameter
PSO C1=C2=2 wdamp=0.9
PSOGSA G0=1, C1=0.5, C2=1.5
SCA a = 2, r2=(2*pi)*rand , r3=2*rand, r4=rand
TLBO TF=randi([1 2])
MPSOA C1=C2=2, wdamp=0.9 , N=5
MTLBO G0=1, C1=0.5, C2=1.5, N=5
MaxVelocity=0.2*(VarMax-VarMin) , MinVelocity= —MaxVelocity
Table 5. Benchmark functions
Function n Range PSO PSOGSA SCA TLBO MPSO MTLBO
Int J Elec & Comp Eng ISSN: 2088-8708 
Population based optimization algorithms improvement… (M. M. H. Elroby)
3267
Table 5. Benchmark functions (continue)
Function n Range PSO PSOGSA SCA TLBO MPSO MTLBO
Figure 4. Converge rate curves for F1 to F3
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274
3268
Figure 4. Converge rate curves for F1 to F3 (continue)
Figure 5. Converge rate curves for F4 to F6
Int J Elec & Comp Eng ISSN: 2088-8708 
Population based optimization algorithms improvement… (M. M. H. Elroby)
3269
Figure 5. Converge rate curves for F4 to F6 (continue)
Figure 6. Converge rate curves for F7 to F9
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274
3270
Figure 7. Converge rate curves for F10 to F12
Figure 8. Converge rate curves for F13 to F15
Int J Elec & Comp Eng ISSN: 2088-8708 
Population based optimization algorithms improvement… (M. M. H. Elroby)
3271
Figure 8. Converge rate curves for F13 to F15 (continue)
Figure 9. Converge rate curves for F16 to F18
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274
3272
Figure 9. Converge rate curves for F16 to F18 (continue)
Figure 10. Converge rate curves for F19 to F21
Int J Elec & Comp Eng ISSN: 2088-8708 
Population based optimization algorithms improvement… (M. M. H. Elroby)
3273
Figure 11. Converge rate curves for F22 to F23
6. CONCLUSION
In this paper, the PPM has the advantage of powerful exploration. Thus, it was necessary to enhance
the population algorithms by merging it with PPM, which has the advantage of powerful exploitation.
Hence, the proposed modification improves the exploration quality and maintaining fast convergence.
PPM optimization was tested to find the optimal solution for standard mathematical functions, and results
demonstrated improvement in solution quality and convergence rate..
REFERENCES
[1] J. Kennedy; R. Eberhart, “Particle Swarm Optimization,” IEEE Int. Conf. Neural Networks, vol 4, pp. 1942–1948,
1995.
[2] I. C. Trelea, “The particle swarm optimization algorithm: Convergence analysis and parameter selection,” Inf.
Process. Lett., vol. 85, no. 6, pp. 317-325, 2003.
[3] I. Science, “Analysis of Particle Swarm Optimization Algorithm,” Comput. Inf. Sci., vol. 3, pp. 180–184, 1998.
[4] R. Poli, J. Kennedy, and T. Blackwell, “Particle swarm optimization: An overview,” Swarm Intell., vol. 1, no. 1,
pp. 33–57, 2007.
[5] F. Van Den Bergh and A. P. Engelbrecht, “A study of particle swarm optimization particle trajectories,” Inf. Sci.
(Ny)., vol. 176, no. 8, pp. 937–971, 2006.
[6] D. E. Goldberg, “Genetic Algorithms in Search, Optimization, and Machine Learning,” New York Addison-
Wesley, 1989.
[7] K. Deb and S. Agrawal, “Understanding interactions among genetic algorithm parameters,” Found. Genet.
Algorithms V, San Mateo, CA Morgan Kauffman, pp. 265–286, 1999.
[8] P. Pongcharoen, C. Hicks, P. M. Braiden, and D. J. Stewardson, “Determining optimum Genetic Algorithm
parameters for scheduling the manufacturing and assembly of complex products,” Int. J. Prod. Econ., vol. 78, no. 3,
pp. 311–322, 2002.
[9] A. H. Wright, “Genetic Algorithms for Real Parameter Optimization,” Foundations of Genetic Algorithms, vol 1,
pp. 205–218, 1991.
[10] R. Storn dan K. Price, “Differential Evolution – A Simple and Efficient Heuristic for Global Optimization Over
Continuous Spaces,” Journal of Global Optimization, vol. 11, no. 4, pp 341–359, 1997.
[11] D. Marco, “Ant colony optimization,” Scholarpedia, vol. 2, no. 3, pp. 1461, 2007.
 ISSN: 2088-8708
Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274
3274
[12] E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: A Gravitational Search Algorithm,” Information
Sciences, vol. 179, no. 13, pp. 2232-2248, 2009.
[13] S. Mirjalili, “SCA: A Sine Cosine Algorithm for Solving Optimization Problems,” Knowledge-Based Syst., vol. 96,
pp. 120–133, 2016.
[14] A. I. Hafez, H. M. Zawbaa, E. Emary, and A. E. Hassanien, “Sine cosine optimization algorithm for feature
selection,” Proc. 2016 Int. Symp. Innov. Intell. Syst. Appl. INISTA 2016, 2016.
[15] M. Abd Elaziz, D. Oliva, and S. Xiong, “An improved Opposition-Based Sine Cosine Algorithm for global
optimization,” Expert Syst. Appl., vol. 90, pp. 484–500, 2017.
[16] S. Mirjalili and S. Z. M. Hashim, “A New Hybrid PSOGSA Algorithm for Function Optimization,” Proc. ICCIA
2010, Int. Conf. Comput. Inf. Appl, no. 1, pp. 374–377, 2010.
[17] M. Issa, A. E. Hassanien, D. Oliva, A. Helmi, I. Ziedan, and A. Alzohairy, “ASCA-PSO: Adaptive Sine Cosine
Optimization Algorithm Integrated with Particle Swarm for Pairwise Local Sequence Alignment,” Expert Syst.
Appl, vol. 99, pp. 56–70, 2018.
[18] R. V. Rao, V. J. Savsani, and D.P Vakharia, “Teaching–Learning-Based Optimization: A Novel Method for
Constrained Mechanical Design Optimization Problems,” Comput. Des., vol. 43, no. 3, pp. 303-315, 2011.
[19] R. V. Rao and V. Patel, “An improved teaching-learning-based optimization algorithm for solving unconstrained
optimization problems,” Sci. Iran., vol. 20, no. 3, pp. 710–720, 2013.
[20] K. Yu, X. Wang, and Z. Wang, “An improved teaching-learning-based optimization algorithm for numerical and
engineering optimization problems,” J. Intell. Manuf., vol. 27, no. 4, pp. 831–843, 2016.
[21] A. E. Eiben and C. A. Schippers, "On Evolutionary Exploration and Exploitation," Journal Fundamenta
Informaticae, vol. 35, no. 1-4, pp. 35–50, 1998.
[22] N. Benfenatki, “La Tuberculose multirésistante,” Rev. Med. Interne, vol. 30, pp. 268-272, 2009.
[23] G. A. F. Alfarisy, W. F. Mahmudy, and M. H. Natsir, “Good parameters for PSO in optimizing laying hen diet,”
International Journal of Electrical and Computer Engineering (IJECE), vol. 8, no. 4, pp. 2419–2432, 2018.
[24] W. R. Abdul-Adheem, “An enhanced Particle Swarm Optimization algorithm,” International Journal of Electrical
and Computer Engineering (IJECE), vol. 9, no. 6, pp. 4904–4907, 2019.
[25] D. B. Chen and C. X. Zhao, “Particle swarm optimization with adaptive population size and its application,” Appl.
Soft Comput. J., vol. 9, no. 1, pp. 39–48, 2009.
[26] G. S. Basheer, M. S. Ahmad, and A. Y. C. Tang, “Intelligent Information and Database Systems - Part II,” 7th
Asian Conf., vol. 7803, pp. 549–558, 2013.
[27] J. Gardezi, “Handbook of Research on Machine Learning Innovations and Trends,” no. April. 2017.
[28] A. Kaveh and T. Bakhshpoori, Metaheuristics: Outlines, MATLAB Codes and Examples, 2019.
Ad

Recommended

PPTX
Optimization and particle swarm optimization (O & PSO)
Engr Nosheen Memon
 
PDF
A PARTICLE SWARM OPTIMIZATION ALGORITHM BASED ON UNIFORM DESIGN
IJDKP
 
PPT
PSO.ppt
grssieee
 
PPTX
Particle swarm optimization
anurag singh
 
PDF
Free Lunch or No Free Lunch: That is not Just a Question?
Xin-She Yang
 
PDF
An Automatic Medical Image Segmentation using Teaching Learning Based Optimiz...
idescitation
 
PDF
Ip3514921495
IJERA Editor
 
PPT
Swarm intelligence pso and aco
satish561
 
PDF
Behavior study of entropy in a digital image through an iterative algorithm
ijscmcj
 
PDF
Medical diagnosis classification
csandit
 
PDF
A0311010106
theijes
 
PDF
E41033336
IJERA Editor
 
PDF
Ds33717725
IJERA Editor
 
PDF
An_Accelerated_Nearest_Neighbor_Search_Method_for_the_K-Means_Clustering_Algo...
Adam Fausett
 
PDF
Textural Feature Extraction of Natural Objects for Image Classification
CSCJournals
 
PDF
PARTICLE SWARM INTELLIGENCE: A PARTICLE SWARM OPTIMIZER WITH ENHANCED GLOBAL ...
Hennegrolsch
 
PDF
A COMPREHENSIVE ANALYSIS OF QUANTUM CLUSTERING : FINDING ALL THE POTENTIAL MI...
IJDKP
 
PDF
2007 santiago marchi_cobem_2007
CosmoSantiago
 
PDF
Dv33736740
IJERA Editor
 
PDF
Combination of Similarity Measures for Time Series Classification using Genet...
Deepti Dohare
 
PDF
F5233444
IOSR-JEN
 
PDF
Finding increment statistics on various types of wavelets under 1 d fractiona...
International Journal of Science and Research (IJSR)
 
PDF
General Theory of Boundaries
Vicente Fachina
 
PDF
Vishalpathak_Resume(Theoretical-Physics)
Vishal Pathak
 
PDF
Application of Gravitational Search Algorithm and Fuzzy For Loss Reduction of...
IOSR Journals
 
PDF
MARKOV CHAIN AND ADAPTIVE PARAMETER SELECTION ON PARTICLE SWARM OPTIMIZER
ijsc
 
PPTX
Particle swarm optimization
Mahesh Tibrewal
 
DOC
Pso notes
Darshan Sharma
 
PDF
40120130405025
IAEME Publication
 
PPTX
Particle Swarm Optimization by Rajorshi Mukherjee
Rajorshi Mukherjee
 

More Related Content

What's hot (17)

PDF
Behavior study of entropy in a digital image through an iterative algorithm
ijscmcj
 
PDF
Medical diagnosis classification
csandit
 
PDF
A0311010106
theijes
 
PDF
E41033336
IJERA Editor
 
PDF
Ds33717725
IJERA Editor
 
PDF
An_Accelerated_Nearest_Neighbor_Search_Method_for_the_K-Means_Clustering_Algo...
Adam Fausett
 
PDF
Textural Feature Extraction of Natural Objects for Image Classification
CSCJournals
 
PDF
PARTICLE SWARM INTELLIGENCE: A PARTICLE SWARM OPTIMIZER WITH ENHANCED GLOBAL ...
Hennegrolsch
 
PDF
A COMPREHENSIVE ANALYSIS OF QUANTUM CLUSTERING : FINDING ALL THE POTENTIAL MI...
IJDKP
 
PDF
2007 santiago marchi_cobem_2007
CosmoSantiago
 
PDF
Dv33736740
IJERA Editor
 
PDF
Combination of Similarity Measures for Time Series Classification using Genet...
Deepti Dohare
 
PDF
F5233444
IOSR-JEN
 
PDF
Finding increment statistics on various types of wavelets under 1 d fractiona...
International Journal of Science and Research (IJSR)
 
PDF
General Theory of Boundaries
Vicente Fachina
 
PDF
Vishalpathak_Resume(Theoretical-Physics)
Vishal Pathak
 
PDF
Application of Gravitational Search Algorithm and Fuzzy For Loss Reduction of...
IOSR Journals
 
Behavior study of entropy in a digital image through an iterative algorithm
ijscmcj
 
Medical diagnosis classification
csandit
 
A0311010106
theijes
 
E41033336
IJERA Editor
 
Ds33717725
IJERA Editor
 
An_Accelerated_Nearest_Neighbor_Search_Method_for_the_K-Means_Clustering_Algo...
Adam Fausett
 
Textural Feature Extraction of Natural Objects for Image Classification
CSCJournals
 
PARTICLE SWARM INTELLIGENCE: A PARTICLE SWARM OPTIMIZER WITH ENHANCED GLOBAL ...
Hennegrolsch
 
A COMPREHENSIVE ANALYSIS OF QUANTUM CLUSTERING : FINDING ALL THE POTENTIAL MI...
IJDKP
 
2007 santiago marchi_cobem_2007
CosmoSantiago
 
Dv33736740
IJERA Editor
 
Combination of Similarity Measures for Time Series Classification using Genet...
Deepti Dohare
 
F5233444
IOSR-JEN
 
Finding increment statistics on various types of wavelets under 1 d fractiona...
International Journal of Science and Research (IJSR)
 
General Theory of Boundaries
Vicente Fachina
 
Vishalpathak_Resume(Theoretical-Physics)
Vishal Pathak
 
Application of Gravitational Search Algorithm and Fuzzy For Loss Reduction of...
IOSR Journals
 

Similar to Population based optimization algorithms improvement using the predictive particles (20)

PDF
MARKOV CHAIN AND ADAPTIVE PARAMETER SELECTION ON PARTICLE SWARM OPTIMIZER
ijsc
 
PPTX
Particle swarm optimization
Mahesh Tibrewal
 
DOC
Pso notes
Darshan Sharma
 
PDF
40120130405025
IAEME Publication
 
PPTX
Particle Swarm Optimization by Rajorshi Mukherjee
Rajorshi Mukherjee
 
PPTX
Optimization Using Evolutionary Computing Techniques
Siksha 'O' Anusandhan (Deemed to be University )
 
PPSX
PSO.ppsx
Arunkumar Tulasi
 
PDF
Comparison Between PSO and HPSO In Image Steganography
IJCSIS Research Publications
 
PDF
Particle Swarm Optimization Application In Power System
Ministry of New & Renewable Energy, Govt of India
 
PPTX
TEXT FEUTURE SELECTION USING PARTICLE SWARM OPTIMIZATION (PSO)
yahye abukar
 
PDF
A Fast and Inexpensive Particle Swarm Optimization for Drifting Problem-Spaces
Zubin Bhuyan
 
PPTX
Practical Swarm Optimization (PSO)
khashayar Danesh Narooei
 
PDF
Markov Chain and Adaptive Parameter Selection on Particle Swarm Optimizer
ijsc
 
PDF
IRJET- PSO based PID Controller for Bidirectional Inductive Power Transfer Sy...
IRJET Journal
 
PPTX
PSO__AndryPinto_InesDomingues_LuisRocha_HugoAlves_SusanaCruz.pptx
SubhamGupta106798
 
PPSX
Particle Swarm optimization
midhulavijayan
 
PDF
International Journal of Engineering Research and Development (IJERD)
IJERD Editor
 
PPTX
DriP PSO- A fast and inexpensive PSO for drifting problem spaces
Zubin Bhuyan
 
PPTX
Particle Swarm Optimization
Vikas Kumar Sinha
 
PDF
Particle Swarm Optimization
Stelios Petrakis
 
MARKOV CHAIN AND ADAPTIVE PARAMETER SELECTION ON PARTICLE SWARM OPTIMIZER
ijsc
 
Particle swarm optimization
Mahesh Tibrewal
 
Pso notes
Darshan Sharma
 
40120130405025
IAEME Publication
 
Particle Swarm Optimization by Rajorshi Mukherjee
Rajorshi Mukherjee
 
Optimization Using Evolutionary Computing Techniques
Siksha 'O' Anusandhan (Deemed to be University )
 
Comparison Between PSO and HPSO In Image Steganography
IJCSIS Research Publications
 
Particle Swarm Optimization Application In Power System
Ministry of New & Renewable Energy, Govt of India
 
TEXT FEUTURE SELECTION USING PARTICLE SWARM OPTIMIZATION (PSO)
yahye abukar
 
A Fast and Inexpensive Particle Swarm Optimization for Drifting Problem-Spaces
Zubin Bhuyan
 
Practical Swarm Optimization (PSO)
khashayar Danesh Narooei
 
Markov Chain and Adaptive Parameter Selection on Particle Swarm Optimizer
ijsc
 
IRJET- PSO based PID Controller for Bidirectional Inductive Power Transfer Sy...
IRJET Journal
 
PSO__AndryPinto_InesDomingues_LuisRocha_HugoAlves_SusanaCruz.pptx
SubhamGupta106798
 
Particle Swarm optimization
midhulavijayan
 
International Journal of Engineering Research and Development (IJERD)
IJERD Editor
 
DriP PSO- A fast and inexpensive PSO for drifting problem spaces
Zubin Bhuyan
 
Particle Swarm Optimization
Vikas Kumar Sinha
 
Particle Swarm Optimization
Stelios Petrakis
 
Ad

More from IJECEIAES (20)

PDF
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
IJECEIAES
 
PDF
Embedded machine learning-based road conditions and driving behavior monitoring
IJECEIAES
 
PDF
Advanced control scheme of doubly fed induction generator for wind turbine us...
IJECEIAES
 
PDF
Neural network optimizer of proportional-integral-differential controller par...
IJECEIAES
 
PDF
An improved modulation technique suitable for a three level flying capacitor ...
IJECEIAES
 
PDF
A review on features and methods of potential fishing zone
IJECEIAES
 
PDF
Electrical signal interference minimization using appropriate core material f...
IJECEIAES
 
PDF
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
IJECEIAES
 
PDF
Bibliometric analysis highlighting the role of women in addressing climate ch...
IJECEIAES
 
PDF
Voltage and frequency control of microgrid in presence of micro-turbine inter...
IJECEIAES
 
PDF
Enhancing battery system identification: nonlinear autoregressive modeling fo...
IJECEIAES
 
PDF
Smart grid deployment: from a bibliometric analysis to a survey
IJECEIAES
 
PDF
Use of analytical hierarchy process for selecting and prioritizing islanding ...
IJECEIAES
 
PDF
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
IJECEIAES
 
PDF
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
IJECEIAES
 
PDF
Adaptive synchronous sliding control for a robot manipulator based on neural ...
IJECEIAES
 
PDF
Remote field-programmable gate array laboratory for signal acquisition and de...
IJECEIAES
 
PDF
Detecting and resolving feature envy through automated machine learning and m...
IJECEIAES
 
PDF
Smart monitoring technique for solar cell systems using internet of things ba...
IJECEIAES
 
PDF
An efficient security framework for intrusion detection and prevention in int...
IJECEIAES
 
Redefining brain tumor segmentation: a cutting-edge convolutional neural netw...
IJECEIAES
 
Embedded machine learning-based road conditions and driving behavior monitoring
IJECEIAES
 
Advanced control scheme of doubly fed induction generator for wind turbine us...
IJECEIAES
 
Neural network optimizer of proportional-integral-differential controller par...
IJECEIAES
 
An improved modulation technique suitable for a three level flying capacitor ...
IJECEIAES
 
A review on features and methods of potential fishing zone
IJECEIAES
 
Electrical signal interference minimization using appropriate core material f...
IJECEIAES
 
Electric vehicle and photovoltaic advanced roles in enhancing the financial p...
IJECEIAES
 
Bibliometric analysis highlighting the role of women in addressing climate ch...
IJECEIAES
 
Voltage and frequency control of microgrid in presence of micro-turbine inter...
IJECEIAES
 
Enhancing battery system identification: nonlinear autoregressive modeling fo...
IJECEIAES
 
Smart grid deployment: from a bibliometric analysis to a survey
IJECEIAES
 
Use of analytical hierarchy process for selecting and prioritizing islanding ...
IJECEIAES
 
Enhancing of single-stage grid-connected photovoltaic system using fuzzy logi...
IJECEIAES
 
Enhancing photovoltaic system maximum power point tracking with fuzzy logic-b...
IJECEIAES
 
Adaptive synchronous sliding control for a robot manipulator based on neural ...
IJECEIAES
 
Remote field-programmable gate array laboratory for signal acquisition and de...
IJECEIAES
 
Detecting and resolving feature envy through automated machine learning and m...
IJECEIAES
 
Smart monitoring technique for solar cell systems using internet of things ba...
IJECEIAES
 
An efficient security framework for intrusion detection and prevention in int...
IJECEIAES
 
Ad

Recently uploaded (20)

PPTX
Introduction to sensing and Week-1.pptx
KNaveenKumarECE
 
PPTX
Introduction to Python Programming Language
merlinjohnsy
 
PDF
Structured Programming with C++ :: Kjell Backman
Shabista Imam
 
PPTX
CST413 KTU S7 CSE Machine Learning Clustering K Means Hierarchical Agglomerat...
resming1
 
PPTX
Industry 4.o the fourth revolutionWeek-2.pptx
KNaveenKumarECE
 
PDF
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
Mark Billinghurst
 
PDF
FUNDAMENTALS OF COMPUTER ORGANIZATION AND ARCHITECTURE
Shabista Imam
 
PPTX
DESIGN OF REINFORCED CONCRETE ELEMENTS S
prabhusp8
 
PPTX
Structural Wonderers_new and ancient.pptx
nikopapa113
 
PPTX
MATERIAL SCIENCE LECTURE NOTES FOR DIPLOMA STUDENTS
SAMEER VISHWAKARMA
 
PPTX
Deep Learning for Natural Language Processing_FDP on 16 June 2025 MITS.pptx
resming1
 
PPTX
Tesla-Stock-Analysis-and-Forecast.pptx (1).pptx
moonsony54
 
PPTX
Stability of IBR Dominated Grids - IEEE PEDG 2025 - short.pptx
ssuser307730
 
PPTX
Solar thermal – Flat plate and concentrating collectors .pptx
jdaniabraham1
 
PPTX
How to Un-Obsolete Your Legacy Keypad Design
Epec Engineered Technologies
 
PDF
Tally.ERP 9 at a Glance.book - Tally Solutions .pdf
Shabista Imam
 
PPTX
FSE_LLM4SE1_A Tool for In-depth Analysis of Code Execution Reasoning of Large...
cl144
 
PPT
20CE404-Soil Mechanics - Slide Share PPT
saravananr808639
 
PDF
Proposal for folders structure division in projects.pdf
Mohamed Ahmed
 
PPTX
Deep Learning for Image Processing on 16 June 2025 MITS.pptx
resming1
 
Introduction to sensing and Week-1.pptx
KNaveenKumarECE
 
Introduction to Python Programming Language
merlinjohnsy
 
Structured Programming with C++ :: Kjell Backman
Shabista Imam
 
CST413 KTU S7 CSE Machine Learning Clustering K Means Hierarchical Agglomerat...
resming1
 
Industry 4.o the fourth revolutionWeek-2.pptx
KNaveenKumarECE
 
Rapid Prototyping for XR: Lecture 6 - AI for Prototyping and Research Directi...
Mark Billinghurst
 
FUNDAMENTALS OF COMPUTER ORGANIZATION AND ARCHITECTURE
Shabista Imam
 
DESIGN OF REINFORCED CONCRETE ELEMENTS S
prabhusp8
 
Structural Wonderers_new and ancient.pptx
nikopapa113
 
MATERIAL SCIENCE LECTURE NOTES FOR DIPLOMA STUDENTS
SAMEER VISHWAKARMA
 
Deep Learning for Natural Language Processing_FDP on 16 June 2025 MITS.pptx
resming1
 
Tesla-Stock-Analysis-and-Forecast.pptx (1).pptx
moonsony54
 
Stability of IBR Dominated Grids - IEEE PEDG 2025 - short.pptx
ssuser307730
 
Solar thermal – Flat plate and concentrating collectors .pptx
jdaniabraham1
 
How to Un-Obsolete Your Legacy Keypad Design
Epec Engineered Technologies
 
Tally.ERP 9 at a Glance.book - Tally Solutions .pdf
Shabista Imam
 
FSE_LLM4SE1_A Tool for In-depth Analysis of Code Execution Reasoning of Large...
cl144
 
20CE404-Soil Mechanics - Slide Share PPT
saravananr808639
 
Proposal for folders structure division in projects.pdf
Mohamed Ahmed
 
Deep Learning for Image Processing on 16 June 2025 MITS.pptx
resming1
 

Population based optimization algorithms improvement using the predictive particles

  • 1. International Journal of Electrical and Computer Engineering (IJECE) Vol. 10, No. 3, June 2020, pp. 3261~3274 ISSN: 2088-8708, DOI: 10.11591/ijece.v10i3.pp3261-3274  3261 Journal homepage: http://ijece.iaescore.com/index.php/IJECE Population based optimization algorithms improvement using the predictive particles M. M. H. Elroby1 , S. F. Mekhamer2 , H. E. A. Talaat3 , and M. A. Moustafa. Hassan4 1 Electrical Engineering Department, Faculty of Engineering, Ain Shams University, Egypt 2,3 Electrical Engineering Department, Future University, Egypt 4 Electrical Engineering Department, Cairo University, Egypt Article Info ABSTRACT Article history: Received Jun 12, 2019 Revised Dec 2, 2019 Accepted Dec 11, 2019 A new efficient improvement, called Predictive Particle Modification (PPM), is proposed in this paper. This modification makes the particle look to the near area before moving toward the best solution of the group. This modification can be applied to any population algorithm. The basic philosophy of PPM is explained in detail. To evaluate the performance of PPM, it is applied to Particle Swarm Optimization (PSO) algorithm and Teaching Learning Based Optimization (TLBO) algorithm then tested using 23 standard benchmark functions. The effectiveness of these modifications are compared with the other unmodified population optimization algorithms based on the best solution, average solution, and convergence rate. Keywords: Optimization Particle Swarm Optimization Population optimization Predictive particle Teaching Learning Based Copyright © 2020 Institute of Advanced Engineering and Science. All rights reserved. Corresponding Author: M. M. H. Elroby, Electrical Engineering Department, Faculty of Engineering, Ain Shams University, Egypt. Email: [email protected] 1. INTRODUCTION Recently, many Meta heuristic optimization algorithms have been developed. These include Particle Swarm Optimization (PSO) [1-5], Genetic Algorithm (GA) [6-9] , Deferential Evolution (DE) [10], Ant Colony (AC) [11], Gravitational Search algorithm (GSA) [12], Sine Cosine Algorithm (SCA) [13-15], Hybrid PSOGSA Algorithm [16], Adaptive SCA integrated with particle swarm [17], and Teaching Learning Based Optimization (TLBO) [18-20]. The same goal for them is to find the global optimum. In order to do this, a heuristic algorithm should be equipped with two main characteristics to ensure finding global optimum. These two major characteristics are exploration and exploitation. Exploration is the ability to search whole parts of the space whereas exploitation is the convergence ability to the best solution. The goal of all Meta heuristic optimization algorithms is to balance the ability of exploitation and exploration in order to find global optimum. According to [21], exploitation and exploration in evolutionary computing are not clear due to lake of a generally accepted perception. In other hand, with strengthening one ability, the other will weaken and vice versa. Because of the above-mentioned points, the existing Meta heuristic optimization algorithms are capable of solving finite set of problems. It has been proved that there is no algorithm, which can perform general enough to solve all optimization problems [22]. Many hydride optimization algorithms are to balance the overall exploration and exploitation ability. In this study, the proposed modification increases the exploration and make the particle look to the surrounding space before affected by the best solution. The proposed modification can be applied to any population optimization algorithms. The PSO is one of the widely used population algorithms due to its simplicity, convergence speed, and ability of searching global optimum. Recently TLBO is a new efficient optimization method combine between teaching and learning phases. For the reasons listed above this
  • 2.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274 3262 modification has been applied to PSO and TLBO. The organization of this paper is as follows: Section 2 describes the standard PSO and its exploration problem. Section 3 describes the standard TLBO. The proposed modification is presented in Section 4. Section 5 describes the results of the proposed modification. Section 6 concludes this research. 2. THE STANDARD PARTICLE SWARM OPTIMIZATION 2.1. Particle Swarm Optimization Algorithm PSO is a population computation algorithm, which is proposed by Kennedy and Eberhart [1]. The PSO was inspired from social behavior of bird flocking. It uses a number of particles, which fly, around the search space. All particles try to find best solution. Meanwhile, they all look at the best particle in their paths. In other words, particles consider their own best solutions and the best solution has found so far. Each particle in PSO should consider the current position, the distance to pbest, the current velocity, and the distance to global best (gbest) to modify its position. PSO was modeled as follow [1]: 𝑣𝑖 𝑡+1 = 𝑤𝑣𝑖 𝑡 + 𝑐1 × 𝑟𝑎𝑛𝑑 × ( 𝑝𝑏𝑒𝑠𝑡𝑖 𝑡 − 𝑥𝑖 𝑡 ) + 𝑐2 × 𝑟𝑎𝑛𝑑 × ( 𝑔𝑏𝑒𝑠𝑡𝑡 − 𝑥𝑖 𝑡 ) (1) 𝑥𝑖 𝑡+1 = 𝑥𝑖 𝑡 + 𝑣𝑖 𝑡+1 (2) where vi t+1 is the velocity of particle i at iteration t, w is a weighting function, cj is a weighting factor, rand is a random number between 0 and 1, xi t is the current position of particle i at iteration t, pbesti is the pbest of agent i at iteration t, gbest is the best solution so far. The first part of (1), 𝑤𝑣𝑖 𝑡 provides exploration ability for PSO. The second and third parts, 𝑐1× 𝑟𝑎𝑛𝑑 × (𝑝𝑏𝑒𝑠𝑡 − 𝑥𝑖 𝑡 ) and, 𝑐1× 𝑟𝑎𝑛𝑑 × (𝑝𝑏𝑒𝑠𝑡 − 𝑥𝑖 𝑡 ) represent private thinking and collaboration of particles respectively [23, 24]. The PSO is initialized with randomly placing the particles in a problem space. In each iteration, the particles velocities are calculated using (1). After velocities calculating, the position of particle can be calculated as (2). This process will continue until meeting an end criterion. 2.1.1. PSO Exploration Problem The first part of (1), 𝑤𝑣𝑖 𝑡 provides PSO exploration ability. When the algorithm is started, the velocity is initialized with zero value. Thus from Equation 1, the Global Best Particle (GBP) (i.e. P1 in Figure 1 (a)) remains in its place until the best global solution is changed by a new particle. This means the global best particle cannot explore near area because it is not exited by any particle. In addition, particles that arrive from another places (P2 - P5) to the place of the global best solution with a certain velocity after a number of iteration may be damped before reaching the optimal solution as shown in Figure 1 (b). This phenomenon will be treated using PPM in Section 2. (a) (b) Figure 1.Particles at initial and final iteration, (a) initial iteration, (b) final iteration
  • 3. Int J Elec & Comp Eng ISSN: 2088-8708  Population based optimization algorithms improvement… (M. M. H. Elroby) 3263 3. THE STANDARD TEACHING LEARNING BASED OPTIMIZATION The TLBO method is based on the effect of the teacher on the learners. The teacher is considered as a global best learned person (𝑔𝑏𝑒𝑠𝑡 𝑡 ) who shares his knowledge with the learners. The process of TLBO is divided to two phase. The first phase consists of the ‘Teacher Phase’ and the second phase consists of the ‘Learner Phase’. The ‘Teacher Phase’ means learning from the teacher and the ‘Learner Phase’ means learning through the interaction between learners. TLBO was modeled as follows [18]: 3.1. Teacher Phase A learner learns from teacher by moving its mean to teacher value. Learner modification is expressed as: 3.2. Learner Phase A learner learns new something if the other learner has better knowledge than him. Learner modification is expressed as: 4. PREDICTIVE PARTICLE The main idea of the PPM based on that each iteration the particle should look at its near area and see if it have a value best than the GBP or not. If it have value better than GBP, it will be the GBP. The PPM can remedy non-exiting GBP (P1 in Figure 1 (a)) and not wait until excitation from another particle. In addition, it can improve the vision of the particle before movement toward GBP and overcome the jump over narrow area leaving goloabal solution. Consider the initial values of the particles P1 to P5, which are shown in Figure 2. In the next iteration, these particles will move toward P1 (as it is the GBP at this moment) and take positions P1, P2 to P5. In addition, the P3 may jump to P3 without converge to gbest especially when the fitness function have narrow area with high deep value. In addition, the P1 still in its position as it is GBP. These phenomena can be treated if the particle try to find a best solution (target) from near area before move to GBP as shown in Figure 3. This can be done using the numerical gradient with a definite target. Assume the fitness function (F) is a linear function near the particle position in matrix form: Figure 2. Particles movement 𝑭𝒐𝒓 𝑖 = 1 ∶ 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑙𝑒𝑎𝑟𝑛𝑒𝑎𝑟𝑠 𝑅𝑎𝑛𝑑𝑜𝑚𝑙𝑦 𝑠𝑒𝑙𝑒𝑐𝑡 𝑡𝑤𝑜 𝑙𝑒𝑎𝑟𝑛𝑒𝑟𝑠 𝑋𝑖 𝑡 𝑎𝑛𝑑 𝑋𝑗 𝑡 , 𝑤ℎ𝑒𝑟𝑒 𝑖 ≠ 𝑗 𝑰𝒇 𝑓 (𝑋𝑖 𝑡 ) < 𝑓 (𝑋𝑗 𝑡 ) 𝑋𝑖 𝑡+1 = 𝑋𝑖 𝑡 + 𝑟𝑎𝑛𝑑(𝑋𝑖 𝑡 − 𝑋𝑗 𝑡 ) 𝑬𝒍𝒔𝒆 𝑋𝑖 𝑡+1 = 𝑋𝑖 𝑡 + 𝑟𝑎𝑛𝑑(𝑋𝑗 𝑡 − 𝑋𝑖 𝑡 ) 𝑬𝒏𝒅 𝐴𝑐𝑐𝑒𝑝𝑡 𝑋𝑖 𝑡+1 𝑖𝑓 𝑖𝑡 𝑔𝑖𝑣𝑒𝑠 𝑎 𝑏𝑒𝑡𝑡𝑒𝑟 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑣𝑎𝑙𝑢𝑒. 𝑬𝒏𝒅 𝑇𝐹 = 𝑟𝑜𝑢𝑛𝑑[1 + 𝑟𝑎𝑛𝑑(0, 1)] 𝑚𝑒𝑎𝑛 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑡 = 𝑟(𝑔𝑏𝑒𝑠𝑡 𝑡 − 𝑇𝐹 𝑀 𝑡 ) Where 𝑀 𝑡 is the mean of the learner and ‘𝑔𝑏𝑒𝑠𝑡 𝑡 ’ is the global best (the teacher) at any iteration 𝑡. 𝑭𝒐𝒓 𝑖 = 1 ∶ 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑙𝑒𝑎𝑟𝑛𝑒𝑎𝑟𝑠 𝑋𝑖 𝑡+1 = 𝑋𝑖 𝑡 + 𝑚𝑒𝑎𝑛 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑡 𝐴𝑐𝑐𝑒𝑝𝑡 𝑋𝑖 𝑡+1 𝑖𝑓 𝑖𝑡 𝑔𝑖𝑣𝑒𝑠 𝑎 𝑏𝑒𝑡𝑡𝑒𝑟 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑣𝑎𝑙𝑢𝑒. 𝑬𝒏𝒅
  • 4.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274 3264 Figure 3. Initial and target of the particle F = AX + b (3) Using numerical gradient method: Xnew = Xold − R ∗ dF dX (4) where 𝑑𝐹 𝑑𝑋 = [ ∆𝐹/∆𝑥1 ∆𝐹/∆𝑥 𝑛 ] = 𝐴′ Xnew is the new postion of the particle in column form Xold is the current position of the particle R is the step size ∆𝐹/∆𝑥𝑖 is calculated numerically near 𝑋 𝑜𝑙𝑑 by change only 𝑥𝑖 From (3)∶ F 𝑜𝑙𝑑 = AX 𝑜𝑙𝑑 + b (5) F 𝑛𝑒𝑤 = AX 𝑛𝑒𝑤 + b (6) From (5) and (6) by substraction: Xnew = − Fold − Fnew A + Xold (7) where Fold is the current fitinenss value Fnew is the new fitinenss value From (4) and (7). R = Fold − Fnew A ∗ dF dX = Fold − Fnew ( dF dX )′ ∗ dF dX (8) Xnew = Xold − Fold − Fnew ( dF dX )′ ∗ dF dX ∗ dF/dX (9)
  • 5. Int J Elec & Comp Eng ISSN: 2088-8708  Population based optimization algorithms improvement… (M. M. H. Elroby) 3265 If Fi is the current fitness value of the particle and Ft is the target fitness of the particle (less than gbest value). It is nice to dived search steps to N steps as follows: Assume dist = Fi − Ft (10) for each step ∆X = dist/N ( dF dX )′ ∗ dF dX ∗ dF dX (11) Xnew = Xold − ∆X (12) The complete PPM algorithm before moving to GBP is shown in Table 1. In addition, the Modified PSO (MPSO) and Modified TLBO (MTLBO) are shown in Table 2 and Table 3 respectively. Table 1. Gradient algorithm Set particle gradient parameter: 𝐹𝑡 < gbest 𝑋 𝑜𝑙𝑑 = current position of particle 𝑑𝑖𝑠𝑡 = 𝐹𝑖 − 𝐹𝑡 𝑖𝑛𝑡𝑖𝑎𝑙𝑖𝑧𝑒 Vtemp = 0 Execute gradiant algorithm: For N step 𝑋 𝑛𝑒𝑤 = 𝑋 𝑜𝑙𝑑 − ∆𝑋 according to (12) 𝑋 𝑛𝑒𝑤 =max(𝑋 𝑛𝑒𝑤, xmin); 𝑋 𝑛𝑒𝑤 = min(𝑋 𝑛𝑒𝑤, xmax); If F(𝑋 𝑛𝑒𝑤 ) < F(𝑋 𝑛𝑒𝑤 ) 𝑋𝑡𝑒𝑚𝑝 = 𝑋 𝑛𝑒𝑤 Vtemp=∆𝑋 Else 𝑋𝑡𝑒𝑚𝑝 = 𝑋𝑡 − 2 ∗ Vtemp 𝑋𝑡𝑒𝑚𝑝=max(𝑋𝑡𝑒𝑚𝑝, xmin); 𝑋𝑡𝑒𝑚𝑝= min(𝑋𝑡𝑒𝑚𝑝, xmax); Vtemp=Vtemp End End Update particle position : If 𝐹(𝑋𝑡𝑒𝑚𝑝) < 𝑔𝑏𝑒𝑠𝑡 𝑥𝑖 𝑡+1 = 𝑋𝑡𝑒𝑚𝑝 𝑣𝑖 𝑡+1 =Vtemp End Table 2. Modified PSO For each particle initialize particle End Choose the particle with the best fitness value of all the particles as the gbest Do For each particle Update particle position according to 𝑣𝑖 𝑡+1 = 𝑤𝑣𝑖 𝑡 + 𝑐2 × 𝑟𝑎𝑛𝑑 × (𝑔𝑏𝑒𝑠𝑡 − 𝑥𝑖 𝑡 ) 𝑥𝑖 𝑡+1 = 𝑥𝑖 𝑡 + 𝑣𝑖 𝑡+1 gradient algorithm as shown in Table 1 End For each particle Calculate fitness value If the fitness value is better than the best fitness value (pbest) in history set current value as the new pbest End Choose the particle with the best fitness value of all the particles as the gbest While maximum iterations or minimum error criteria is not attained
  • 6.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274 3266 Table 3. Modified TLBO For each particle initialize particle End Choose the particle with the best fitness value of all the particles as the gbest Do 1 ) Teacher phase 𝑇𝐹 = 𝑟𝑜𝑢𝑛𝑑[1 + 𝑟𝑎𝑛𝑑(0, 1)] 𝑚𝑒𝑎𝑛 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑡 = 𝑟(𝑔𝑏𝑒𝑠𝑡 𝑡 − 𝑇𝐹 𝑀 𝑡 ) 𝑭𝒐𝒓 𝑖 = 1 ∶ 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑙𝑒𝑎𝑟𝑛𝑒𝑎𝑟𝑠 𝑋𝑖 𝑡+1 = 𝑋𝑖 𝑡 + 𝑚𝑒𝑎𝑛 𝑑𝑖𝑓𝑓𝑒𝑟𝑒𝑛𝑐𝑒 𝑡 gradient algorithm as shown in Table. 1 for i 𝐴𝑐𝑐𝑒𝑝𝑡 𝑋𝑖 𝑡+1 𝑖𝑓 𝑖𝑡 𝑔𝑖𝑣𝑒𝑠 𝑎 𝑏𝑒𝑡𝑡𝑒𝑟 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑣𝑎𝑙𝑢𝑒. 𝑬𝒏𝒅 2) learner phase 𝑭𝒐𝒓 𝑖 = 1 ∶ 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑙𝑒𝑎𝑟𝑛𝑒𝑎𝑟𝑠 𝑅𝑎𝑛𝑑𝑜𝑚𝑙𝑦 𝑠𝑒𝑙𝑒𝑐𝑡 𝑡𝑤𝑜 𝑙𝑒𝑎𝑟𝑛𝑒𝑟𝑠 𝑋𝑖 𝑡 𝑎𝑛𝑑 𝑋𝑗 𝑡 , 𝑤ℎ𝑒𝑟𝑒 𝑖 ≠ 𝑗 gradient algorithm as shown in Table 1 for i and j 𝑰𝒇 𝑓 (𝑋𝑖 𝑡 ) < 𝑓 (𝑋𝑗 𝑡 ) 𝑋𝑖 𝑡+1 = 𝑋𝑖 𝑡 + 𝑟𝑎𝑛𝑑(𝑋𝑖 𝑡 − 𝑋𝑗 𝑡 ) 𝑬𝒍𝒔𝒆 𝑋𝑖 𝑡+1 = 𝑋𝑖 𝑡 + 𝑟𝑎𝑛𝑑(𝑋𝑗 𝑡 − 𝑋𝑖 𝑡 ) 𝑬𝒏𝒅 gradient algorithm as shown in Table 1 for i 𝐴𝑐𝑐𝑒𝑝𝑡 𝑋𝑖 𝑡+1 𝑖𝑓 𝑖𝑡 𝑔𝑖𝑣𝑒𝑠 𝑎 𝑏𝑒𝑡𝑡𝑒𝑟 𝑓𝑢𝑛𝑐𝑡𝑖𝑜𝑛 𝑣𝑎𝑙𝑢𝑒. 𝑬𝒏𝒅 Choose the particle with the best fitness value of all the particles as the gbest While maximum iterations or minimum error criteria is not attained 5. EXPERIMENTAL RESULTS AND DISCUSSION The standard PSO, PSOSGSA, SCA, TLBO, MPSO, and MTLBO with the parameter in Table 4 [25-28] have executed 30 independent runs over each benchmark function for statistical analysis. As shown in Table 5, MPSO and MTLBO outperformed all of the other algorithms with regard to the quality of the solutions for all functions. In contrast, the other algorithms produced poor results on certain functions and accurate results on others. This finding reflects the efficient performance of the MPSO and MTLBO in comparison with the other unmodifeied algorithms. In addition, Figure 4 to Figure 11 show a comparison between MPSO and MTLBO and all the other algorithms for the convergence rate for the fitness versus the iterations. These figures show that MPSO and MTLBO outperforms all the other unmodifeied algorithms in terms of the convergence speed with an accurate solutio Table 4. Algorithms parameter Algorithm Parameter PSO C1=C2=2 wdamp=0.9 PSOGSA G0=1, C1=0.5, C2=1.5 SCA a = 2, r2=(2*pi)*rand , r3=2*rand, r4=rand TLBO TF=randi([1 2]) MPSOA C1=C2=2, wdamp=0.9 , N=5 MTLBO G0=1, C1=0.5, C2=1.5, N=5 MaxVelocity=0.2*(VarMax-VarMin) , MinVelocity= —MaxVelocity Table 5. Benchmark functions Function n Range PSO PSOGSA SCA TLBO MPSO MTLBO
  • 7. Int J Elec & Comp Eng ISSN: 2088-8708  Population based optimization algorithms improvement… (M. M. H. Elroby) 3267 Table 5. Benchmark functions (continue) Function n Range PSO PSOGSA SCA TLBO MPSO MTLBO Figure 4. Converge rate curves for F1 to F3
  • 8.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274 3268 Figure 4. Converge rate curves for F1 to F3 (continue) Figure 5. Converge rate curves for F4 to F6
  • 9. Int J Elec & Comp Eng ISSN: 2088-8708  Population based optimization algorithms improvement… (M. M. H. Elroby) 3269 Figure 5. Converge rate curves for F4 to F6 (continue) Figure 6. Converge rate curves for F7 to F9
  • 10.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274 3270 Figure 7. Converge rate curves for F10 to F12 Figure 8. Converge rate curves for F13 to F15
  • 11. Int J Elec & Comp Eng ISSN: 2088-8708  Population based optimization algorithms improvement… (M. M. H. Elroby) 3271 Figure 8. Converge rate curves for F13 to F15 (continue) Figure 9. Converge rate curves for F16 to F18
  • 12.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274 3272 Figure 9. Converge rate curves for F16 to F18 (continue) Figure 10. Converge rate curves for F19 to F21
  • 13. Int J Elec & Comp Eng ISSN: 2088-8708  Population based optimization algorithms improvement… (M. M. H. Elroby) 3273 Figure 11. Converge rate curves for F22 to F23 6. CONCLUSION In this paper, the PPM has the advantage of powerful exploration. Thus, it was necessary to enhance the population algorithms by merging it with PPM, which has the advantage of powerful exploitation. Hence, the proposed modification improves the exploration quality and maintaining fast convergence. PPM optimization was tested to find the optimal solution for standard mathematical functions, and results demonstrated improvement in solution quality and convergence rate.. REFERENCES [1] J. Kennedy; R. Eberhart, “Particle Swarm Optimization,” IEEE Int. Conf. Neural Networks, vol 4, pp. 1942–1948, 1995. [2] I. C. Trelea, “The particle swarm optimization algorithm: Convergence analysis and parameter selection,” Inf. Process. Lett., vol. 85, no. 6, pp. 317-325, 2003. [3] I. Science, “Analysis of Particle Swarm Optimization Algorithm,” Comput. Inf. Sci., vol. 3, pp. 180–184, 1998. [4] R. Poli, J. Kennedy, and T. Blackwell, “Particle swarm optimization: An overview,” Swarm Intell., vol. 1, no. 1, pp. 33–57, 2007. [5] F. Van Den Bergh and A. P. Engelbrecht, “A study of particle swarm optimization particle trajectories,” Inf. Sci. (Ny)., vol. 176, no. 8, pp. 937–971, 2006. [6] D. E. Goldberg, “Genetic Algorithms in Search, Optimization, and Machine Learning,” New York Addison- Wesley, 1989. [7] K. Deb and S. Agrawal, “Understanding interactions among genetic algorithm parameters,” Found. Genet. Algorithms V, San Mateo, CA Morgan Kauffman, pp. 265–286, 1999. [8] P. Pongcharoen, C. Hicks, P. M. Braiden, and D. J. Stewardson, “Determining optimum Genetic Algorithm parameters for scheduling the manufacturing and assembly of complex products,” Int. J. Prod. Econ., vol. 78, no. 3, pp. 311–322, 2002. [9] A. H. Wright, “Genetic Algorithms for Real Parameter Optimization,” Foundations of Genetic Algorithms, vol 1, pp. 205–218, 1991. [10] R. Storn dan K. Price, “Differential Evolution – A Simple and Efficient Heuristic for Global Optimization Over Continuous Spaces,” Journal of Global Optimization, vol. 11, no. 4, pp 341–359, 1997. [11] D. Marco, “Ant colony optimization,” Scholarpedia, vol. 2, no. 3, pp. 1461, 2007.
  • 14.  ISSN: 2088-8708 Int J Elec & Comp Eng, Vol. 10, No. 3, June 2020 : 3261 - 3274 3274 [12] E. Rashedi, H. Nezamabadi-Pour, and S. Saryazdi, “GSA: A Gravitational Search Algorithm,” Information Sciences, vol. 179, no. 13, pp. 2232-2248, 2009. [13] S. Mirjalili, “SCA: A Sine Cosine Algorithm for Solving Optimization Problems,” Knowledge-Based Syst., vol. 96, pp. 120–133, 2016. [14] A. I. Hafez, H. M. Zawbaa, E. Emary, and A. E. Hassanien, “Sine cosine optimization algorithm for feature selection,” Proc. 2016 Int. Symp. Innov. Intell. Syst. Appl. INISTA 2016, 2016. [15] M. Abd Elaziz, D. Oliva, and S. Xiong, “An improved Opposition-Based Sine Cosine Algorithm for global optimization,” Expert Syst. Appl., vol. 90, pp. 484–500, 2017. [16] S. Mirjalili and S. Z. M. Hashim, “A New Hybrid PSOGSA Algorithm for Function Optimization,” Proc. ICCIA 2010, Int. Conf. Comput. Inf. Appl, no. 1, pp. 374–377, 2010. [17] M. Issa, A. E. Hassanien, D. Oliva, A. Helmi, I. Ziedan, and A. Alzohairy, “ASCA-PSO: Adaptive Sine Cosine Optimization Algorithm Integrated with Particle Swarm for Pairwise Local Sequence Alignment,” Expert Syst. Appl, vol. 99, pp. 56–70, 2018. [18] R. V. Rao, V. J. Savsani, and D.P Vakharia, “Teaching–Learning-Based Optimization: A Novel Method for Constrained Mechanical Design Optimization Problems,” Comput. Des., vol. 43, no. 3, pp. 303-315, 2011. [19] R. V. Rao and V. Patel, “An improved teaching-learning-based optimization algorithm for solving unconstrained optimization problems,” Sci. Iran., vol. 20, no. 3, pp. 710–720, 2013. [20] K. Yu, X. Wang, and Z. Wang, “An improved teaching-learning-based optimization algorithm for numerical and engineering optimization problems,” J. Intell. Manuf., vol. 27, no. 4, pp. 831–843, 2016. [21] A. E. Eiben and C. A. Schippers, "On Evolutionary Exploration and Exploitation," Journal Fundamenta Informaticae, vol. 35, no. 1-4, pp. 35–50, 1998. [22] N. Benfenatki, “La Tuberculose multirésistante,” Rev. Med. Interne, vol. 30, pp. 268-272, 2009. [23] G. A. F. Alfarisy, W. F. Mahmudy, and M. H. Natsir, “Good parameters for PSO in optimizing laying hen diet,” International Journal of Electrical and Computer Engineering (IJECE), vol. 8, no. 4, pp. 2419–2432, 2018. [24] W. R. Abdul-Adheem, “An enhanced Particle Swarm Optimization algorithm,” International Journal of Electrical and Computer Engineering (IJECE), vol. 9, no. 6, pp. 4904–4907, 2019. [25] D. B. Chen and C. X. Zhao, “Particle swarm optimization with adaptive population size and its application,” Appl. Soft Comput. J., vol. 9, no. 1, pp. 39–48, 2009. [26] G. S. Basheer, M. S. Ahmad, and A. Y. C. Tang, “Intelligent Information and Database Systems - Part II,” 7th Asian Conf., vol. 7803, pp. 549–558, 2013. [27] J. Gardezi, “Handbook of Research on Machine Learning Innovations and Trends,” no. April. 2017. [28] A. Kaveh and T. Bakhshpoori, Metaheuristics: Outlines, MATLAB Codes and Examples, 2019.