Senior Learning JAYA With Powell’s Method and Incremental Population Strategy

JAYA algorithm is one a recently developed meta-heuristic algorithm that does not require algorithm-specific parameters. It is an algorithm based on the fact that the solutions always go towards the best when searching. This paper proposes a JAYA variant (JAYA-SIP) with three improvements to the Original JAYA algorithm. It has incorporated the senior learning strategy, the incremental population strategy, and Powell’s local search method into JAYA. The improvements were tested with IEEE Congress on Evolutionary Computation (CEC) benchmark set for 30 and 50 dimensions, and the benchmark functions set from a special issue of the Soft Computing journal (SOCO) for 500 and 1000 dimensions. In addition to benchmark sets, the performance of JAYA-SIP was evaluated with nine CEC 2011 real-world test functions. The results of the proposed algorithm are compared with JAYA variants and some meta-heuristic algorithms. According to the results of the experiment and the analysis, the proposed improvements increased the performance of the JAYA algorithm. JAYA-SIP achieved better results than the other algorithms it was compared with.


I. INTRODUCTION
Meta-heuristic algorithms are the leading methods used in solving real-world and engineering problems. Meta-heuristic algorithms, i.e. high-level methodologies, are not dependent on the problem being solved, making them applicable to a large number of problem types [1], [2]. The main reason for this is that when traditional methods are employed to solve a problem, the problem becomes more complex as the scale of the problem increases, necessitating a considerable amount of processing power and time [3], [4], [5], [6], [7], [8]. Meta-heuristic algorithms reduce the time it takes to identify the best solution or the solution that most suitable. This has led to an increased interest in algorithms and the development of a significant number of meta-heuristic algorithms over the past 10 years. Examples of these metaheuristic algorithms that researchers are deeply interested in are Particle Swarm Optimization (PSO) [9], Ant Colony Optimization (ACO) [10], Artificial Bee Colony (ABC) [11], The associate editor coordinating the review of this manuscript and approving it for publication was Mu-Yen Chen .
The JAYA algorithm has attracted considerable attention due to its simple structure and the absence of algorithmdependent parameters. JAYA, by its very nature, has the ability to converge quickly. It does, however, have problem of getting stuck in the local optimum. To eliminate this shortcoming, researchers have made a number of changes to the algorithm.
In this study, a JAYA variant algorithm called JAYA-SIP for short, proposes three improvements in total, namely senior learning, incremental social learning, and Powell's local search method. These three improvements were implemented in order to prevent issues such as local optima and early convergence of the JAYA algorithm. The behavior of this algorithm in CEC 2014 and large-scale test functions was investigated and compared with the JAYA variant and current meta-heuristic algorithms. VOLUME 10, 2022 This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ JAYA-SIP includes Powell's local search method and this method is one of the methods used by combining it with various meta-heuristic algorithms. For example, [23] tried to improve the exploitation behavior of the ABC algorithm by combining the ABC algorithm and Powell's and used it in 22 unconstrained and 13 constrained function solutions. In another study, the Grey Wolf Optimizer algorithm was extended by Powell's local search method, and the variant they name PGWO was used for data clustering [24]. [25] solved the minimum energy broadcast (MEB) problem in the wireless sensor network with a variant of the Flower Pollination Algorithm based on Powell's method.
Aydın, Yavuz, and Stützle 2017 proposed a generalized, configurable ABC framework with an ISL mechanism. In addition, two other studies proposed an ABC variant by including ISL in the ABC algorithm (Yavuz, Aydin, and Stützle 2016; Yavuz and Aydın 2019).
JAYA-SIP uses Incremental Social Learning (ISL) strategy as its incremental population strategy [26]. There are also a number of different meta-heuristic algorithms using this method in the literature. De Oca et al. included the ISL method in the PSO algorithm [27]. Liao et al. brought together the ACO algorithm and ISL [28]. Özyön, Yaşar, and Temurtaş included the ISL method in the gravitational search algorithm (GSA) and used it to solve high-dimensional problems [29]. Özyön and Aydin used IABC, a variant of Artificial Bee Colony Algorithm (ABC) with an ISL mechanism, to solve the economic power dispatch problem in the prohibited operating zone [30]. Aydın, Yavuz, and Stützle proposed a generalized, configurable ABC framework with an ISL mechanism [31]. In addition, two other studies proposed an ABC variant by including ISL in the ABC algorithm [32], [33].
The contributions of this work are summarized as follows: • A JAYA variant known as JAYA-SIP is proposed.
• The senior learning mechanism is incorporated into the proposed algorithm.
• CEC 2014, SOCO, and Real World Problems are solved with JAYA-SIP.
• The ISL strategy and Powell's Method have been applied to the JAYA algorithm.
This study focuses on three proposed improvements for the Original JAYA algorithm. Section II briefly mentions the previous studies in the literature. Section III presents the general structure of the Original JAYA algorithm. In Section IV, the proposed improvements to the JAYA algorithm and the details of the proposed JAYA variant algorithm are given. Section V deals with the experimental environment, the results obtained by the algorithms included in the experiments, and their comparisons. Finally, Section VI summarizes the study and outlines the results.

II. RELATED WORKS
Many improvements have been made on the JAYA algorithm in the literature. These can be categorized as follows: • Using learning methods to increase population diversity • Improvement of the solution update equation • Hybridization of the original JAYA algorithm with other  techniques In the first category, researchers used a variety of learning methods to diversify the population of the solution and prevent the algorithm getting stuck to local optima. To determine the Photovoltaic model parameters, Yu et al. used a JAYA variant with experience-based and chaotic elite learning methods [34]. Rao and Rai proposed a JAYA variant using the quasi-oppositional-based learning method to increase the JAYA algorithm's population diversity [35]. Wang and Huang [36] have integrated the elite opposition-based learning mechanism into the JAYA algorithm. This mechanism is based on identifying effective solutions close to the global optimum [36]. A study by X. Yang and Gong improved upon their proposed Enhanced JAYA (EJAYA) algorithm with a strategy called Generalized opposition-based learning [37]. Alawad and Abed-alguni proposed a variant using three mutation methods for position updating and Refraction Learning as an initialization method for solving discrete real-world problems [38].
The solution update equation is the most important factor influencing the power of the JAYA algorithm. Various alternatives to the JAYA solution update equation have been proposed in the second category. Ingle and Jatoth proposed a position update equation based on Levy Flight to prevent the JAYA algorithm from remaining stuck in the local optimum [39]. Leghari et al. added a weight parameter to the position update equation and sought to determine its value adaptively [40]. Luu and Nguyen combined the JAYA algorithm's solution update step with the DE operator [41]. Jian and Weng used a JAYA variant solution with chaotic map-based and multiple solution update equations to determine the parameters of photovoltaic cells [42]. X. Yang and Gong added the improved solution update equation to the EJAYA algorithm alongside the learning method [37]. Rao and Keesari created the MTPG-Jaya algorithm, which uses multiple teams to search the solution space [43]. These teams use various equations to search the same population. Farah and Belazi, however incorporated three new mutation strategies based on chaotic maps into the algorithm to improve JAYA's performance and tested them with 16 benchmark functions [44].
Aside from the improvements made in the first two categories, the third category of JAYA algorithm improvement is the combination of the JAYA algorithm with other techniques. For example,, Gholami, Olfat, and Gholami combined the crow search algorithm, which excels at global search, and the JAYA algorithm, which excels at local search, and tested it on 20 benchmark functions [45]. Alotaibi combined the firefly and JAYA algorithms to prevent local optima from becoming stuck [46]. Goudos et al. combined the Grey Wolf and JAYA algorithms and applied them to two different antenna designs [47]. Xiong et al. used the Differential Evolution algorithm in combination with JAYA to determine the parameters of the solid oxide fuel cell model [48]. Kaur, Sharma, and Mishra proposed the JAYA-Bat algorithm as a solution for reducing cognitive radio network power consumption [49]. Kumar and Yadav [50] combined the Teaching Learning Based Optimization algorithm with JAYA [50] and Azizi et al. brought together the ant lion optimizer algorithm with JAYA [51]. Tefek and Beşkirli combined the JAYA algorithm with a method known as Elite Local Search, without affecting the JAYA algorithm's general structure, and used it to solve optimization problems [52]. Gupta, Kumar, and Srivastava combined JAYA with Powell's method and proposed three JAYA variants for solving the optimal power flow problem with a distributed generating unit [53]. Aslan, Gunduz, and Kiran hybridized their JAYA variant algorithm with a local search module and used it to solve binary optimization problems [54].
Although the JAYA algorithm was introduced as recently as 2016, it has already been used extensively and been applied to various types of problems [55], [56], [57]. Premkumar et al. developed a variant of JAYA that uses a chaotic mapping to determine photovoltaic cell parameters [58]. Luu and Nguyen proposed a new variant called Modified JAYA to identify solar cell parameters [41]. Gunduz and Aslan created a variation dubbed DJAYA, which they utilized to solve the discrete problem type traveling salesman problem by making two adjustments to JAYA [3]. Aslan, Gunduz, and Kiran developed a JAYA variant for binary optimization by combining the JAYA algorithm with the logic exclusive or operator [54]. Liu et al. proposed a JAYA variant for short-term forecasting of wind speed with the support vector machine (SVM) based JAYA-SVM and tested the success of their algorithm with seven different methods [59]. Warid developed the adaptive multiple teams perturbation-guiding Jaya (AMTPG-Jaya) for single goal optimum power flow (OPF) forms [60]. Degertekin, Lamberti, and Ugur proposed the Discrete Advanced JAYA (DAJA) algorithm for the optimization of truss structures under stress and displacement constraints, which are a discrete optimization type problem that is difficult to solve [61]. Rao and Keesari optimized the wind farm layout problems with the JAYA algorithm variant they developed [43]. Rao and Saroj minimized the total annual cost problem by using a variant referred to as Elitist-JAYA in the shell-and-tube heat exchangers design [62]. Thirumoorthy and Muneeswaran used a hybrid JAYA variant to solve the text clustering problem and compared it with some known meta-heuristic algorithms [63]. Chaudhuri and Sahu proposed a hybrid filter-wrapper approach based on JAYA and tested it on 10 micro-array datasets [64].

III. THE ORIGINAL JAYA ALGORITHM
The Original JAYA is a population-based meta-heuristic algorithm created by Venkata Rao in 2016 [22]. It takes its name from the Sanskrit word JAYA, which means victory. The algorithm has a straightforward structure and only requires general control parameters (the number of function calls (FES) and population size(NP)), rather than algorithm-specific control parameters.
Like other population-based algorithms, the Original JAYA has a population of randomly-distributed solutions in the search space. The basic logic of the algorithm is based on moving a solution in the population away from the worst solution and approximating it to the best solution in the population. The pseudo-code of the Original JAYA is given in Algorithm 1.
The Original JAYA algorithm begins its execution by generating random solutions. The positions of all solutions in the population are updated by Equation 1 until the execution budget is completed. Thus, it is ensured that all solutions avoid the worst solution (X t wo,j ) and reach the current best solution (X t best,j ).
In the equation, X t i,j represents the j. dimension of the i. solution in t. iteration. X t best,j and X t wo,j show the dimension of the best and worst solution in the t. iteration, respectively. Finally, r1 and r2 are randomly selected numbers from the range of [0, 1].

Algorithm 1
The Original JAYA Algorithm 1: Determine population size (NP) 2: Populate the population P(i = 1, 2, · · · , n) with random solutions 3: t ← 1 initialize iteration 4: while The termination criteria are not met do 5: Find the current best and worst solutions of the population 6: for i = 1 to NP do 7: Update the position of the current solution with Equation 1 using the information of the best and worst solutions. 8:

IV. SENIOR LEARNING JAYA WITH POWELL's METHOD AND INCREMENTAL POPULATION STRATEGY
The proposed algorithm and its components will be described in detail in this section.

A. SENIOR LEARNING STRATEGY
The exploration and exploitation capabilities of the Original JAYA algorithm are based on its position update Equation (Equation 1) which is described in Section III. Although JAYA has a powerful position update equation, it can at times be insufficient in complex problem types [52], [65].
In order to eliminate this deficiency, the JAYA-SIP algorithm has included a learning method called senior learning in the JAYA algorithm. In this approach, Equation 2 was used to replace JAYA's position update equation, and an algorithmic VOLUME 10, 2022 component called Senior Pool (SE) was added to the algorithm as a result of this equation.
In the equation, X t i,j represents the j. dimension of the i. solution in t. iteration. X t best,j and SE ra,j show the j. dimension of the best solution and randomly selected solution from SE pool in the t. iteration, respectively. Finally, r1 and r2 are randomly selected numbers from the range of [0, 1].
In JAYA, the positions of the solutions are updated with Equation 1 during iterations. As a result of each update operation, if the objective function value of the new solution is better than the objective value of the existing solution, the current solution is updated with the new solution, and the information of the old solution is discarded. In the original JAYA this could lead to the loss of previous knowledge and experience in the scanned region in the search space, and thus, these regions to be searched again in the subsequent iterations. In this proposed study, after the current solution's position update, if the objective value of the new solution is better than the objective value of the old solution, the old solution is replaced with the new solution and the information of the old solution is not discarded. Instead, this solution is labeled ''Senior'' and saved in a ''Senior Pool (SE)''. Thus, this pool preserves the previous search space experience.
Solutions are added to the senior pool throughout the iterations, and as a result, the pool size can reach enormous sizes. This causes bad solutions to fill the pool and causes the algorithm to waste execution time for bad solutions. To avoid this bad situation, the SE pool size is reduced at a certain rate in each iteration. For this purpose, a parameter called ratio (ratio = 0.10) is used, and the new pool size is determined using this parameter. After the determination process, the solutions placed in the pool are ranked according to their objective function values. Solutions with the worst objective function value are discarded/removed from the pool until the size of the SE reaches the newly calculated pool size. This prevents the SE from growing too large and keeping bad solutions in the pool. In addition, a minimum value (Minimum Pool Size, MPS) is defined for the pool size (MPS = 3). The size of the SE pool cannot fall below this value. Thus, it is ensured that the algorithm always works with a small number of Senior solutions.

B. POWELL's METHOD
In order to strengthen the exploration ability of the JAYA-SIP algorithm, it is considered in this study to add a local search method to the JAYA algorithm. For this purpose, a wellknown, hybridized with many meta-heuristic algorithms [23], [24], [25], using Brent's technique [66] Powell's conjugate directions method was preferred [67]. The key reason for this is that it provides better performance. In addition, it is easy to hybridize with the meta-heuristic algorithm.
Powell's method is not an approach called for in every iteration in the proposed algorithm. The aim is to keep the execution budget from being spent. According to Algorithm 2, JAYA-SIP calls Powell's method when it thinks that the search process has entered stagnation. This is determined by a parameter specified as sg. The value of sg is initially set at 0. When the position of a solution is updated (Equation 2), if the objective function value of the new solution is worse than the current solution, the value of sg is increased by one. When the value of this sg reaches the ''stagnation factor value'' specified at the beginning of the algorithm, JAYA-SIP calls Powell's local search method, and the sg value is reset. The solution with the current best objective value is used as the starting point for Powell's local search method.

C. INCREMENTAL SOCIAL LEARNING STRATEGY
The final improvement of the proposed algorithm is the incremental population method. For this method, the Incremental Social Learning mechanism (ISL) has been selected [27]. ISL is tried to increase the diversity of the population by adding new solutions to the existing population with ISL and to avoid getting stuck in the local optimum. The number of the growth period, which is selected at the start of the algorithm and called growthperiod, determines how long it takes to add a new solution to the population. After each growthperiod iteration, a new solution is added to the population. Unlike the random generation method used during the initialization step, the new solution is created using Equation 3. The new solution is produced close to the best solution in the population. The logic here is that there could be superior solutions around the best solution. The ISL process continues until the maximum number of populations(NP max ).

X t+1
i,j = X t best,j + ra(0, 1) * (X t best,j − ra(X l − X u )) (3) X t best,j , represents the j. dimension of the best solution of the current population. X l and X u represents the minimum and maximum bounds of the solved problem. ra(0, 1) represents a random number generated between 0 and 1.X t+1 i,j is the j. dimension of the new solution generated.

V. EXPERIMENTS
The experimental environment and conditions designed to test the performance of the proposed JAYA-SIP algorithm will be explained in this section, followed by the outcomes of the algorithms in the experiments.
The first experiments conducted are the CEC 2014 benchmark set experiments. CEC 2014 is a benchmark set that Algorithm 2 Pseudo-Code of JAYA-SIP 1: Determine population size (NP) 2: Set the problems dimensions (D) 3: t ← 1, flag i ← 0 4: stagnation, ratio ← 0.10, MPS ← 3 5: Populate the population P(i = 1, 2, · · · , n) with random solutions 6: SE ← ∅ Create the Senior pool (SE) 7: while FES ≤ MaxFES do 8: for i = 1 to NP do 9: for j = 1 to D do 10: Update X t i,j solution with Equation 2 11: Put the old solution in the pool 13: flag i ← 0 14: sg ← 0 15: else 16: flag i ← flag i + 1 17: sg ← sg + 1 18: i ← i + 1 19: if sg < stagnation then 20: Determine the current best solution of the population 21: Use the current best solution of the population as the starting point and apply Powell's method 22: sg ← 0 23: Apply the increasing population method (ISL) 24: if t > 0 and t % growthperiod == 0 and NP < NP max then 25: P ←X t+1 i,j 26: poolsize ← NP × ratio set pool size 27: Sort solutions in Senior Pool(SE) by objective function values 28: Discard the worst solutions from the pool until the senior pool size equals poolsize.
is the objective function value obtained by the algorithm, and f (x * ) is the global optimum value of the test function) is used in the comparison. The error values obtained by following the rules of [68] were accepted as 10 −8 when they were less than 10 −8 .
The second part of the experiments was carried out for large-scale test functions. For this purpose, the SOCO benchmark set prepared for the Large Scale Optimization special issue of Soft Computing Journal was used. This benchmark set can be seen in Table 3. This benchmark set includes 19 functions, 4 separable and 15 non-separable SOCO experiments were carried out following the operating rules specified in [69]. Accordingly, the algorithms were run independently 25 times for each test function. The execution budget is equal to the number of D×5000 function calls. The algorithms were compared by taking the median value of the obtained error values (f (x) − f (x * )). The error values obtained by following the rules of [69] were accepted as 10 −14 when they were less than 10 −14 The performance of the JAYA-SIP algorithm was tested using real-world engineering problems in the third part of the experiments. Nine test functions were selected from the CEC 2011 benchmark set. The experiments were carried out in accordance with the conditions specified in [70]. The JAYA-SIP results were compared with the JAYA variants.
All of the CEC 2014, SOCO and Real World problem experiments were conducted using an Ubuntu Linux computer with an AMD Ryzen 5 1600 with 16GB of RAM.
In order to conduct the experiments fairly, the control parameters of JAYA-SIP and the algorithms included in the experiments were configured using the irace tool [74], a parameter configuration tool. For parameter configuration, CEC 2014 10 dimension is used as a training set. The execution budget was selected as D × 5000. The parameters of the algorithms, parameter range values, and parameter values configured with irace are listed in Table 1.

B. EXAMINING THE EFFECTS OF THE PROPOSED MODIFICATIONS
This section examines the contributions of the proposed improvements to the Original JAYA. For this, each improvement has been added to the JAYA algorithm separately and run on the CEC 2014 benchmark set for 30 and 50 dimensions. The proposed algorithm (JAYA-SIP) has been compared with JAYA+ISL, which is the ISL added version of JAYA, JAYA+Powell, which is Powell's method added to JAYA, and JAYA+SL, which is the inclusion of the Senior learning method The mean error values obtained through the algorithms are presented in Table 4 for 30 dimensions and in Table 5 for 50 dimensions. At the bottom of the tables, the results of the pairwise comparison between the improvements with JAYA-SIP are given. ''Win'' means JAYA-SIP algorithm wins, ''Lost'' means the compared improvement wins, and ''Draw'' means the comparison is a draw. The Rank row shows the average ranking value obtained by the algorithms in 30 problems.
First, JAYA-SIP and improvements on JAYA were compared for 30 dimensions in Table 4.Accordingly, it appears that JAYA-SIP, which contains all the improvements in its structure, showed great success. While JAYA-SIP achieved lower error values in 28 of 30 problems against JAYA+ISL, JAYA+ISL achieved smaller values in 2 of them. In addition, the suggested method surpassed JAYA+SL in 25 problems while falling short in 5 others. When compared with another improvement, local search integration, JAYA+Powell, it is  Afterwards, the effects of the proposed improvements on the JAYA algorithm when the size is increased were examined. Accordingly, when Table 5, which contains 50 dimension results, was analyzed, it was discovered that the order of the 30 dimension results remained constant as the dimension was increased. It can be said that JAYA-SIP achieved better results than the other algorithms. The proposed algorithm had 30 wins against JAYA+ISL, 26 wins against JAYA+SL, 4 losses, and finally, 20 wins, 8 losses, and 2 draws against JAYA+Powell.
Not considering the JAYA-SIP results, it can be said that Powell's local search method outperformed the others, only when the suggested improvements were examined among themselves. However, no single improvement has achieved better results than JAYA-SIP.

C. CEC 2014 TEST RESULTS
CEC 2014 experiments were conducted in two phases. The performance of the JAYA-SIP algorithm was first compared with the Original JAYA algorithm and its improved current variants. After that, some meta-heuristic algorithms from the literature were compared. Experiments were conducted for 30 and 50 dimensions.
The proposed algorithm and the algorithms included in the experiments were compared in pairs with Wilcoxon's ranksum test at the 0.05 significance level. Wilcoxon rank-sum test results of the algorithm compared with JAYA-SIP are given in Tables 7, 9 ≈ sign is used when the results obtained are not significant, the + sign is used if JAYA-SIP is significantly better, and − if JAYA-SIP is significantly worse.
In addition, summary information of statistical test results is given in Draw, Win, and Lost lines below the tables. Moreover, the average ranking value obtained by the algorithms is also included in the rank row.

1) EXPERIMENTS WITH JAYA VARIANTS
The 30 and 50 dimension results of JAYA-SIP and JAYA variants are presented in Table 6 and Table 8. In the VOLUME 10, 2022  According to the 30 dimension results in Table 6, the proposed algorithm obtained smaller objective function values in 24 test functions from JAYA, 20 from CL-JAYA, 27 from LJA and MJA, and 26 from LW-IJAYA.
According to the 50 dimensions results of JAYA variants in Table 8, JAYA-SIP obtained the smallest error value in all unimodal functions. In multi-modal functions, the proposed algorithm was the best algorithm in 10 of the test functions. On the other hand, in hybrids, JAYA-SIP took first place in 4 functions. Finally, while it was the first algorithm in three of the composite functions, it came in second place in two others, resulting in competitive outcomes.
When the 50 dimension results in Table 8  In addition to the 30 and 50 dimension results, the JAYA-SIP algorithm's convergence speed was also investigated. From the CEC 2014 benchmark set, one unimodal, multimodal, composite, and hybrid test function was selected, and plots for 30 and 50 dimensions were created. The convergence curves obtained are shown in Figure 1. When the figures are compared, it is seen that the JAYA-SIP algorithm converges faster than other JAYA variants.

2) EXPERIMENTS WITH OTHER META-HEURISTICS
The results of JAYA-SIP are compared to some current meta-heuristic algorithms in this section. Controlled restart in differential evolution (b6e6rl)(The results of the b6e6rl algorithm are taken from the original article.) [75], TSA, GWO, and SCA were the meta-heuristic algorithms employed in the comparison. Table 10 shows the results of 30 dimensions. However, 50 dimension results are presented in Table 12.
Based on Table 10, it can be said that JAYA-SIP achieved better results than the other algorithms. While it obtained better results than TSA in 19 of the 30 functions, TSA which is one of the recent metaheuristic algorithms achieved better results in 9 of them, and the algorithms had similar results in 2 of them. However, it had 19 wins and 8 losses against GWO, which is one of the popular meta-heuristic algorithms, while the algorithms achieved the same results in 3 functions. While JAYA-SIP was ahead of SCA in 20 functions, it was behind in 9 of them. In one function, the result was a draw. The JAYA-SIP algorithm achieved 22 wins, 5 losses and 3 draws against b6e6rl. Based on the 30 and 50 dimension results in the tables, JAYA-SIP was the algorithm with the lowest average rank in 30 functions compared to other meta-heuristic algorithms. It was able to examine the search spaces of the functions in the benchmark set more effectively. This is due to the improvements in JAYA-SIP discussed earlier in this section, which strengthened the algorithm's exploration and exploitation aspects.

D. SOCO LARGE SCALE EXPERIMENTS
In this section, the scalability behavior of the JAYA-SIP algorithm in large scale problems is examined. For this purpose, the SOCO large scale benchmark set was solved for 500 dimensions and 1000 dimensions. The median error values of the algorithms for 500 dimensions are presented in Table 14 and for 1000 dimensions in Table 15.
Wilcoxon's rank-sum test was used to compare the proposed algorithm and other algorithms in pairs at the 0.05 significance level. Statistical results are given in the p-value line at the bottom of the tables. The signs ≈ are used if the results   the results of JAYA-SIP are worse. Furthermore, the numbers of wins, losses, and draws against the algorithm with which JAYA-SIP was compared are given in the Draw, Win, and Lost lines at the bottom of the tables.
When the 500 dimension results in Table 14 are examined, it is seen that the JAYA-SIP algorithm showed superior performance against the other algorithms it was compared to. JAYA-SIP won all 19 of the 19 functions of the SOCO benchmark set against CLJAYA, LJA, LW-IJAYA, MJA, SCA and TSA. It only lost in one test function against GWO and JAYA. Considering the statistical tests performed, it can be said that the results of the proposed algorithm were significantly good.
When the problem size was increased to 1000, JAYA-SIP once again outperformed the other algorithms, as seen in Table 15. In the 1000 dimension median results, it was ahead VOLUME 10, 2022 of CLJAYA, GWO, JAYA, LW-IJAYA and SCA algorithms in 18 test functions, and a function obtained JAYA-SIP worse results from these algorithms. However, the proposed algorithm had 19 wins against LJA, MJA, and TSA.
SOCO test results showed that JAYA-SIP maintains its performance even as the problem size increases. This shows that the scalability aspect of the algorithm is strong.

E. REAL WORLD EXPERIMENTS
The proposed algorithm's performance has also been evaluated using real-world problems. Real-World Optimization Problems from the CEC 2011 Competition were preferred for this [70]. This benchmark set includes problems from Communication, Chemistry, Economics, and Astronomy. In the experiments, nine test functions from this benchmark set were used, and Table 16 provides a summary of their information. More detailed information about the problems can be found in [70].
Experiments with real-world problems were conducted by following the rules stated in [70]. Accordingly, each algorithm was independently run 25 times for every test function. Each algorithm is run up to the 150000 function evaluation (FES) and the parameters listed in Table 1 were used to run the algorithms. Table 17 displays the results, including the mean error value and standard deviation.
The results of the real world problems of the JAYA-SIP algorithm are compared with the JAYA variant algorithms. As a result, the proposed algorithm performed better than other JAYA variants and had an average ranking value of 2.44 when compared to the other algorithms. As a result, the JAYA-SIP algorithm has shown the same performance in the real world benchmark set as in the CEC 2014 and SOCO.

F. ALGORITHM COMPLEXITY
The computational complexity analysis of the JAYA-SIP algorithm is performed in this section. The obtained results were compared with the Original JAYA and JAYA variants. For comparison, the method in the problem definition of the CEC 2014 benchmark set was used [68].  value is then calculated using 200000 function calls of the 18th test function from the CEC 2014 benchmark set. The algorithms then solve the 18th function 200000 times to determine the T 2 time value. Finally, theT 2 variable is determined by taking the average value of running the algorithms five times. The values obtained as a result of these processes are provided in Table 18 for 30 dimensions and Table 19 for 50 dimensions.
The smallT 2 value in Tables 18 and 19 indicates that the algorithm needs less computational time. In addition, (T 2 − T 1)/T 0 ratio is calculated in these tables. This ratio is also used to express the complexity of algorithms. Here, the smaller value indicates that the algorithm is better. The JAYA-SIP algorithm has less computational complexity than the algorithms with which it is compared, with 9.7853 and 13.4093.

VI. CONCLUSION
In this study, three improvements were proposed for the Original JAYA algorithm. The algorithm was enhanced with senior learning, incremental population strategy, and Powell's method, resulting in a powerful JAYA variant (JAYA-SIP). The performance of the proposed JAYA variant was tested with the CEC 2014 benchmark set for low dimension, the SOCO for large scale and nine CEC 2011 problems for real world problems. The results of the algorithm were compared with JAYA variants and various recent meta-heuristic algorithms. According to the results of the experiment, better results were obtained than the algorithms in which the JAYA-SIP algorithm was compared according to the objective function value.
A future study consideration is the proposed algorithm to binary optimization problems and expensive problems.