A Self-Adaption Butterfly Optimization Algorithm for Numerical Optimization Problems

For shortcomings of poor exploaration and parameter complexities of the butterﬂy optimization algorithm, an improved butterﬂy optimization algorithm based the self-adaption method (SABOA) was proposed to extremely enhance the searching accuracy and the iteration capability. SABOA has advantages of having fewer parameters, the simple algorithm structure, and the strong precision. First, a new fragrance coefﬁcient was added to the basic butterﬂy optimization algorithm. Then, new iteration and updating strategies were introduced in global searching and local searching phases. Finally, this paper tested different optimization problems including low-high functions and constrained problems, and the obtained results were compared with other well-known algorithms, this paper also drew various mathematical statistics ﬁgures to comprehensively analyze searching performances of the proposed algorithms. The experimental results show that SABOA can get less number of function evaluations compared to other considered algorithms, which illustrates that SABOA has great searching balance, large exploration, and high iterative speed.

Butterfly optimization algorithm (BOA), which is inspired by swarming and mating behaviors of butterflies, is proposed by Arora and Singh [34], [35]. BOA mainly consists of two basic parts including the global search phase and the local search phase [36]. In the global search phase, disparate solutions are obtained by exploring the searching space. In the local search phase, more detailed solutions are got by exploiting information about the current optimum solution. Because of the unique optimization mechanism, many scholars are interested in BOA, and BOA has been widely introduced in the science and engineering fields [37], [38]. There are two initial parameters including the initial sensory modality coefficient and the initial power exponent coefficient. And initial parameters not only influence iteration time but also result in precocious problems. If unsuitable parameters are set in BOA before optimization operation, the BOA will have unsteady local exploitation and slow convergence speed. However, the initial sensory modality coefficient and the initial power exponent coefficient in BOA are chosen by engineers experiences, which causes a large extent of fuzziness and randomness in BOA. To improve the performance of BOA and overcome its deficiencies, this paper proposed an improved butterfly optimization algorithm based the self-adaption method (SABOA). This combination of the adaption method and basic BOA will assurance that the calculated solution can reach the searching aim, also avoid local defects through parameter randomization. To achieve a more comprehensive analysis of the proposed algorithm, this paper selects different testing functions and constrained problems to verify the effectiveness of the proposed algorithm.
The remainder of this paper is organized as follows. Section 2 introduces the basic BOA. The proposed algorithm SABOA is presented in Section 3. In Section 4, this paper uses SABOA and comparison algorithms to calculate low dimension functions and high dimension functions. Constrained problems are discussed in Section 5. The conclusions are presented in Section 6.

II. THE PROPOSED HEURISTIC OPTIMIZATION ALGORITHM
Butterfly optimization algorithm (BOA) is an inspired metaheuristic algorithm based on living habits of butterflies. When a butterfly moves from one position to another position in the searching space, the butterfly can generate fragrance which will vary accordingly. Butterflies can sense the fragrance and get attracted toward the butterfly which generates the maximum concentration of the fragrance, and the propagated fragrance is sensed by butterflies and a collective fragrance network is formed. When butterflies are in the fragrance network, they will fly to the butterfly that can generate the maximum concentration of the fragrance and this phase is termed as the global search phase. When butterflies are not able to sense the fragrance network, they will fly randomly and this phase is called the local search phase. In other words, all butterflies in searching space fly to new positions and then their fitness values are calculated by functions in each iteration. So the butterfly optimization algorithm mainly consists of the global search phase and the local search phase. Searching for food and mating partners by butterflies can occur at both local and global phases.
Set the position vector of each butterfly in T × N dimensional space, where T means the number of iterations and N represents population size. Hence, the position vector of each individual can be represented by a multidimensional matrix in (1).
where t indicates the current iteration, xt i is the position of the i-th butterfly.
In the global search phase, the next position of each butterfly can be determined by (2).
where g * means the best position in searching space. r is a random number in the range of [0,1]. f is the fragrance coefficient which means the perceived magnitude of the fragrance, all butterflies can send out some fragrance that makes butterflies attract each other, and f can be updated in (3).
where c is the sensory modality coefficient, I means the stimulus intensity, and a is the power exponent coefficient. The parameter a controls the algorithm behavior. Another parameter c is also a crucial parameter in determining the speed of convergence and how the BOA algorithm behaves. Parameters a and c crucially affect the convergence speed of the algorithm. BOA takes a and c in the range [0,1]. I mean the fitness value of the solution.
In the local search phase, the next position of a butterfly can be determined by (4).
where t indicates the current iteration. x t i is the position of the i-th butterfly, x t j and x t k are the positions of the j-th and k-th butterflies in the searching space.
Butterflies can search for food at both local and global scales. BOA considers physical proximity and various other VOLUME 8, 2020 factors including rain and wind, and butterflies have a significant fraction p in searching activities. So the switch probability p is applied in BOA to switch between common global searching phase to intensive local searching phase. The parameter p is in the range of [0, 1].
The iterative process of the Butterfly Optimization Algorithm can be presented as follows: Step 1: In the initialization phase, the algorithm defines the objective function and its solution space. Set the maximum number of iterations T . Set t = 1. Randomly generate N butterflies positions x t i (i=1,2,...,N ),(t=1,2,...,T ) . Determine searching range and all initial parameters including a and c. Define switch probability p.
Step 2: Calculate the fragrance coefficient of each butterfly using (3). Record the best butterfly position whose concentration of the fragrance is maximum.
Step 3: Randomly generate r in the range of [0, 1]. Contrast r with the switch probability p, if r is less than p, update the next position of each butterfly using (2). If r is larger or equal to p, update the next position of each butterfly using (4).
Step 4: Update the power exponent a coefficient.
Step 5. Calculate each fitness value and compare all fitness values. Record and replace the best solution and the best fitness value if there is a better solution.
Step 6. Calculate t = t + 1. Judge whether t equals to T . If no, return to step 2 and go on. Otherwise, stop iterative loops.
The BOA main step can be summarized in the pseudo-code shown below:

while (t < T)
for

III. SELF-ADAPTION BUTTERFLY OPTIMIZATION ALGORITHM
In BOA, the initial sensory modality coefficient c and the initial power exponent coefficient of the fragrance coefficient are selected according to artificial experiences. A lot of experiments are needed before the initial c and the initial a is selected. However, many industry experiments can not be carried out many times because of the limitation of materials and the bad situation of experimental environments. In the search phase, the random number r greatly affects the searching efficiency. When r is selected too large, too large random parameter makes that BOA has high randomization, therefore, it is easy to jump from one region to another region, which causes low searching accuracy and low searching efficiency.
To obtain a better optimization result in BOA, we introduce an improved butterfly optimization algorithm based self-adaption method, and the proposed algorithm is called SABOA. To eliminate the stochastic behavior and blindness of the fragrance coefficient in BOA, the self-adaption method is added into the new fragrance coefficient. The self-adaption method is set to make the fragrance distribution range smaller and smaller so that the algorithm can search more and more carefully. When the searching process begins, the initial fragrance distribution range is large. Then, the fragrance distribution range gradually decreases as the iterative process goes on, which ensures the convergence speed and ability to jump out of the local extreme point.
The new fragrance coefficient can be updated in (5).
where t indicates the current iteration, T is the maximum iteration. To create a more balanced distribution of the fragrance, u obeys the standard normal distribution using the computation program. The ideal optimization process includes the high searching ability of the early phase and the great precision ability of the later phase. For the global search phase, SABOA works by adjusting the difference between the best solution and each butterfly position of the current iteration.
Thus, SABOA makes each individual have a more careful and thorough the searching mechanism around the best solution of the current iteration. The new position updating formula in the global search phase can be expressed as follows.
where g * means the best position in the searching space. For the local search phase, SABOA uses the average value of the best position and the worst position in each iteration to update the next position of each butterfly, which can reduce the blindness of randomly selecting two butterflies. The new position updating formula in the local search phase can be expressed as follows.
where w * means the worst position in the searching space.
Step 1: In the initialization phase, the algorithm defines the objective function and its solution space. Set the maximum number of iterations T . Set t = 1. Randomly generate N butterflies positions x t i (i=1,2,...,N ),(t=1,2,...,T ) . Determine searching range. Define switch probability p. Calculate each fitness value and compare all fitness values. Record the best solution g * and the worst solution w * .
Step 2: Set u in the standard normal distribution. Calculate the fragrance coefficient of each butterfly using (5).
Step 3: Randomly update r in the range of [0, 1]. Contrast r with the switch probability p, if r is less than p, update the next position of each butterfly using (6). Otherwise, update the next position of each butterfly using (7).
Step 4: Calculate each fitness value. Record the best solution, the best fitness value, the worst solution, and the worst fitness value in the current generation. Replace the best solution g * and the best fitness value if there is a better solution. Replace the worst solution w * and the worst fitness value if there is a worse solution.
Step 5: Calculate t = t + 1. Judge whether t equals to T . If no, return to step 2 and go on. Otherwise, stop iterative loops.
The flow chart of SABOA implementation in Figure 1.
The SABOA main step can be summarized in the pseudo-code shown below: SABOA uses the self-adaption method to get a better optimization result for basic BOA, and advantages of the proposed algorithm mainly include three aspect. First, the self-adaption method can make the fragrance distribution factor range smaller and smaller, so that SABOA can find the extremum more and more carefully. Second,the self-adaption method can ensure the convergence ability to jump out of the local extreme point. Three,the proposed algorithm works by adjusting the difference between the best solution and each solution, which can wander around the best solution in the current iteration. However, SABOA has more computational complexities than the basic BOA because SABOA need to calculate the best solution and the worst solution in each iteration to enhance the searching precision, which can increase the calculation amount of the proposed algorithm.

IV. FUNCTION EXPERIMENTS A. TESTING ENVIRONMENTS
To reflect the performance of the proposed algorithm more accurately and more entirely, different testing functions are calculated in this paper. Testing functions, which are listed in Table 1, mainly include seven low dimension functions ( f 1 -f 7 ) and seven high dimension functions (f 8 -f 14 ). In Table 1, D means the function dimension, and f min means the ideal value of each function.
High dimension functions have many local optimization solutions, which increases the deception of problem-solving. When high dimension functions are calculated, the solution space and the complexity will increase exponentially with the increasing of dimensions. So, high dimension functions can test capabilities to solve complex functions. The low dimension function usually has a single optimization solution, and it is easy to find the best aim. In the original BOA algorithm article, literatures [34], [35] compared BOA with seven algorithms, and literatures [34], [35] showed that BOA had the best searching ability in all testing algorithms. In order to analyze the proposed algorithm comprehensively, this paper compared the proposed method with other algorithms that are different from the mentioned seven algorithms of original BOA literature. Testing algorithms in this paper include flower pollination algorithm (FPA) [33], salp swarm algorithm (SSA) [26], sine cosine algorithm (SCA) [30], lévy flight trajectory based whale optimization algorithm (LWOA) [39]. All initial parameters of testing algorithms were chosen by relevant algorithm literatures.
FPA was proposed by Xin-She Yang. There are two searching phases including biotic and cross-pollination in FPA. Biotic and cross-pollination can be seen as the global pollination process with pollen-carrying pollinators performing lévy flight trajectory. In FPA, parameter p was equal to 0.8, parameter β was equal to 1.5.   In BOA, modular modality c was 0.01 and power exponent a was increased from 0.1 to 0.3. The switch probability p was equal to 0.8.
In SABOA, switch probability p was equal to 0.8. Initial parameters of comparison algorithms were selected optimum parameters by original algorithm literatures. Operational details of comparison algorithms are in original algorithm literatures. The population size and the maximum iterations were 50 and 500, respectively. Each algorithm was independently run ten times in MATLAB.
To make a fair comparison, all algorithms were programmed in MATLAB (R2014b, The MathWorks, Inc, Natick, MA, USA). All the experiments were conducted on a laptop with Intel (R) Core (TM) i5-4210U CPU, 2.30 GHz, 4 GB RAM. All data and figures were completed in MATLAB (R2014b, The MathWorks, Inc, Natick, MA, USA).

B. DATA DISCUSSIONS
In Table 2 and Table 3, Best, Worst, Mean, Std respectively represent the optimal fitness value, the worst fitness value, the average fitness value and standard deviation when functions are calculated ten times. Table 2 and Table 3 show the results of the algorithm in different dimensions. With the statistical results, we can make sure that the results are not generated by chance. From the results presented in Table 2 and  Table 3, SABOA is able to get the best values among different testing functions in comparison with the BOA, FPA, LWOA, SCA, and SSA, and the best value is closer to f min of Table 1 in each function. For BOA and SABOA, most functions solutions of BOA are unsatisfactory. SABOA can get the ideal accuracy for almost functions, it can achieve a high precision level of f 3 , f 6 -f 7 , f 9 , and f 11 -f 14 . With the increase of the searching dimension, the difficulty faced by the algorithm will emerge problems of low solution accuracy and solving speed slowing down. The proposed SABOA in this paper can also calculate each fitness function effectively with the best function value. Standard deviation can reflect the discrete degree of a data set, the large standard deviation performs a large difference between most calculated values and the average of all values, and the small standard deviation represents that calculated values are closer to the average of all values. To discuss and analyze the efficiency and robustness of the proposed algorithm, this paper computes the mean and standard deviation to indicate how reliable SABOA is. We can easily find out that the SABOA algorithm has the smallest standard deviation in each function, which can show that SABOA has a steady searching ability and robustness than other algorithms in low-high dimension problems. The numerical results in Table 2 and Table 3 can demonstrate that SABOA has a consistent and reliable performance to get the global minimum, and SABOA is obviously better than the original BOA.

C. ITERATIONS DISCUSSIONS
Iterations, whose aims are to approach desired results, are activities of repeating feedback action, and the result of each iteration will be the initial value of the next iteration. Figure 2 shows the average iterations curves of all algorithms disposing different functions under 10 independent runs. It must be mentioned here that all the curves in Figure 2 are the average convergence curves. In Figure 2, the proposed SABOA algorithm has the fastest iteration speed in all iteration curves, which shows that SABOA can enhance the diversity of achievable solutions within shorter computation times compared to other algorithms.
For BOA, the low dimension iteration speed is larger than the high dimension iteration speed. And the iteration speed of BOA in low dimension is Rapid and intense, but the iteration speed of BOA in high dimension is Smooth and slow. BOA has the worst searching result in function 5. FPA only in function 5 has the good convergence curve and iteration performance, other FPA curves are all in a state of precocity and falling into the local optimum solution, especially for function 8 to function 12. LWOA has the good convergence effect, it can jump out the local optimum expect function 1 and function 8. SCA has the slow iteration speed in the early searching phase, especially for function 1 to function 4. SCA has the worst searching result in function 3, so SCA can not meet the convergence requirements under the fast convergence environment. SSA in all functions can not carefully search the optimal solution. SABOA also can overcome BOA shortcomings of premature convergence and poor local searching abilities, having the fastest convergence speed of the early phase and the best searching accuracy of the later optimization phase. Therefore, these results demonstrate that the SABOA has an outstanding ability to explore feasible regions and escaping from local optimal solution in the searching scope, and the adaption mechanism can make SABOA effectively avoid the local minimum and searching extreme points. The fastest iteration speed reflects that SABOA can quickly lead each butterfly to the close-by optimal point. Considering all the advantages discussed above, the optimal solution can be got successfully by applying SABOA, which outperforms the basic BOA for all testing functions. VOLUME 8, 2020

D. BOX-PLOT CHARTS DISCUSSIONS
The box-plot chart, that is a statistical chart, can show a set of dispersed data, it can be used in several samples to analyze the symmetry. There are five indexes in a box-plot chart, including the maximum value, the minimum value, the median, the upper quartile, and the lower quartile. The algorithm stability and robustness can be confirmed by evaluating five indexes of a box-plot chart. The box-plot chart, that is a statistical chart, can show a set of dispersed data, it can be used in several samples to analyze the symmetry. The algorithm stability and robustness can be confirmed by evaluating five indexes of a box-plot chart. All box-plot charts of different algorithms after 10 independent runs are shown in Figure 3. Figure  3 clearly shows that box-plot charts of SABOA have the narrowest form, the fewest outliers, and the weakest dispersion, which indicates that the SABOA has a balanced convergence ability and the approaching effect when calculating function problems. The median, upper quartile, and lower quantile of the proposed algorithm are lowest in Figure 3, which also displays that SABOA has an outstanding searching capability. Figure 4 shows three-dimensional images of low dimension functions, the optimal BOA search path and the optimal SABOA search path of low dimension functions. In a twodimensional plane, the optimal BOA search path, the optimal SABOA search path, and the contour plot of each function are shown separately. As can be seen from Figure 4, the optimal search path of the SABOA is much less than the search path taken by the BOA, and there are numerous short-distance, repeat, invalid, and occasional long-distance paths in BOA search path. SABOA uses the searching strategy of tightening from the periphery to the middle due to group optimal constraints. Compared with the basic BOA algorithm, the main reason that the SABOA search path is obviously increased with the increasing of iterations is that SABOA has a strong searching mechanism around the best solution. And because SABOA works by adjusting the difference between the best solution and each solution, a lot of calculating time is saved. Therefore, for the proposed algorithms, a few iterations can get a better path without too much computation, which also displays that SABOA has exceedingly higher positioning accuracy. Two sets of search path experiments show that VOLUME 8, 2020    compared with the BOA algorithm, the SABOA algorithm not only can give a diversity of solutions but also has more flexibility.

F. THE WILCOXON RANK SUM TEST
The rank sum test, which is a nonparametric test, does not depend on the specific form of the overall distribution, dispersed type, and known distribution, which has strong practicability. The rank sum test is to arrange all observations in order from small to large, and each observation is called the rank. The non-parametric statistical test can be applied in algorithms testing to check whether the proposed algorithm is statistically significant or not [5], [40]. However, the basic rank sum test only considers the difference sign, and ignores the absolute value difference, which can cause the rough result and the loss of some tested information. To enhance the testing performance of the basic rank sum test, Wilcoxon proposed an improved testing strategy called the wilcoxon rank sum test [41]. The wilcoxon rank sum test considers the difference direction and the difference size, can also be applied to test any difference in the distribution data, which is more effective than the basic rank sum test. The results of the wilcoxon rank sum test is called the p-values, and the p-values that is less than 0.05 indicates that there is significant difference at a level of 0.05.
To have a more comprehensive algorithm discussions, this paper used the Wilcoxon rank sum test to carry out the data analysis. The test was computed by using the SABOA results against those of other comparison algorithms at 0.05 significance level. All p-values were listed in Table 4. From Table 4 we can find that the result of LWOA in function 8 is larger than 0.05, other results are all less than 0.05. The Wilcoxon rank sum test results shows that SABOA has the highest searching accuracy, the greatest searching efficiency, and the strongest searching mechanism around the best solution in all contrast algorithms, which further demonstrates the perfect performances of SABOA.

V. CONSTRAINED PROBLEM EXPERIMENTS
Constrained optimization problem mainly consists of two parts, including objective function and constraints on variables of the objective function. The optimization process is to find the optimal solution under constraints. In many engineering design fields, there are different conditions that bring great challenges to solving the processing of algorithms. In this section, the excellence and competitiveness of the proposed algorithm are examined by solving several constrained optimization problems. In order to confirm the proposed method for constraint problems, different constrained problems are examined and then, the resulting performance of SABOA against constrained problems that are commonly used in kinds of literature were tested, and the results have been compared with another well-known algorithm. In this section, the Initial parameter of SABOA is the same as the last section. The proposed algorithm was coded in MATLAB programming software.

A. CONSTRAINED PROBLEM 1
This minimization problem, which is a constrained minimization problem, originally proposed by Bracken and McCormick. The optimum solution is obtained at X = (0.8229, 0.9114) with an objective function value equal to VOLUME 8, 2020   f (X) = 1.393454. The problem can then be expressed as: Abdollah Homaifar et al. solved this problem using GA [42]. David B. Fogel compared the results of evolutionary programming (EP) with results got by GA [43]. Kang Seok Lee and Zong Woo Geem used the harmony search (HS) method and found their best solution to this problem [44].
In GA an EP, the population size equals to 400, the number of Iterations equals to 40000. HS found the best solution after approximately 40,000 searches. In SABOA, the population size and iterations respectively selected 100 and 2000. Table 5 shows the comparison of results and the corresponding 1calculated variables among the five algorithms. As can be seen from Table 5, the results got by SABOA is minimum, and SABOA exceeded another contrasted algorithm in terms of the final value and constraint accuracy. Although EP and HS can find the smaller solution, they only retain two decimals and three decimals respectively in constraints, their searching accuracy and the number of iterations are less than that of SABOA.
Details of all algorithms can be found in the original algorithm literature. The number of function evaluations (NFEs) [57], which can be seen as the one found for the best optimal solution, is calculated by the product of the number of shrapnel pieces and the number of iterations. In SABOA, the population size and iterations respectively selected 50 and 2000, the algorithm was independently run ten times in MATLAB. Table 6 displays the comparison of results for constrained problem 2. In Table 6, NOT means that the parameter value is not given in the original literature. From Table 6, most of the algorithms considered in this paper can find the best solution, however, SABOA obtained the best result compared to all considered algorithms in terms of NFEs as shown in Table 6. PSO and PSO-DE have less NFEs, but SABOA has higher accuracy. Table 6 shows that the proposed algorithm has a superiority factor of reaching the near-optimal value in the early iterations.
In SABOA, the population size and iterations respectively selected 50 and 2000, the algorithm was independently run ten times in MATLAB. All results for algorithms were shown in Table 7. DE and DELC have less NFEs, but SABOA has higher accuracy. Almost all algorithms find the minimum, however, the proposed algorithm can attain the best solution with the fastest speed with and fewer NFEs, which shows that SABOA has the ability to resist and reducing constraints.

VI. CONCLUSION
In the present article, a new SABOA algorithm, which has fewer parameters and strong exploitation, is proposed to solve the global optimization problem. The self-adaption method is introduced in the proposed algorithm to approximate the closest candidate solution and employ for the exploitation of the promising searching space. The simulation results show that SABOA performs better than BOA in optimizing 14 benchmark functions including seven low-dimensional functions and seven high-dimensional functions, and 3 constrained functions, which display that the SABOA can maintain a fair balance between exploration and exploitation.
For benchmark functions, we drew iteration graphs, boxplots, and search paths to statistically test the performance of the proposed algorithm, and the results demonstrate that the proposed algorithm is undoubtedly different from compared algorithms in benchmark functions.
For three constraint Functions, we compared searching results of different literatures, and computational results based on the comprehensive comparative investigation, exhibit the attractiveness of the SABOA for handling numerous constraints. All calculated results illustrate that the SABOA can give better solutions than other compared algorithms in terms of function values and the number of function evaluations and iterations. In general, SABOA offers excellently competitive abilities compared with original BOA due to the experimental results.
In the future study, this proposed algorithm will be applied in engineering problems to find appropriate industry parameters.
JUNPENG SHAO received the B.S., M.S., and Ph.D. degrees in hydraulic drive from the Harbin Institute of Technology, in 1982, 1987, and 1991, respectively. He has been a Professor and a Ph.D. Supervisor with the Harbin University of Science and Technology. His current research interests include hydraulic drive, intelligent control, and hydraulic quadruped robots.
GUITAO SUN received the B.S., M.S., and Ph.D. degrees in mechatronics engineering from the Harbin University of Science and Technology, in 2009, 2012, and 2015, respectively. He has been a Lecturer with the Harbin University of Science and Technology. His current research interests include hydraulic drive, intelligent control, and hydraulic quadruped robots.
XUAN SHAO received the B.S. degree from the Beijing University of Aeronautics and Astronautics, in 2008, the M.S. degree from The University of Manchester, in 2010, and the Ph.D. degree from the Harbin University of Science and Technology, in 2018. She has been a Lecturer with the Harbin University of Science and Technology. Her main research interests include hydraulic drive, intelligent control, and hydraulic quadruped robots. VOLUME 8, 2020