An Improved Crow Search Algorithm Based on Spiral Search Mechanism for Solving Numerical and Engineering Optimization Problems

The crow search algorithm (CSA) is a new intelligent optimization algorithm based on the behavior of the crow population, which has been proven to perform well. However, its simple search mechanism also leads to the algorithm’s slow convergence speed and its ease of falling into local optimization when solving complex optimization problems. In order to overcome these problems, this paper proposes an improved CSA (ISCSA) based on a spiral search mechanism. By introducing a weight coefﬁcient, an optimal guidance position and a spiral search mechanism, the position equation was updated to accelerate the convergence of the algorithm and make the exploration and exploitation of CSA more balanced. Meanwhile, adding Gaussian variation and random perturbation strategy made it difﬁcult for the algorithm to fall into local optimization. The advantages of the proposed ISCSA were evaluated using 23 benchmark functions and four classical engineering design problems. The experimental and statistical results of 23 test functions showed that the proposed ISCSA could escape from the local optima with higher accuracy and faster convergence than both the CSA and other meta-heuristic optimization algorithms. The calculation results of the four engineering optimization problems showed that compared with other algorithms, ISCSA can solve the practical optimization problem well and has been proved to have strong competitiveness and good performance.


I. INTRODUCTION
Over the past few decades, the complexity of numerical and engineering optimization problems has grown rapidly, prompting researchers to continuously propose and improve different optimization methods to solve the increasingly intractable optimization problems currently encountered [1], [2]. Generally speaking, optimization algorithms can be composed of two types: deterministic techniques and meta-heuristics [3]. Research shows that traditional optimization algorithms will encounter various The associate editor coordinating the review of this manuscript and approving it for publication was Khalid Aamir.
problems when solving complex optimization problems. For example, it is easy to fall into the local optimum, and the accuracy of the result is highly dependent on the selection of the initial point of the algorithm [4]. However, the meta-heuristic optimization algorithm is simple, flexible and easily avoids local optimization. It has been proved to be able to solve many practical problems [5], [6]. Therefore, the meta-heuristic optimization algorithm has become an interesting topic for scientists.
To date, a series of meta-heuristic search algorithms have been proposed to solve various nonlinear and high-dimensional complex optimization problems. The proposed meta-heuristic algorithms can be roughly divided into VOLUME 8, 2020 This work is licensed under a Creative Commons Attribution 4.0 License. For more information, see https://creativecommons.org/licenses/by/4.0/ three categories. They are based on evolution, individual and swarm intelligence, and physical phenomena and scientific laws. The evolutionary algorithm is based on Darwin's theory of evolution and solves problems by simulating biological evolutionary processes and mechanisms. The algorithm mainly solves the optimization problem through three operations: selection, recombination, and mutation. For example, the genetic algorithms (GA) [5], the evolution strategy (ES) [7] and the biogeography-based optimizer (BBO) [8], etc. Individual and swarm intelligence optimization algorithms are a type of mathematical model established by using the behaviors of animals, such as survival and predation, and swarm behaviors such as information exchange, swarm collaboration, and competition among swarms. For example, the artificial bee colony (ABC) optimization algorithm [9], the cuckoo search (CS) algorithm [10], the whale optimization algorithm (WOA) [11], the particle swarm optimization (PSO) [12], the grey wolf optimizer (GWO) algorithm [13], the salp swarm algorithm (SSA) [14] and the Harris hawks optimization(HHO) [15], etc. Others search algorithms are based on natural phenomena, physical phenomena, and other scientific laws that have been discovered by humans in nature. For example, simulated annealing (SA) [16], the gravity search algorithm (GSA) [17], the harmony search (HS) [18], the wind driven optimization (WDO) [19] and the state transition simulated annealing algorithm(STASA) [20], etc. The above algorithms have been proven to solve problems in test functions, and practical scientific and industrial applications. However, according to the no free lunch (NFL) theorem [21], we cannot theoretically consider an algorithm as a general solution to all optimization problems. Each optimizer will have a greater or fewer number of problems, such as a slow convergence speed and weak ability to avoid local extreme points. Therefore, the NFL theorem encourages scientists to seek to develop more efficient optimizers that have solved problems that have already been encountered and optimization problems that may arise in the future.
In this paper, we focused on the crow search algorithm (CSA), which was developed by Askarzadeh [22] and is based on imitating crow foraging behavior. CSA has been tested with some benchmark functions and is competitive when compared to other algorithms. However, the optimization strategy of CSA is relatively simple and random, which has led to premature convergence and poor robustness of this algorithm when applied to complex high-dimensional optimization problems. This makes it difficult to strike a balance between exploration and exploitation [23]- [29]. Therefore, avoidance of local optimization and accelerating the convergence have become the two relatively important directions for CSA research, and they have attracted more and more attention for investigating the improved versions of the CSA algorithm and their applications to a variety of fields [24]- [28], [30]- [39].
To overcome these shortcomings, this paper presents a new improved version of CSA called ISCSA. In the proposed ISCSA, the weight coefficients and the optimal guidance were introduced to modify the crow's position update equation; this will produce more efficient information and possible candidate solution [26], [40]- [42]. The spiral position update mechanism was adopted to maximize the use of known information [11], [43]. During each iteration, the position of one fifth of the individuals was randomly generated in the search space, which can ensure the dispersion of the group in the space and reduce the probability of the group falling into the local optimal. Meanwhile, the strategies of Gaussian variation and random disturbance were introduced to make it have the ability of mutation and migration when the algorithm is stagnant; this made the exploration and exploitation ability of the algorithm become more balanced [44].
The rest of this paper is structured as follows. In Section 2, the basic principles of the standard CSA are reviewed. In Section 3, the proposed ISCSA is described in detail. In Section 4, the effectiveness of the algorithm is tested and analyzed in three different dimensions of the 13 benchmark functions and 10 fixed-dimensional benchmark functions, and compared with some classical and advanced algorithms. In Section 5, four classic practical engineering applications are selected to verify the practicability of ISCSA. In Section 6, the study is summarized and possible future work is described.

II. CROW SEARCH ALGORITHM
The crow search algorithm (CSA) is a recently proposed evolutionary algorithm based on crow foraging behavior [22]. The main idea is that when a crow looks for its own hidden food, the other crows will follow and steal the food with a certain probability [22], [24]- [27].
In this algorithm, the optimal candidate solution of the optimization problem is that the crow hides the position information of its food in their memory. The trajectory of crows that constantly track flights in space to find a better food source, is an algorithm optimization process. Suppose N is the population size and iter max is the maximum number of iterations. In the i-th iteration, x i,iter d was the position of the crow i in the d-dimensional search space, where =1,2,. . . ,N, and =1,2,. . . ,iter max . The hidden position of the crow i was represented by m i,iter , which was the best location for this crow so far. In each iteration, assume that the crow i followed the crow j to find the hidden position of the crow j, then we could divide the problem into two cases: 1) crow j doesn't realize that crow i is tracking it, crow i can follow crow j to find hidden food; 2) crow j realized that crow i is tracking it, in order to prevent food from being stolen, crow j will fly to a random position in space to confuse crow i. The mathematical models of Case 1 and 2 were expressed as follows:  where r j was a random value with uniform distribution between 0 and 1, fl i,iter represents the distance the crow i flies, and AP j,iter denoted the awareness probability of crow j at iteration, respectively. The effect of the AP value on CSA is shown in Fig. 1. The value of fl (fl< 1 or fl> 1) will affect the local or global search ability of the crow. The value of AP will determine the proportion of the two different cases for the crow throughout the process. At the same time, the memory of each crow was updated using the following formula: The CSA optimization flow chart is shown in Fig. 2.

III. PROPOSED APPROACH
Metaheuristic algorithms vary in the search methods they use to solve optimization problems; however, in all methods, the search process may be divided into two phases; the exploration and exploitation phases [41], [44], [45]. Although CSA has the advantages of a simple algorithm and few parameters, its ability to balance the two phases is limited due to the simple search mechanism of the CSA and fewer optimization strategies. The algorithm also converges slowly, which was the main reason for the development of CSA [24]- [29]. In this section, we will detail introduce the ISCSA, which could optimize the performance of the traditional CSA. The proposed algorithm inherited the configuration of CSA, but overcame the disadvantages of CSA by modifying the original CSA position update mechanism and adding Gaussian mutation and random perturbations when the algorithm update is stagnant. These improvements are described in detail below.

A. WEIGHT COEFFICIENT AND OPTIMAL GUIDANCE
To improve the performance of CSA, we introduced the weight coefficient (w) with reference to the PSO algorithm, and change the random follow object of the crow into the best individual in the swarm. The modified position update equation described by state i was as follows: where gbest was the optimal solution during the current iteration, and w(t) was adaptively changed according to the following formula: where w max was the maximum initial value of w. When the value of w was larger in the early stage, it benefited exploration, and when it was smaller in the later stage, it benefited exploitation. The schematic diagram of this stage is shown in Fig. 3.

B. SPIRAL SEARCH MECHANISM
The second state of CSA was to randomly generate a position in the space. This kind of blind operation will slow down the convergence speed of the algorithm. To solve this problem, we introduced the spiral search mechanism. The spiral search mechanism provides more possibilities for crows to make full use of the crow's best location, thus increasing the ability to find global solutions during the optimization process. The trajectory of the spiral structure was a search space obtained by calculating the distance between the previous position   and the crow's best position of the crow [11], [43], [46]. The schematic diagram of the model is shown in Fig. 4, which was mathematically expressed as follows: where D = |gbest − x t i | denoted the distance of the current best position from the i − th crow, b denoted a constant for defining the shape of the logarithmic spiral, and l represented a random number in [−1,1].

C. GAUSSIAN MUTATION AND RANDOM PERTURBATION
In the process of iterative optimization of complex problems, the algorithm may appear to be trapped in a local optimum.
In this case, the solution value of the algorithm remains unchanged during the iteration. In order to make up for the shortcomings of the algorithm and increase the probability of the algorithm jumping out of the local optimal position, we added a Gaussian mutation or a random perturbation (the function ran 200 times and the result remained the same), and then continued to run the algorithm on this basis. The formula is shown below: where rand was a random number in [0,1],r j was the probability of choosing to perform Gaussian mutation or random perturbation. The Gaussian variation distribution function is as follows: where µ was the mean value, σ 2 was the variance, The schematic diagram of this stage is shown in Fig. 5.

D. COMPARISON BETWEEN ISCSA AND CSA
The performance of the meta-heuristic optimization algorithm can be analyzed from two points of view: development and exploration. Only in these two aspects of the algorithm to achieve a proper balance to play a good effect. Compared with CSA, ISCSA has the following advantages: 1. In the first stage of the algorithm, the original position update of the crow was modified. It was divided into two parts. In the first part, the following flight formula was modified to update in the direction of the global optimal value, which will accelerate the convergence speed of the algorithm. Multiplying by the weight coefficient within a certain range can greatly widen the search space of crows with different changes, which was more conducive to the search for the global optimal solution. These two different strategies will give consideration to the exploration and exploitation capabilities of the algorithm. In the second part, the spiral tracking flight strategy was introduced. The crow can follow the best position of the crow in a spiral search. Therefore, in a high-dimensional space, the position of the crow can vary in different dimensions. The information of the crow's best location will be used more efficiently. At the same time, one fifth of the individual positions were randomly updated in the iteration process to ensure the dispersion of individual distribution in the population, thus reducing the probability of them falling into local extremum.
2. In the second stage of the algorithm, when the location update stagnation was detected, the algorithm may have fallen into a premature or local optimal value. At this point, Gaussian variation and random perturbation were introduced. This method improved the probability of the algorithm jumping out of the current state. This will give the crow a chance to escape from its current location and search in a better solution space to find the global optimal solution.
In summary, we have combined the above some strategies to propose the ISCSA. The pseudocode for the ISCSA is shown in Fig. 6.

IV. RESULTS AND DISCUSSION
To evaluate the validity and stability of the proposed ISCSA, we used different benchmark functions to test the optimization ability of the algorithm. We also compared the algorithm with common heuristic algorithms to prove its superiority.

A. BENCHMARK FUNCTIONS
Many benchmark functions are characterized by their high complexity, multi-peak, nonlinear, multi-mode, and fuzzy search directions. Therefore, the optimization ability of the selected algorithm can be fully and accurately evaluated by the reference function. In this test, we selected 23 typical benchmark functions [47], including seven unimodal benchmark functions, six multimode benchmark functions, and ten fixed-dimensional benchmark functions. The unimodal function had only one optimal value, and was used to evaluate the VOLUME 8, 2020 exploitation capabilities of the optimization algorithm. Multimodal functions and fixed-dimensional benchmark functions have multiple local optima and are suitable for studying the exploration capabilities of optimization algorithms. These functions have been widely used in related literature, so they are suitable for this study. The selected benchmark functions are shown in Tables 1, 2 and 3, where Theoretical best was the global optimal value. These benchmark functions were the classical functions used by many researchers when they study optimization algorithms [47]- [51].

B. PARAMETER SETTINGS AND COMPARISONS OF OPERATING RESULTS TO VARIOUS ALGORITHMS
To test the efficiency of the proposed improved algorithm, we compared ISCSA with WOA [11], PSO [12],GWO [13], SSA [14], HHO [15], GSA [17], CSA [22] and ICSA [27], BOA [52]. To ensure the algorithm was useful, these algorithms have been executed in the MATLAB R2017a computing environment on a computer with an i5-4210H 2.9 GHz processor with 4GB RAM (Random Access Memory). The dimensions of unimodal and multimodal reference functions are 30, 50 and 100, respectively. At the same time, we compared the performance comparison of ISCSA and ICSA in 200 dimensions. Common parameters for various algorithms along with their detailed information are shown in Table 4.
The following information can be seen in Table 4. In ISCSA, weight coefficient(w max )=1.5, flight length(fl)=2, awareness probability (AP)=0.8, the constant b of the logarithmic spiral shape is 1. In ICSA, experience factor(ef )=0.5, flight length(fl)=2, awareness probability (AP)=0.1. In CSA, flight length(fl)=2, awareness probability (AP)=0.1. In HHO, the initial energy E 0 is between 0 and 1 and the jump strength J is between 0 and 2. In BOA, power exponent=0.1 and modular modality=0.01. In SSA, the controlling parameter P 3 =0.5. In WOA, the a is linearly decreased from 2 to 0 and the b is a constant for defining the shape of the logarithmic spiral. In GWO, the a is linearly decreased from 2 to 0. In GSA, the G 0 = is set to 100 and α is set to 20, In the last iteration, final_per is only 2 percent of agents apply force to the others. In PSO, c1=c2=1.45 and inertia factor (w) is decreasing linearly from 0.9 to 0.4.
Since the algorithm in this paper was improved based on the crow search algorithm, in order to reflect the fairness of the algorithm, the population size was set to 20 and the maximum number of iterations was 2,000 during the algorithm run by referring to CSA and ICSA. To eliminate contingency, each function ran 30 times independently. The mean, standard deviation and average running time provided by the algorithm were recorded in each test function and are given in Tables 5, 6, 7, 8 and 9.
First of all, the comparison algorithms selected in this paper include not only the classical meta-heuristic algorithms such as PSO and GSA, but also the newly proposed algorithms such as HHO, BOA, SSA, WOA and GWO, as well as the improved CSA algorithms such as ISCSA. The overall analysis of the data in the table showed that no algorithm can solve all the test function problems. When an algorithm performs well on some functions, it may perform poorly on others. This result is also consistent with the NFL theorem [21]. The purpose of our improved algorithm was to make it perform well in most functions, so that it could solve both numerical and practical engineering optimization problems.
Second, the performance of ISCSA on 23 different test functions must be carefully analyzed. From the results in Tables 5, 6, 7 and 8 for unimodal and multimode benchmark functions, ISCSA only performed poorly on F5 and F13 for unimodal and multimodal benchmark functions and has the best mean and standard deviation of the remaining 11 functions. The calculation results of the function in three different dimensions showed that the mean and standard deviation was optimal, indicating that our algorithm is robust. At the same time, ISCSA's performance was still strong as the dimension increases. For 10 fixed dimensional functions, ISCSA only performed poorly on F15 and F20, and it performed best on the remaining 8 functions, and the best solution can be found. In function F15, ISCSA is second only to BOA. It can be seen from the analysis of the performance of ISCSA and ICSA in 200 dimensions in Table 9 that in 13 single-peak and multi-peak functions, the two algorithms find the optimal in F6, F9, F10 and F11 functions, while in the remaining 9 functions, the performance of ISCSA is better than ICSA and has better stability. It can be seen from the average running time of the function that the ISCSA algorithm ranks higher, although lower than CSA, it is better than most algorithms.
Finally, we compared the convergence of the various algorithms. We selected eight representative functions (F1, F4, F7, F10, F11, F13, F15 and F21) of the 23 benchmark functions to plot the convergence curves of their running results, as shown in Fig. 7, 8 and 9. From the results of the unimodal function  in Fig. 7, ISCSA and HHO performed best in the three different dimensions of F1 function, and the optimal solutions were found. In terms of convergence speed, ISCSA was the fastest to find the optimal solution. On the F4 function, only ISCSA could found the optimal solution on three different dimensions. And the converges rare was also the fastest. F7 is a difficult function, but ISCSA not only finds a better solution than other algorithms, but also converges faster. From the results of multi-peak functions in Fig. 8, in F10 function, ISCSA and HHO performed best on three different dimensions, and the optimal solutions were found. In terms of convergence speed, ISCSA was the fastest to find the optimal solution. In the F11 function, ISCSA, ICSA, HHO and BOA find the best solution in three different dimensions. VOLUME 8, 2020 When comparing the convergence rate, the convergence rate from fast to slow was ISCSA, ICSA, HHO and BOA. In F13 function, the optimal value of its algorithm was difficult to find. ISCSA's performance in 30 dimensions was second only to GSA and HHO, and superior to most algorithms in convergence in 50 dimensions and 100 dimensions, second only to HHO. As can be seen from the results of fixed-dimensional functions in Fig. 9, ISCSA performed well on the function F15, second only to BOA. On the function F21, ISCSA finds the best solution relative to other algorithms and has a fast convergence speed.
Analyze the effects of 23 test functions used in the experiment, of which 7 unimodal test functions were used to test the algorithm's exploitation ability, 6 multimodal functions and 10 fixed-dimensional functions were used to test the algorithm's exploration ability. The experimental results showed that ISCSA performs best on all six functions on the unimodal function, which proved that the algorithm has a good exploitation ability. Thirteen of the multimodal and fixed-dimensional functions are superior to other algorithms, proved that the algorithm has good exploration capabilities. Overall, the algorithm ranks first, proved that the location update strategy of the ISCSA algorithm makes the algorithm exploitation and exploration capabilities better balanced.
In summary, from the above experimental and analytical results, ISCSA was significantly better than most other existing algorithms in most test functions. It could also prove the superiority of the improved algorithm and the necessity of improving the CSA algorithm.

C. STATISTICAL ANALYSES
In this section we aimed to arrive at an accurate conclusion, and to visualize the superiority of the proposed improved algorithm over the prior algorithms. A statistical test was performed to show the importance of the results. This section introduces two non-parametric tests: the Friedman test [53] and the Wilcoxon rank sum test [51]. The Friedman test is a nonparametric test that uses ranks to achieve significant differences in multiple population distributions. The average over the entire test set of the algorithm obtained by the Friedman test was used to indicate the validity and importance of the proposed ISCSA. According to Friedman's test rankings, the lower the ranking, the more efficient the algorithm. Tables 10, 11 and 12 show the average grades obtained by the various algorithms using this test at a 95% confidence level. Wilcoxon statistical test was performed at a significant level of 5% and the p-values were listed in Tables 13, 14 and 15. During the statistical test, Wilcoxon statistical test selected the best algorithm and compared it with other competitor algorithms. p-values less than 0.05 and h values equal to 1 proved the statistical superiority of ISCSA. Note that if the results of the two algorithms were the same, the parameter p was N/A. However, this result does not mean that these algorithms have exactly the same performance [51].
Analysis of Tables 10, 11 and 12, ISCSA and comparison algorithm Friedman's test results showed that the mean and variance of ISCSA ranking in different dimensions of different test functions were the smallest. This proves that ISCSA was the best solution and superior to other optimization algorithms in most test problems. Fig. 10, 11 and 12 showed the schematic of the Friedman test result.
We see in Fig. 10, 11 and 12 that ISCSA was the best solution compared to other comparison algorithms, and the best value, mean and median were the smallest; indicating that the ISCSA had a superior performance. Compared with the standard CSA, this also proved the necessity of algorithm improvement.
Based on the analysis of Tables 13, 14 and 15, Wilcoxon rank sum test tests were conducted for ISCSA and 9 other algorithms in different dimensions of 23 test functions. The results showed that compared with other algorithms, most p values of ISCSA were less than 0.05 and h values were equal to 1. The results showed that ISCSA is statistically significantly better than other comparison algorithms. The superiority and necessity of ISCSA algorithm can be proved.

V. ENGINEERING OPTIMIZATION PROBLEMS
In the previous section we demonstrated the superiority of our algorithm on the test function. However, the ultimate goal of the optimization algorithm is to solve real  problems encountered in the engineering process. Only by solving practical problems, can the feasibility and efficiency of the algorithm be demonstrated. In this section we will evaluate it with some classic engineering issues. We selected four common engineering optimization problems, including the tension/compression spring design problem, the I-beam VOLUME 8, 2020  design, the gear train design and the welded beam design.

A. TENSION/COMPRESSION SPRING DESIGN PROBLEM
The purpose of Tension/compression spring design problem is to minimize the weight of the tension/compression spring under the constraints of a linear equation and three nonlinear inequalities. The structure of the spring is shown in Fig. 13. The problem consisted of three consecutive decision variables, the wire diameter (d or x 1 ), the average coil diameter (D or x 2 ), and the number of effective coils (P or x 3 ). And g1, g2, g3 and g4 are four inequality constraints formed by three variables in the design process that must be satisfied. The mathematical expressions for the objective function and constraints of the problem were as follows: We used some optimization algorithms to optimize this design problem, such as WOA [11], PSO [12], GWO [13],HHO [15], GSA [17], CSA [22], BOA [52], SA-MFO [54], MFO [55], improved HS [56], GA [57], ES [58], DE [59] and RO [60]. The best results for each of these methods are shown in Table 16. The results showed that the solutions obtained by ISCSA were superior to those obtained by other methods,Second only to BOA.

B. I-BEAM DESIGN PROBLEM
The I-beam design problem is also a difficult engineering structure optimization problem Fig. 14. The objective of this structure is to minimize the vertical deflection of I-beam, so that the design problem could find the optimal geometric parameters related to the cross section. The design parameters in this problem are: length (b or x 1 ), height (h or x 2 ), and the thicknesses (t w or x 3 and t f or x 4 ). And g1 is a inequality constraint formed by four variables in the design process that must be satisfied. The mathematical expressions for the objective function and constraints of the problem were as follows: The results of this design problem were found using ISCSA and other optimization algorithms such as CS [10], CSA [22], MFO [55], ARSM and IARSM [61], and SOS [62], and were   showed in Table 17. According to the analysis results in the table, compared with other algorithms, the experimental results of ISCSA were optimal. ISCSA found the optimal solution to the design problem, which found a smaller vertical deflection of the I-beam. Therefore, ISCSA could solve this engineering application problem well.

C. GEAR TRAIN DESIGN PROBLEM
The gear train design problem is also well-known unconstrained optimization problems in mechanical engineering. A schematic diagram of this problem is shown in Fig. 15.
The main purpose is to minimize the gear ratio of the four gear sets. The design variables of this mechanical optimization problem were the tooth numbers of the gears n A (x 1 ), n B (x 2 ), n C (x 3 ), and n D (x 4 ) [59], [60]. The mathematical expressions for the objective function and constraints of the problem were as follows:    Optimization of the number of gear sets is a discrete problem. Because the four solutions must conform to the standards of integers, the results of each calculation were rounded. The best results of ISCSA and other algorithms running on this issue are presented in Table 18. The results were compared with ABC [9] CS [10], CSA [22], BOA [52], MFO [55], ALO [63], MVO [64], ISA [65], MBA [66], GA [67], [68] and ALM [69]. Experimental data showed that the optimal gear ratio obtained using ISCSA was the same as the optimal gear ratio obtained using CS, MFO, MVO, ISA, and MBA. The number of gears obtained by using the ISCSA was the same as that obtained by CSA, BOA and ALO. These results proved the validity of the ISCSA method in solving gear ratio problems. These results proved that the ISCSA method was effective in solving gear ratio problems. It can be explained that ISCSA can effectively solve the discrete problem.

D. WELDED BEAM DESIGN PROBLEM
The purpose of the welded beam design problem is to minimize the construction cost of welded beams. Based on shear stress (τ ) and bending stress in the beam (σ ), buckling load (Pc), end deflection of the beam (δ), there are four continuous design variables, and two linear and five nonlinear inequality constraints. Fig. 16 shows a schematic diagram of the design problem of the welded beam. The four decision variables for the problem are h(x 1 ), L(x 2 ), T(x 3 ), and b(x 4 ). And g1, g2, g3, g4, g5, g6 and g7 are seven inequality constraints formed by four variables in the design process that must be satisfied. The mathematical expressions for the objective function and constraints of the problem were as follows: , VOLUME 8, 2020 Variable range This design problem was solved by selecting different optimization algorithms, such as WOA [11], GWO [13], HHO [15], GSA [17], CSA [22], BOA [52], SA-MFO [54], MFO [55], Improved HS [56], RO [60], MVO [64], CBO [70], GA [71]- [73], and CPSO [74], and their running  results and ISCSA results were recorded in Table 19. The results in the table showed that compared with other numerical optimization methods, ISCSA has improved experimental results. That is to say, we use the algorithm proposed in this paper to find a more qualified welded beam design that met the conditions, which gave it a lowest manufacturing cost, Second only to BOA.
In conclusion, based on the experimental analysis results of the four engineering application problems, the following conclusions can be drawn: whether the new algorithm or the VOLUME 8, 2020    old algorithm was compared, ISCSA can solve the engineering optimization problem and obtain the best solution so far in the I-beam design problem. It can be seen that ISCSA has strong competitiveness and good performance. This result further proved the efficiency and necessity of the proposed algorithm.

VI. CONCLUSION
This paper proposed an improved crow search algorithm based on the spiral search mechanism. The ISCSA combined the weight coefficient, the optimal guidance and the spiral search mechanism to accelerate convergence and balance the exploration and exploitation ability of the algorithm. The idea of Gaussian variation and random perturbation was introduced, which increased the probability that the algorithm jumped out of a local optimum. The performance of the algorithm was verified by using 23 typical benchmark functions. The experimental results showed that, compared with ICSA, CSA, HHO, BOA, SSA, WOA, GWO, GSA, and PSO, ISCSA greatly improved the search accuracy and convergence speed of the algorithm by combining various improvement strategies. We also applied the algorithm to 4 classic engineering optimization problems. The results showed that the improved algorithm solved the practical engineering application problems better, thus showing the necessity to improve CSA and the superiority of ISCSA. In future work, we will study the effect of ISCSA on image threshold segmentation, parameter optimization of machine learning model, feature selection, multi-objective optimization problems and apply this to solving more real complex engineering optimization problems, etc.
XIAOXIA HAN received the M.A.Sc. degree in control theory and engineering and the Ph.D. degree in circuits and systems from the Taiyuan University of Technology, Shanxi, China, in 2005 and 2010, respectively. From 2015 to 2016, she was a Visiting Scholar with the University of Saskatchewan, Canada. She has been a Professor with the Taiyuan University of Technology, since 2019. Her research interests include modeling and optimal control of complex industrial process, machine learning, and intelligent computing.
QUANXI XU received the B.Eng. degree in measurement and control technology and instrumentation from the North China University of Science and Technology, Hebei, China, in 2017. He is currently pursuing the M.A.Eng. degree in control engineering with the Taiyuan University of Technology, Shanxi, China. His research interests include modeling, and optimization and control of complex industrial process data mining.
LIN YUE received the B.Eng. degree in automation from the Xi'an University of Science and Technology, Shanxi, China, in 2017. He is currently pursuing the M.A.Sc. degree in control science and control engineering with the Taiyuan University of Technology, Shanxi. His research interests include modeling, optimization and control of complex industrial process, and data mining.
YINGCHAO DONG received the B.Eng. degree in electrical engineering from Xinjiang University, Urumqi, China, in 2017. He is currently pursuing the M.A.Sc. degree in control science and engineering with the Taiyuan University of Technology, Shanxi, China. His research interests include modeling, optimization and control of complex industrial process, and data mining.
GANG XIE received the B.S. degree in control theory and the Ph.D. degree in circuits and systems from the Taiyuan University of Technology, China, in 1994 and 2006, respectively. He has been a Professor with the Taiyuan University of Technology, since 2008. He has attained six provincial science and technology awards, authored more than 100 articles, and held five invention patents. His main research interests include intelligent information processing, computer vision, and big data.
XINYING XU received the M.A.Sc. degree in control theory and engineering and the Ph.D. degree in circuits and systems from the Taiyuan University of Technology, Taiyuan, China, in 2005 and 2009, respectively. Since 2018, he has been a Professor with the Taiyuan University of Technology. His research interests include intelligent information processing and fault diagnosis.