Keywords

1 Introduction

The purpose of the optimization aims at finding the best possible solution(s) for given problems. In the real world, a lot of problems can be considered as optimization problems. With the scale and complexity of the problem escalating, we need new optimization techniques more than ever. Over the past few decades, many new meta-heuristic techniques have been proposed to solve these optimization problems and become very popular. These meta-heuristics like a black box just need to looking at the inputs and outputs. In the recent years, some well-known meta-heuristic algorithms are proposed in this field such as Different Evolution [1], Particle Swarm Optimization (PSO) [2], Bat algorithm (BA) [3], Moth-Flame Optimization (MFO) [4], Grey Wolf Optimization (GWO) [5], and Cuckoo Search (CS) [6]. Most of these algorithms are derived from a various natural phenomenon. These algorithms are widely used in a variety of scientific and industry fields.

The salp swarm optimization algorithm is proposed by Mirjalili et al. [7]. The salps swarm optimization has been shown the powerful results, when it compared to other state-of-the-art met-heuristic optimization algorithms. The author has been applied this algorithm to engineering design problems such as welded design problem, and achieved good results. Although, the SSA has proved good performance compared with some traditional algorithms, it still has some drawbacks such as spend too long time in research phase, and need to enhance the ability of convergence speed and calculation accuracy. To overcome the above problems, a simplex method-based salp swarm algorithm is proposed. The simplex method [8] has the strong ability to avoid local optimum and enhance the ability of searching the global optimum. In this work an improved version of the SSA is based on simplex method named SMSSA which purpose aimed at enhance the precision of the convergence of basic SSA.

The rest of paper is organized as follows: In the Sect. 2 presents a briefly introduce of the original SSA algorithm and Simplex method. The detailed description of the SMSSA algorithm is introduced in the Sect. 3. In the Sect. 4, through a range of tests to demonstrate the superior performance of SMSSA and compared with other well-known five meta-heuristic algorithms (including the original algorithm SSA) via fourteen benchmark functions. In the Sect. 5, SMSSA employed to solve one engineering design problems. The analysis and discussion of the results are provided in Sect. 6. In the last section, the conclusion of the work will provided.

2 Related Works

In this part, a briefly background information about the salp swarm algorithm will be provided. The salp swarm algorithm [9] is a new meta-heuristic optimization algorithm that proposed by Seyedali Mirjalili. The SSA inspired from the behavior of the salps foraging and navigating in the ocean. Salps is one of the family of Salpidae and the body is transparent barrel-shaped. The tissues of salp are very similar to jelly fish. Salps navigate in the water by using water pumped through body to get propulsion to move forward [10].

In the SSA algorithm, the mathematically model of the salp chains are divided to two groups: leader groups and follow groups. The leader salp position updating formula as follows:

$$ X_{j}^{i} = \left\{ {\begin{array}{*{20}l} {F_{j} + c_{1} ((ub_{j} - lb_{j} )c_{2} + lb_{j} )} \hfill & {c_{3} \ge 0.5} \hfill \\ {F_{j} - c_{1} ((ub_{j} - lb_{j} )c_{2} + lb_{j} )} \hfill & {c_{3} < 0.5} \hfill \\ \end{array} } \right. $$
(1)

where \( X_{j}^{i} \) indicate the position of the leader salp groups in the jth dimension, \( F_{j} \) indicates the position of the food source in the jth dimension, \( ub_{j} \) denotes the upper bound of the salps in the jth dimension, \( lb_{j} \) denotes the lower bound of the salps in the jth dimension, \( c_{1} ,c_{2} ,c_{3} \) are three random coefficients.

The equation of the follow salps update its position can be expressed as follows:

$$ X_{\text{j}}^{i} = \frac{1}{2}(X_{j}^{i} + X_{j}^{i - 1} ) $$
(2)

Where \( i \ge \frac{N}{2} \) and \( X_{j}^{i} \) denotes that the follower salp position in jth dimension. The Eq. (1) simulated the move of the salp chains. The steps of salp swarm algorithm (SSA) can be described through the pseudo code illustrated as follows (Algorithm 1):

figure a

3 The Proposed SMSSA Approach

The simplex method-based on salp swarm algorithm (SMSSA) proposed in this paper is designed to improve the population diversity and enhance the speed of the convergence. The simplex method has excellent qualities that make the algorithm to jump out the local optimum and increase the diversity of the population. It is means that this approach can make a balance between exploration and exploitation ability of SSA. So, we update the location of the worst salp by using simplex method after each iterating. The modified Algorithm 2 illustrated as follows:

figure b

4 Simulation Experiments

4.1 Simulation Platform

The experimental settings for these algorithms are tested in MATLAB R2016 (a) on a windows 10 computer with an Intel Core (TM) i5-4590 Processor, 3.30 GHz, 4 GB RAM.

4.2 Benchmark Functions

Benchmark functions are widely used in this field to benchmark the performance of the algorithm by using a set of quintessential math functions to find the globally optimal. Following the same procedure, 4 standard benchmark functions are used as a comparative test bed from the literature [8, 9]. Tables 1, 2 illustrated the mathematical formulations that employed benchmark functions used respectively. In these three tables, range denotes the search space boundary of the function, and dim means the dimension of the function, and \( f_{\hbox{min} } \) represent the theoretical minimum (optimal value). Heuristic algorithms are stochastic optimization techniques, so they must run dozens of times to produce meaningful statistical results. The result of the last iteration is calculated as the best solution. The same method was chosen to generate and report results for over 30 independent runs.

Table 1. Unimodal benchmark functions
Table 2. Multimodal benchmark functions

To evaluate the performance of the proposed SMSSA algorithm, we have chosen some new and well-known algorithm for comparison: CS [6], MFO [4], PSO [2], BA [3], and SSA [7]. Every algorithm uses 30 population individuals and experiences 1000 iteration.

In this work, the best, the average, the worst, and the standard represent the best fitness value, the worst fitness value, and the standard deviation, respectively. The experimental results are shown in Tables 3, 4. The best results are shown in bold type. In addition, for the randomness of the algorithm, statistical tests should be conducted to confirm the significance of the results [10]. To determine whether the SMSSA results differed statistically from the best results for CS, MFO, PSO, BA, and SSA, a non-parametric test called the Wilcoxon rank-sum test [11] is performed at 5% significance level. Tables 6, 7 illustrated the pairwise comparisons of the best values for the six groups generated by the Wilcoxon test. Such groups are formed by CS versus SMSSA, MFO versus SMSSA, PSO versus SMSSA, BA versus SMSSA, SSA versus SMSSA. Generally, p values < 0.05 can harbor the idea that it is strong evidence against the null hypothesis. Through the statistical test, we can confirm that the results are not produced by chance.

Table 3. The results of unimodal benchmark functions

In addition, through nonparametric Wilcoxon statistical tests and calculate the p values are reported as the standards of significance as well. Tables 5, 6 show the experimental results of the rank-sum test.

4.3 Unimodal Benchmark Functions

The unimodal benchmark functions have only one global optimum and have no local optimum. So, this type of functions is very suitable for benchmarking the convergence of the algorithm. The results shown in Table 4 show that SMSSA algorithm is more competitive in researching the global optimum. According to the Table 4, the results of SMSSA are superior to some of other algorithms in \( f_{1} \sim f_{3} \). Therefore, the SMSSA has higher performance in finding the global minimum of unimodal benchmark functions. As shown in Table 5, the p values of \( f_{1} \sim f_{3} \) illustrated that SMSSA achieves paramount improvement in some unimodal benchmark functions against other algorithm. Therefore, it is proved by unimodal benchmark functions that SMSSA has better performance in searching global optimal value. Figures 1, 2, 3 shows that the average convergence curve for all algorithms tested with the unimodal benchmark functions are obtained from 30 times independent run.

Table 4. The results of multimodal benchmark functions.
Table 5. p-values rank-sum test on unimodal benchmark functions
Fig. 1.
figure 1

The convergence curves for \( f_{1} \)

Fig. 2.
figure 2

The convergence curves for \( f_{2} \)

Fig. 3.
figure 3

The convergence curves for \( f_{3} \)

Fig. 4.
figure 4

The standard deviation for \( f_{1} \)

4.4 Multimodal Benchmark Functions

Compared with the unimodal benchmark functions, the multimodal benchmark functions have many local optimal solutions (minima) which increases exponentially with the dimension. This feature makes them good at benchmarking the exploration ability of the algorithm. The result obtained from the benchmark function test reflects the ability of an algorithm to avoid the local minimum and finally can reach the global minimum. Table 4 illustrated the results of the algorithm on multimodal benchmark functions. All the results of the Best, Worst, Mean, and Std values illustrated in the table, SMSSA can provide more competitive results on the multimodal benchmark functions. All the results in the table show that the SMSSA has advantage in exploration. As the p values of \( f_{4} \) shown in the Table 6 are less than 0.05 mostly, which demonstrated it is not the null hypothesis. Therefore, these evidence shows that the results of SMSSA not occurring by accident in the statistic’s sense.

Table 6. p-values rank-sum test on multimodal benchmark functions

As the Table 4 and Fig. 5 shown, the speed of convergence of SMSSA on the multimodal benchmark functions is faster than other algorithms In summary, these evidence shows that this algorithm is more stable and robust than other algorithms (Figs. 4 and 6).

Fig. 5.
figure 5

The convergence curves for \( f_{4} \)

Fig. 6.
figure 6

The standard deviation for \( f_{4} \)

5 SMSSA for Engineering Optimization Problems

In this part, solve an engineering problem (spring design problem) by applying SMSSA algorithm to prove the good performance of SMSSA. Assuming the use of the SMSSA algorithm, some inequality constraints of real problems will be solved. Some methods have been employed to deal with constraints in the paper: special operators, penalty function, repaired algorithms, and hybrid methods [12]. In this work, penalty method is applied to solve the constraints of spring design problem. The spring design problem, [13] is a classic engineering design problem. The main purpose of this problem is to minimize the weight of the spring illustrated in Fig. 7. The model of this problem described as follows:

Fig. 7.
figure 7

The spring design problem.

$$ \begin{array}{*{20}l} {\text{Consider}} \hfill & {\vec{x} = [x_{1} \,x_{2} \,x_{3} ] = [RDN],} \hfill \\ {\text{Minimize}} \hfill & {f(\vec{x}) = (x_{3} + 2)x_{2} x_{1}^{2} ,} \hfill \\ {\text{Subject to}} \hfill & {g_{1} (\vec{x}) = 1 - \frac{{x_{2}^{3} x_{3} }}{{71785x_{1}^{4} }} \le 0,\,g_{2} (\vec{x}) = \frac{{4x_{2}^{2} - x_{1} x_{2} }}{{12566(x_{2} x_{1}^{3} - x_{1}^{4} )}} + \frac{1}{{5108x_{1}^{2} }} \le 0,} \hfill \\ {} \hfill & {g_{3} (\vec{x}) = 1 - \frac{{140.45x_{1} }}{{x_{2}^{2} x_{3} }} \le 0,\,g_{4} (\vec{x}) = \frac{{x_{1} + x_{2} }}{1.5} - 1 \le 0,} \hfill \\ {\text{Variable range}} \hfill & {0.05 \le x_{1} \le 2.00,\,0.25 \le x_{2} \le 1.30,\,2.00 \le x_{3} \le 15.0} \hfill \\ \end{array} $$

As shown in Table 7, the spring design problem has been solved by some different approaches. Some meta-heuristic algorithms such as GSA [14], PSO [15] Evolutionary genetic algorithms GA [16] has been employed to solve this problem. The statistics lead us to the conclusion that the SMSSA is better than other algorithms.

Table 7. Comparison results for spring design problem

6 Conclusion

This work proposed an improved algorithm named SMSSA based on simplex method aims at increases the performance of the original SSA algorithm. It can be seen from the Sect. 4, experimental results show that SMSSA achieves not only faster convergence speed and better solutions compared with other algorithms. The conclusion is derived from the comparison between SMSSA and other algorithms. This proposed algorithm demonstrated its outstanding performance by 4 benchmark functions. In addition, this algorithm (SMSSA) also applied to solve engineering problems. The results in the Sect. 5 illustrated that SMSSA algorithm has good performance in solving engineering constraint problems. By combining the advantages of both techniques, SMSSA can get a balance between exploitation and exploration to deal with classical engineering problems.