Keywords

1 Introduction

Evolutionary algorithms (EAs) [1] are relatively new unconstrained heuristic search techniques which require little problem specific knowledge. EAs consist of nature-inspired stochastic optimization techniques that mimic social behavior of animal or natural phenomena known as Swarm Intelligence techniques [2]. EAs are superior to other techniques such as Hill-Climbing algorithms [3] that are easily deceived in multimodal problem spaces and often get stuck in some sub optima.

There are many optimization problems possessing intrinsic search spaces of a binary nature such as feature selection [4] and dimensionality reduction [5]. In this paper, we aim to compare the binary versions of 3 evolutionary algorithms with other high performance algorithms for unimodal and bimodal single objective optimization problems. We also investigate the effect of 2 different families of Transfer Functions (TFs) that are instrumental in mapping continuous variables into binary search spaces that allow us to apply Binary evolutionary algorithms (BEAs). The main contributions of our work can be summarized as:

  1. 1.

    We present a comparative study between three BEAs.

  2. 2.

    Our work focuses on global minimization of 4 standard benchmark functions across 4 evaluation metrics.

  3. 3.

    We study the variations in performance due to different TFs in BEAs.

The rest of the paper is organized as follows: Sects. 2 and 3 describe the various BEAs and their transfer functions. Section 4 lays down the experimental details for which the results are provided and discussed. Section 5 summarizes the ideas presented in this paper and outlines the directions for future work.

2 Binary Evolutionary Algorithms

Binary versions of EAs have been adapted from their regular versions that are employed in continuous search spaces. BEAs operate on search spaces that can be considered as hypercubes. In addition, problems with continuous real search space can be converted into binary problems by mapping continuous variables to vectors of binary variables. A brief presentation of the algorithms used for single objective optimization in this paper are shown in the following subsections.

2.1 Gravitational Search Algorithm

Gravitational Search algorithm (GSA) proposed by Rashedi et al. [6] is a nature inspired heuristic optimization algorithm based on the law of gravity and mass interactions. The position of each agent represents the solution of the problem. In GSA, the gravitational force causes a global movement where objects with a lighter mass move towards objects with heavier masses. The inertial mass that determines the position of an object is determined by the fitness function, and the position of the heaviest mass presents the optimal solution. The binary version of GSA, known as BGSA proposed by Rashedi et al. [7] is an efficient adaptation of GSA in binary search spaces.

2.2 Bat Algorithm

Bat Algorithm (BA) [8] is a meta-heuristic optimization algorithm based on the echolocation behavior of bats. BA aims to behave as a colony of bats that track their prey and food using their echolocation capabilities. The search process is intensified by a local random walk. The BA is a modification of Particle Swarm Optimization in which the position and velocity of virtual microbats are updated based on the frequency of their emitted pulses and loudness. The binary version of BA, [9] models the movement of bats (agents) across a hypercube.

2.3 Dragonfly Algorithm

Dragonfly Algorithm (DA) proposed by Mirjalili [10] is a novel swarm optimization technique originating from the swarming behavior of dragonflies in nature. DA consists of two essential phases of optimization, exploitation and exploration that are deigned by modeling the social interaction of dragonflies in navigating, search for good, and avoiding enemies when swarming. The binary version of DA, known as BDA maps the five parameters: cohesion, alignment, escaping from enemies, separation and attraction towards food to be applicable for calculating the position of dragonflies (agents) in the binary search space.

3 Transfer Functions for Binary Evolutionary Algorithms

The continuous and binary versions of evolutionary algorithms differ by two different components: a new transfer function and a different position updating procedure. The transfer function maps a continuous search space to a binary one, and the updating process is based on the value of the transfer function to switch positions of particles between 0 and 1 in the hypercube. The agents of a binary optimization problem can thus move to nearer and farther corners of the hypercube by flipping bits based on the transfer function and position updating procedure. Various transfer functions have been studied to transform all real values of velocities to probability values in the interval [0, 1].

In this paper, we investigate two families of transfer functions first presented in [11], namely s-shaped transfer functions and v-shaped transfer functions. Table 1 presents the s-shaped and v-shaped transfer functions that are in accordance with the concepts presented by Rashedi et al. [7].

Table 1. Transfer functions for s-shaped and v-shaped families.

The names s-shaped and v-shaped transfer functions arise from the shape of the curves as depicted in Fig. 1 [11], and study the variations in performance due to these.

Fig. 1.
figure 1

(a) s-shaped and (b) v-shaped family of transfer functions

Due to the drastic difference between s-shaped and v-shaped transfer functions, different position updating rules are required. For s-shaped transfer functions, Formula (1) is employed to update positions based on velocities.

$$\begin{aligned} x^k_i(t+1)= {\left\{ \begin{array}{ll} 0 &{} \text {if } rand < T(v_i^k(t+1)) \\ 1 &{} \text {if } rand \ge T(v_i^k(t+1)) \end{array}\right. } \end{aligned}$$
(1)

Formula (2) [11] is used to update positions for v-shaped transfer functions.

$$\begin{aligned} x^k_i(t+1)= {\left\{ \begin{array}{ll} (x^k_i(t))^{-1} &{} \text {if } rand < T(v_i^k(t+1)) \\ x^k_i(t) &{} \text {if } rand \ge T(v_i^k(t+1)) \end{array}\right. } \end{aligned}$$
(2)

Formula (2) is significantly different from Formula (1) as it flips the value of a particle only when it has a high velocity as compared to simply forcing a value of 0 or 1. Algorithm 1 presents the basic steps of utilizing both s-shaped and v-shaped functions for updating the position of particles and finding the optima that can be generalized and used with any Binary Evolutionary Algorithm.

figure a

4 Experiment Settings and Results

4.1 Benchmark Functions and Evaluation Metrics

In order to compare the performance of the BEAs with both s-shaped and v-shaped transfer functions, four benchmark functions [12] are employed. The objective of the algorithms is to find the global minimum for each of these functions. Both unimodal (\(F_1,F_2\)) and multimodal (\(F_3,F_4\)) functions are used for performance evaluation, and are shown in Table 2. The Range of the function quantifies the boundary of the functions search space. The global minimum value for each of the functions is 0. The two dimensional versions of the functions are shown in Fig. 2. We use a 15 bit vector to map each continuous variable to a binary search space, where one bit is reserved for the sign of each variable. As the dimension of the benchmark function is 5, the dimension of each agent is 75. We use the following four metrics for evaluation and comparison.

  1. 1.

    Average Best So Far (ABSF) solution over 40 runs in the last iteration.

  2. 2.

    Median Best So Far (MBSF) solution over 40 runs in the last iteration.

  3. 3.

    Standard Deviation (STDV) of the best so far solution over 40 runs.

  4. 4.

    Best indicates the best solution over 40 runs in all iterations.

Table 2. Benchmark functions and their input ranges
Table 3. Parameters for experiments
Fig. 2.
figure 2

Performance comparison on evaluation metrics

4.2 Experiment Setting

The algorithms are run 40 times with a random seed on an Intel Core 2 Duo ma- chine, 3.06 GHz CPU and 4 GB of RAM. Metrics are reported over multiple runs to reduce the effect of random variations keeping in mind the stochastic nature of the algorithms. Table 3 shows the parameters for simulation of the algorithms.

4.3 Results and Discussion

As may be seen from the results presented in Tables 4 and 5 both BBA and BDA perform better than BGSA. We also observe that the v-shaped family of functions outperforms the s-shaped family and can significantly improve the ability of BEAs in avoiding local minima for these benchmark functions. It is also evident from the results, that BBA with v-shaped transfer functions also performs better than BDA in terms of most aspects. Both BBA and BDA with v-shaped transfer functions significantly outperform high performance algorithms such as CLPSO [13], FIPS [14], and DMS-PSO [15]. Despite relatively accurate results, a high standard deviation is observed which is attributed to the stochastic behavior of swarm optimization algorithms. The number of iterations drastically affects the ability of all 3 algorithms to converge to global minima easily in unimodal functions particularly \(F_2\), characterized by an extremely deep valley that makes convergence at the minima extremely slow. Similarly, for multimodal functions, \(F_3\), \(F_4\), increasing the number of iterations combined with the effect of using v-shaped transfer functions allows a significant improvement in the accuracy as well as convergence rate when compared to s-shaped transfer functions.

Table 4. Minimization results of unimodal benchmark functions over 40 runs
Table 5. Minimization results of multimodal benchmark functions over 40 runs

5 Conclusion

In this paper, a comparative study between binary evolutionary algorithms is performed and the effect of s and v-shaped transfer functions on the performance of single objective optimization problems is explored. The highest performing algorithm was BBA when compared with BGSA, BDA and other evolutionary algorithms in terms of avoiding local minima in both unimodal and multimodal functions. The results show the drastic improvement in performance with the introduction of v-shaped family of transfer functions for updating the position of agents. Our work shows the merit v-shaped functions and BBA have for use in binary algorithms. In the future, the current work aims to compare the performance of the transfer functions on other evolutionary and heuristic algorithms.