Cat swarm optimization for solving the open shop scheduling problem

This paper aims to prove the efficiency of an adapted computationally intelligence-based behavior of cats called the cat swarm optimization algorithm, that solves the open shop scheduling problem, classified as NP-hard which its importance appears in several industrial and manufacturing applications. The cat swarm optimization algorithm was applied to solve some benchmark instances from the literature. The computational results, and the comparison of the relative percentage deviation of the proposed metaheuristic with other’s existing in the literature, show that the cat swarm optimization algorithm yields good results in reasonable execution time.


Introduction
To solve real optimization problems such as in industrial and manufacturing applications, the problem should be formulated as a theoretical problem.In 1974, Gonzalez and Sahni (1976) had introduced one of the most known complex combinatorial problem called the open shop scheduling problem (OSSP).There are several real-world applications of the OSSP, such as system-on-a-chip testing (Iyengar and Chakrabarty 2002), the area of satellite-switched time-division multiple access (Dell'Amico and Martello 1996), routing packets (Suel 1995), the scheduling and wavelength assignment problem in optical networks that are based on the wavelength-division-multiplexing technology (Bampis and Rouskas 2002), routing in optical transpose interconnect system (Lucas 2010), in routing in heterogeneous networks to model communications schedules (Bhat et al. 2000).
The OSSP problem is classified as NP-hard (Gonzalez and Sahni 1976), that is why some researchers had tried to solve it by introducing some methods, such as exact methods, polynomial time algorithm proposed by Gonzalez and Sahni (1976), and the branch and bound developed by Brucker et al. (1997).In general, the exact methods can attain some local solutions and rarely a global solution.The metaheuristic has proven its efficiency to reach the global solution of some problems such as the OSSP.Some metaheuristics are used to solve the OSSP problem, such as simulated annealing (Liaw 1999a) and Tabu search algorithm proposed by Liaw (1999b), genetic algorithm proposed by Prins (2000), extended genetic algorithm proposed by Rahmani Hosseinabadi et al. (2018), hybrid ant colony optimization proposed by Blum (2005), bee colony optimization proposed by Huang and Lin (2011), particle swarm optimization proposed by Sha and Hsu (2008).
This paper presents a new approach for solving the open shop scheduling problem, which is called the cat swarm optimization.In order to prove that the proposed method is efficient, the result obtained by its application to solve some benchmarks instances is compared with those existing in the literature.The rest of the paper is organized as follows; Section two is a description and formulation of the open shop scheduling problem, with an example.Section three presents the cat swarm optimization algorithm, its used parameters, and its process.Section four describes the proposed adaptation of cat swarm optimization to solve the open shop scheduling problem.
Section five shows the results obtained by the application of the adapted cat swarm optimization to solve some benchmark instances and a discussion.Finally, a conclusion is presented.

Presentation of the problem
The OSSP (Gonzalez and Sahni 1976) The OSSP consists of n jobs, J, should be processed at most m machines, M, each job One performance measure which is con- sidered to be minimized is the total execution time of all process called makespan.

Problematic assumptions
• All operations should be processed.
• Each machine can process at most one operation at a time in determining operation process time.• The operations in the same job cannot process simultaneously.• Two operations of the same job cannot be processed at the same time.

Formulation of the problem
The machine name where the operation o i should be processed.T o i : The processing time of operation o i .

Cat swarm optimization algorithm
Cat swarm optimization (CSO) algorithm is a computational intelligent algorithm, inspired from the behavior of cats.The CSO algorithm was introduced by Chu and Tsai (2006).This algorithm is divided into two modes which are the seeking mode and the tracing mode.The seeking mode presents the rest mode in real life of a cat, where it spends the majority lifetime and the tracing mode, when the cat hunts a prey or any moving object.Every cat is characterized by its own position, its velocity, and the flag to identify whether the cat is in the seeking mode or the tracing mode.
The CSO algorithm proposed by Chu and Tsai (2006) was improved by some researchers to ameliorate its efficiency, as using average-inertia weight suggested by Orouskhani et al. (2011), introducing an adaptive parameter control by Wang et al. (2015), parallel cat swarm optimization by PW Tsai et al. (2008), solving combinatorial optimization problem by Bouzidi and Riffi (2013), solving the clustering problem improved by Razzaq et al. (2016), enhanced parallel cat swarm optimization based on the Taguchi method by Tsai et al. (2012).It was also extended to solve multi-objective problems in 2012 by Pradhan and Panda (2012).These improvements were applied to solve some difficult application problems, such as IIR system identification by Panda et al. (2011b), optimizing least-significant-bit

Fig. 1 Informations matrix
Fig. 2 The schedule information matrix substitution by Wang et al.(2012), optimal placement of multiple UPFC's in voltage stability enhancement under contingency by Kumar and Kalavathi (2014), direct and inverse modeling of plants by Panda et al. (2011a), single bitmap block truncation coding of color images by Cui et al. (2013), linear antenna array synthesis by Pappula and Ghosh (2014), improved metaheuristic techniques for simultaneous capacitor and DG allocation in radial distribution networks by Kawtar et al. (2015).
This paper aims to give an improved CSO algorithm to solve the OSSP problem, and to prove its efficiency, it was applied to solve some benchmark problems.

CSO to OSSP
To solve the OSSP problem by the CSO, its operators and operations (elementary and global) were enhanced.The improvements are described as follows:

Cat's parameters
In CSO algorithm, the solution presents the global best solution (Gbest) found by the cats in the swarm; for each cat, the position presents the solution that should be a schedule in OSSP problem.The velocity is used to move from a position to another; in OSSP, a novel solution is obtained by applying a set of swaps to the solution, and thus, the set of swaps is presented by the velocity, and the flag is used to know the cat mode.To sum up, the used operators of each cat in a swarm are:

Mode
Parameter Role

SM and TM Position
The schedule solution presented by a sequence of all operations Flag The cat mode (seeking or tracing mode) TM Velocity Set of couple permutation (i,j) that will be applied to a position, where i and j are a range of two velocities in the selected position SM SMP Number copies of cats in the SM CDC Percent length of the mutation SRD First rang in the selected solution vector SPC A Boolean value

Cat's process
The metaheuristic is known by its intelligent combination of two principal concepts which are exploring and exploiting.For the CSO metaheuristic, in each mode, we find the concept of exploration and exploitation.The two modes are combined intelligently with the mixture ratio (MR).The proposal CSO process respects the definition proposed by Chu and Tsai (2006), but some improvements are provided to solve the open shop scheduling problem, and these improved modes are described as follows:

Seeking mode
The seeking mode (SM) presents the cat at rest and also as being alert-looking around its environment for its next move.The position is presented by a vector of schedule, and for that the four parameters of this mode were adapted, and the new role of each one is: The seeking mode steps can be described as follows: 1. Put j copies of the present position of the cat k, with j = SMP.If the value of SPC is true or j = SMP-1, and retain the cat as one of the candidates.2. Generate a random value of SRD 3. If the fitness (FS) is not equal, calculate the probability of each candidate by Eq. (a), and the default probability value of each candidate is 1. 4. Perform mutation and replace the current position.
where FS i is the fitness of cat i , FS max is the maximum fitness in the swarm and the FS min is the minimal fitness in the swarm.
Step 1: The program makes four copies of the selected cat position and considering the selected position as a candidate, because SPC = True.
where 0 < i < j Step 2: For each copy, according to the CDC, randomly increase or decrease the current SRD percentage value, and replace the old values b the application of a swap between the SRD position and the second position that is (CDC + SRD) if the sum is less than 12 (the total size of the problem), else the second position is (CDC + SRD) − (12 + 1).

The tracing mode
The mode tracing (TM) presents the cat at the quick movement, according to its own velocity, while chasing a pray or any moving object.This section is devoted to describing the parameters, the process and elementary operations used in this mode Tracing mode parameters The used parameters in this mode are: X best The best solution/position of the cat who has the best fitness value V k The old speed value (current value) V′ k The new value velocity of the cat k w The inertia weighted parameter The new value of the position of the cat k X k The actual position of the cat k V k The velocity of the cat k (a) Tracing mode process: The process of TM is given as: (1) Update the velocities of each cat k According to the equation: (2) check if the velocities are of the highest order.
(3) update the position of the cat k according to equation:

Elementary operations:
The elementary operations (addition, subtraction and multiplication) used in tracing mode to solve the OSSP are not the same as those defined by Chu and Tsai (2006) to solve continuous optimization problem.The used operation is like the elementary operations defined to PSO algorithm, by Clerc (2004), to solve the combinatorial optimization problem.These operations will be performed on the velocity and the position of each cat in the TM mode.
Let's put x and x' two positions (schedules), and a velocity v represents all permutations to perform.
Adding operation which translates the movement represents the set of permutations to be applied to the position x to getting a new position x'.Example

Subtraction (position-position):
This operation is performed between two positions to get a velocity.
The subtraction is the opposite of the addition operation (x′−x = v ⇔x + v = x′).In this case, by two positions x and x′, the result is all permutations performed on x, to obtain x′.These pairs of permutations are the velocity v. Example This operation is performed between a real r and velocity v = (i k , j k )[k:0 →|v|]; the result is a velocity.The different possible cases, according to the real r, are: Where n is the integer part of r, and x corresponds to decimal parts.And returns to each party to the previous cases.• If (r < 0): r × v = (− r) × ¬v.Now (− r) > 0, and you will consider one of the previous case.
The complete mode process: The flowchart represents the description of the complete CSO process as shown in Fig. 4.

Computational results and discussion
This section presents the obtained results and the discussion of the proposal of CSO to solve some benchmark instances proposed by Taillard (1993) and Guéret and Prins (1999).This proposed adaptation is coded in Visual C++, which runs on an Ultrabook characteristic′s 2.1 GHz 2.59 Ghz Intel Core i7 PC with 8G of RAM.Each instance runs for 10 times in the maximum time of 1 h.

Parameter tuning
The used parameters values in CSO process are the SMP, CDC used in the seeking mode, the parameters w, r and c used in tracing mode, and the MR used in the general process.
For respecting the real life of cats, this paper dose not have to change the SMP, CDC, MR, c values and also the range of the variation of the random value r.
About the inertia weighted parameter w, this paper had used a fix parameter values such that it was consider in the proposal of CSO to solve combinatorial problem (Bouzidi and Riffi 2014b), that values were analyzed and discussed by the application to solve the TSP problem (Bouzidi and Riffi 2013).After that it was applied to solve other combinatorial problems by using this parameters values, such as the QAP (Riffi and Bouzidi 2014), JSSP (Bouzidi and Riffi 2014a) and FSSP (Bouzidi and Riffi 2015).
To resume, the used parameters values are shown in Table 1.

Evaluation of the proposed algorithm
This section presents two tables that show the collected results by the application of the adapted CSO algorithm to the benchmark instances of Taillard (1993) in Table 2, and Guéret and  Prins (1999) in Table 3.For each instance, the number of job J and machines M is defined (J × M), the best-known fitness solution (BKS) is found by the existing methods to solve the OSSP to the selected problem instance, the best obtained solution (BS) is found by applying CSO algorithm got in ten runs of the program, and the BS lower than the BKS is marked by * after the instance name.To prove the efficiency of the CSO, the relative percentage deviation (RPD) is calculated.And the average execution time (Aver_T) is taken into seconds.
The RPD is calculated by: Table 2 shows the obtained results by the application of proposed CSO to solve the Taillard (1993) benchmark instances.
Table 2 shows that the application of CSO to solve all proposed Taillard instances problems has a lower-to-negligible RPD value, and each instance problem solution is obtained in a reasonable execution time.
Table 3 shows the obtained results by the application of CSO to solve the OSSP benchmark instances proposed by Guéret and Prins (1999).
Also the CSO can find the optimal solution of many benchmark instance problems of Guéret and Prins that have the lower values of RPD, and the solutions are obtained in a reasonable execution time.Also, when the application takes more time running, the CSO algorithm gives better solutions to all selected benchmark instances.Table 4 proves that CSO application with more time (more than 900 s) can find the best optimal solutions.Table 4 shows some benchmark instance, its size (number of jobs J and number of machines M), the best-known solution BKS for each one, the best solution (BS) found by the application of CSO method, the number of jobs J and the number of machines M; the RPD1 presents the relative percentage deviation by executing the application for a maximal time 900 s, and the RPD2 presents the relative percentage deviation by executing the application more than 900 s, and  until it obtains the best solution, and the average time execution in ten runs (Aver_T) to seconds.

Comparison with other metaheuristics and discussion
This part aims to compare the average RPD of seven methods, including the one obtained by applying CSO to OSSP.Those other methods are two of hybrid variable neighborhood search methods (VNS), which are VNS-based curtailed fashion VNS (CLS) and VNS-based greedy local search VNS (GLS), two-phase solution method (TPSM), genetic algorithm (GA), the genetic algorithm incorporating the mutation (MGA) (Naderi and Zandieh 2014) and the electromagnetism algorithm (EA).All these methods run in a PC with 2.0 GHz Intel Core 2 Duo and 2 GB of RAM memory.The average RPD obtained by each metaheuristic for the different problem size of two known benchmark instances (Taillard 1993;Guéret and Prins 1999) is presented as follows: In order to do the comparative study, the best way used is the graphical representation that provides a visual display to more assess the collected average RPD results to solve the differents benchmark instances by the metaheuristics in study.For this reason, the results in Table 5 were translated into two graphs; Fig. 5 represents the variation of RPD of the seven methods for the different size of Taillard benchmark instances problems; Fig. 6 is like the first but of the benchmark instances are of Guèret and Prins.
Figure 5 shows that the CSO algorithm had the lower RPD for the different problem size instances, which means that it has more efficiency than the others.
Again, Fig. 6 shows that the CSO algorithm had the lower RPD for the different problem sizes, which means that the CSO is more efficient than other methods.

Conclusion
To conclude, this paper presented the efficiency of the cat swarm optimization algorithm to solve the theoretical problem, open shop scheduling problem, by its ability to find the best-known solution for some benchmark instances and to find the new best solutions.The comparison of the results of the CSO method to some recent existing methods to solve OSSP problem has also proved the efficiency of the CSO algorithm to solve the OSSP problem.That is why, the aim is to extend the proposed metaheuristic to be applied to various real applications based on OSSP.

Compliance with ethical standards
Conflict of interest There is no conflict of interest with this research as known by the authors.
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creat iveco mmons .org/licenses/by/4.0/),which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Fig. 5 Fig. 6
Fig. 5 Mean plots for metaheuristic versus the problem size of Taillard benchmark instances

Table 5
The average RPD obtained of each of the methods against two well-known benchmarks