Abstract
In this paper, the water cycle algorithm (WCA), a recently developed metaheuristic method is proposed for solving multiobjective optimization problems (MOPs). The fundamental concept of the WCA is inspired by the observation of water cycle process, and movement of rivers and streams to the sea in the real world. Several benchmark functions have been used to evaluate the performance of the WCA optimizer for the MOPs. The obtained optimization results based on the considered test functions and comparisons with other wellknown methods illustrate and clarify the robustness and efficiency of the WCA and its exploratory capability for solving the MOPs.
This is a preview of subscription content, access via your institution.
References
AtashpazGargari E, Lucas C (2007) Imperialist competitive algorithm: an algorithm for optimization inspires by imperialistic competition. IEEE Congress on Evolutionary Computation, Singapore, pp 4661–4667
Blum C, Andrea R (2003) Metaheuristics in combinatorial optimization: overview and conceptual comparison. ACM Comput Surv 35(3):268–308
Coello CAC, Lechuga MS (2002) MOPSO: A proposal for multiple objective particle swarm optimization. In: Proceedings of the congress on evolutionary computation (CEC’2002), Honolulu, vo 1, pp1051–1056
Coello CAC (2000) An updated survey of GAbased multiobjective optimization techniques. ACM Comput Surv 32(2):109–143
Coello CAC, Veldhuizen DAV, Lamont G (2002) Evolutionary algorithms for solving multiobjective problems., Genetic Algorithms and Evolutionary ComputationKluwer, Dordrecht
Coello CAC (2004) Handling multiple objectives with particle swarm optimization. IEEE T Evolut comput 8(3):256–279
Coello CAC, Cruz Cortés N (2005) Solving multiobjective optimization problems using an artificial immune system. Genet Program Evol M 6:163–190
Deb K (2001) Multiobjective optimization using evolutionary algorithms. Wiley, New York
Deb K, Pratap A, Agarwal S, Meyarivan T (2002a) A fast and elitist multi objective genetic algorithm: NSGAII. IEEE Trans Evolut Comput 6(2):182–197
Deb K (2002) Multiobjective genetic algorithms: problem difficulties and construction of test problems. Evol Comput 7:205–230
Deb K, Thiele L, Laumanns M, Zitzler E (2002) Scalable multiobjective optimization test problems. In: Proceedings of IEEE Conference on Evolutionary Computation, pp 825–830
Eskandar H, Sadollah A, Bahreininejad A, Hamdi M (2012) Water cycle algorithm—a novel metaheuristic optimization method for solving constrained engineering optimization problems. Comput Struct 110–111:151–166
Fonseca CM, Fleming PJ (1993) Genetic algorithms for multiobjective optimization: formulation, discussion and generalization. In: Forrest S (ed) Proceedings of the fifth international conference on genetic algorithms. Morgan Kauffman, San Mateo, pp 416–423
Freschi F, Repetto M (2006) VIS: an artificial immune network for multiobjective optimization. Eng Optim 38(8):975–996
Gao J, Wang J (2010) WBMOAIS: a novel artificial immune system for multiobjective optimization. Comput Oper Res 37:50–61
Glover FW, Kochenberger GA (2003) Handbook of metaheuristics. Kluwer, Dordrecht
Haupt RL, Haupt SE (2004) Practical genetic algorithms, 2nd edn. John Wiley, New York
Holland J (1975) Adaptation in natural and artificial systems. University of Michigan Press, Ann Arbor
Kaveh A, Laknejadi K (2011) A novel hybrid charge system search and particle swarm optimization method for multiobjective optimization. Expert Syst Appl 38(12):15475–15488
Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of the IEEE international conference on neural networks. Perth, Australia, pp 1942–1948
Knowles JD, Corne DW (2000) Approximating the nondominated front using the Pareto archived evolution strategy. Evol Comput 8(2):149–172
Kursawe F (1991) A variant of evolution strategies for vector optimization. In: Lecture Notes in Computer Science. In: Proceedings of the Parallel Problem Solving From Nature, PPSN I, vol 496, pp 193–197
Lin Q, Chen J (2013) A novel micropopulation immune multiobjective optimization algorithm. Comput Oper Res 40(6):1590–1601
Mahmoodabadi MJ, Adljooy Safaie A (2013) A novel combination of particle swarm optimization and genetic algorithm for pareto optimal design of a fivedegree of freedom vehicle vibration model. Appl Soft Comput 13:2577–2591
Mostaghim S, Teich J (2003) Strategies for finding good local guides in multi objective particle swarm optimization (MOPSO). In: Proceedings of the IEEE swarm intelligence symposium, pp 26–33
Osman IH, Laporte G (1996) Metaheuristics: a bibliography. Ann Oper Res 63:513–623
Poloni C (1997) Hybrid GA for multiobjective aerodynamic shape optimization in genetic algorithms., Engineering and Computer ScienceWiley, New York
Pradhan PM, Panda G (2012) Solving multiobjective problems using cat swarm optimization. Expert Syst Appl 39:2956–2964
Sierra MR, Coello CAC (2005) Improving PSObased multi objective optimization using crowding, mutation and edominance. In: Proceedings of evolutionary multicriterion optimization conference. Guanajuato, Mexico, pp 505–519
Srinivas N, Deb K (1995) Multi objective function optimization using nondominated sorting genetic algorithms. Evol Comput 2(3):221–248
Viennet R, Fontiex C, Marc I (1995) New multicriteria optimization method based on the use of a diploid genetic algorithm: example of an industrial problem. In: Proceedings of Artificial Evolution. Brest, France, pp 120–127
Wang L, Zhong X, Liu M (2012) A novel group search optimizer for multiobjective optimization. Expert Syst Appl 39:2939–2946
Wang L, Zhong X, Liu M (2012) A novel group search optimizer for multiobjective optimization. Expert Syst Appl 39(3):2939–2946
Wang Y, Zeng J (2013) A multiobjective artificial physics optimization algorithm based on ranks of individuals. Soft Comput 17:939–952
Zhang B, Ren W, Zhao L, Deng X (2009) Immune system multiobjective optimization algorithm for DTLZ problems. In: Fifth international conference on natural computation, pp 603–609
Zitzler E, Thiele L (1999) Multi objective evolutionary algorithms: a comparative case study and the strength Pareto approach. IEEE Trans Evolut Comput 3(4):257–271
Zitzler E, Deb K, Thiele L (2000) Comparison of multiobjective evolutionary algorithms: empirical results. Evol Comput 8(2):173–195
Zitzler E, Laumanns M, Thiele L (2001) SPEA2: Improving the strength Pareto evolutionary algorithm. Swiss Federal Institute Technology, Zurich, Switzerland, TIK Report, vol 103, pp 1–21
Acknowledgments
This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korean government (MSIP) (NRF2013R1A2A1A01013886).
Author information
Affiliations
Corresponding author
Additional information
Communicated by V. Loia.
Appendix A: Mathematical formulation of studied MOPs
Appendix A: Mathematical formulation of studied MOPs
This Appendix represents the MOPs used in this paper to conduct a qualitative assessment for performance and efficiency of the MOWCA.

(1)
Test problem 1–DEB: Deb’s function is a problem with two design variables. This problem is defined as follows (Deb 2002):
$$\begin{aligned} \mathrm{DEB}:\min \left\{ {\begin{array}{l} f_1 (X)=x_1 \\ f_2 (X)=g(X)\times h(X) \\ \end{array}} \right. \!\!, \end{aligned}$$(22)where
$$\begin{aligned} g(X)=11+x_2^2 10\cos (2\pi x_2 ), \end{aligned}$$(23)$$\begin{aligned}&h(X)=\left\{ {{\begin{array}{*{20}c} {1\sqrt{f_1 /g} } &{} \quad {\mathrm{if}\;\;f_1 (X)\le g(X)\;} \\ 0 &{} {\mathrm{Otherwise}} \\ \end{array} }} \right. \nonumber \\&\mathrm{where}\;\;0\le x_1 \le 1,\;30\le x_2 \le 30. \end{aligned}$$(24)The Pareto optimal front for this problem is convex and defined as \(x_{1} \epsilon [0, 1], x_{2}\) = 0.

(2)
Test problem 2–FON: Fonseca and Fleming’s function (FON) is an eightdesign variable function suggested as follows (Fonseca and Fleming 1993):
$$\begin{aligned}&\mathrm{FON}:\min \left\{ {\begin{array}{l} f_1 (X)=1\exp \left( {\sum \limits _{i=1}^8 {\left( {x_i \frac{1}{\sqrt{8} }}\right) ^2} }\right) \\ f_2 (X)=1\exp \left( {\sum \limits _{i=1}^8 {\left( {x_i +\frac{1}{\sqrt{8} }}\right) ^2} }\right) \\ \end{array}} \right. \nonumber \\&\quad \mathrm{where}\;2<x_i <2,\;i=1,2,3,\ldots ,8, \end{aligned}$$(25)The optimal Pareto front for this biobjective problem is \(x_i^*=[1/\sqrt{8} ,1/\sqrt{8} ]\) for \(i = 1,2,3,{\ldots },8\).

(3)
Test problem 3–POL: This function was first introduced by Poloni (1997) which has been widely analyzed in the literature (Deb et al. 2002a; Kaveh and Laknejadi 2011). The mathematical formulation proposed by Poloni (1997) is given as follows:
$$\begin{aligned}&\mathrm{POL}:\min \nonumber \\&\quad \times \left\{ {\begin{array}{l} f_1 (X)=[1+(A_1 B_1 )^2+(A_2 B_2 )^2] \\ f_2 (X)=[(x_1 +3)^2+(x_2 +1)^2] \\ A_1 =0.5\sin 12\cos 1+\sin 21.5\cos 2 \\ A_2 =1.5\sin 1\cos 1+2\sin 20.5\cos 2 \\ B_1 =0.5\sin (x_1 )2\cos (x_1 )+\sin (x_2 )1.5\cos (x_2 ) \\ B_2 =1.5\sin (x_1 )\cos (x_1 )+2\sin (x_2 )0.5\cos (x_2 ) \\ \end{array}} \right. \nonumber \\&\quad \mathrm{where}\;\pi <x_1 ,x_2 <\pi . \end{aligned}$$(26)The Pareto optimal front for the POL function is nonconvex and discontinuous.

(4)
Test problem 4–KUR: This problem, presented by Kursawe (1991), has three design variables having nonconvex and discontinuous Pareto optimal front. The KUR’s mathematical formulation is as follows (Kursawe 1991):
$$\begin{aligned}&\mathrm{KUR}:\min \left\{ {\begin{array}{l} f_1 (X)=\sum \limits _{i=1}^{n1} {\left( {10\exp \left( 0.2\sqrt{x_i^2 +x_{i+1}^2 } \right) }\right) } \\ f_2 (X)=\sum \limits _{i=1}^n {\left( {\left {x_i } \right ^{0.8}+0.5\sin x_i^3 }\right) } \\ \end{array}} \right. \nonumber \\&\quad \mathrm{where}\;5<x_i <5,\;i=1,2,3.\nonumber \\ \end{aligned}$$(27) 
(5)
Test problem 5–VNT: This problem is a threedimensional problem in objective space suggested by Viennet et al. (1995). This problem has previously been investigated by many researchers (Freschi and Repetto 2006; Gao and Wang 2010) and is formulated as follows (Viennet et al. 1995):
$$\begin{aligned}&\mathrm{VNT}=\min \nonumber \\&\quad \times \left\{ {\begin{array}{lll} {f(X)=0.5(x_1^2 +x_2^2 )+\sin (x_1^2 +x_2^2 )} \\ {f(X)=(3x_1 2x_2 +4)^2/8+(x_1 x_2 +1)^2/27\!+\!15} \\ {f(X)=(x_1^2 +x_2^2 +1)^{1}1.1\exp (x_1^2 x_2^2 )} \\ \end{array} }\right. \nonumber \\&\quad \mathrm{where}\;3\le x_1 ,x_2 \le 3 . \end{aligned}$$(28)It is worth mentioning that the discontinuous Pareto optimal set and having several local Pareto fronts are considered to be challenging features of the VNT problem (Gao and Wang 2010).

(6)
Test problem 6–ZDT1: The ZDT1 function, was suggested by Zitzler et al. (2000), and has been extensively investigated (Deb et al. 2002a; Kaveh and Laknejadi 2011). This problem is described as follows:
$$\begin{aligned}&\mathrm{ZDT1}:\min \left\{ {\begin{array}{l} f_1 (X)=x_1 \\ f_2 (X)=g(X)[1\sqrt{x_1 /g(X)} \\ g(X)=1+9\left( {\sum \limits _{i=2}^n {x_i }}\right) /(n1) \\ \end{array}} \right. \nonumber \\&\quad \mathrm{where}\;0<x_i <1,\;i=1,2,3,\ldots ,30. \end{aligned}$$(29)The ZDT1 problem has 30 design variables and its Pareto optimal front is convex and defined as \(x_{1}\epsilon [0, 1], x_{i}\) = 0, for \(i = 2,{\ldots }, 30\).

(7)
Test problem 7–ZDT3: The ZDT3 problem, suggested by Zitzler et al. (2000), has 30 design variables with a nonconvex and discontinuous Pareto optimal front. The mathematical formulation of the ZDT3 problem is as follows (Zitzler et al. 2000):
$$\begin{aligned}&\mathrm{ZDT3}:\min \nonumber \\&\quad \left\{ {\begin{array}{l} f_1 (X)=x_1 \\ f_2 (X)=g(X)[1\sqrt{x_1 /g(X)} \frac{x_1 }{g(X)}\sin (10\pi x_1 )] \\ g(X)=1+9\left( {\sum \limits _{i=2}^n {x_i } }\right) /(n1) \\ \end{array}} \right. \nonumber \\&\;\mathrm{where}\;0<x_i <1,\;i=1,2,3,\ldots ,30. \end{aligned}$$(30)The Pareto optimal front is defined as \(x_{1} \epsilon [0, 1], x_{i}\) = 0, for \(i = 2, 3,4, {\ldots }, 30\).

(8)
Test problem 8–ZDT4: The ZDT4 problem, proposed by Zitzler et al. (2000), has 10 design variables having several local Pareto fronts. This problem is given as follows:
$$\begin{aligned}&\mathrm{ZDT4}:\min \nonumber \\&\quad \left\{ {\begin{array}{l} f_1 (X)=x_1 \\ f_2 (X)=g(X)[1\sqrt{x_1 /g(X)} ] \\ g(X)=1+10(n1)+\sum \limits _{i=2}^n {[x_i^2 10\cos (4\pi x_i )]} \\ \end{array}}\right. \!,\quad \end{aligned}$$(31)where \(x_{1} \epsilon [0, 1] \mathrm{and} x_{i}\) [\(\)5,5], for \(i = 2, 3,\dots , 10\). In addition, the optimal Pareto front is defined as \(x_{1}\epsilon [0, 1], x_{i} = 0\), for \(i = 2, 3, 4,\dots , 10\).

(9)
Test problem 9–ZDT6: The ZDT6, introduced by Zitzler et al. (2000), has 10 design variables with a nonconvex Pareto optimal front. The mathematical formulation of this problem is given as follows:
$$\begin{aligned} \mathrm{ZDT6}:\min \left\{ {\begin{array}{l} f_1 (X)=1\exp (4x_1 )\sin ^6(6\pi x_1 ) \\ f_2 (X)=g(X)[1( {f_1 (X)/g(X)})^2] \\ g(X)=1+9\left[ {( {\sum \limits _{i=2}^n {x_i } })/(n1)} \right] ^{0.25} \\ \end{array}} \right. \nonumber \\ \;\mathrm{where}\;0<x_i <1,\;i=1,2,3,\ldots ,10.\nonumber \\ \end{aligned}$$(32)The Pareto optimal front is characterized as \(x_{1 }\epsilon [0, 1], x_{i}\) = 0 for \(i\) from 2 to 10 for test problem 9.

(10)
Test problem 10–DTLZ series
The DTLZ series, proposed by Deb et al. (2002b), known as scalable problem, are considered in this paper. The mathematical formulation of DTLZ problems are given in Table 9. The Pareto optimal front for all DTLZ problems in this paper is defined as \(x_{i} = 0.5\) in which \(i \epsilon x_{M}.\) The design space for considered DTLZ problems is between zero and one.
Two and three objective functions (\(M =2\) and 3) are considered for the DTLZ series given in Table 9. The DTLZ series are minimization problems and the number of design variables for these problems is calculated as follows:
where \(n\) and \(M\) are the number of design variables and number of objective functions, respectively. Also, \(\left {x_M } \right \) is set to 10 for all considered problems in this paper.
Rights and permissions
About this article
Cite this article
Sadollah, A., Eskandar, H., Bahreininejad, A. et al. Water cycle algorithm for solving multiobjective optimization problems. Soft Comput 19, 2587–2603 (2015). https://doi.org/10.1007/s0050001414244
Published:
Issue Date:
Keywords
 Multiobjective optimization
 Water cycle algorithm
 Paretooptimal solutions
 Benchmark function
 Metaheuristics