Abstract
Recently many researchers invented a wide variety of meta-heuristic optimization algorithms. Most of them achieved remarkable performance results by infusing the natural phenomena or biological behaviors into the search logics of the optimization algorithms, such as PSO, Cuckoo Search and so on. Although these algorithms have promising performance, there still exist a drawback—it is hard to find a perfect balance between the global exploration and local exploitation from the traditional swarm optimization algorithms. Like an either-or problem, algorithms that have better global exploration capability come with worse local exploitation capability, and vice versa. In order to address this problem, in this paper, we propose a novel Dynamic Group Search Algorithm (DGSA) with enhanced intra-group and inter-group communication mechanisms. In particular, we devise a formless “group” concept, where the vectors of solutions can move to different groups dynamically based on the group best solution fitness, the better group has the more vectors. Vectors inside a group mainly focus on the local exploitation for enhancing its local search. In contrast, inter-group communication assures strong capability of global exploration. In order to avoid being stuck at local optima, we introduce two types of crossover operators and an inter-group mutation. Experiments using benchmarking test functions for comparing with other well-known optimization algorithms are reported. DGSA outperforms other algorithms in most cases. The DGSA is also applied to solve welded beam design problem. The promising results on this real world problem show the applicability of DGSA for solving an engineering design problem.
This is a preview of subscription content, log in to check access.
References
Bratton J (2015) Introduction to work and organizational behavior. Palgrave Macmillan, London
Ioannou G, Kritikos MN (2004) Optimization of material handling in production and warehousing facilities. Oper Res Int J 4(3):317
Kosmidou K, Zopounidis C (2002) An optimization scenario methodology for bank asset liability management. Oper Res Int J 2(2):279
Mantegna RN, Stanley HE (1994) Stochastic process with ultraslow convergence to a Gaussian: the truncated Lévy flight. Phys Rev Lett 73(22):2946
Noman N, Iba H (2008) Accelerating differential evolution using an adaptive local search. IEEE Trans Evol Comput 12(1):107–125
Pardalos PM (2006) Advances in global optimization: methods and applications. In: 5th international conference on advances in global optimization: methods and applications. Myconos, Greece
Pinter WN, Hobson EA, Smith JE, Edelman AJ, Shizuka D, De SS, Fewell J (2013) The dynamics of animal social networks: analytical, conceptual, and theoretical advances. Behav Ecol 25(2):242–255
Shlesinger MF, Zaslavsky GM, Frisch U (1995) Lévy flights and related topics in physics. Proceedings of the international workshop held at Nice, France, 27–30 June 1994, Lecture notes in physics, vol 450. Springer, Berlin
Tang R, Fong S, Yang XS, Deb S (2012) Wolf search algorithm with ephemeral memory. In: Digital information management (ICDIM),2012 Seventh International Conference on IEEE, pp 165–172
Wong KC, Wu CH, Mok RK, Peng C, Zhang Z (2012) Evolutionary multimodal optimization using the principle of locality. Inf Sci 194:138–170
Yang XS (2011) Bat algorithm for multi-objective optimisation. Int J Bio-Inspir Comput 3(5):267–274
Yang XS, Deb S (2009)Cuckoo search via Lévy flights. In: Nature & biologically inspired computing. NaBIC 2009. World Congress on IEEE, pp 210–214
Yang XS, Ting TO, Karamanoglu M (2013) Random walks, Lévy flights, markov chains and metaheuristic optimization. In: Jung H-K, Kin JT, Sahama T, Yang C-H (eds) Future information communication technology and applications. Springer, Netherland, pp 1055–1064
Acknowledgement
The authors are thankful for the financial support from the Research Grant Temporal Data Stream Mining by Using Incrementally Optimized Very Fast Decision Forest (iOVFDF), Grant No. MYRG2015-00128-FST, and Research Grant ‘Nature-Inspired Computing and Metaheuristics Algorithms for Optimizing Data Mining Performance’, Grant No. MYRG2016-00069-FST.offered by the University of Macau, FST, and RDAO.
Author information
Affiliations
Corresponding author
Appendix: Benchmark functions
Appendix: Benchmark functions
This suite of benchmark functions are wildly used in optimization studies and can be divided into two groups: the first includes seven unimodal functions f1–f7, the second contains six multimodal functions f8–f14. Unimodal functions, contains multiple landscape, and are difficult to find the optima. For multimodal functions, the final results are much more important because they have many local minima and reflect an algorithm’s ability to escape from poor local optima.
Function | Dim | Range | f _{min} |
---|---|---|---|
\(f1\left( x \right) = \mathop \sum \limits_{i = 1}^{n} x_{i}^{2}\) | 30 | [−100, 100] | 0 |
\(f2\left( x \right) = \mathop \sum \limits_{i = 1}^{n} |x_{i} | + \mathop \prod \limits_{i = 1}^{n} |x_{i} |\) | 30 | [−10, 10] | 0 |
\(f3\left( x \right) = \mathop \sum \limits_{i = 1}^{n} \left( {\mathop \sum \limits_{j - 1}^{i} x_{j} } \right)^{2}\) | 30 | [−100, 100] | 0 |
\(f4\left( x \right) = { \hbox{max} }_{i} \{ |x_{i} |,1 \le i \le n\}\) | 30 | [−100, 100] | 0 |
\(f5\left( x \right) = \mathop \sum \limits_{i = 1}^{n - 1} \left[ {100\left( {x_{i + 1} - x_{i}^{2} } \right)^{2} + \left( {x_{i} - 1} \right)^{2} } \right]\) | 30 | [−30, 30] | 0 |
\(f6\left( x \right) = \mathop \sum \limits_{i = 1}^{n} \left( {x_{i} + 0.5} \right)^{2}\) | 30 | [−100, 100] | 0 |
\(f7\left( x \right) = \mathop \sum \limits_{i = 1}^{n} ix_{i}^{4} + random\left[ {0,1} \right)\) | 30 | [−1.28, 1.28] | 0 |
\(f8\left( x \right) = \mathop \sum \limits_{i = 1}^{n} - x_{i} sin\left( {\sqrt {\left| {x_{i} } \right|} } \right)\) | 30 | [−500, 500] | −418.9829 × 5 |
\(f9\left( x \right) = \mathop \sum \limits_{i = 1}^{n} [x_{i}^{2} - 10cos\left( {2\pi x_{i} } \right) + 10]\) | 30 | [−5.12, 5.12] | 0 |
\(f10\left( x \right) = - 20\exp \left( { - .02\sqrt {\frac{1}{n}\mathop \sum \limits_{i = 1}^{n} x_{i}^{2} } } \right) - { \exp }\left( {\frac{1}{n}\mathop \sum \limits_{i = 1}^{n} cos2\pi x_{i} } \right)\) | 30 | [−32, 32] | 0 |
\(f11\left( x \right) = \frac{1}{4000}\mathop \sum \limits_{i = 1}^{n} x_{i}^{2} - \mathop \prod \limits_{i = 1}^{n} cos\left( {\frac{{x_{i} }}{\sqrt i }} \right) + 1\) | 30 | [−600, 600] | 0 |
\(\begin{aligned} f12\left( x \right) & = \frac{\pi }{n}\{ 10\sin \left( {\pi y_{1} } \right) \\ & \quad + \mathop \sum \limits_{i = 1}^{n - 1} \left( {y_{i} - 1} \right)^{2} [1 + 10\sin^{2} \left( {\pi y_{i + 1} } \right)] + \left( {y_{n} - 1} \right)^{2} {\text{\} }} \\ &\quad + \mathop \sum \limits_{i = 1}^{n} u\left( {x_{i} ,10,100,4} \right),y_{i} = 1 + \frac{{x_{j} + 1}}{4}, \\ u\left( {x_{i} ,a,k,m} \right) & = \left\{ {\begin{array}{*{20}c} {k\left( {x_{i} - a} \right)^{m} x_{i} > a} \\ {0 - a < x_{i} < a } \\ {k\left( { - x_{i} - a} \right)^{m} x_{i} < - a} \\ \end{array} } \right. \\ \end{aligned}\) | 30 | [−50, 50] | 0 |
\(\begin{aligned} f13\left( x \right) & = 0.1\{ \sin^{2} \left( {3\pi x_{1} } \right) \\ & \quad + \mathop \sum \limits_{i = 1}^{n} \left( {x_{i} - 1} \right)^{2} [1 + 10\sin^{2} \left( {3\pi x_{i} + 1} \right)] \\ \quad + \left( {x_{n} - 1} \right)^{2} [1 + \sin^{2} \left( {3\pi x_{n} } \right)]{\text{\} }} \\ \quad + \mathop \sum \limits_{i = 1}^{n} u\left( {x_{i} ,10,100,4} \right) \\ \end{aligned}\) | 30 | [−50, 50] | 0 |
\(f14\left( x \right) = - \mathop \sum \limits_{i = 1}^{n} \sin \left( {x_{i} } \right) \cdot \left( {\sin \left( {\frac{{ix_{i}^{2} }}{\pi }} \right)} \right)^{2m} ,m = 10\) | 30 | [0, π] | −4.687 |
Rights and permissions
About this article
Cite this article
Tang, R., Fong, S., Deb, S. et al. Dynamic group search algorithm for solving an engineering problem. Oper Res Int J 18, 781–799 (2018). https://doi.org/10.1007/s12351-017-0317-6
Received:
Revised:
Accepted:
Published:
Issue Date:
Keywords
- Component
- Group search algorithm
- Optimization