Using Hybrid Dependency Identification with a Memetic Algorithm for Large Scale Optimization Problems
Decomposing a large scale problem into smaller subproblems is one of the approaches used to overcome the usual performance deterioration that occurs in EA because of the large dimensionality. To achieve a good performance with a decomposition approach, the dependent variables need to be grouped into the same subproblem. In this paper, the Hybrid Dependency Identification with Memetic Algorithm (HDIMA) model is proposed for large scale optimization problems. The Dependency Identification (DI) technique identifies the variables that must be grouped together to form the subproblems. These subproblems are then evolved using a Memetic Algorithm (MA). Before the end of the evolution process, the subproblems are then aggregated and optimized as a complete large scale problem. A newly designed test suite of problems has been used to evaluate the performance of HDIMA over different dimensions. The evaluation shows that HDIMA is competitive to other models in the literature in terms of both consuming less computational resources and better performance.
KeywordsLarge Scale Problems Optimization Evolutionary Algorithms Memetic Algorithms Problem Decomposition Dependency Identification
Unable to display preview. Download preview PDF.
- 1.Omidvar, M., Li, X., Yang, Z., Yao, X.: Cooperative co-evolution for large scale optimization through more frequent random grouping (2010)Google Scholar
- 2.Potter, M., Jong, K.D.: A Cooperative Coevolutionary Approach to Function Optimization. In: Davidor, Y., Männer, R., Schwefel, H.-P. (eds.) PPSN III. LNCS, vol. 866. Springer, Heidelberg (1994)Google Scholar
- 3.Goldberg, D., Voessner, S.: Optimizing global-local search hybrids. In: Proceedings of the Genetic and Evolutionary Computation Conference, San Mateo, California, pp. 220–228 (1999)Google Scholar
- 6.Ray, T., Yao, X.: A Cooperative Coevolutionary Algorithm with Correlation Based Adaptive Variable Partitioning. In: IEEE Congress on Evolutionary Computation, pp. 983–989 (2009)Google Scholar
- 7.Omidvar, M., Li, X., Yao, X.: Cooperative Co-evolution with Delta Grouping for Large Scale Non-separable Function Optimization. In: 2010 IEEE World Congress on Computational Intelligence, Barcelona, Spain, pp. 18–23 (2010)Google Scholar
- 9.Sayed, E., Essam, D., Sarker, R.: Dependency Identification Technique for Large Scale Optimization Problems. In: Proceedings of the 2012 IEEE Congress on Evolutionary Computation, Brisbane, Australia, pp. 1442–1449 (2012)Google Scholar
- 11.Merz, P.: Memetic Algorithms for Combinational Optimization Problems: Fitness Landscapes and Effective Search Strategies. PhD, Gesamthochschule Siegen, University of Siegen, Germany (2000)Google Scholar
- 13.Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley Longman Publishing Co., Inc. (1989)Google Scholar
- 16.Molina, D., Lozano, M., Sánchez, A., Herrera, F.: Memetic algorithms based on local search chains for large scale continuous optimisation problems: MA-SW-Chains. In: Soft Computing - A Fusion of Foundations, Methodologies and Applications, pp. 1–20 (2010)Google Scholar
- 17.Tang, K., Yao, X., Suganthan, P.N., MacNish, C., Chen, Y.-P., Chen, C.-M., Yang, A.Z.: Benchmark Functions for the CEC 2008 Special Session and Competition on Large Scale Global Optimization. University of Science and Technology of China (USTC), School of Computer Science and Technology, Nature Inspired Computation and Applications Laboratory, NICAL, Héféi, Ānhū, China (2007)Google Scholar