Advertisement

Swarm Intelligence

, Volume 8, Issue 4, pp 275–302 | Cite as

Characterising the searchability of continuous optimisation problems for PSO

  • K. M. Malan
  • A. P. Engelbrecht
Article

Abstract

The focus of research in swarm intelligence has been largely on the algorithmic side with relatively little attention being paid to the study of problems and the behaviour of algorithms in relation to problems. When a new algorithm or variation on an existing algorithm is proposed in the literature, there is seldom any discussion or analysis of algorithm weaknesses and on what kinds of problems the algorithm is expected to fail. Fitness landscape analysis is an approach that can be used to analyse optimisation problems. By characterising problems in terms of fitness landscape features, the link between problem types and algorithm performance can be studied. This article investigates a number of measures for analysing the ability of a search process to improve fitness on a particular problem (called evolvability in literature but referred to as searchability in this study to broaden the scope to non-evolutionary-based search techniques). A number of existing fitness landscape analysis techniques originally proposed for discrete problems are adapted to work in continuous search spaces. For a range of benchmark problems, the proposed searchability measures are viewed alongside performance measures for a traditional global best particle swarm optimisation (PSO) algorithm. Empirical results show that no single measure can be used as a predictor of PSO performance, but that multiple measures of different fitness landscape features can be used together to predict PSO failure.

Keywords

Fitness landscape analysis Evolvability Particle swarm optimisation 

References

  1. Altenberg, L. (1994). The evolution of evolvability in genetic programming. In K. Kinnear (Ed.), Advances in genetic programming (pp. 47–74). Cambridge, MA: MIT Press.Google Scholar
  2. Altenberg, L. (1997). Fitness distance correlation analysis: An instructive counterexample. In T. Baeck (Ed.), Proceedings of the Seventh International Conference on Genetic Algorithms (pp. 57–64). San Francisco, CA: Morgan Kaufmann.Google Scholar
  3. BBOB Black-Box Optimization Benchmarking. (2013). Comparing Continuous Optimisers. http://coco.gforge.inria.fr/.
  4. Bischl, B., Mersmann, O., Trautmann, H., & Preuss, M. (2012). Algorithm selection based on exploratory landscape analysis and cost-sensitive learning. In T. Soule (Ed.), Proceedings of the Fourteenth International Genetic and Evolutionary Computation Conference (pp. 313–320). New York, NY: ACM.Google Scholar
  5. Borenstein, Y., & Poli, R. (2005a). Information landscapes. In Proceedings of the 2005 Genetic and Evolutionary Computation Conference (pp. 1515–1522). New York, NY: ACM.Google Scholar
  6. Borenstein, Y., & Poli, R. (2005b). Information landscapes and problem hardness. In Proceedings of the 2005 Genetic and Evolutionary Computation Conference (pp. 1425–1431). New York, NY: ACM.Google Scholar
  7. Cleghorn, C. W., & Engelbrecht, A. P. (2014). A generalized theoretical deterministic particle swarm model. Swarm Intelligence, 8(1), 35–59.CrossRefGoogle Scholar
  8. Eberhart, R., & Kennedy, J. (1995). A new optimizer using particle swarm theory. In Proceedings of the Sixth International Symposium on Micromachine and Human Science (pp. 39–43). Piscataway, NJ: IEEE.Google Scholar
  9. Eberhart, R., & Shi, Y. (2000). Comparing inertia weights and constriction factors in particle swarm optimization. In Proceedings of the IEEE Congress on Evolutionary Computation (Vol. 1, pp. 84–88). Piscataway, NJ: IEEE.Google Scholar
  10. Engelbrecht, A. P. (2012). Particle swarm optimization: Velocity initialization. In: Proceedings of the IEEE Congress on Evolutionary Computation (pp. 1–8). Piscataway, NJ: IEEE.Google Scholar
  11. Helwig, S., & Wanka, R. (2008). Theoretical analysis of initial particle swarm behavior. In G. Rudolph, T. Jansen, S. M. Lucas, C. Poloni, & N. Beume (Eds.), Proceedings of the Tenth International Conference on Parallel Problem Solving from Nature. Lecture Notes in Computer Science (Vol. 5199, pp. 889–898). Berlin: Springer.Google Scholar
  12. Jansen, T. (2001). On classifications of fitness functions. In L. Kallel, B. Naudts, & A. Rogers (Eds.), Theoretical aspects of evolutionary computing (pp. 371–385). Berlin: Springer.CrossRefGoogle Scholar
  13. Jones, T., & Forrest, S. (1995). Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In Proceedings of the Sixth International Conference on Genetic Algorithms (pp. 184–192). San Francisco, CA: Morgan Kaufmann.Google Scholar
  14. Kennedy, J. (1997). The particle swarm: Social adaptation of knowledge. In Proceedings of the IEEE International Conference on Evolutionary Computation (pp. 303–308). Piscataway, NJ: IEEE.Google Scholar
  15. Kennedy, J. (2003). Bare bones particle swarms. In Proceedings of the 2003 IEEE Swarm Intelligence Symposium (pp. 80–87). Piscataway, NJ: IEEE.Google Scholar
  16. Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. In Proceedings of the IEEE International Joint Conference on Neural Networks (pp. 1942–1948). Piscataway, NJ: IEEE.Google Scholar
  17. Langdon, W. B., & Poli, R. (2007). Evolving problems to learn about particle swarm optimizers and other search algorithms. IEEE Transactions on Evolutionary Computation, 11(5), 561–578.CrossRefGoogle Scholar
  18. Lu, G., Li, J., & Yao, X. (2011). Fitness-probability cloud and a measure of problem hardness for evolutionary algorithms. In P. Merz & J.-K. Hao (Eds.), Evolutionary Computation in Combinatorial Optimization. Lecture Notes in Computer Science (Vol. 6622, pp. 108–117). Berlin: Springer.Google Scholar
  19. Malan, K. M. (2014). Characterising Continuous Optimisation Problems for Particle Swarm Optimisation Performance Prediction. Ph.D. thesis, Department of Computer Science, University of Pretoria, Pretoria, South Africa.Google Scholar
  20. Malan, K. M., & Engelbrecht, A. P. (2013). Ruggedness, funnels and gradients in fitness landscapes and the effect on PSO performance. In Proceedings of the IEEE Congress on Evolutionary Computation (pp. 963–970). Piscataway, NJ: IEEE.Google Scholar
  21. Malan, K. M., & Engelbrecht, A. P. (2013b). Steep gradients as a predictor of PSO failure. In Proceedings of the Fifteenth International Conference on Genetic and Evolutionary Computation Conference, Companion (pp. 9–10). New York, NY: ACM.Google Scholar
  22. Malan, K. M., & Engelbrecht, A. P. (2013c). A survey of techniques for characterising fitness landscapes and some possible ways forward. Information Sciences, 241, 148–163.CrossRefGoogle Scholar
  23. Malan, K. M., & Engelbrecht, A. P. (2014a). Fitness landscape analysis for metaheuristic performance prediction. In H. Richter & A. P. Engelbrecht (Eds.), Recent Advances in the Theory and Application of Fitness Landscapes, Emergence, Complexity and Computation (Vol. 6, pp. 103–132). Springer: Berlin.Google Scholar
  24. Malan, K. M., & Engelbrecht, A. P. (2014b). Particle swarm optimisation failure prediction based on fitness landscape characteristics. In: Proceedings of IEEE Symposium on Swarm Intelligence. Piscataway, NJ: IEEE (to appear).Google Scholar
  25. Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., & Rudolph, G. (2011). Exploratory landscape analysis. In Proceedings of the Thirteenth Annual Conference on Genetic and Evolutionary Computation (pp. 829–836). New York, NY: ACM.Google Scholar
  26. Müller, C. L., & Sbalzarini, I. F. (2011). Global characterization of the CEC 2005 fitness landscapes using fitness-distance analysis. In C. Di Chio, et al. (Eds.), Applications of Evolutionary Computation. Lecture Notes in Computer Science (Vol. 6624, pp. 294–303). Berlin: Springer.Google Scholar
  27. Muñoz, M. A., Kirley, M., & Halgamuge, S. K. (2012). A meta-learning prediction model of algorithm performance for continuous optimization problems. In C. Coello Coello, V. Cutello, K. Deb, S. Forrest, G. Nicosia, & M. Pavone (Eds.), Proceedings of the Twelfth International Conference on Parallel Problem Solving from Nature: Part I. Berlin: Springer.Google Scholar
  28. Naudts, B., & Kallel, L. (1998). Some Facts About So Called GA-Hardness Measures. Technical Report 379, CMAP, Ecole Polytechnique, France.Google Scholar
  29. Naudts, B., & Kallel, L. (2000). A comparison of predictive measures of problem difficulty in evolutionary algorithms. IEEE Transactions on Evolutionary Computation, 4(1), 1–15.CrossRefGoogle Scholar
  30. Quick, R. J., Rayward-Smith, V. J., & Smith, G. D. (1998). Fitness distance correlation and ridge functions. In A. E. Eiben, T. Bäck, M. Schoenauer, & H.-P. Schwefel (Eds.), Proceedings of the Fifth International Conference on Parallel Problem Solving from Nature (pp. 77–86). Berlin: Springer.Google Scholar
  31. Quinlan, J. R. (1993). C4.5: Programs for machine learning. San Francisco, CA: Morgan Kaufmann.Google Scholar
  32. Reeves, C. R. (1999). Predictive measures for problem difficulty. In Proceedings of the 1999 Congress on Evolutionary Computation (pp. 736–743), Piscataway, NJ: IEEE.Google Scholar
  33. Rice, J. R. (1976). The algorithm selection problem. Advances in Computers, 15, 65–118.CrossRefGoogle Scholar
  34. Shang, Y. W., & Qiu, Y. H. (2006). A note on the extended Rosenbrock function. Evolutionary Computation, 14(1), 119–126.CrossRefGoogle Scholar
  35. Smith, T., Husbands, P., Layzell, P., & O’Shea, M. (2002). Fitness landscapes and evolvability. Evolutionary Computation, 10(1), 1–34.CrossRefGoogle Scholar
  36. Smith-Miles, K. (2008). Towards insightful algorithm selection for optimisation using meta-learning concepts. In Proceedings of the IEEE Joint Conference on Neural Networks (pp. 4118–4124). Piscataway, NJ: IEEE.Google Scholar
  37. Suganthan, P. N., Hansen, N., Liang, J. J., Deb, K., Chen, Y. P., Auger, A., et al. (2005). Problem Definitions and Evaluation Criteria for the CEC 2005 Special Session on Real-Parameter Optimization, Technical Report. Nanyang Technological University, Singapore.Google Scholar
  38. Trelea, I. C. (2003). The particle swarm optimization algorithm: Convergence analysis and parameter selection. Information Processing Letters, 85(6), 317–325.CrossRefMATHMathSciNetGoogle Scholar
  39. Turney, P. D. (1999). Increasing evolvability considered as a large-scale trend in evolution. Proceedings of 1999 Genetic and Evolutionary Computation Conference Workshop Program (pp. 43–46). San Mateo, CA: Morgan Kaufmann.Google Scholar
  40. Van Den Bergh, F., & Engelbrecht, A. P. (2006). A study of particle swarm optimization particle trajectories. Information Sciences, 176(8), 937–971.CrossRefMATHMathSciNetGoogle Scholar
  41. Vanneschi, L. (2004). Theory and Practice for Efficient Genetic Programming. Ph.D. thesis, Faculty of Sciences, University of Lausanne, Switzerland.Google Scholar
  42. Vanneschi, L., Clergue, M., Collard, P., Tomassini, M., & Verel, S. (2004). Fitness clouds and problem hardness in genetic programming. In K. Deb, R. Poli, W. Banzhaf, H. G. Beyer, E. Burke, P. Darwen, D. Dasgupta, D. Floreano, J. Foster, M. Harman, O. Holland, P. L. Lanzi, L. Spector, A. G. B. Tettamanzi, D. Thierens, & A. Tyrrell (Eds.), Proceedings of Genetic and Evolutionary Computation Conference. Lecture Notes in Computer Science (Vol. 3103, pp. 690–701). Berlin: Springer.Google Scholar
  43. Verel, S., Collard, P., & Clergue, M. (2003). Where are bottlenecks in NK fitness landscapes? In Proceedings of the 2003 Congress on Evolutionary Computation (Vol. 1, pp. 273–280). New York, NY: ACM.Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Department of Computer ScienceUniversity of PretoriaPretoriaSouth Africa

Personalised recommendations