Artificial Life and Robotics

, Volume 23, Issue 4, pp 618–627 | Cite as

The velocity updating rule according to an oblique coordinate system with mutation and dynamic scaling for particle swarm optimization

  • Tetsuyuki TakahamaEmail author
  • Setsuko Sakai
Original Article


Particle swarm optimization (PSO) has been showing powerful search performance especially in separable and unimodal problems. However, the performance is deteriorated in non-separable problems such as rotated problems. In this study, a new velocity updating rule according to an oblique coordinate system, instead of an orthogonal coordinate system, is proposed to solve non-separable problems. Two mutation operations for the best particle and the worst particle are proposed to improve the diversity of particles and to decrease the degradation of moving speed of particles. In addition, the vectors generated according to the oblique coordinate system are dynamically scaled to improve the robustness and efficiency of the search. The advantage of the proposed method is shown by solving various problems including separable, non-separable, unimodal, and multimodal problems, and their rotated problems and by comparing the results of the proposed method with those of standard PSO.


Particle swarm optimization Oblique coordinate system Velocity updating rule Mutation 


  1. 1.
    Kennedy J, Eberhart RC (1995) Particle swarm optimization. In: Proceedings of IEEE international conference on neural networks, Perth, pp 1942–1948Google Scholar
  2. 2.
    Kennedy J, Eberhart RC (2001) Swarm intelligence. Morgan Kaufmann, San FranciscoGoogle Scholar
  3. 3.
    Sun Y, Halgamuge SK, Kirley M, Munoz MA (2014) On the selection of fitness landscape analysis metrics for continuous optimization problems. In: The 7th international conference on information and automation for sustainability, IEEE, pp 1–6Google Scholar
  4. 4.
    Spears WM, Green DT, Spears DF (2010) Biases in particle swarm optimization. Int J Swarm Intell Res 1(2):34–57CrossRefGoogle Scholar
  5. 5.
    Hansen N, Ros R, Mauny N, Schoenauer M, Auger A (2011) Impacts of invariance in search: when cma-es and pso face ill-conditioned and non-separable problems. Appl Soft Comput 11(8):5755–5769CrossRefGoogle Scholar
  6. 6.
    Zhang J, Sanderson AC (2009) JADE: adaptive differential evolution with optional external archive. IEEE Trans Evolut Comput 13(5):945–958CrossRefGoogle Scholar
  7. 7.
    Bonyadi M, Li X, Michalewicz Z (2013) A hybrid particle swarm with velocity mutation for constraint optimization problems. In: Proceedings of the 15th annual conference on genetic and evolutionary computation, ACM, New York, pp 1–8.
  8. 8.
    Takahama T, Sakai S (2010) Solving nonlinear optimization problems by differential evolution with a rotation-invariant crossover operation using Gram-Schmidt process. In: Proceedings of second world congress on nature and biologically inspired computing (NaBIC2010), pp 533–540Google Scholar
  9. 9.
    Guo SM, Yang CC (2015) Enhancing differential evolution utilizing eigenvector-based crossover operator. IEEE Trans Evolut Comput 19(1):31–49MathSciNetCrossRefGoogle Scholar
  10. 10.
    Bonyadi MR, Michalewicz Z (2014) Spso 2011: analysis of stability; local convergence; and rotation sensitivity. In: Proceedings of the 2014 annual conference on genetic and evolutionary computation, ACM, pp 9–16Google Scholar
  11. 11.
    Eberhart R, Shi Y (2001) Particle swarm optimization: developments, applications and resources. In: Proceedings of the 2001 congress on evolutionary computation, pp 81–86Google Scholar
  12. 12.
    Engelbrecht A (2013) Particle swarm optimization: global best or local best? In: 2013 BRICS congress on computational intelligence and 11th Brazilian congress on computational intelligence, IEEE, pp 124–135Google Scholar
  13. 13.
    Shi Y, Eberhart R (1999) Empirical study of particle swarm optimization. In: Proceedings of the 1999 congress on evolutionary computation, pp 1945–1950Google Scholar
  14. 14.
    Clerc M, Kennedy J (2002) The particle swarm—explosion, stability, and convergence in a multidimensional complex space. IEEE Trans Evolut Comput 6(1):58–73CrossRefGoogle Scholar
  15. 15.
    Eberhart R, Shi Y (2000) Comparing inertia weights and constriction factors in particle swarm optimization. In: Proceedings of the 2000 congress on evolutionary computation, pp 84–88Google Scholar
  16. 16.
    Sakai S, Takahama T (2018) A study on selecting an oblique coordinate system for rotation-invariant blend crossover in a real-coded genetic algorithm. In: Kadoya A, Teramoto H (eds) Recent studies in economic sciences: information Systems. Kyushu University Press, Project Managements, Economics, OR and Mathematics, pp 65–87Google Scholar
  17. 17.
    Iorio AW, Li X (2008) Improving the performance and scalability of differential evolution. In: Asia-Pacific conference on simulated evolution and learning, Springer, New York, pp 131–140Google Scholar
  18. 18.
    Eshelman LJ, Schaffer JD (1993) Real-coded genetic algorithms and interval schemata. In: Whitley LD (ed) Foundations of genetic algorithms 2. Morgan Kaufmann Publishers, San Mateo, pp 187–202Google Scholar
  19. 19.
    Shang YW, Qiu YH (2006) A note on the extended Rosenbrock function. Evolut Comput 14(1):119–126CrossRefGoogle Scholar
  20. 20.
    Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evolut Comput 3:82–102CrossRefGoogle Scholar
  21. 21.
    Yao X, Liu Y, Liang KH, Lin G (2003) Fast evolutionary algorithms. In: Ghosh A, Tsutsui S (eds) Advances in evolutionary computing: theory and applications. Springer, New York, pp 45–94CrossRefGoogle Scholar
  22. 22.
    Akimoto Y, Nagata Y, Sakuma J, Ono I, Kobayashi S (2009) Proposal and evaluation of adaptive real-coded crossover AREX. Trans Jpn Soc Artif Intell 24(6):446–458 in JapaneseCrossRefGoogle Scholar

Copyright information

© ISAROB 2018

Authors and Affiliations

  1. 1.Hiroshima City UniversityHiroshimaJapan
  2. 2.Hiroshima Shudo UniversityHiroshimaJapan

Personalised recommendations