Advertisement

Swarm Intelligence

, Volume 10, Issue 3, pp 161–192 | Cite as

A new particle swarm optimization algorithm for noisy optimization problems

  • Sajjad Taghiyeh
  • Jie XuEmail author
Article

Abstract

We propose a new particle swarm optimization algorithm for problems where objective functions are subject to zero-mean, independent, and identically distributed stochastic noise. While particle swarm optimization has been successfully applied to solve many complex deterministic nonlinear optimization problems, straightforward applications of particle swarm optimization to noisy optimization problems are subject to failure because the noise in objective function values can lead the algorithm to incorrectly identify positions as the global/personal best positions. Instead of having the entire swarm follow a global best position based on the sample average of objective function values, the proposed new algorithm works with a set of statistically global best positions that include one or more positions with objective function values that are statistically equivalent, which is achieved using a combination of statistical subset selection and clustering analysis. The new PSO algorithm can be seamlessly integrated with adaptive resampling procedures to enhance the capability of PSO to cope with noisy objective functions. Numerical experiments demonstrate that the new algorithm is able to consistently find better solutions than the canonical particle swarm optimization algorithm in the presence of stochastic noise in objective function values with different resampling procedures.

Keywords

Noisy optimization Particle swarm optimization Subset selection Clustering Optimality gap 

References

  1. AlRashidi, M. R., & El-Hawary, M. E. (2009). A survey of particle swarm optimization applications in electric power systems. IEEE Transactions on Evolutionary Computation, 13(4), 913–918.CrossRefGoogle Scholar
  2. Audibert, J. Y., Munos, R., & Szepesvári, C. (2009). Exploration–exploitation tradeoff using variance estimates in multi-armed bandits. Theoretical Computer Science, 410(19), 1876–1902.zbMATHMathSciNetCrossRefGoogle Scholar
  3. Auer, P. (2003). Using confidence bounds for exploitation–exploration trade-offs. The Journal of Machine Learning Research, 3, 397–422.zbMATHMathSciNetGoogle Scholar
  4. Bartz-Beielstein, T., Blum, D., & Branke, J. (2007). Particle swarm optimization and sequential sampling in noisy environments. In K. F. Doerner, M. Gendreau, P. Greistorfer, W. J. Gutjahr, R. F. Hartl, M. Reimann (Eds.), Metaheuristics, operations research/computer science interfaces series (pp. 261–273). Heidelberg: Springer.Google Scholar
  5. Beielstein, T., & Markon, S. (2002). Threshold selection, hypothesis tests, and doe methods. IEEE Proceedings of the World Congress on Computational Intelligence, 1, 777–782.Google Scholar
  6. Bird, S., & Li, X. (2006). Enhancing the robustness of a speciation-based PSO. In IEEE Congress on Evolutionary Computation, 2006. CEC 2006 (pp 843–850).Google Scholar
  7. Boesel, J., Nelson, B. L., & Ishii, N. (2003). A framework for simulation-optimization software. IIE Transactions, 35(3), 221–229.CrossRefGoogle Scholar
  8. Branke, J., & Schmidt, C. (2003). Selection in the presence of noise. In Genetic and Evolutionary Computation—GECCO 2003 (pp 766–777). Berlin: Springer.Google Scholar
  9. Bratley, P., Fox, B. L., & Schrage, L. E. (2011). A guide to simulation. Berlin: Springer.zbMATHGoogle Scholar
  10. Cantú-Paz, E. (2004). Adaptive sampling for noisy problems. In Genetic and evolutionary computation—GECCO 2004 (pp. 947–958). Berlin: Springer.Google Scholar
  11. Chen, C. H., Lin, J., Yücesan, E., & Chick, S. E. (2000). Simulation budget allocation for further enhancing the efficiency of ordinal optimization. Discrete Event Dynamic Systems, 10(3), 251–270.zbMATHMathSciNetCrossRefGoogle Scholar
  12. Chen, C. H., He, D., Fu, M., & Lee, L. H. (2008). Efficient simulation budget allocation for selecting an optimal subset. INFORMS Journal on Computing, 20(4), 579–595.CrossRefGoogle Scholar
  13. Chen, W. N., Zhang, J., Chung, H. S. H., Zhong, W. L., Wu, W. G., & Shi, Y. H. (2010). A novel set-based particle swarm optimization method for discrete optimization problems. IEEE Transactions on Evolutionary Computation, 14(2), 278–300.CrossRefGoogle Scholar
  14. Chick, S. E., Inoue, K., Inoue, K., & Inoue, K. (2001). New two-stage and sequential procedures for selecting the best simulated system. Operations Research, 49(5), 732–743.CrossRefGoogle Scholar
  15. Clerc, M., & Kennedy, J. (2002). The particle swarm—Explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation, 6(1), 58–73.CrossRefGoogle Scholar
  16. Di Mario, E., Navarro, I., & Martinoli, A. (2015a). A distributed noise-resistant particle swarm optimization algorithm for high-dimensional multi-robot learning. In IEEE international conference on robotics and automation (ICRA) (pp. 5970–5976).Google Scholar
  17. Di Mario, E., Navarro, I., & Martinoli, A. (2015b). Distributed particle swarm optimization using optimal computing budget allocation for multi-robot learning. In IEEE congress on evolutionary computation (CEC) (pp. 566–572).Google Scholar
  18. Di Pietro, A., While, L., & Barone, L. (2004). Applying evolutionary algorithms to problems with noisy, time-consuming fitness functions. In Congress on evolutionary computation, 2004. CEC 2004, IEEE (Vol. 2, pp. 1254–1261).Google Scholar
  19. Eberhart, R. C., & Kennedy, J. (1995). A new optimizer using particle swarm theory. In Proceedings of the sixth international symposium on micro machine and human science, New York, NY (Vol. 1, pp. 39–43).Google Scholar
  20. Engelbrecht, A. P. (2013). Particle swarm optimization: Global best or local best? In BRICS Congress on computational intelligence and 11th Brazilian congress on computational intelligence (BRICS-CCI & CBIC), IEEE (pp. 124–135).Google Scholar
  21. Fernandez-Marquez, J. L., & Arcos, J. L. (2009). An evaporation mechanism for dynamic and noisy multimodal optimization. In Proceedings of the 11th annual conference on genetic and evolutionary computation, ACM (pp. 17–24).Google Scholar
  22. Fernandez-Marquez, J. L., & Arcos, J. L. (2010). Adapting particle swarm optimization in dynamic and noisy environments. In IEEE Congress on Evolutionary Computation (CEC), 2010, IEEE (pp. 1–8).Google Scholar
  23. Fernandez-Martinez, J. L., & Garcia-Gonzalo, E. (2011). Stochastic stability analysis of the linear continuous and discrete PSO models. IEEE Transactions on Evolutionary Computation, 15(3), 405–423.CrossRefGoogle Scholar
  24. Fitzpatrick, J. M., & Grefenstette, J. J. (1988). Genetic algorithms in noisy environments. Machine learning, 3(2–3), 101–120.Google Scholar
  25. Frazier, P., Powell, W., & Dayanik, S. (2009). The knowledge-gradient policy for correlated normal beliefs. INFORMS Journal on Computing, 21(4), 599–613.zbMATHMathSciNetCrossRefGoogle Scholar
  26. Frazier, P. I., Powell, W. B., & Dayanik, S. (2008). A knowledge-gradient policy for sequential information collection. SIAM Journal on Control and Optimization, 47(5), 2410–2439.zbMATHMathSciNetCrossRefGoogle Scholar
  27. Horng, S. C., Lin, S. Y., Lee, L. H., & Chen, C. H. (2013). Memetic algorithm for real-time combinatorial stochastic simulation optimization problems with performance analysis. IEEE Transactions on Cybernetics, 43(5), 1495–1509.CrossRefGoogle Scholar
  28. Hu, X., & Eberhart, R. (2002). Multiobjective optimization using dynamic neighborhood particle swarm optimization. In Proceedings of the world congress on computational intelligence, IEEE (pp. 1677–1681).Google Scholar
  29. Jacod, J., & Protter, P. E. (2003). Probability essentials. Berlin: Springer.zbMATHGoogle Scholar
  30. Jiang, M., Luo, Y. P., & Yang, S. Y. (2007). Stochastic convergence analysis and parameter selection of the standard particle swarm optimization algorithm. Information Processing Letters, 102(1), 8–16.zbMATHMathSciNetCrossRefGoogle Scholar
  31. Jin, Y., & Branke, J. (2005). Evolutionary optimization in uncertain environments—A survey. IEEE Transactions on Evolutionary Computation, 9(3), 303–317.CrossRefGoogle Scholar
  32. Kennedy, J. (1999). Small worlds and mega-minds: Effects of neighborhood topology on particle swarm performance. In Proceedings of the 1999 Congress on evolutionary computation, 1999. CEC’99, (Vol. 3).Google Scholar
  33. Kennedy, J., & Eberhart, R. (1995). Particle swarm optimization. Proceedings of IEEE International Conference on Neural Networks, 4, 1942–1948.CrossRefGoogle Scholar
  34. Kennedy, J., & Mendes, R. (2002). Population structure and particle swarm performance. In Proceedings of the IEEE congress on evolutionary computation (CEC) (pp. 1671–1676). Honolulu, HI/Piscataway: IEEE.Google Scholar
  35. Kennedy, J., & Mendes, R. (2006). Neighborhood topologies in fully informed and best-of-neighborhood particle swarms. IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, 36(4), 515.CrossRefGoogle Scholar
  36. Kennedy, J., Kennedy, J. F., Eberhart, R. C., & Shi, Y. (2001). Swarm intelligence. San Francisco: Morgan Kaufmann.Google Scholar
  37. Kim, S. H., & Nelson, B. L. (2006). Selecting the best system. In Handbooks in operations research and management science: Simulation (Vol 13, pp 261–273). Amsterdam: ElsevierGoogle Scholar
  38. Langeveld, J., & Engelbrecht, A. P. (2012). Set-based particle swarm optimization applied to the multidimensional knapsack problem. Swarm Intelligence, 6(4), 297–342.CrossRefGoogle Scholar
  39. Law, A. M., & Kelton, W. D. (2000). Simulation modeling and analysis (3rd ed.). Boston: McGraw Hill.zbMATHGoogle Scholar
  40. Li, L., & Tang, K. (2015). History-based topological speciation for multimodal optimization. IEEE Transactions on Evolutionary Computation, 19(1), 136–150.CrossRefGoogle Scholar
  41. Liu, J., Li, C., Yang, F., Wan, H., & Uzsoy, R. (2011). Production planning for semiconductor manufacturing via simulation optimization. In Proceedings of the winter simulation conference (pp. 3617–3627).Google Scholar
  42. Mahajan, S., & van Ryzin, G. (2001). Stocking retail assortments under dynamic consumer substitution. Operations Research, 49(3), 334–351.zbMATHCrossRefGoogle Scholar
  43. Markon, S., Arnold, D. V., Back, T., Beielstein, T., & Beyer, H. G. (2001). Thresholding—a selection operator for noisy ES. In Proceedings of the 2001 congress on evolutionary computation, 2001 (Vol. 1, pp. 465–472).Google Scholar
  44. Maron, O., & Moore, A. W. (1993). Hoeffding races: Accelerating model selection search for classification and function approximation. In J. D. Cowan et al. (Eds.), Advances in neural information processing systems (Vol. 6, pp. 59–66). San Francisco, CA: Morgan Kaufmann.Google Scholar
  45. Mendes, R., Kennedy, J., & Neves, J. (2004). The fully informed particle swarm: Simpler, maybe better. IEEE Transactions on Evolutionary Computation, 8(3), 204–210.CrossRefGoogle Scholar
  46. Miller, B. L., & Goldberg, D. E. (1996). Genetic algorithms, selection schemes, and the varying effects of noise. Evolutionary Computation, 4(2), 113–131.CrossRefGoogle Scholar
  47. Olorunda, O., & Engelbrecht, A. P. (2008). Measuring exploration/exploitation in particle swarms using swarm diversity. In IEEE congress on evolutionary computation, 2008. CEC 2008 (IEEE world congress on computational intelligence) (pp. 1128–1134).Google Scholar
  48. Pan, H., Wang, L., & Liu, B. (2006). Particle swarm optimization for function optimization in noisy environment. Applied Mathematics and Computation, 181(2), 908–919.zbMATHMathSciNetCrossRefGoogle Scholar
  49. Parrott, D., & Li, X. (2006). Locating and tracking multiple dynamic optima by a particle swarm model using speciation. IEEE Transactions on Evolutionary Computation, 10(4), 440–458.CrossRefGoogle Scholar
  50. Pehlivanoglu, Y. V. (2013). A new particle swarm optimization method enhanced with a periodic mutation strategy and neural networks. IEEE Transactions on Evolutionary Computation, 17(3), 436–452.CrossRefGoogle Scholar
  51. Piperagkas, G. S., Georgoulas, G., Parsopoulos, K. E., Stylios, C. D., & Likas, A. C. (2012). Integrating particle swarm optimization with reinforcement learning in noisy problems. In Proceedings of the 14th annual conference on Genetic and evolutionary computation, ACM (pp. 65–72).Google Scholar
  52. Pugh, J., Martinoli, A., & Zhang, Y. (2005). Particle swarm optimization for unsupervised robotic learning. In Proceedings of IEEE swarm intelligence symposium (SIS) (pp. 92–99). Piscataway: IEEE.Google Scholar
  53. Rada-Vilela, J., Zhang, M., & Johnston, M. (2013). Optimal computing budget allocation in particle swarm optimization. In Proceedings of the 15th annual conference on genetic and evolutionary computation, ACM (pp. 81–88).Google Scholar
  54. Rada-Vilela, J., Johnston, M., & Zhang, M. (2014). Population statistics for particle swarm optimization: Resampling methods in noisy optimization problems. Swarm and Evolutionary Computation, 17, 37–59.Google Scholar
  55. Rada-Vilela, J., Johnston, M., & Zhang, M. (2015a). Population statistics for particle swarm optimization: Single-evaluation methods in noisy optimization problems. Soft computing, 19(9), 2691–2716. doi: 10.1007/s00500-014-1438-y.
  56. Rada-Vilela, J., Johnston, M., & Zhang, M. (2015b). Population statistics for particle swarm optimization: Hybrid methods in noisy optimization problems. Swarm and Evolutionary Computation, 22, 15–29.Google Scholar
  57. Rudolph, G. (2001a). Evolutionary search under partially ordered fitness sets. In Proceedings of the international symposium on information science innovations in engineering of natural and artificial intelligent systems (ISI 2001) (pp. 818–822). Millet, AB, CA: ICSC Academic Press.Google Scholar
  58. Rudolph, G. (2001b). A partial order approach to noisy fitness functions. In Proceedings of the 2001 congress on evolutionary computation, 2001 (Vol. 1, pp. 318–325).Google Scholar
  59. Samanta, B., & Nataraj, C. (2009). Application of particle swarm optimization and proximal support vector machines for fault detection. Swarm Intelligence, 3(4), 303–325.CrossRefGoogle Scholar
  60. Shi, Y. (2004). Particle swarm optimization. IEEE Connections, 2(1), 8–13.Google Scholar
  61. Shi, Y., & Eberhart, R. C. (1998a). A modified particle swarm optimizer. In Proceedings of the IEEE international conference on evolutionary computation (pp. 69–73). Piscataway: IEEE.Google Scholar
  62. Shi, Y., & Eberhart, R. C. (1998b). Parameter selection in particle swarm optimization. In LNCS Proceedings of the seventh annual conference on evolutionary programming (Vol. 1447, pp. 591–600). Berlin: Springer.Google Scholar
  63. Suganthan, P. N. (1999). Particle swarm optimiser with neighbourhood operator. In Proceedings of the IEEE congress on evolutionary computation (CEC) (pp. 1958–1962). Piscataway: IEEE.Google Scholar
  64. Sun, T. Y., Liu, C. C., Tsai, S. J., Hsieh, S. T., & Li, K. Y. (2011). Cluster guide particle swarm optimization (CGPSO) for underdetermined blind source separation with advanced conditions. IEEE Transactions on Evolutionary Computation, 15(6), 798–811.CrossRefGoogle Scholar
  65. Tang, K., Li, X., Suganthan, P., Yang, Z., & Weise, T. (2009). Benchmark functions for the CEC2010 special session and competition on large scale global optimization. China: Nature Inspired Computation and Applications Laboratory, USTC.Google Scholar
  66. Thompson, S. K., & Seber, G. A. (1996). Adaptive Sampling. New York: Wiley.zbMATHGoogle Scholar
  67. Trelea, I. C. (2003). The particle swarm optimization algorithm: Convergence analysis and parameter selection. Information Processing Letters, 85(6), 317–325.zbMATHMathSciNetCrossRefGoogle Scholar
  68. Wasserman, L. (2004). All of statistics: A concise course in statistical inference (Springer Texts in Statistics). Berlin: Springer.zbMATHCrossRefGoogle Scholar
  69. Weber, R., et al. (1992). On the Gittins index for multiarmed bandits. The Annals of Applied Probability, 2(4), 1024–1033.zbMATHMathSciNetCrossRefGoogle Scholar
  70. Whittle, P. (1980). Multi-armed bandits and the Gittins index. Journal of the Royal Statistical Society Series B (Methodological)42(2), 143–149.Google Scholar
  71. Xiao, H., & Lee, L. H. (2014). Simulation optimization using genetic algorithms with optimal computing budget allocation. Simulation, 90(10), 1146–1157.CrossRefGoogle Scholar
  72. Xu, J., Nelson, B. L., & Hong, J. L. (2010). Industrial strength compass: A comprehensive algorithm and software for optimization via simulation. ACM Transactions on Modeling and Computer Simulation (TOMACS), 20(1), 3:1–3:29. doi: 10.1145/1667072.1667075.CrossRefGoogle Scholar
  73. Xu, J., Vidyashankar, A., & Nielsen, M. K. (2014). Drug resistance or re-emergence? simulating equine parasites. ACM Transactions on Modeling and Computer Simulation (TOMACS), 24(4), 20.CrossRefGoogle Scholar
  74. Xu, R., & Wunsch, D. (2005). Survey of clustering algorithms. IEEE Transactions on Neural Networks, 16(3), 645–678.CrossRefGoogle Scholar
  75. Xu, R., Venayagamoorthy, G. K., & Wunsch, D. C. (2007). Modeling of gene regulatory networks with hybrid differential evolution and particle swarm optimization. Neural Networks, 20(8), 917–927.zbMATHCrossRefGoogle Scholar
  76. Yoshida, H., Kawata, K., Fukuyama, Y., Takayama, S., & Nakanishi, Y. (2000). A particle swarm optimization for reactive power and voltage control considering voltage security assessment. IEEE Transactions on Power Systems, 15(4), 1232–1239.CrossRefGoogle Scholar
  77. Zhang, S., Chen, P., Lee, L. H., Peng, C. E., & Chen, C. H. (2011). Simulation optimization using the particle swarm optimization with optimal computing budget allocation. In Proceedings of the winter simulation conference (pp. 4303–4314).Google Scholar
  78. Zheng, Y. L., Ma, L. H., Zhang, L. Y., & Qian, J. X. (2003). On the convergence analysis and parameter selection in particle swarm optimization. IEEE International Conference on Machine Learning and Cybernetics, 3, 1802–1807.Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Department of Systems Engineering and Operations ResearchGeorge Mason UniversityFairfaxUSA

Personalised recommendations