Analysis of Noisy Evolutionary Optimization When Sampling Fails

Abstract

In noisy evolutionary optimization, sampling is a common strategy to deal with noise. By the sampling strategy, the fitness of a solution is evaluated multiple times (called sample size) independently, and its true fitness is then approximated by the average of these evaluations. Most previous studies on sampling are empirical, and the few theoretical studies mainly showed the effectiveness of sampling with a sufficiently large sample size. In this paper, we theoretically examine what strategies can work when sampling with any fixed sample size fails. By constructing a family of artificial noisy examples, we prove that sampling is always ineffective, while using parent or offspring populations can be helpful on some examples. We also construct an artificial noisy example to show that when using neither sampling nor populations is effective, a tailored adaptive sampling (i.e., sampling with an adaptive sample size) strategy can work. These findings may enhance our understanding of sampling to some extent, but future work is required to validate them in natural situations.

This is a preview of subscription content, access via your institution.

Fig. 1

References

  1. 1.

    Akimoto, Y., Astete-Morales, S., Teytaud, O.: Analysis of runtime of optimization algorithms for noisy functions over discrete codomains. Theoret. Comput. Sci. 605, 42–50 (2015)

    MathSciNet  MATH  Article  Google Scholar 

  2. 2.

    Auger, A., Doerr, B.: Theory of Randomized Search Heuristics: Foundations and Recent Developments. World Scientific, Singapore (2011)

    Google Scholar 

  3. 3.

    Bian, C., Qian, C., Tang, K.: Towards a running time analysis of the (1+1)-EA for OneMax and LeadingOnes under general bit-wise noise. In: Proceedings of the 15th International Conference on Parallel Problem Solving from Nature (PPSN’18), pp. 165–177. Coimbra, Portugal (2018)

    Google Scholar 

  4. 4.

    Branke, J., Schmidt, C.: Sequential sampling in noisy environments. In: Proceedings of the 8th International Conference on Parallel Problem Solving from Nature (PPSN’04), pp. 202–211. Birmingham, UK (2004)

    Google Scholar 

  5. 5.

    Cantú-Paz, E.: Adaptive sampling for noisy problems. In: Proceedings of the 6th ACM Conference on Genetic and Evolutionary Computation (GECCO’04), pp. 947–958. Seattle, WA (2004)

  6. 6.

    Dang, D.C., Lehre, P.K.: Efficient optimisation of noisy fitness functions with population-based evolutionary algorithms. In: Proceedings of the 13th ACM Conference on Foundations of Genetic Algorithms (FOGA’15), pp. 62–68. Aberystwyth, UK (2015)

  7. 7.

    Dang-Nhu, R., Dardinier, T., Doerr, B., Izacard, G., Nogneng, D.: A new analysis method for evolutionary optimization of dynamic and noisy objective functions. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation (GECCO’18), pp. 1467–1474. Kyoto, Japan (2018)

  8. 8.

    Devroye, L., Lugosi, G.: Combinatorial Methods in Density Estimation. Springer, New York (2001)

    Google Scholar 

  9. 9.

    Doerr, B., Hota, A., Kötzing, T.: Ants easily solve stochastic shortest path problems. In: Proceedings of the 14th ACM Conference on Genetic and Evolutionary Computation (GECCO’12), pp. 17–24. Philadelphia, PA (2012)

  10. 10.

    Doerr, B., Johannsen, D., Winzen, C.: Multiplicative drift analysis. Algorithmica 64(4), 673–697 (2012)

    MathSciNet  MATH  Article  Google Scholar 

  11. 11.

    Droste, S.: Analysis of the (1+1) EA for a noisy OneMax. In: Proceedings of the 6th ACM Conference on Genetic and Evolutionary Computation (GECCO’04), pp. 1088–1099. Seattle, WA (2004)

  12. 12.

    Feldmann, M., Kötzing, T.: Optimizing expected path lengths with ant colony optimization using fitness proportional update. In: Proceedings of the 12th ACM Conference on Foundations of Genetic Algorithms (FOGA’13), pp. 65–74. Adelaide, Australia (2013)

  13. 13.

    Friedrich, T., Kötzing, T., Krejca, M., Sutton, A.: Robustness of ant colony optimization to noise. Evol. Comput. 24(2), 237–254 (2016)

    Article  Google Scholar 

  14. 14.

    Friedrich, T., Kötzing, T., Krejca, M., Sutton, A.: The compact genetic algorithm is efficient under extreme Gaussian noise. IEEE Trans. Evol. Comput. 21(3), 477–490 (2017)

    Google Scholar 

  15. 15.

    Gießen, C., Kötzing, T.: Robustness of populations in stochastic environments. Algorithmica 75(3), 462–489 (2016)

    MathSciNet  MATH  Article  Google Scholar 

  16. 16.

    Hajek, B.: Hitting-time and occupation-time bounds implied by drift analysis with applications. Adv. Appl. Probab. 14(3), 502–525 (1982)

    MathSciNet  MATH  Article  Google Scholar 

  17. 17.

    He, J., Yao, X.: Drift analysis and average time complexity of evolutionary algorithms. Artif. Intell. 127(1), 57–85 (2001)

    MathSciNet  MATH  Article  Google Scholar 

  18. 18.

    Li, G., Chou, W.: Path planning for mobile robot using self-adaptive learning particle swarm optimization. Sci. China Inf. Sci. 61(5), 052204 (2018)

    MathSciNet  Article  Google Scholar 

  19. 19.

    Mukhopadhyay, A., Maulik, U., Bandyopadhyay, S., Coello Coello, C.A.: A survey of multiobjective evolutionary algorithms for data mining: Part I. IEEE Trans. Evol. Comput. 18(1), 4–19 (2013)

    Article  Google Scholar 

  20. 20.

    Neumann, F., Witt, C.: Bioinspired Computation in Combinatorial Optimization: Algorithms and Their Computational Complexity. Springer, Berlin (2010)

    Google Scholar 

  21. 21.

    Oliveto, P., Witt, C.: Simplified drift analysis for proving lower bounds in evolutionary computation. Algorithmica 59(3), 369–386 (2011)

    MathSciNet  MATH  Article  Google Scholar 

  22. 22.

    Oliveto, P., Witt, C.: Erratum: Simplified drift analysis for proving lower bounds in evolutionary computation. arXiv:1211.7184 (2012)

  23. 23.

    Oliveto, P., Witt, C.: On the runtime analysis of the simple genetic algorithm. Theoret. Comput. Sci. 545, 2–19 (2014)

    MathSciNet  MATH  Article  Google Scholar 

  24. 24.

    Prügel-Bennett, A., Rowe, J., Shapiro, J.: Run-time analysis of population-based evolutionary algorithm in noisy environments. In: Proceedings of the 13th ACM Conference on Foundations of Genetic Algorithms (FOGA’15), pp. 69–75. Aberystwyth, UK (2015)

  25. 25.

    Qian, C.: Distributed Pareto optimization for large-scale noisy subset selection. IEEE Trans. Evol. Comput. (2020)

  26. 26.

    Qian, C., Bian, C., Jiang, W., Tang, K.: Running time analysis of the (1+1)-EA for OneMax and LeadingOnes under bit-wise noise. Algorithmica 81(2), 749–795 (2019)

    MathSciNet  MATH  Article  Google Scholar 

  27. 27.

    Qian, C., Bian, C., Yu, Y., Tang, K., Yao, X.: Analysis of noisy evolutionary optimization when sampling fails. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation (GECCO’18), pp. 1507–1514. Kyoto, Japan (2018)

  28. 28.

    Qian, C., Shi, J.C., Yu, Y., Tang, K., Zhou, Z.H.: Subset selection under noise. In: Advances in Neural Information Processing Systems 30 (NIPS’17), pp. 3562–3572. Long Beach, CA (2017)

  29. 29.

    Qian, C., Yu, Y., Tang, K., Jin, Y., Yao, X., Zhou, Z.H.: On the effectiveness of sampling for evolutionary optimization in noisy environments. Evol. Comput. 26(2), 237–267 (2018)

    Article  Google Scholar 

  30. 30.

    Qian, C., Yu, Y., Zhou, Z.H.: Analyzing evolutionary optimization in noisy environments. Evol. Comput. 26(1), 1–41 (2018)

    Article  Google Scholar 

  31. 31.

    Sudholt, D.: On the robustness of evolutionary algorithms to noise: Refined results and an example where noise helps. In: Proceedings of the 20th ACM Conference on Genetic and Evolutionary Computation (GECCO’18), pp. 1523–1530. Kyoto, Japan (2018)

  32. 32.

    Sudholt, D., Thyssen, C.: A simple ant colony optimizer for stochastic shortest path problems. Algorithmica 64(4), 643–672 (2012)

    MathSciNet  MATH  Article  Google Scholar 

  33. 33.

    Syberfeldt, A., Ng, A., John, R., Moore, P.: Evolutionary optimisation of noisy multi-objective problems using confidence-based dynamic resampling. Eur. J. Oper. Res. 204(3), 533–544 (2010)

    MathSciNet  MATH  Article  Google Scholar 

  34. 34.

    Tyurin, I.S.: An improvement of upper estimates of the constants in the Lyapunov theorem. Russ. Math. Surv. 65(3), 201–202 (2010)

    Article  Google Scholar 

  35. 35.

    Witt, C.: Runtime analysis of the (\(\mu\)+1) EA on simple pseudo-Boolean functions. Evol. Comput. 14(1), 65–86 (2006)

    Google Scholar 

  36. 36.

    Xu, P., Liu, X., Cao, H., Zhang, Z.: An efficient energy aware virtual network migration based on genetic algorithm. Front. Comput. Sci. 13(2), 440–442 (2019)

    Article  Google Scholar 

  37. 37.

    Yu, Y., Qian, C., Zhou, Z.H.: Switch analysis for running time analysis of evolutionary algorithms. IEEE Trans. Evol. Comput. 19(6), 777–792 (2015)

    Article  Google Scholar 

  38. 38.

    Zhang, Z., Xin, T.: Immune algorithm with adaptive sampling in noisy environments and its application to stochastic optimization problems. IEEE Comput. Intell. Mag. 2(4), 29–40 (2007)

    MathSciNet  Article  Google Scholar 

  39. 39.

    Zhou, Z.H., Yu, Y., Qian, C.: Evolutionary Learning: Advances in Theories and Algorithms. Springer, Singapore (2019)

    Google Scholar 

Download references

Acknowledgements

We want to thank the anonymous reviewers of GECCO’18, TEvC and Algorithmica for their valuable comments and thank Per Kristian Lehre for helpful discussions. This work was supported by the National Key Research and Development Program of China (2017YFB1003102), the NSFC (61672478, 61876077), the Shenzhen Peacock Plan (KQTD2016112514355531), and the Fundamental Research Funds for the Central Universities.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Chao Qian.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

A preliminary version of this paper has appeared at GECCO’18 [27].

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Qian, C., Bian, C., Yu, Y. et al. Analysis of Noisy Evolutionary Optimization When Sampling Fails. Algorithmica (2020). https://doi.org/10.1007/s00453-019-00666-6

Download citation

Keywords

  • Noisy optimization
  • Evolutionary algorithms
  • Sampling
  • Population
  • Adaptive sampling
  • Running time analysis