Skip to main content
Log in

Filter-based stochastic algorithm for global optimization

  • Published:
Journal of Global Optimization Aims and scope Submit manuscript

Abstract

We propose the general Filter-based Stochastic Algorithm (FbSA) for the global optimization of nonconvex and nonsmooth constrained problems. Under certain conditions on the probability distributions that generate the sample points, almost sure convergence is proved. In order to optimize problems with computationally expensive black-box objective functions, we develop the FbSA-RBF algorithm based on the general FbSA and assisted by Radial Basis Function (RBF) surrogate models to approximate the objective function. At each iteration, the resulting algorithm constructs/updates a surrogate model of the objective function and generates trial points using a dynamic coordinate search strategy similar to the one used in the Dynamically Dimensioned Search method. To identify a promising best trial point, a non-dominance concept based on the values of the surrogate model and the constraint violation at the trial points is used. Theoretical results concerning the sufficient conditions for the almost surely convergence of the algorithm are presented. Preliminary numerical experiments show that the FbSA-RBF is competitive when compared with other known methods in the literature.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Aguirre, A.H., Rionda, S.B., Coello Coello, C.A., Lizrraga, G.L., Montes, E.M.: Handling constraints using multiobjective optimization concepts. Int. J. Numer. Methods Eng. 59(15), 1989–2017 (2004)

    MathSciNet  MATH  Google Scholar 

  2. Ali, M.M., Golalikhani, M.: An electromagnetism-like method for nonlinearly constrained global optimization. Comput. Math. Appl. 60(8), 2279–2285 (2010)

    MathSciNet  MATH  Google Scholar 

  3. Ali, M.M., Zhu, W.X.: A penalty function-based differential evolution algorithm for constrained global optimization. Comput. Optim. Appl. 54(1), 707–739 (2013)

    MathSciNet  MATH  Google Scholar 

  4. Audet, C., Dennis Jr., J.E.: A pattern search filter method for nonlinear programming without derivatives. SIAM J. Optim. 14(4), 980–1010 (2004)

    MathSciNet  MATH  Google Scholar 

  5. Barbosa, H.J.C., Lemonge, A.C.C.: An adaptive penalty method for genetic algorithms in constrained optimization problems. In: Iba, H. (ed.) Frontiers in Evolutionary Robotics, Chapter 2. IntechOpen, Rijeka (2008)

    Google Scholar 

  6. Birgin, E.G., Floudas, C.A., Martínez, J.M.: Global minimization using an augmented Lagrangian method with variable lower-level constraints. Math. Program. 125, 139–162 (2010)

    MathSciNet  MATH  Google Scholar 

  7. Booker, A.J., Dennis, J.E., Frank, P.D., Serafini, D.B., Torczon, V., Trosset, M.W.: A rigorous framework for optimization of expensive functions by surrogates. Struct. Multidisc. Optim. 17, 1–19 (1999)

    Google Scholar 

  8. Chin, C.M., Fletcher, R.: On the global convergence of an SLP-filter algorithm that takes EQP steps. Math. Program. 96(1), 161–177 (2003)

    MathSciNet  MATH  Google Scholar 

  9. Coello, C.A.C.: Theoretical and numerical constraint-handling techniques used with evolutionary algorithms: a survey of the state of the art. Comput. Method. Appl. Mech. Eng. 191(11), 1245–1287 (2002)

    MathSciNet  MATH  Google Scholar 

  10. Costa, M.F.P., Fernandes, F.P., Rocha, A.M.A.C.: Multiple solutions of mixed variable optimization by multistart Hooke and Jeeves filter method. Appl. Math. Sci. 8, 2163–2179 (2014)

    MathSciNet  Google Scholar 

  11. Costa, M.F.P., Rocha, A.M.A.C., Fernandes, E.M.G.P.: Filter-based direct method for constrained global optimization. J. Glob. Optim. 71(3), 517–536 (2018)

    MathSciNet  MATH  Google Scholar 

  12. Di Pillo, G., Lucidi, S., Rinaldi, F.: An approach to constrained global optimization based on exact penalty functions. J. Glob. Optim. 54, 251–260 (2012)

    MathSciNet  MATH  Google Scholar 

  13. Dolan, E.D., Moré, J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)

    MathSciNet  MATH  Google Scholar 

  14. Echebest, N., Shuverdt, M.L., Vignau, R.P.: An inexact restoration derivative-free filter method for nonlinear programming. Comput. Appl. Math. 36(1), 693–718 (2017). https://doi.org/10.1007/s40314-015-0253-0

    Article  MathSciNet  MATH  Google Scholar 

  15. Ferreira, P.S., Karas, E.W., Sachine, M., Sobral, F.N.: Global convergence of a derivative-free inexact restoration filter algorithm for nonlinear programming. Optimization 66, 271–292 (2017)

    MathSciNet  MATH  Google Scholar 

  16. Fletcher, R., Gould, N.I.M., Leyffer, S., Toint, P.L., Wachter, A.: Global convergence of trust-region SQP-filter algorithm for general nonlinear programming. SIAM J. Optim. 13, 635–659 (2002)

    MathSciNet  MATH  Google Scholar 

  17. Fletcher, R., Leyffer, S.: Nonlinear programming without a penalty function. Math. Program. 91, 239–269 (2002)

    MathSciNet  MATH  Google Scholar 

  18. Gablonsky, J.: DIRECT Version 2.0 User Guide. Technical Report CRSC-TR01-08, Center for Research in Scientific Computation, North Carolina State University (2001)

  19. Gonçalves, M.L.N., Melo, J.G., Prudente, L.F.: Augmented Lagrangian methods for nonlinear programming with possible infeasibility. J. Glob. Optim. 63, 297–318 (2015)

    MathSciNet  MATH  Google Scholar 

  20. Gonzaga, C.C., Karas, E.W., Vanti, M.: A globally convergent filter method for nonlinear programming. SIAM J. Optim. 14(3), 646–669 (2003)

    MathSciNet  MATH  Google Scholar 

  21. Gould, N.I.M., Leyffer, S., Toint, P.L.: A multidimensional filter algorithm for nonlinear equations and nonlinear least-squares. SIAM J. Optim. 15(1), 17–38 (2004)

    MathSciNet  MATH  Google Scholar 

  22. Greenwood, G.W., Shu, Q.J.: Convergence in evolutionary programs with self-adaptation. Evol. Comput. 9(2), 147–157 (2001)

    Google Scholar 

  23. He, Q., Wang, L.: A hybrid particle swarm optimization with a feasibility-based rule for constrained optimization. Appl. Math. Comput. 186(2), 1407–1422 (2007)

    MathSciNet  MATH  Google Scholar 

  24. Hedar, A.-R., Fukushima, M.: Derivative-free filter simulated annealing method for constrained continuous global optimization. J. Glob. Optim. 35(4), 521–549 (2006)

    MathSciNet  MATH  Google Scholar 

  25. Karas, E.W., Oening, A.P., Ribeiro, A.A.: Global convergence of slanting filter methods for nonlinear programming. Appl. Math. Comput. 200, 486–500 (2008)

    MathSciNet  MATH  Google Scholar 

  26. Liang, J.J., Runarsson, T.P., Mezura-Montes, E., Clerc, M., Suganthan, P.N., Coello, C.A.C., Deb, K.: Problem Definitions and Evaluation Criteria for the CEC 2006 Special Session on Constrained Real-Parameter Optimization. Technical Report, Nanyang Technological University, Singapore (2006)

  27. Long, J., Zeng, S.: A new Filter-Levenberg-Marquart method with disturbance for solving nonlinear complementarity problems. Appl. Math. Comput. 216(2), 677–688 (2010)

    MathSciNet  MATH  Google Scholar 

  28. Macêdo, M.J.F.G., Costa, M.F.P., Rocha, A.M.A.C., Karas, E.W.: Combining filter method and dynamically dimensioned search for constrained global optimization. In: Gervasi, O., Murgante, B., Misra, S., Borruso, G., Torre, C., Rocha, A., Taniar, D., Apduhan, B., Stankova, E., Cuzzocrea, A. (eds.) Computational Science and Its Applications—ICCSA 2017: 17th International Conference, Trieste, Italy, July 3–6, 2017, Proceedings, Part III, pp. 119–134. Springer International Publishing (2017)

  29. Mallipeddi, R., Suganthan, P.N.: Problem Definitions and Evaluation Criteria for the CEC 2010 Competition on Constrained Real-Parameter Optimization. Technical Report, Nanyang Technological University, Singapore (2010)

  30. Nuñez, L., Regis, R.G., Varela, K.: Accelerated random search for constrained global optimization assisted by radial basis function surrogates. J. Comput. Appl. Math. 340, 276–295 (2018)

    MathSciNet  MATH  Google Scholar 

  31. Periçaro, G.A., Ribeiro, A.A., Karas, E.W.: Global convergence of a general filter algorithm based on an efficiency condition of the step. Appl. Math. Comput. 219, 9581–9597 (2013)

    MATH  Google Scholar 

  32. Petalas, Y.G., Parsopoulos, K.E., Vrahatis, M.N.: Memetic particle swarm optimization. Ann. Oper. Res. 156(1), 99–127 (2007)

    MathSciNet  MATH  Google Scholar 

  33. Powell, M.J.D.: The theory of radial basis function approximation in 1990. In: Light, W. (ed.) Advances in Numerical Analysis. Vol. 2. Wavelets, Subdivision Algorithms and Radial Basis Functions, pp. 105–210. Oxford University Press, Oxford (1992)

    Google Scholar 

  34. Price, C.J., Reale, M., Robertson, B.L.: Stochastic filter methods for generally constrained global optimization. J. Glob. Optim. 65(3), 441–456 (2016)

    MathSciNet  MATH  Google Scholar 

  35. Regis, R.G.: Convergence guarantees for generalized adaptive stochastic search methods for continuous global optimization. Eur. J. Oper. Res. 207, 1187–1202 (2010)

    MathSciNet  MATH  Google Scholar 

  36. Regis, R.G.: Stochastic radial basis function algorithms for large-scale optimization involving expensive black-box objective and constraint functions. Comput. Oper. Res. 38, 837–853 (2011)

    MathSciNet  MATH  Google Scholar 

  37. Regis, R.G.: Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points. Eng. Optim. 46, 218–243 (2014)

    MathSciNet  Google Scholar 

  38. Regis, R.G., Shoemaker, C.A.: A stochastic radial basis function method for the global optimization of expensive functions. INFORMS J. Comput. 19(4), 497–509 (2007)

    MathSciNet  MATH  Google Scholar 

  39. Regis, R.G., Shoemaker, C.A.: Combining radial basis function surrogates and dynamic coordinate search in high-dimensional expensive black-box optimization. Eng. Optim. 45, 529–555 (2013)

    MathSciNet  Google Scholar 

  40. Resnick, S.I.: A Probablity Path. Birkhauser, Boston (1999)

    Google Scholar 

  41. Ribeiro, A.A., Karas, E.W., Gonzaga, C.C.: Global convergence of filter methods for nonlinear programming. SIAM J. Optim. 19(3), 1231–1249 (2008)

    MathSciNet  MATH  Google Scholar 

  42. Rocha, A.M.A.C., Costa, M.F.P., Fernandes, E.M.G.P.: A filter-based artificial fish swarm algorithm for constrained global optimization: theoretical and practical issues. J. Glob. Optim. 60, 239–263 (2014)

    MathSciNet  MATH  Google Scholar 

  43. Rocha, A.M.A.C., Fernandes, E.M.G.P.: Hybridizing the electromagnetism-like algorithm with descent search for solving engineering design problems. Int. J. Comput. Math. 86, 1932–1946 (2009)

    MATH  Google Scholar 

  44. Tolson, B.A., Asadzadeh, M., Maier, H.R., Zecchin, A.: Hybrid discrete dynamically dimensioned search (HD-DDS)algorithm for water distribution system design optimization. Water Resour. Res. 45, W12416 (2009). https://doi.org/10.1029/2008WR007673

    Article  Google Scholar 

  45. Tolson, B.A., Shoemaker, C.A.: Dynamically dimensioned search algorithm for computationally efficient watershed model calibration. Water Resour. Res. 43, W01413 (2007). https://doi.org/10.1029/2005WR004723

    Article  Google Scholar 

  46. Ulbrich, M., Ulbrich, S., Vicente, L.N.: A globally convergent primal-dual interior-point filter method for nonlinear programming. Math. Program. 100(2), 379–410 (2004)

    MathSciNet  MATH  Google Scholar 

  47. Wang, C.-Y., Li, D.: Unified theory of augmented lagrangian methods for constrained global optimization. J. Glob. Optim. 44(3), 433–458 (2008)

    MathSciNet  MATH  Google Scholar 

  48. Wang, X., Zhu, Z., Zuo, S., Huang, Q.: An SQP-filter method for inequality constrained optimization and its global convergence. Appl. Math. Comput. 217(24), 10224–10230 (2011)

    MathSciNet  MATH  Google Scholar 

  49. Ye, K.Q., Li, W., Sudjianto, A.: Algorithmic construction of optimal symmetric latin hypercube designs. J. Stat. Plan. Infer. 90(1), 145–159 (2000)

    MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

The authors are grateful to the anonymous referees for their fruitful comments and suggestions. The first and second authors were partially supported by Brazilian Funds through CAPES and CNPq by Grants PDSE 99999.009400/2014-01 and 309303/2017-6. The research of the third and fourth authors were partially financed by Portuguese Funds through FCT (Fundação para Ciência e Tecnologia) within the Projects UIDB/00013/2020 and UIDP/00013/2020 of CMAT-UM and UIDB/00319/2020.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. Joseane F. G. Macêdo.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Macêdo, M.J.F.G., Karas, E.W., Costa, M.F.P. et al. Filter-based stochastic algorithm for global optimization. J Glob Optim 77, 777–805 (2020). https://doi.org/10.1007/s10898-020-00917-9

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10898-020-00917-9

Keywords

Navigation