Skip to main content
Log in

CARTopt: a random search method for nonsmooth unconstrained optimization

  • Published:
Computational Optimization and Applications Aims and scope Submit manuscript

Abstract

A random search algorithm for unconstrained local nonsmooth optimization is described. The algorithm forms a partition on \(\mathbb{R}^{n}\) using classification and regression trees (CART) from statistical pattern recognition. The CART partition defines desirable subsets where the objective function f is relatively low, based on previous sampling, from which further samples are drawn directly. Alternating between partition and sampling phases provides an effective method for nonsmooth optimization. The sequence of iterates {z k } is shown to converge to an essential local minimizer of f with probability one under mild conditions. Numerical results are presented to show that the method is effective and competitive in practice.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  1. Appel, M.J., Labarre, R., Radulovic, D.: On accelerated random search. SIAM J. Optim. 14, 708–731 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  2. Audet, C., Dennis, J.E.: Mesh adaptive direct search algorithms for constrained optimization. SIAM J. Optim. 17, 188–217 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  3. Brachetti, P., Ciccoli, M.D.F., Di Pillo, G., Lucidi, S.: A new version of the Price’s algorithm for global optimization. J. Glob. Optim. 10, 165–184 (1997)

    Article  MATH  Google Scholar 

  4. Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth, Belmont (1984)

    MATH  Google Scholar 

  5. Burmen, Á., Puhan, J., Tuma, T.: Grid restrained Nelder Mead algorithm. Comput. Optim. Appl. 34, 359–375 (2006)

    Article  MathSciNet  Google Scholar 

  6. Clarke, F.H.: Optimization and Nonsmooth Analysis. Classics in Applied Mathematics. SIAM, Philadelpha (1990)

    Book  MATH  Google Scholar 

  7. Dorea, C.C.Y.: Stopping rules for a random optimization method. SIAM J. Control Optim. 28, 841–850 (1990)

    Article  MathSciNet  MATH  Google Scholar 

  8. Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification. Wiley-Interscience, New York (2001)

    MATH  Google Scholar 

  9. Hart, W.E.: Sequential stopping rules for random optimization methods with applications to multistart local search. SIAM J. Optim. 9, 270–290 (1998)

    Article  MATH  Google Scholar 

  10. Hock, W., Schittkowski, K.: Test Examples for Nonlinear Programming Codes. Lecture Notes in Economics and Mathematical Systems, vol. 187. Springer, Berlin (1981)

    Book  MATH  Google Scholar 

  11. Loh, W.: On latin hypercube sampling. Ann. Stat. 24, 2058–2080 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  12. Lukšan, L., Vlc̆ek, J.: Test problems for unconstrained optimization. Technical Report no. 8. Academy of Sciences of the Czech Republic, Institute of Computer Science (2003)

  13. Martinez, W.L., Martinez, A.R.: Computational Statistics Handbook with MATLAB. Chapman & Hall/CRC Press, London/Boca Raton (2002)

    Google Scholar 

  14. Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM Trans. Math. Softw. 7, 17–41 (1981)

    Article  MATH  Google Scholar 

  15. Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7, 308–313 (1965)

    Article  MATH  Google Scholar 

  16. Price, C.J., Reale, M., Robertson, B.L.: A direct search method for smooth and nonsmooth unconstrained optimization. ANZIAM J. 48, C927–C948 (2006)

    MathSciNet  Google Scholar 

  17. Price, C.J., Robertson, B.L., Reale, M.: A hybrid Hooke and Jeeves—Direct method for nonsmooth optimization. Adv. Model. Optim. 11, 43–61 (2009)

    MathSciNet  MATH  Google Scholar 

  18. Price, C.J., Reale, M., Robertson, B.L.: A cover partitioning method for bound constrained global optimization. Optim. Methods Softw. 27, 1059–1072 (2012). doi:10.1080/10556788.2011.557726

    Article  MathSciNet  MATH  Google Scholar 

  19. Rinnooy Kan, A.H.G., Timmer, G.T.: Stochastic global optimization methods. Part I. Clustering methods. Math. Program. 39, 27–56 (1987)

    Article  MathSciNet  MATH  Google Scholar 

  20. Robertson, B.L., Price, C.J., Reale, M.: Nonsmooth optimization using classification and regression trees. In: Proceedings of the 18th IMACS World Congress and MODSIM09 International Congress on Modelling and Simulation, Cairns, Australia, July 13–17, 2009, pp. 1195–1201 (2009)

    Google Scholar 

  21. Robertson, B.L.: Direct search methods for nonsmooth problems using global optimization techniques. Ph.D. Thesis, University of Canterbury, Christchurch, New Zealand (2010)

  22. Schittkowski, K.: More Test Examples for Nonlinear Programming Codes. Lecture Notes in Economics and Mathematical Systems, vol. 282. Springer, Berlin (1987)

    Book  MATH  Google Scholar 

  23. Tang, Z.B.: Adaptive partitioned random search to global optimization. IEEE Trans. Autom. Control 39, 2235–2244 (1994)

    Article  MATH  Google Scholar 

  24. Torn, A., Žilinskas, A.: Global Optimization. Lecture Notes in Computer Science, vol. 350. Springer, Berlin (1989)

    Book  Google Scholar 

  25. Vicente, L.N., Custódio, A.L.: Analysis of direct searches for discontinuous functions. Math. Program. 133, 299–325 (2012)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

We acknowledge the helpful comments of two anonymous referees which led to an improved version of the paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to C. J. Price.

Appendix

Appendix

A nonsmooth version of the Cosine Mixture problem [3] is

$$ f(x) = \begin{cases} 0.1 \sum_{i = 1}^n \cos(5\pi x_i) - \sum_{i = 1}^n |x_i|, & \text{if } \|x\|_{\infty} \leq 1\\[4pt] \infty & \text{otherwise}, \end{cases} $$

where x 0=(0,0,…,0). The cases n=4 and n=6 were studied, where f =−4.4 and f =−6.6 respectively.

A nonsmooth version of the Exponential problem [3] is

$$ f(x) = -\exp \Biggl(-0.5 \sum_{i=1}^n |x_i| \Biggr), $$

where x 0=(1,1,…,1) and f =−1. The cases n=6 and n=8 were studied.

The discontinuous versions of the Rosenbrock function are defined as follows.

Each problem uses x 0=(−1.2,1) and has an essential local minimizer at x =(1,1) with f =0.

The discontinuous versions of the Beale function are defined as follows. Let

Each problem uses x 0=(1,1) and has an essential local minimizer at x =(3,0.5) with f =0.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Robertson, B.L., Price, C.J. & Reale, M. CARTopt: a random search method for nonsmooth unconstrained optimization. Comput Optim Appl 56, 291–315 (2013). https://doi.org/10.1007/s10589-013-9560-9

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10589-013-9560-9

Keywords

Navigation