Skip to main content
Log in

Nonmonotone line search methods with variable sample size

  • Original Paper
  • Published:
Numerical Algorithms Aims and scope Submit manuscript

Abstract

Nonmonotone line search methods for unconstrained minimization with the objective functions in the form of mathematical expectation are considered. The objective function is approximated by the sample average approximation (SAA) with a large sample of fixed size. The nonmonotone line search framework is embedded with a variable sample size strategy such that different sample size at each iteration allow us to reduce the cost of the sample average approximation. The variable sample scheme we consider takes into account the decrease in the approximate objective function and the quality of the approximation of the objective function at each iteration and thus the sample size may increase or decrease at each iteration. Nonmonotonicity of the line search combines well with the variable sample size scheme as it allows more freedom in choosing the search direction and the step size while the sample size is not the maximal one and increases the chances of finding a global solution. Eventually the maximal sample size is used so the variable sample size strategy generates the solution of the same quality as the SAA method but with significantly smaller number of function evaluations. Various nonmonotone strategies are compared on a set of test problems.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bastin, F.: Trust-Region Algorithms for Nonlinear Stochastic Programming and Mixed Logit Models, PhD thesis. University of Namur, Belgium, 2004

  2. Bastin, F., Cirillo, C., Toint, P. L.: An adaptive Monte Carlo algorithm for computing mixed logit estimators. Comput. Manag. Sci. 3(1), 55–79 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  3. Bastin, F., Cirillo, C., Toint, P. L.: Convergence theory for nonconvex stochastic programming with an application to mixed logit. Math. Program., Ser. B. 108, 207–234 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  4. Birgin, E. G., Krejić, N., Martínez, J. M.: Globaly convergent inexact quasi-Newton methods for solving nonlinear systems. Numer. Algorithm 32, 249–260 (2003)

    Article  MATH  Google Scholar 

  5. Byrd, R., Chin, G., Neveitt, W., Nocedal, J.: On the Use of Stochastic Hessian Information in Optimization Methods for Machine Learning. SIAM J. Optim. 21(3), 977–995 (2011)

    Article  MATH  MathSciNet  Google Scholar 

  6. Byrd, R., Chin, G., Nocedal, J., Wu, Y.: Sample Size Selection in Optimization Methods for Machine Learning. Math. Program. 134(1), 127–155 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  7. Cheng, W., Li, D.H.: A derivative-free nonmonotone line search and its applications to the spectral residual method. IMA J. Numer. Anal. 29, 814–825 (2008)

    Article  Google Scholar 

  8. Dai, Y.H.: On the nonmonotone line search. J. Optim. Theory Appl. 112, 315–330 (2002)

    Article  MATH  MathSciNet  Google Scholar 

  9. Deng, G., Ferris, M. C.: Variable-number sample path optimization. Math. Program. 117(1–2), 81–109 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  10. Diniz-Ehrhardt, M.A., Martínez, J. M., Raydan, M.: A derivative-free nonmonotone line-search technique for unconstrained optimization. J. Comput. Appl. Math. 219(2), 383–397 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  11. Dolan, E. D., Moré, J. J., Benchmarking optimization software with performance profiles. Math. Program. Ser. A. 91, 201–213 (2002)

    Article  MATH  Google Scholar 

  12. Friedlander, M. P., Schmidt, M.: Hybrid deterministic-stochastic methods for data fitting, SIAM. J. Sci. Comput. 34(3), 1380–1405 (2012)

    MathSciNet  Google Scholar 

  13. Fu, M. C.: Handbook in OR & MS. In: Henderson, S.G., Nelson, B.L. (eds.) Gradient Estimation, Vol. 13, pp 575–616 (2006)

  14. Grippo, L., Lampariello, F., Lucidi, S.: A nononotone line search technique for Newton’s method, SIAM. J. Numer. Anal. 23(4), 707–716 (1986)

    Article  MATH  MathSciNet  Google Scholar 

  15. Grippo, L., Lampariello, F., Lucidi, S.: A class of nonmonotone stabilization methods in unconstrained optimization. Numer. Math. 59, 779–805 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  16. Homem-de-Mello, T.: Variable-Sample Methods for Stochastic Optimization. ACM Trans. Model. Comput. Simul. 13(2), 108–133 (2003)

    Article  Google Scholar 

  17. Krejić, N., Krklec, N.: Line search methods with variable sample size for unconstrained optimization. J. Comput. Appl. Math. 245, 213–231 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  18. Krejić, N., Rapajić, S.: Globally convergent Jacobian smoothing inexact Newton methods for NCP. Comput. Optim. Appl. 41(2), 243–261 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  19. La Cruz, W., Martínez, J. M., Raydan, M.: Spectral residual method without gradient information for solving large-scale nonlinear systems of equations. Math. Comput. 75, 1429–1448 (2006)

    Article  MATH  Google Scholar 

  20. Li, D. H., Fukushima, M.: A derivative-free line search global convergence of Broyden-like method for nonlinear equations. Opt. Methods Softw. 13, 181–201 (2000)

    Article  MATH  MathSciNet  Google Scholar 

  21. Lizotte, D. J., Greiner, R., Schuurmans, D.: An experimental methodology for response surface optimization methods. J. Glob. Optim. 53(4), 699–736 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  22. Nesterov, Y.: Introductory Lectures on Convex Optimization. Kluwer Academic Publishers (2004)

  23. Nocedal, J., Wright, S. J.: Numerical Optimization. Springer (1999)

  24. Pasupathy, R.: On Choosing Parameters in Retrospective-Approximation Algorithms for Stochastic Root Finding and Simulation Optimization. Oper. Res. 58(4), 889–901 (2010)

    Article  MATH  Google Scholar 

  25. Polak, E., Royset, J. O.: Eficient sample sizes in stochastic nonlinear programing. J. Comput. Appl. Math. 217(2), 301–310 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  26. Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  27. Royset, J. O.: Optimality functions in stochastic programming. Math. Program. 135(1–2), 293–321 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  28. Shapiro, A., Ruszczynski, A.: Stochastic Programming. In: Handbooks in Operational Research and Management Science, vol. 10, pp. 353–425. Elsevier (2003)

  29. Spall, J. C.: Introduction to Stochastic Search and Optimization. In: Wiley-Interscience serises in discrete mathematics. New Jersey (2003)

  30. Tavakoli, R., Zhang, H.: A nonmonotone spectral projected gradient method for large-scale topology optimization problems. Numer. Algebra Control. Optim. 2(2), 395–412 (2012)

    Article  MATH  MathSciNet  Google Scholar 

  31. Toint, P. L.: An assessment of nonmonotone line search techniques for unconstrained optimization. SIAM J. Sci. Comput. 17(3), 725–739 (1996)

    Article  MATH  MathSciNet  Google Scholar 

  32. Zhang, H., Hager, W. W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 4, 1043–1056 (2004)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nataša Krejić.

Additional information

Research supported by Serbian Ministry of Education Science and Technological Development, grant no. 174030

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Krejić, N., Krklec Jerinkić, N. Nonmonotone line search methods with variable sample size. Numer Algor 68, 711–739 (2015). https://doi.org/10.1007/s11075-014-9869-1

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11075-014-9869-1

Keywords

Navigation