Computer Algebra and Line Search

  • Predrag Stanimirović
  • Marko Miladinović
  • Ivan M. Jovanović
Chapter
Part of the Springer Optimization and Its Applications book series (SOIA, volume 42)

Abstract

We investigate symbolic transformations on unevaluated expressions representing objective functions to generate unevaluated composite objective functions required during the implementation of unconstrained nonlinear optimization methods based on exact line search. Using these transformations, we develop a MATHEMATICA implementation of the main nonlinear optimization methods. We also implement the same optimization methods using an approximate minimization rule as well as three inexact line search procedures. A comparison of the number of iterations and the CPU time between these five variations of optimization methods is presented.

Keywords

Argone Cond Wolfram 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algor. 42, 63–73 (2006)MATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Andrei, N.: An Unconstrained Optimization Test Functions Collection. http://camo.ici.ro/neculai/t1.pdf (2005)
  3. 3.
    Andrei, N.: A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. Appl. Math. Lett. 20, 645–650 (2007)MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Andrei, N.: A Dai-Yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization. Appl. Math. Lett. 21 165–171 (2008)MATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Armijo, L.: Minimization of functions having Lipschitz first partial derivatives. Pac. J. Math. 6, 1–3 (1966)MathSciNetGoogle Scholar
  6. 6.
    Bhatti, M.A.: Practical Optimization Methods with Mathematica® Applications. Springer-Verlag, New York (2000)MATHGoogle Scholar
  7. 7.
    Cohen, A.I.: Stepsize analysis for descent methods. J. Optim. Theory Appl. 33, 187–205 (1981)MATHCrossRefMathSciNetGoogle Scholar
  8. 8.
    Culioli, J.C.: Optimization with Mathematica. In: Computational Economics and Finance (H. Varian, eds.), TELOS/Springer-Verlag, Santa Clara CA (1996)Google Scholar
  9. 9.
    Fletcher, R.: Practical Methods of Optimization. Wiley, New York (1987)MATHGoogle Scholar
  10. 10.
    Goldstein, A.A.: On steepest descent. SIAM J. Contrl. 3, 147–151 (1965)Google Scholar
  11. 11.
    Loehle, C.: Global optimization using Mathematica: A test of software tools. Math. Educ. Res. 11, 139–152 (2006)Google Scholar
  12. 12.
    Maeder, R.E.: Computer Science with Mathematica. Cambridge University Press, Cambridge, New York, Madrid, Cape Town, Singapore, Sao Paolo (2006)Google Scholar
  13. 13.
    Lemaréchal, C.: A view of line search. In: Optimization and Optimal Control (A. Auslander, W. Oetti, J. Stoer, eds.), pp. 59–78, Springer, Berlin (1981)Google Scholar
  14. 14.
    Moré, J.J., Thuente, D.J.: On line search algorithm with guaranteed sufficient decrease. Mathematics and Computer Science Division Preprint MCS-P153-0590, Argone National Laboratory, Argone (1990)Google Scholar
  15. 15.
    Neumaier, A.: Matlab line search routines. http://www.mat.univie.ac.at/~neum/software/ls (1988)
  16. 16.
    Potra, F.A., Shi, Y.: Efficient line search algorithm for unconstrained optimization. J. Optim. Theory Appl. 85, 677–704 (1995)MATHCrossRefMathSciNetGoogle Scholar
  17. 17.
    Powell, M.J.D.: Some global convergence properties of a variable-metric algorithm for minimization without exact line search. AIAM-AMS Proc., Philadelphia 9, 53–72 (1976)Google Scholar
  18. 18.
    Press, W.H., Flannery, B.P., Teukolsky, S.A., Vetterling, W.T.: Numerical Recipes in C. Cambridge University Press, New York-Melbourne-Sydney (1990)Google Scholar
  19. 19.
    Shi, Z.-Jun: Convergence of line search methods for unconstrained optimization. Appl. Math. Comput. 157, 393–405 (2004)Google Scholar
  20. 20.
    Wenyuu, S., Ya-xiang, Y.: Optimization Theory and Methods, Series: Springer Optimization and its Application, Vol 1, Springer, Berlin (2006)Google Scholar
  21. 21.
    Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1968)CrossRefMathSciNetGoogle Scholar
  22. 22.
    Wolfram, S.: The Mathematica Book. 4th ed., Wolfram Media/Cambridge University Press, Cambridge (1999)MATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Predrag Stanimirović
    • 1
  • Marko Miladinović
    • 1
  • Ivan M. Jovanović
    • 2
  1. 1.Faculty of Science, Department of MathematicsUniversity of NišNišSerbia
  2. 2.Technical FacultyUniversity of BelgradeBorSerbia

Personalised recommendations