Skip to main content

Part of the book series: Springer Optimization and Its Applications ((SOIA,volume 28))

Summary

We study the asymptotic behaviour of Forsythe's s-optimum gradient algorithm for the minimization of a quadratic function in \({\mathbb R}^d\) using a renormalization that converts the algorithm into iterations applied to a probability measure. Bounds on the performance of the algorithm (rate of convergence) are obtained through optimum design theory and the limiting behaviour of the algorithm for s = 2 is investigated into details. Algorithms that switch periodically between s = 1 and s = 2 are shown to converge much faster than when s is fixed at 2.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Akaike, H. (1959). On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method. Annals of the Institute Statistical Mathematics Tokyo, 11, 1–16.

    Article  MATH  MathSciNet  Google Scholar 

  • Elaydi, S. (2005). An Introduction to Difference Equations. Springer, Berlin. Third Edition.

    Google Scholar 

  • Fedorov, V. (1972). Theory of Optimal Experiments. Academic Press, New York.

    Google Scholar 

  • Forsythe, G. (1968). On the asymptotic directions of the s-dimensional optimum gradient method. Numerische Mathematik, 11, 57–76.

    Article  MATH  MathSciNet  Google Scholar 

  • Hoel, P. and Levine, A. (1964). Optimal spacing and weighting in polynomial prediction. Annals of Mathematical Statistics, 35(4), 1553–1560.

    Article  MATH  MathSciNet  Google Scholar 

  • Kantorovich, L. and Akilov, G. (1982). Functional Analysis. Pergamon Press, London. Second edition.

    Google Scholar 

  • Kiefer, J. and Wolfowitz, J. (1959). Optimum designs in regression problems. Annals of Mathematical Statistics, 30, 271–294.

    Article  MATH  MathSciNet  Google Scholar 

  • Luenberger, D. (1973). Introduction to Linear and Nonlinear Programming. Addison-Wesley, Reading, Massachusetts.

    Google Scholar 

  • Meinardus, G. (1963). Über eine Verallgemeinerung einer Ungleichung von L.V. Kantorowitsch. Numerische Mathematik, 5, 14–23.

    Article  MATH  MathSciNet  Google Scholar 

  • Molchanov, I. and Zuyev, S. (2001). Variational calculus in the space of measures and optimal design. In A. Atkinson, B. Bogacka, and A. Zhigljavsky, editors, Optimum Design 2000, chapter 8, pages 79–90. Kluwer, Dordrecht.

    Google Scholar 

  • Nocedal, J., Sartenaer, A., and Zhu, C. (1998). On the accuracy of nonlinear optimization algorithms. Technical Report Nov. 1998, ECE Department, Northwestern Univ., Evanston, Il 60208.

    Google Scholar 

  • Nocedal, J., Sartenaer, A., and Zhu, C. (2002). On the behavior of the gradient norm in the steepest descent method. Computational Optimization and Applications, 22, 5–35.

    Article  MATH  MathSciNet  Google Scholar 

  • Pronzato, L., Wynn, H., and Zhigljavsky, A. (2000). Dynamical Search. Chapman & Hall/CRC, Boca Raton.

    Google Scholar 

  • Pronzato, L., Wynn, H., and Zhigljavsky, A. (2001). Renormalised steepest descent in Hilbert space converges to a two-point attractor. Acta Applicandae Mathematicae, 67, 1–18.

    Article  MATH  MathSciNet  Google Scholar 

  • Pronzato, L., Wynn, H., and Zhigljavsky, A. (2005). Kantorovich-type inequalities for operators via D-optimal design theory. Linear Algebra and its Applications (Special Issue on Linear Algebra and Statistics), 410, 160–169.

    MATH  MathSciNet  Google Scholar 

  • Pronzato, L., Wynn, H., and Zhigljavsky, A. (2006). Asymptotic behaviour of a family of gradient algorithms in \({\mathbb R}^d\) and Hilbert spaces. Mathematical Programming, A107, 409–438.

    Article  MathSciNet  Google Scholar 

  • Sahm, M. (1998). Optimal designs for estimating individual coefficients in polynomial regression. Ph.D. Thesis, Fakultät für Mathematik, Ruhr Universität Bochum.

    Google Scholar 

  • Silvey, S. (1980). Optimal Design. Chapman & Hall, London.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to L. Pronzato .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer Science+Business Media LLC

About this chapter

Cite this chapter

Pronzato, L., Wynn, H., Zhigljavsky, A. (2009). A Dynamical-System Analysis of the Optimum s-Gradient Algorithm. In: Pronzato, L., Zhigljavsky, A. (eds) Optimal Design and Related Areas in Optimization and Statistics. Springer Optimization and Its Applications, vol 28. Springer, New York, NY. https://doi.org/10.1007/978-0-387-79936-0_3

Download citation

Publish with us

Policies and ethics