Abstract
This paper presents for the first time a robust exact line-search method based on a full pseudospectral (PS) numerical scheme employing orthogonal polynomials. The proposed method takes on an adaptive search procedure and combines the superior accuracy of Chebyshev PS approximations with the high-order approximations obtained through Chebyshev PS differentiation matrices. In addition, the method exhibits quadratic convergence rate by enforcing an adaptive Newton search iterative scheme. A rigorous error analysis of the proposed method is presented along with a detailed set of pseudocodes for the established computational algorithms. Several numerical experiments are conducted on one- and multi-dimensional optimization test problems to illustrate the advantages of the proposed strategy.
Similar content being viewed by others
References
Baltensperger, R.: Improving the accuracy of the matrix differentiation method for arbitrary collocation points. Appl. Numer. Math. 33(1), 143–149 (2000)
Baltensperger, R., Trummer, M.R.: Spectral differencing with a twist. SIAM J. Sci. Comput. 24(5), 1465–1487 (2003)
Canuto, C., Hussaini, M.Y., Quarteroni, A., Zang, T.A.: Spectral Methods in Fluid Dynamics. Springer Series in Computational Physics. Springer, Berlin (1988)
Chong, E.K., Zak, S.H.: An Introduction to Optimization, vol. 76. Wiley, New York (2013)
Clenshaw, C.W., Curtis, A.R.: A method for numerical integration on an automatic computer. Numer. Math. 2, 197–205 (1960)
Costa, B., Don, W.S.: On the computation of high order pseudospectral derivatives. Appl. Numer. Math. 33(1), 151–159 (2000)
Day, D., Romero, L.: Roots of polynomials expressed in terms of orthogonal polynomials. SIAM J. Numer. Anal. 43(5), 1969–1987 (2005)
Elbarbary, E.M., El-Sayed, S.M.: Higher order pseudospectral differentiation matrices. Appl. Numer. Math. 55(4), 425–438 (2005)
Elgindy, K.T.: High-order numerical solution of second-order one-dimensional hyperbolic telegraph equation using a shifted Gegenbauer pseudospectral method. Numer. Methods for Partial Differ. Equ. 32(1), 307–349 (2016)
Elgindy, K.T., Hedar, A.: A new robust line search technique based on Chebyshev polynomials. Appl. Math. Comput. 206(2), 853–866 (2008)
Elgindy, K.T., Smith-Miles, K.A.: Fast, accurate, and small-scale direct trajectory optimization using a Gegenbauer transcription method. J. Comput. Appl. Math. 251, 93–116 (2013a)
Elgindy, K.T., Smith-Miles, K.A.: Optimal Gegenbauer quadrature over arbitrary integration nodes. J. Comput. Appl. Math. 242, 82–106 (2013b)
Elgindy, K.T., Smith-Miles, K.A.: Solving boundary value problems, integral, and integro-differential equations using Gegenbauer integration matrices. J. Comput. Appl. Math. 237(1), 307–325 (2013)
Elgindy, K.T., Smith-Miles, K. A., Miller, B.: Solving optimal control problems using a Gegenbauer transcription method. In: 2012 2nd Australian Control Conference (AUCC), IEEE, pp 417–424 (2012).
Gottlieb, D., Orszag, S.A.: Numerical analysis of spectral methods: theory and applications, no. 26. In: CBMS-NSF Regional Conference Series in Applied Mathematics. SIAM, Philadelphia (1977)
Hesthaven, J.S.: From electrostatics to almost optimal nodal sets for polynomial interpolation in a simplex. SIAM J. Numer. Anal. 35(2), 655–676 (1998)
Kopriva, D.A.: Implementing Spectral Methods for Partial Differential Equations: Algorithms for Scientists and Engineers. Springer, Berlin (2009)
Mason, J.C., Handscomb, D.C.: Chebyshev Polynomials. Chapman & Hall/CRC, Boca Raton (2003)
Nickalls, R.: Viète, Descartes and the cubic equation. Math. Gaz. 203–208 (2006)
Nocedal, J., Wright, S.: Numerical Optimization. Springer, Berlin (2006)
Pedregal, P.: Introduction to Optimization, vol. 46. Springer, Berlin (2006)
Peherstorfer, F.: Zeros of linear combinations of orthogonal polynomials. In: Mathematical Proceedings of the Cambridge Philosophical Society, vol. 117, pp. 533–544. Cambridge University Press, Cambridge (1995)
Sherman, J., Morrison, W.J.: Adjustment of an inverse matrix corresponding to changes in the elements of a given column or a given row of the original matrix. In: Annals of Mathematical Statistics, vol. 20. Inst Mathematical Statistics IMS Business Office-Suite 7, 3401 Investment Blvd, Hayward, CA 94545, pp. 621–621 (1949)
Snyder, M.A.: Chebyshev methods in numerical approximation. Prentice-Hall, vol. 2 (1966)
Weideman, J.A.C., Reddy, S.C.: A MATLAB differentiation matrix suite. ACM Trans. Math. Softw. 26(4), 465–519 (2000)
Author information
Authors and Affiliations
Corresponding author
Appendices
Appendix 1: Preliminaries
In this section, we present some useful results from approximation theory. The first kind Chebyshev polynomial (or simply the Chebyshev polynomial) of degree n, \(T_n(x)\), is given by the explicit formula
using trigonometric functions, or in the following explicit polynomial form [24]:
where
The Chebyshev polynomials can be generated by the three-term recurrence relation
starting with \({T_0}(x) = 1\) and \({T_1}(x) = x\). They are orthogonal in the interval \([-1, 1]\) with respect to the weight function \(w(x) = \left( 1 - x^2\right) ^{-1/2}\), and their orthogonality relation is given by
where \(c_0 = 2, c_n = 1, n \ge 1\), and \(\delta _{n m}\) is the Kronecker delta function defined by
The roots (aka Chebyshev–Gauss points) of \(T_n(x)\) are given by
and the extrema (aka CGL points) are defined by
The derivative of \(T_n(x)\) can be obtained in terms of Chebyshev polynomials as follows [18]:
Clenshaw and Curtis [5] showed that a continuous function f(x) with bounded variation on \([-1, 1]\) can be approximated by the truncated series
where
\(x_j, j = 0, \ldots , n\), are the CGL points defined by Eq. (8.7), \(f_j = f(x_j)\, \forall j\), and the summation symbol with double primes denotes a sum with both the first and last terms halved. For a smooth function f, the Chebyshev series (8.9) exhibits exponential convergence faster than any finite power of 1 / n [15].
Appendix 2: Pseudocodes of developed computational algorithms
Rights and permissions
About this article
Cite this article
Elgindy, K.T. Optimization via Chebyshev polynomials. J. Appl. Math. Comput. 56, 317–349 (2018). https://doi.org/10.1007/s12190-016-1076-x
Received:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s12190-016-1076-x
Keywords
- Adaptive
- Chebyshev polynomials
- Differentiation matrix
- Line search
- One-dimensional optimization
- Pseudospectral method