Abstract
Three term relations for orthogonal polynomials in several variables associated to a moment linear functional obtained by a Uvarov modification of a given moment functional are studied. Existence of Uvarov orthogonal polynomials is analyzed, stating conditions to ensure it. The matrices of the three term relations of the Uvarov orthogonal polynomials are explicitly given in terms of the matrices of the three term relations satisfied by the original family. Two examples are presented in order to show that the results are valid for positive definite linear functionals and also for some quasi definite linear functionals which are not positive definite.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Let \(\{P_{n}\}_{n \ge 0}\) be a sequence of orthogonal polynomials with respect to a moment linear functional u. One of the more interesting problems in the theory of orthogonal polynomials in one and in several variables is the modification of the moment linear functional u and the study of the new family of orthogonal polynomials \(\{Q_{n}\}_{n \ge 0}\) with respect to the modified moment linear functional v. Among the orthogonality properties, the problem of how to compute the new family of orthogonal polynomials \(\{Q_{n}\}_{n \ge 0}\) in terms of the original family of orthogonal polynomials \(\{P_{n}\}_{n \ge 0}\) has been the subject of several papers both in the univariate and multivariate cases. Modifications of moment functionals are related with quasi orthogonality, connection formulas between different families of orthogonal polynomials or adjacent families of classical orthogonal polynomials, quadrature formulas, orthogonal polynomials as solutions of higher order differential equations, among others.
In the univariate case, perturbations of positive Borel measures supported on the real line by the addition of mass points have been considered in the literature in the framework of spectral problems for differential operators of higher order. More precisely, for fourth order ordinary linear differential operators the analysis of their polynomial eigenfunctions has been done in Krall (1940), where three new families of orthogonal polynomials, different of the classical ones (Hermite, Laguerre, Jacobi and Bessel) as trivial solutions, appear. They are the so called Legendre type (the Lebesgue measure in the interval \([-1,1]\) plus two equal positive masses located at \(\pm 1\)), the Laguerre type (the absolutely continuous measure \(\exp (-x){\text {d}}x\) supported at \((0,+\infty )\) plus a positive mass located at \(x=0\)) and the Jacobi-type (the absolutely continuous measure \((1-x)^{\alpha }{\text {d}}x, \alpha >-1\), supported at (0, 1) plus a positive mass located at \(x=0\)). Notice that in Krall (1981) a updated approach is presented. Later on, in Chihara (1985) the author focuses his attention on the study of algebraic properties of orthogonal polynomials with respect to positive measures with masses located at the end points of the convex hull of the support of the measure. Some interesting examples are shown. In Marcellán and Maroni (1992) properties of orthogonal polynomials associated with a perturbation of a quasi definite linear functional by the addition of a mass in any point of the real line have been deeply analyzed. Notice that in this case, the authors deal with a general framework (without restrictions about the location of the mass points and the linear functional in the linear space of polynomials with real coefficients). The main problem is to analyze necessary and sufficient conditions in order that the quasi definiteness of the linear functional is preserved. Such a type of transformations are known in the literature as Uvarov transformations (see Uvarov 1969). They have considered in the framework of linear spectral transformations which are generated by Christoffel and Geronimus transformations (see Zhedanov 1997). The first ones are discrete Darboux transformations of the Jacobi matrix associated to the sequence of orthogonal polynomials with respect to the first linear functional (by using a LU factorization of the above matrix) while the second ones are related to UL factorizations of the Jacobi matrix that are also known as discrete Darboux transformations with parameter (see Bueno and Marcellán 2004 and García-Ardila et al. 2021). The modifications of moment linear functionals are usually classified in three types, namely: Uvarov or Krall-type modifications, defined by adding Dirac’s delta(s) in one or several fixed points; Christoffel modifications, constructed by multiplying the moment functional times a fixed low degree polynomial, and Geronimus modifications, obtained by dividing the moment functional by a polynomial. The main difference between Uvarov and Krall-type modifications is based in the fact that the Krall-type modifications are made on classical moment functionals such as Jacobi and Laguerre, and the perturbed polynomials satisfy higher order linear differential equations (Krall 1981). Several papers have been devoted to study the modifications of univariate moment functionals. Among others, and referring the references therein, we can cite Álvarez-Nodarse et al. (2004), Bueno and Marcellán (2004), Maroni (1991).
In several variables, the study of the Uvarov and Krall modifications of moment functionals was started in Delgado et al. (2010), where the modification was considered for a bilinear form obtained by adding a Dirac mass to a positive definite moment functional. Explicit formulas of the perturbed orthogonal polynomials were derived in terms of the orthogonal polynomials associated with the original moment functional. In Fernández et al. (2010), the case when the new measure is obtained from adding a set of mass points to another measure is analysed. Also in this case, orthogonal polynomials in several variables associated with the modified measure can be explicitly expressed in terms of orthogonal polynomials associated with the first measure, so are the reproducing kernels associated with these polynomials. A Uvarov modification of the bivariate classical measure on the unit disk by adding a finite set of equally spaced mass points on the border was studied in Delgado et al. (2012). In this situation, both orthogonal polynomials and reproducing kernels associated with the new measure were explicitly expressed in terms of those corresponding with the classical one, and asymptotics of kernel functions were studied.
Besides Uvarov modifications by adding Dirac masses at a finite and discrete set of points, in the context of several variables it is possible to modify the moment functional with other moment functionals defined on lower-dimensional manifolds such as curves, surfaces, etc. A family of orthogonal polynomials with respect to such type of Uvarov modification of the classical ball measure by means of a mass uniformly distributed over the sphere was introduced in Martínez and Piñar (2016), and the authors proved that, at least in the Legendre case, these multivariate orthogonal polynomials satisfy a fourth order partial differential equation, which constitutes a natural extension of Krall orthogonal polynomials to the multivariate case. Moreover, in Delgado et al. (2018), an inner product on the triangle defined by adding Krall terms over the border and the vertexes of the triangle is studied. For particular values of the parameters, orthogonal polynomials associated with these inner product satisfy fourth order partial differential equations with polynomial coefficients, as an extension of the classical theory introduced (Krall 1940) and developed later in Krall (1981).
The aim of this paper is to study the three term relations for orthogonal polynomials in several variables associated with a moment linear functional obtained by a Uvarov modification of a given moment functional. In this way, we define a general frame for Uvarov perturbations of a multivariate moment functional, we establish conditions for the existence of perturbed polynomials, and analyse the impact of the perturbation on the coefficients of the three term relations. Moreover, we analyse two interesting examples. First, we compute explicitly the matrix coefficients of the three term relations for the monic polynomials in the case of Jacobi measure on the simplex with mass points added on the vertices, completing the study of this case started in Delgado et al. (2010). In addition, we show the explicit expressions of the perturbed and non perturbed polynomials of low degrees for a special election of the parameters as well as we draw both families of polynomials and its zeros, where we can see the influence of the mass points. Finally, a non positive definite bivariate moment functional based on Bessel and Laguerre polynomials perturbed by a mass point at the origin is analysed. In this case we also compute the explicit expressions for the matrix coefficients of both families of bivariate orthogonal polynomials. The question about the spectral problem associated with the orthogonal polynomials corresponding to the Uvarov perturbation of the measure supported on the simplex presented in the first example remains open. Reading the results given in Delgado et al. (2018) and Martínez and Piñar (2016), we can not expect that these polynomials can be eigenfunctions of a fourth order partial differential equation since the Uvarov term is taken as discrete points. Moreover, if that kind of Uvarov polynomials satisfy higher order partial linear differential equations is a topic that remains open.
The structure of the paper is as follows. In Sect. 2 we recall the basic background about orthogonal polynomials in several variables, including necessary definitions and notations. In Sect. 3 Uvarov modifications of given linear functionals are studied in detail, providing results for the existence of orthogonal polynomials with respect to the new linear functional. In Sect. 4 we obtain the expressions of the three term relations for Uvarov orthogonal polynomials from the recurrence relations of the starting family. Finally, in Sect. 5 two examples are analyzed in detail, in order to show that the results are valid for positive definite linear functionals and also for non positive definite linear functionals. The first one deals with the Uvarov orthogonal polynomials on the simplex, obtained by adding mass points to the positive definite linear functional of very classical bivariate Jacobi polynomials on the simplex. Explicit expressions for the matrices of the three term relations satisfied by the Uvarov modification are presented, and for specific values of the parameters we compare the zeros of the classical bivariate Jacobi polynomials and those of the Uvarov family. We must point out that the zeros of multivariate polynomials constitute a theory that remains open, only a few analytic results concerning zeros are known (see Dunkl and Xu 2014; Area et al. 2015). In general, a zero of a multivariate polynomial is an algebraic curve, and different basis have different zeros. Finally, the second example is devoted to the Bessel-Laguerre case, which is non positive definite moment functional. For this case we show that the presented approach is also valid, giving rise to the matrices of the three term relations for the Uvarov modification.
2 Orthogonal polynomials in several variables
Let \({\mathbb {N}}_0\) be the set of nonnegative integers and \(d \in {{\mathbb {N}}}_{0}\). For \({m}=(m_1,\dots ,m_d)\in {\mathbb {N}}_0^d\) and \({x}=(x_1,\dots ,x_d)\in {\mathbb {R}}^d\), we write a monomial as
The number \(|{m}|=m_1+\dots +m_d\) is called the total degree of \({x}^{{m}}\).
Along this paper, we denote by \(\Pi \) the linear space of polynomials in d variables with real coefficients, by \(\Pi _n\) the linear space of real polynomials of total degree not greater than n, and we will denote by \({\mathcal {P}}_n\) the space of homogeneous polynomials of total degree n in d variables. It is well known that (Dunkl and Xu 2014)
For \(h, k\ge 1\), let \({{\mathcal {M}}}_{h\times k}({\mathbb {R}})\), and \({{\mathcal {M}}}_{h\times k}({\Pi })\) be the linear spaces of \((h\times k)\) matrices with real and polynomial entries, respectively, and \(I_h\) will represent the identity matrix of size \(h\times h\). In general, if \(h=k\), we will omit one of the subscript.
We define the degree of a polynomial matrix \(A=\left( a_{i,j} ({x})\right) _{i,j=1}^{h,k} \in {{\mathcal {M}}}_{h\times k}(\Pi )\) as the maximum of the degrees of the entries in such a matrix, i. e.,
As usual, given a matrix M, we will denote by \(M^{\top }\) its transpose, and, if M is a square matrix, then we will say that it is non-singular if \(\det M\ne 0\).
Following (Dunkl and Xu 2014, p. 71), if \(M_1\), \(M_2\), \(\ldots \), \(M_d\) are matrices of the same size \(p\times q\), then their joint matrix M is defined as
of size \(d\,p\times q\).
Given a sequence of polynomials of total degree n, \(\{P^n_{m}\}_{|m|=n}\), we will use the vector notation (see Dunkl and Xu 2014) to define \({\mathbb {P}}_n\) as the column vector polynomial
where \({m}_{1},{m}_{2},\ldots ,{m}_{r_n}\) are the elements in \(\{{m} \in {\mathbb {N}}_0^d:\,|{m}|=n \}\) arranged according to the reverse lexicographical order. Observe that \({\mathbb {P}}_0\) is a constant (\(r_0 = 1\)), and \({\mathbb {P}}_1 \) is a column vector of independent multivariate polynomials of degree 1 of dimension \(r_1=d\).
Definition 1
A polynomial system (PS) is a vector polynomial sequence \(\{{\mathbb {P}}_n\}_{n\ge 0}\) such that the set of the entries of \(\{{\mathbb {P}}_m\}_{m=0}^n\) is a basis of \(\Pi _n\) for each \(n\ge 0\), and by extension we will say that \(\{{\mathbb {P}}_m\}_{m=0}^n\) is a basis of \(\Pi _n\).
The simplest case of polynomial system is the so–called canonical polynomial system, defined as
Using the vector notation, for a given polynomial system \(\{{\mathbb {P}}_n\}_{n\ge 0}\), the vector polynomial \({\mathbb {P}}_n\) can be written as
where \(G_n = G_{n,n}\) is called the leading coefficient of \({\mathbb {P}}_n\), which is a square matrix of size \(r_n\). Moreover, since \(\{{\mathbb {P}}_m\}_{m=0}^n\) is a basis of \(\Pi _n\), then \(G_n\) is non-singular.
We will say that two PS \(\{{\mathbb {P}}_n\}_{n\ge 0}\) and \(\{{\mathbb {Q}}_n\}_{n\ge 0}\) have the same leading coefficient if \({\mathbb {P}}_0={\mathbb {Q}}_0\), and \({\mathbb {P}}_n\) and \({\mathbb {Q}}_n\) have the same leading coefficient matrix for \(n\ge 1\), that is, the entries of the vector \({\mathbb {P}}_n - {{\mathbb {Q}}}_n\) are polynomials in \(\Pi _{n-1}\).
In addition, a polynomial system is called monic if every polynomial contains only one monic term of highest degree, that is, for \(n\ge 0\),
where \(|{m}_k| = n\), and \(R({x})\in \Pi _{n-1}\). Equivalently, a monic polynomial system is a polynomial system such that its leading coefficient is the identity matrix, i. e., \(G_n = I_{r_n}\), for \(n\ge 0\).
Usually, a moment linear functional u defined on \(\Pi \) is introduced from its moments. In fact (Dunkl and Xu 2014), let \(\{\mu _{{m}}\}_{{m}\in {\mathbb {N}}_0^d}\) be a multi–sequence of real numbers and let u be a real valued functional defined on \({\Pi }\) by means of
and extended by linearity. Then, u is called the moment functional determined by \(\{\mu _{{m}}\}_{{m}\in {\mathbb {N}}_0^d}\), and the number \(\mu _{{m}}\) is called the moment of order m.
The action of a moment linear functional u is extended over polynomial matrices in the following way (see, for instance, Dunkl and Xu 2014). Let \(A= \left( a_{i,j} ({x})\right) _{i,j=1}^{h,k} \in {{\mathcal {M}}}_{h\times k}(\Pi )\) be a polynomial matrix. Then
Given a moment functional u in \(\Pi \), two polynomials p and q are said to be orthogonal with respect to u if \(\langle u, p\,q\rangle =0\). For each \(n\ge 0\), let \({\mathcal {V}}_n \subset \Pi _n\) be the set of polynomials of total degree n that are orthogonal with respect to the linear functional u to all polynomials in \(\Pi _{n-1}\) together with zero. Then, \({\mathcal {V}}_n\) is a linear space of dimension less than or equal to \(r_n\) ( Dunkl and Xu 2014).
Definition 2
An orthogonal polynomial system (OPS) with respect to a given linear functional u is a PS \(\{{\mathbb {P}}_n\}_{n\ge 0}\) such that
where \(\Delta _{n,m}\) is the \((n+1)\times (m+1)\) zero matrix for \(n\ne m\), and the identity matrix for \(n=m\), and \(H_n\in {{\mathcal {M}}}_{r_n}({\mathbb {R}})\) is a symmetric and non-singular matrix.
Moreover, if \(H_n\) is a diagonal matrix, we will say that \(\{{\mathbb {P}}_n\}_{n\ge 0}\) is a mutually orthogonal polynomial system (OPS) with respect to the linear functional u.
A moment functional u is called quasi definite if there exists an OPS with respect to u. Given a quasi definite moment linear functional u, orthogonal polynomial systems with respect to u are not unique. In fact, \(\{{\mathbb {P}}_n\}_{n\ge 0}\) and \(\{\widehat{{\mathbb {P}}}_n\}_{n\ge 0}\) are OPS associated with u if and only if there exist non-singular matrices \(F_n\) such that
In this case, for \(n, m\ge 0\) we get
where \({\widehat{H}}_n = F_n\,H_n\,F_n^\top \) is a non-singular and symmetric matrix. Therefore, there exists a unique monic orthogonal polynomial system that can be obtained by
where \(G_n\) are the respective leading coefficients of \({\mathbb {P}}_n\).
Observe that u is quasi definite if and only if
In this case, a PS \(\{{\mathbb {P}}_n\}_{n\ge 0}\) is an OPS if and only if the set of entries of the vector \({\mathbb {P}}_n\) is a basis of \({\mathcal {V}}_n\), \(n\ge 0\).
A moment functional u is called positive definite if \(\langle u, p^2({x})\rangle > 0\), for all \(p({x}) \in \Pi \), \(p({x})\not \equiv 0\). A positive definite moment functional is quasi definite, and it is possible to construct an orthonormal polynomial system, that is, an orthogonal polynomial system such that \(H_n\) is positive definite. If \(H_n\) is the identity matrix, then \(\{P_m^n\}_{|m|=n}\) is an orthonormal basis for \({\mathcal {V}}_n\) and the OPS is called an orthonormal polynomial system.
Orthogonal polynomials in several variables are characterized by d vector–matrix three term relations (see Dunkl and Xu 2014, Theorem 3.3.7, p. 74). More precisely,
Theorem 3
(Dunkl and Xu 2014) Let \(\{{\mathbb {P}}_n\}_{n\ge 0} = \{ P^n_{\mathrm {m}}(\mathrm {x}): |\mathrm {m}| = n, n\in {\mathbb {N}}_0\}, {\mathbb {P}}_0=1\), be an arbitrary sequence in \(\Pi \). Then the following statements are equivalent.
-
1.
There exists a linear functional u which defines a quasi definite moment functional on \(\Pi \) and which makes \(\{{\mathbb {P}}_n\}_{n\ge 0}\) an orthogonal basis in \(\Pi \).
-
2.
For \(n\ge 0\), \(1\le i\le d\), there exist matrices \(A_{n,i}\), \(B_{n,i}\) and \(C_{n,i}\) of respective sizes \(r_n\times r_{n+1}, \, r_n\times r_{n}\) and \( r_n \times r_{n-1}\), such that
-
(a)
the polynomials \({\mathbb {P}}_n\) satisfy the three term relations
$$\begin{aligned} x_i {\mathbb {P}}_n = A_{n,i} {\mathbb {P}}_{n+1} + B_{n,i} {\mathbb {P}}_{n} +C_{n,i} {\mathbb {P}}_{n-1}, \quad 1\le i\le d, \end{aligned}$$(2)with \({\mathbb {P}}_{-1}=0\), \(C_{-1,i} =0\), and \(\mathrm {x} = (x_1, x_2, \ldots , x_d)\),
-
(b)
for \(n\ge 0\) and \(1\le i\le d\), the matrices \(A_{n,i}\) and \(C_{n+1,i}\) satisfy the rank conditions
$$\begin{aligned} {\text {rank}}\, A_{n,i} = {\text {rank}}\, C_{n+1,i}=r_n, \end{aligned}$$(3)and, for the joint matrices \(A_n\) of \(A_{n,1}, A_{n,2},\ldots , A_{n,d}\), of size \(d\,r_{n}\times r_{n+1}\) and \(C_{n+1}^\top \) of \(C_{n+1,1}^\top , C_{n+1,2}^\top ,\ldots , C_{n+1,d}^\top \), of size \(d\,r_{n}\times r_{n+1}\), we get
$$\begin{aligned} {\text {rank}}\, A_{n} = {\text {rank}}\, C_{n+1} = r_{n+1}. \end{aligned}$$(4)
In that case, we get
$$\begin{aligned} {\left\{ \begin{array}{ll} A_{n,i} \, H_{n+1} = \langle u, x_i\, {\mathbb {P}}_{n}{\mathbb {P}}_{n+1}^\top \rangle , \\ B_{n,i} \, H_{n} = \langle u, x_i\, {\mathbb {P}}_{n}{\mathbb {P}}_{n}^\top \rangle , \\ C_{n,i} \, H_{n-1} = \langle u, x_i\, {\mathbb {P}}_{n}{\mathbb {P}}_{n-1}^\top \rangle , \end{array}\right. } \end{aligned}$$(5)where \(H_n = \langle u, {\mathbb {P}}_{n}{\mathbb {P}}_{n}^\top \rangle \), and \(A_{n,i} \,H_{n+1} = H_n\,C_{n+1,i}^\top \).
-
(a)
Relations (2) can be written in a block matrix way (Kowalski 1982a, b; Dunkl and Xu 2014). In fact, for \(1\le i \le d\), we define the block Jacobi matrices
Observe that the entries of the block Jacobi matrices are the coefficients of the ith three term relation, whose sizes increase to infinity. If we denote
the column of all polynomials, then, three term relations (2) become to
The version of above theorem for orthonormal polynomial systems \(\{{\mathbb {P}}_n\}_{n\ge 0}\) is obtained by changing \(C_{n,i}\) by \(A_{n-1,i}^\top \), \( 1 \le i \le d\), since \(H_n = I_{r_n}\), \( \, n \ge 0\).
If \(\{\widehat{{\mathbb {P}}}_n\}_{n\ge 0}\) is another OPS associated with u, then there exist non-singular matrices \(F_n\) such that \(\widehat{{\mathbb {P}}}_n = F_n\,{\mathbb {P}}_n, n\ge 0\). Multiplying (2) times \(F_n\), we deduce that
This means that \(\{\widehat{{\mathbb {P}}}_n\}_{n\ge 0}\) satisfy the three term relations
where \({\widehat{A}}_{n,i} = F_n A_{n,i}F_{n+1}^{-1}\), \({\widehat{B}}_{n,i} = F_n B_{n,i} F_n^{-1}\), and \({\widehat{C}}_{n,i} = F_n C_{n,i}F_{n-1}^{-1}\). Obviously, the rank conditions (3) and (4) are preserved since the rank is unchanged upon left or right multiplication by a non-singular matrix (Horn and Johnson 2013, p. 13).
When the orthogonal polynomial system \(\{{\mathbb {P}}_n\}_{n\ge 0}\) is monic, comparing the highest coefficient matrices at both sides of (2), it follows that \(A_{n,i} = L_{n,i}\), for \(n\ge 0\), and \(1\le i\le d\) (Dunkl and Xu 2014), where \(L_{n,i}\) are matrices of size \(r_n\times r_{n+1}\) defined by
These matrices verify \(L_{n,i}L_{n,i}^\top =I_{r_n}\), and \({\text {rank}}L_{n,i} = r_n\). Moreover, the rank of the joint matrix \(L_n\) of \(L_{n,i}\) is \(r_{n+1}\) (Dunkl and Xu 2014, p. 71).
For the particular case \(d=2\), we have that \(L_{n,i}\), \(i=1,2\), are the \((n+1)\times (n+2)\) matrices defined as
In the general case, comparing the leading coefficient matrices at both sides of (2), we get \( G_n\,L_{n,i} = A_{n,i}\, G_{n+1} \), that is,
where \(G_{n}\) is the leading coefficient matrix of \({\mathbb {P}}_n\).
Let u be a quasi definite moment linear functional, and let \(\{{\mathbb {P}}_n\}_{n\ge 0}\) be an OPS with respect to u. In terms of \(\{{\mathbb {P}}_n\}_{n\ge 0}\), the kernel of \({\mathcal {V}}_m\), denoted by \({\mathbf {P}}_m(u;{x},{y})\) (Dunkl and Xu 2014, p. 97), is defined by
Similarly, the kernel of \(\Pi _n\), takes the form
The definition of both kernels does not depend on a particular basis.
For orthogonal polynomials in one variable, the kernel function is called reproducing kernel function. In several variables, we have an analogous property (Dunkl and Xu 2014), that is,
3 Uvarov orthogonal polynomials in several variables
From now on, we consider u a quasi definite moment functional defined on \(\Pi \). Then orthogonal polynomials of several variables with respect to u exist, and let us denote by \(\{{\mathbb {P}}_n\}_{n\ge 0}\) an OPS associated with it.
Let \(N \ge 1\) be a positive integer and let \(\xi ^{(1)}, \xi ^{(2)}, \ldots , \xi ^{(N)}\) be distinct points in \({\mathbb {R}}^d\). Obviously, every point has d entries, then we will write
and \(\xi ^{(j)}_i \in {\mathbb {R}}\), for \(1\le i\le d\).
Let \(\Lambda \) be a symmetric matrix of size \(N \times N\). We define the new moment functional v as a Uvarov modification of the original moment functional given by
for \(p, q \in \Pi \), where
denotes the vector of evaluations of the polynomial p(x) at the points \(\xi ^{(1)}, \xi ^{(2)}, \ldots , \xi ^{(N)}\).
We want to know how the new inner product (10) acts over polynomial systems. Given a PS \(\{{\mathbb {P}}_n\}_{n\ge 0}\), if we denote by \({\mathsf {P}}_n(\xi )\) the matrix that has \({\mathbb {P}}_n(\xi ^{(j)})\) as columns,
then, the action of (10) is as follows:
If (10) is quasi definite, we denote by \(\{{\mathbb {Q}}_n\}_{n\ge 0}\) an orthogonal polynomial system associated with it, such that \({\mathbb {P}}_n\) and \({\mathbb {Q}}_n\) have the same leading coefficient, for all \(n\ge 0\).
Following Delgado et al. (2010), if u is given by means of a measure \(d\mu ({x})\) on \({\mathbb {R}}^d\) with all finite moments and we assume that \(d \mu \) is positive definite in the sense that
and the matrix \(\Lambda \) is positive definite, then v is positive definite and an OPS with respect to v exist.
Our first goal is to study the existence of orthogonal polynomials with respect to the moment functional v defined in (10).
We need to introduce several extra notations. If \(\{{\mathbb {P}}_n\}_{n\ge 0}\) is an OPS with respect to u, then we denote by \({\mathsf {K}}_{n-1}\) the square matrix of constants whose entries are \({\mathbf {K}}_{n-1}(u;\xi ^{(j)},\xi ^{(k)})\),
and, denote by \({\mathbb {K}}_{n-1}(\xi ,{x})\) the \(N\times 1\) vector of polynomials
In (12) and (13) for \(n=0\) we assume \({\mathbf {K}}_{-1}(u;{x},{y}) = 0\).
From the fact that \({\mathbf {K}}_{n}(u; {x}, {y}) - {\mathbf {K}}_{n-1}(u; {x}, {y}) = {\mathbf {P}}_n(u;{x},{y})\), we have immediately the following relations,
which will be used below.
In Fernández et al. (2010), a necessary and sufficient condition in order to be v quasi definite when \(N=1\) is given. In addition, orthogonal polynomials with respect to v can be expressed in terms of those with respect to the linear functional u. That result can be extended for \(N \ge 1\) by using a similar technique as in Delgado et al. (2010), but in this case, we focus the attention on the analysis of the quasi definite character of the perturbed linear functional.
Theorem 4
The moment linear functional v is quasi definite if and only if the \(N\times N\) matrix
is non-singular for \(n\ge 0\). In this case, if \(\{{\mathbb {P}}_n\}_{n\ge 0}\) denotes an OPS with respect to the linear functional u, the system of polynomials \(\{{\mathbb {Q}}_n\}_{n\ge 0}\) defined by
is an OPS with respect to the linear functional v taking \({\mathsf {K}}_{-1}(\cdot ,\cdot )=0\), where
Moreover,
In the rest of the section, we will suppose that v is quasi definite, and we denote by \(\{{\mathbb {Q}}_n\}_{n\ge 0}\) an OPS with respect to v defined by (15). Also we denote
Then \({\widetilde{H}}_n\) is a \(r_n\times r_n\) symmetric non-singular matrix. It turns out that both \({\widetilde{H}}_n\) and \({\widetilde{H}}_n^{-1}\) can be expressed in terms of matrices that only involve \(\{{\mathbb {P}}_m\}_{m \ge 0}\). In Delgado et al. (2010) was consider the simplest case when u is positive definite and \(\{{\mathbb {P}}_n\}_{n\ge 0}\) is orthonormal, that is, \(H_n = I_{r_n}\).
Proposition 5
For \(n \ge 0\),
where \(\Xi _{n}\) is defined in (16).
Proof
We compute directly,
In order to study the inverse, we calculate
We study this last term. Observe that, from (14), we get
and then,
Substituting above, we get
and the result holds. \(\square \)
Next result gives explicit formulas for the reproducing kernels associated with v, which we denote by
Theorem 6
For \(m\ge 0\), \(\Xi _{m}\) defined in (16) is a symmetric matrix, and
where we assume \({\mathbb {K}}_{-1}(\mathrm {x},\mathrm {y}) \equiv 0\). Furthermore, for \(n \ge 0\),
4 Three term relations for Uvarov orthogonal polynomials
In this section, let \(\{{\mathbb {P}}_n\}_{n\ge 0}\) be an OPS with respect to u, and let \(\{{\mathbb {Q}}_n\}_{n\ge 0}\) be an OPS with respect to v such that they have the same leading coefficient.
Both OPS satisfy three term relations, denoted by
where \({\mathbb {P}}_{-1}=0\) and \({\mathbb {P}}_{0}=G_0 \ne 0\), and by
with \({\mathbb {Q}}_{-1}=0\) and \({\mathbb {Q}}_{0}=G_0 \ne 0\). The coefficients \(B_{n,i}\) and \({\widetilde{B}}_{n,i}\), for \(n\ge 0\), are \(r_n\times r_n\) matrices, and, \(A_{n,i}\), \({\widetilde{A}}_{n,i}\), \(C_{n,i}\), and \({\widetilde{C}}_{n,i}\) are, respectively, \(r_n \times r_{n+1}\), and \(r_n\times r_{n-1}\) coefficient matrices given by (5) satisfying the respective rank conditions (3) and (4). Here, \( {x} = (x_1, x_2, \ldots , x_d). \)
Theorem 7
The matrices \({\widetilde{A}}_{n,i}\), \({\widetilde{B}}_{n,i}\) and \({\widetilde{C}}_{n,i}\) of the three term relations (18) for the vector polynomials \(\{{\mathbb {Q}}_n\}_{n\ge 0}\) orthogonal with respect to the linear functional v defined in (10) are given by
where the matrices \(A_{n,i}\), \(B_{n,i}\) and \(C_{n,i}\) are those of the three term relations for the polynomials \(\{{\mathbb {P}}_n\}_{n\ge 0}\) orthogonal with respect to the linear functional u, \({\mathsf {P}}_n(\xi )\) is defined in (11), the matrices \(H_{n}\) are defined in (1), and \(\Xi _{n}\) is defined in (16).
Proof
Since both OPS have the same leading coefficient, by (9) we get \(A_{n,i} = {\widetilde{A}}_{n,i}\), for \(1\le i \le d\), and \(n\ge 0\).
We will compute \({\widetilde{B}}_{n,i}\) from the explicit expressions of the vector polynomials in terms of the canonical basis. In this way, we know that \({\mathbb {P}}_0({x}) = {\mathbb {Q}}_0({x})\), and, for \(n\ge 1\), we can express
where \(G_n = {\widetilde{G}}_n\) are non-singular matrices of size \(r_n\), and \(G_m^n\) and \({\widetilde{G}}_m^n\), for \(m=0, 1, \ldots , n-1\), are matrices of constants of dimension \(r_n\times r_m\). Relating the coefficient of the term \({\mathbb {X}}_n\) in (18), we get
and, analogously, for the first family, using (17)
by defining \(L_{-1,i} =0\).
Next, we want to deduce the matrix coefficient of \({\mathbb {X}}_n\) in expression (15) written for \(n+1\). First, we observe that, for \(1\le j\le d\) and \(n\ge 0\), we have
and then, the coefficient of \({\mathbb {X}}_n\) in \( {{\textbf {K}}}_n(u; \xi ^{(j)},{x})\) is given by \({\mathbb {P}}_{n}^\top (\xi ^{(j)})\, H_n^{-1}\).
Therefore, the vector of kernels can be written as
Thus, the coefficient of \({\mathbb {X}}_n\) in (15) for \(n+1\) is
Substituting in (20), and using (21), we get
Since \(G_n= {\widetilde{G}}_n\) is a non-singular matrix, we get the announced expression.
Now, we compute \({\widetilde{C}}_{n+1,i}\), \(n\ge 0\). From (5), we know that
Using Proposition 5, we get
which completes the proof. \(\square \)
We would like to notice that, in the monic case, the result simplifies by substituting \(A_{n,i} = {\widetilde{A}}_{n,i} = L_{n,i}\).
Remark 1
We can give a block matrix perspective of expressions (19). Let
be the block Jacobi matrices associated with the three term relations for the Uvarov polynomials. We also define the block matrices
where \(\bigcirc \) means a zero matrix of adequate size, and the omitted elements are considered as zeros. Then, the block Jacobi matrix associated with the Uvarov orthogonal polynomials is a perturbation of the original block Jacobi matrix in the form
5 Examples
In this section we analyse two examples in the bivariate case to apply our results. In the first example we study Uvarov polynomials on the bivariate simplex with mass points at the vertexes. We transform the standard basis of the polynomials on the simplex to the monomial basis, and compute explicitly all matrices that we need. We compare the some low total degree polynomials, by showing their explicit expressions both for the classical case and for the Uvarov modification, by choosing some specific values of the parameters. Second example is devoted to a non positive definite case based on Bessel and Laguerre polynomials.
In both cases, we compute explicitly the involved coefficients. For simplicity in this bivariate case, we will denote \(x= x_1\), \(y =x_2\).
5.1 Bivariate Uvarov orthogonal polynomials on the simplex
For \(0 \le m \le n\), and \(\alpha , \beta , \gamma > -1\), let us consider the family of classical bivariate polynomials on the simplex
where
stands for classical univariate Jacobi orthogonal polynomials in [0, 1], i.e.
where \(\delta _{n,m}\) denotes the Kronecker delta.
The orthogonality relation for the bivariate polynomials (23) can be written as
Hence, if
then, the OPS \(\{{\mathbb {P}}_n^{(\alpha ,\beta ,\gamma )} \}_{n\ge 0}\) is a mutually OPS, and
where the diagonal and non-singular matrix \({\mathbb {H}}_{n}^{(\alpha ,\beta ,\gamma )}\) has as i-th entry
for \(0\le i \le n\).
Moreover, the bivariate polynomials (23) are solutions of the following potentially self-adjoint second order partial differential equation of hypergeometric type (Area et al. 2012a, b) which has been deeply analyzed in the literature (see e.g. Appell and Kampé de Fériet (1926), p. 104, formula (28), Suetin (1999), Chapter III, Krall and Sheffer (1967), or Dunkl and Xu (2014))
The family of bivariate monic polynomials
is also solution of the partial differential equation (26). Let
In the monic case, the recurrence relations can be written as
with the initial conditions \(\widehat{{\mathbb {P}}}_{-1}^{(\alpha ,\beta ,\gamma )}=0\) and \(\widehat{{\mathbb {P}}}_{0}^{(\alpha ,\beta ,\gamma )}=1\), where the matrices \(L_{n,j}\) of size \((n+1) \times (n+2)\) are defined by (8), \({{\widehat{B}}}_{n,j}^{(\alpha ,\beta ,\gamma )}\) are of size \((n+1) \times (n+1)\), and \( {{\widehat{C}}}_{n,j}^{(\alpha ,\beta ,\gamma )}\) are matrices of size \((n+1) \times n\).
The square matrices \({\widehat{B}}_{n,1}^{(\alpha ,\beta ,\gamma )}\) and \({\widehat{B}}_{n,2}^{(\alpha ,\beta ,\gamma )}\) are, respectively, lower and upper bidiagonal
where
Moreover,
where
and
with entries
for \(0 \le i \le n-1\), and
for \(0 \le i \le n-2\).
Both families of orthogonal polynomials (23) and (27) (with respect to the same weight function on the same domain, and solution of the same partial differential equation) are related as
with \(u_{i,j,n}^{(\alpha ,\beta ,\gamma )}=0\) if \(j>i\) and
if \(j \le i\). Therefore, if we multiply (28) by \({{\mathbb {U}}}_{n}^{(\alpha ,\beta ,\gamma )}\) we obtain the following recurrence relations for the initial family (24)
where
By using (see e.g. Area et al. (2017), p. 776 or Suetin (1999), Eq. (15), p. 81)
we get
The orthogonality relation for the monic polynomials reads
where for \(0\le i,j\le n\) the matrix \(\widehat{{{\mathbb {H}}}}_{n}^{(\alpha ,\beta ,\gamma )}\) of size \((n+1) \times (n+1) \) has as (i, j)-entry
Let us compute the inverse of \(\widehat{{{\mathbb {H}}}}_{n}^{(\alpha ,\beta ,\gamma )}\). For standard polynomials, we just need to compute the inverse of the diagonal matrix \({{\mathbb {H}}}_{n}^{(\alpha ,\beta ,\gamma )}\) of size \((n+1)\times (n+1)\) with entries given in (25). By using (32) we have
Thus,
where \( h_{n,k-1}^{(\alpha ,\beta ,\gamma )}\) and \(u_{k,i,n}^{(\alpha ,\beta ,\gamma )}\) are defined in (25) and (33), respectively.
Now, we introduce the Uvarov moment functional on the simplex. Let us denote by u the linear functional associated with the bivariate orthogonal polynomials on the simplex
and let the matrix \(\Lambda \) in (10) be the diagonal matrix
where \(M_1, M_2, M_3\) are positive real numbers. Hence, the Uvarov linear functional v is defined by
In this case, the matrices \({\mathsf {K}}_{n}\) defined in (12) are explicitly given by
where we denote \(u=(1,1,1)^\top \), \(A_{n} = {{\,\mathrm{diag}\,}}\left\{ a_{n,1},a_{n,2},a_{n,3}\right\} \), and
Since \(\Lambda \) is positive definite, the inverse of the matrix \(I_{3}+\Lambda {{\mathsf {K}}}_{n}\) is computed as
Let us denote
where \(z_{n,i}=M_{i}^{-1}+a_{n,i}-b_{n}.\) Using the Sherman–Morrison–Woodbury identity, it follows (see Golub and Van Loan 1996)
The matrix \({\mathsf {P}}_n(\xi )\) defined in (11) is explicitly given by
where for \(m=0,1,\ldots ,n\)
from the properties
Then, we can express the monic Uvarov polynomials on the simplex by using the explicit expression (15). Applying Theorem 7, the coefficients of the three term relations for Uvarov polynomials are given by (19), where the involved matrices are already explicitly computed.
Finally, we analyse a particular example. Consider \(\alpha = \beta = 1\), \(\gamma =1/2\), and \(M_1 = M_2 = M_3 =1/2\). The Uvarov inner product is given by
Table 1 shows the explicit expression of the first classical monic polynomials on the simplex, and the Uvarov monic polynomials perturbed as above. We have plotted the polynomials of degree 2 up to degree 5 of the zeros (as algebraic curves Area et al. (2015)) of both families as well as the corresponding surfaces in Figs. 1, 2, 3, 4, 5. We must point out that the zeros of multivariate polynomials constitute a theory that remains open, only a few analytic results concerning zeros are known (see Dunkl and Xu 2014; Area et al. 2015). In general, a zero of a multivariate polynomial is an algebraic curve, and different basis have different zeros. Here, we wanted to show the impact of the Uvarov modification on the orthogonal polynomials as well as its zeros.
5.2 Bivariate Uvarov Bessel–Laguerre orthogonal polynomials
Let us consider the classical Laguerre polynomials
orthogonal with respect to the positive definite univariate moment functional
with \(w^{(\alpha )}(t) = t^\alpha \, e^{-t}, \quad \alpha > -1\). Formula (5.1.1) in Szegő (1975) provides
Let
be the classical univariate Bessel polynomials orthogonal with respect to the quasi definite but not positive definite moment functional ( Krall and Frink 1949)
where
for \(a\ne 0, -1, -2, \ldots \), \(b\ne 0\), where \(i^{2}=-1\), and T is the unit circle oriented in the counter-clockwise direction ( Krall and Frink (1949), eq. (58)). Moreover, for \(a \ge 2\) integer and b real we have
which implies
This means that Bessel polynomials are associated with a quasi definite but not positive definite moment functional.
For \(n\ge 0\), \(g, \gamma \in {\mathbb {R}}\), such that \(g + n \ne 0\), \(g \gamma + n \ne 0\), the bivariate Bessel-Laguerre orthogonal polynomials are defined by
Following (Dunkl and Xu 2014, p. 39), Bessel-Laguerre polynomials (35) are mutually orthogonal with respect to a non positive definite moment functional w acting as follows. We define
on the region \(R=\{(x,y): x\in T,\,\, 0< gy/x < + \infty \}.\) The bivariate moment functional is defined as
Therefore, if \(g\ge 2\) is integer
Bessel–Laguerre polynomials appear in Kwon et al. (2001) as solutions of the partial differential equation
and were considered later in Area et al. (2012a) and Marriaga et al. (2017), among others. The partial differential equation (37) has as monic solution
Let us define the Bessel–Laguerre mutually orthogonal polynomial system \(\{{\mathbb {P}}_n^{(g,\gamma )}\}_{n\ge 0}\) given by
We get
where \( {\mathbb {H}}_{n}^{(g,\gamma )} = {{\,\mathrm{diag}\,}}\{h_{n,0}^{(g,\gamma )} , h_{n,1}^{(g,\gamma )} , \ldots , h_{n,n}^{(g,\gamma )} \}\) is a \((n+1)\) diagonal matrix with entries
Let us also consider the monic Bessel-Laguerre system
Both families of orthogonal polynomials (35) and (38) (with respect to the same weight function on the same domain, and solution of the same partial differential equation) are related as
with \(u_{i,j,n}^{(g,\gamma )}=0\) if \(j>i\) and
if \( j \le i\). The orthogonality relation for the monic polynomials reads
where \(\widehat{{{\mathbb {H}}}}_{n}^{(g,\gamma )}\) is a non-singular matrix of size \((n+1) \times (n+1)\). By using (41) we have
Thus
The monic Bessel-Laguerre polynomials \( \{\widehat{{\mathbb {P}}}_n^{(g,\gamma )}\}_{n\ge 0}\) satisfy three term relations
where now
Moreover, \({\widehat{C}}_{n,1}^{(g,\gamma )}\) and \({\widehat{C}}_{n,2}^{(g,\gamma )}\) have the same structure as (30) and (31), respectively, where now
and
The mutually OPS of non monic Bessel-Laguerre polynomials \( \{{\mathbb {P}}_n^{(g,\gamma )}\}_{n\ge 0}\) also satisfy the three term relations (see Marriaga et al. 2017)
If we multiply (43) by \({{\mathbb {U}}}_{n}^{(g,\gamma )}\) we obtain
The explicit expressions of the matrices in (43) are as follows. The recursion coefficients \(A_{n,j}^{(g,\gamma )}\) are the \((n+1)\times (n+2)\) matrices
where
and
with entries
The coefficients \(B_{n,j}^{(g,\gamma )}\) are
where
Moreover,
where
and
with entries
Now, we define the Uvarov modification. We take \(\xi = (0,0)\). Let us denote by w the non positive definite moment functional associated with the bivariate Bessel-Laguerre polynomials defined in (36) and let \(\Lambda \)=M be a real number such that
Hence, the Uvarov linear functional v is defined by
By using (35) as well as \(B_{n-m}^{(g+2m,-g)}(0)=1\), we get
and then,
Observe that
The matrix \({\mathsf {P}}_n(\xi )\) defined in (11) is explicitly given by using the above value in
Now, we compute the matrix \(\Lambda _n = 1 + \Lambda \,{{\mathsf {K}}_{n}}\).
In this case, \({\mathsf {K}}_{n}\) is explicitly given by
and
We can now give explicitly the matrices \({\widetilde{A}}_{n,i}\), \({\widetilde{B}}_{n,i}\) and \({\widetilde{C}}_{n,i}\) of the three term relations for the vector polynomials \(\{{\mathbb {Q}}_n\}_{n\ge 0}\) orthogonal with respect to the linear functional v defined in (44) according to Theorem (7). First of all,
Moreover,
where
and
with entries
Finally,
where
and
with entries
References
Álvarez-Nodarse R, Arvesú J, Marcellán F (2004) Modifications of quasi-definite linear functionals via addition of delta and derivatives of delta Dirac functions. Indag Math New Ser 15(1):1–20
Appell P, Kampé de Fériet J (1926) Fonctions hypergéométriques et hypersphériques. Polynômes d’Hermite, Gauthier-Villars, Paris
Area I, Godoy E, Ronveaux A, Zarzo A (2012a) Bivariate second-order linear partial differential equations and orthogonal polynomial solutions. J Math Anal Appl 387:1188–1208
Area I, Godoy E, Rodal J (2012b) On a class of bivariate second-order linear partial difference equations and their monic orthogonal polynomial solutions. J Math Anal Appl 389:165–178
Area I, Dimitrov DK, Godoy E (2015) Zero sets of bivariate Hermite polynomials. J Math Anal Appl 421:830–841
Area I, Foupouagnigni M, Godoy E, Guemo Y (2017) On moments of hypergeometric bivariate weight functions. Bull Sci Math 141(8):766–784
Bueno MI, Marcellán F (2004) Darboux transformation and perturbation of linear functionals. Linear Algebra Appl 384:215–242
Chihara TS (1985) Orthogonal polynomials and measures with end point masses. Rocky Mt J Math 15(3):705–719
Delgado AM, Fernández L, Pérez TE, Piñar MA, Xu Y (2010) Orthogonal polynomials in several variables for measures with mass points. Numer Algorithms 5:245–264
Delgado AM, Fernández L, Pérez TE, Piñar MA (2012) On the Uvarov modification of two variable orthogonal polynomials on the disk. Complex Anal Oper Theory 6(3):665–676
Delgado AM, Fernández L, Pérez TE (2018) Fourth order partial differential equations for Krall-type orthogonal polynomials on the triangle. Proc Am Math Soc 146(9):3961–3974
Dunkl CF, Xu Y (2014) Orthogonal polynomials of several variables, 2nd edition, Encyclopedia of mathematics and its applications, vol 155. Cambridge Univ. Press, Cambridge
Fernández L, Pérez TE, Piñar MA, Xu Y (2010) Krall-type orthogonal polynomials in several variables. J Comput Appl Math 233:1519–1524
García-Ardila JC, Marcellán F, Marriaga ME (2021) Orthogonal polynomials and linear functionals-an algebraic approach and applications, EMS Series of Lectures in Mathematics. EMS Press, Berlin
Golub GH, Van Loan CF (1996) Matrix computations, 3rd edn. Johns Hopkins, Baltimore
Horn RA, Johnson CR (2013) Matrix analysis, 2nd edn. Cambridge Univ. Press, Cambridge
Kowalski MA (1982a) The recursion formulas for orthogonal polynomials in n variables. SIAM J Math Anal 13(2):309–315
Kowalski MA (1982b) Orthogonality and recursion formulas for polynomials in n variables. SIAM J Math Anal 13(2):316–323
Krall HL (1940) On orthogonal polynomials satisfying a certain fourth order differential equation. Pennsyl State Coll Stud 1940(6):24
Krall AM (1981) Orthogonal polynomials satisfying fourth order differential equations. Proc R Soc Edinb Sect A Math 87:271–288
Krall HL, Frink O (1949) A new class of orthogonal polynomials: the Bessel polynomials. Trans Am Math Soc 65:100–115
Krall HL, Sheffer IM (1967) Orthogonal polynomials in two variables. Ann Mat Pura Appl 76(4):325–376
Kwon KH, Lee JK, Littlejohn LL (2001) Orthogonal polynomial eigenfunctions of second-order partial differential equations. Trans Am Math Soc 353:3629–3647
Marcellán F, Maroni P (1992) Sur l’adjonction d’une masse de Dirac à une forme régulière et semi-classique. (French). Ann Mat Pura Appl 162(4):1–22
Maroni P (1991) Une théorie algébrique des polynômes orthogonaux. Application aux polynômes orthogonaux semi-classiques, in orthogonal polynomials and their applications (Erice, (1990) IMACS Ann. Comput. Appl. Math., vol 9. Baltzer, Basel, pp 95–130
Marriaga ME, Pérez TE, Piñar MA (2017) Three term relations for a class of bivariate orthogonal polynomials. Mediterr J Math 14(2):26
Martínez C, Piñar MA (2016) Orthogonal polynomials on the unit ball and fourth-order partial differential equations. SIGMA Symmetry Integrability Geom. Methods Appl. 12 , Paper No. 020, 11
Suetin PK (1999) Orthogonal polynomials in two variables, analytical methods and special functions, vol 3. Gordon and Breach Science Publishers, Amsterdam
Szegő G (1975) Orthogonal polynomials, vol 23, 4th edn. Amer. Math. Soc. Colloq. Publ., Amer. Math. Soc., Providence
Uvarov VB (1969) The connection between systems of polynomials that are orthogonal with respect to different distribution functions (Russian). Z Vycisl Mat i Mat Fiz 9(1253–1262):1969 (English translation in USSR Computational and Mathematical Physics, 9:25–36)
Zhedanov A (1997) Rational spectral transformations and orthogonal polynomials. J Comput Appl Math 85(1):67–86
Acknowledgements
The authors would like to thank the referee for his/her many valuable suggestions and comments which led us to improve this paper.
Funding
Open Access funding provided thanks to the CRUE-CSIC agreement with Springer Nature.
Author information
Authors and Affiliations
Corresponding author
Additional information
Communicated by Armin Iske.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
The work of the first author (RA) has been partially supported by TUBITAK Research Grant Proj. no. 120F140. The work of the second author (IA) has been partially supported by the Agencia Estatal de Investigación (AEI) of Spain under Grant PID2020-113275GB-I00, cofinanced by the European Community fund FEDER. Third author (TEP) thanks FEDER/J. Andalucía under grant A-FQM-246-UGR20; MCIN/AEI 10.13039/501100011033 and FEDER funds under grant PGC2018-094932-B-I00; and IMAG-María de Maeztu grant CEX2020-001105-M.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Aktaş, R., Area, I. & Pérez, T.E. Three term relations for multivariate Uvarov orthogonal polynomials. Comp. Appl. Math. 41, 330 (2022). https://doi.org/10.1007/s40314-022-02030-x
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40314-022-02030-x
Keywords
- Multivariate orthogonal polynomials
- Uvarov orthogonal polynomials
- Moment linear functional
- Modification of moment linear functionals