Three term relations for multivariate Uvarov orthogonal polynomials

Three term relations for orthogonal polynomials in several variables associated to a moment linear functional obtained by a Uvarov modification of a given moment functional are studied. Existence of Uvarov orthogonal polynomials is analyzed, stating conditions to ensure it. The matrices of the three term relations of the Uvarov orthogonal polynomials are explicitly given in terms of the matrices of the three term relations satisfied by the original family. Two examples are presented in order to show that the results are valid for positive definite linear functionals and also for some quasi definite linear functionals which are not positive definite.


Introduction
Let {P n } n≥0 be a sequence of orthogonal polynomials with respect to a moment linear functional u. One of the more interesting problems in the theory of orthogonal polynomials in one and in several variables is the modification of the moment linear functional u and the study of the new family of orthogonal polynomials {Q n } n≥0 with respect to the modified moment linear functional v. Among the orthogonality properties, the problem of how to compute the new family of orthogonal polynomials {Q n } n≥0 in terms of the original family of orthogonal polynomials {P n } n≥0 has been the subject of several papers both in the univariate and multivariate cases. Modifications of moment functionals are related with quasi orthogonality, connection formulas between different families of orthogonal polynomials or adjacent families of classical orthogonal polynomials, quadrature formulas, orthogonal polynomials as solutions of higher order differential equations, among others.
In the univariate case, perturbations of positive Borel measures supported on the real line by the addition of mass points have been considered in the literature in the framework of spectral problems for differential operators of higher order. More precisely, for fourth order ordinary linear differential operators the analysis of their polynomial eigenfunctions has been done in Krall (1940), where three new families of orthogonal polynomials, different of the classical ones (Hermite, Laguerre, Jacobi and Bessel) as trivial solutions, appear. They are the so called Legendre type (the Lebesgue measure in the interval [−1, 1] plus two equal positive masses located at ±1), the Laguerre type (the absolutely continuous measure exp(−x)dx supported at (0, +∞) plus a positive mass located at x = 0) and the Jacobi-type (the absolutely continuous measure (1 − x) α dx, α > −1, supported at (0, 1) plus a positive mass located at x = 0). Notice that in Krall (1981) a updated approach is presented. Later on, in Chihara (1985) the author focuses his attention on the study of algebraic properties of orthogonal polynomials with respect to positive measures with masses located at the end points of the convex hull of the support of the measure. Some interesting examples are shown. In Marcellán and Maroni (1992) properties of orthogonal polynomials associated with a perturbation of a quasi definite linear functional by the addition of a mass in any point of the real line have been deeply analyzed. Notice that in this case, the authors deal with a general framework (without restrictions about the location of the mass points and the linear functional in the linear space of polynomials with real coefficients). The main problem is to analyze necessary and sufficient conditions in order that the quasi definiteness of the linear functional is preserved. Such a type of transformations are known in the literature as Uvarov transformations (see Uvarov 1969). They have considered in the framework of linear spectral transformations which are generated by Christoffel and Geronimus transformations (see Zhedanov 1997). The first ones are discrete Darboux transformations of the Jacobi matrix associated to the sequence of orthogonal polynomials with respect to the first linear functional (by using a LU factorization of the above matrix) while the second ones are related to UL factorizations of the Jacobi matrix that are also known as discrete Darboux transformations with parameter (see Bueno andGarcía-Ardila et al. 2021). The modifications of moment linear functionals are usually classified in three types, namely: Uvarov or Krall-type modifications, defined by adding Dirac's delta(s) in one or several fixed points; Christoffel modifications, constructed by multiplying the moment functional times a fixed low degree polynomial, and Geronimus modifications, obtained by dividing the moment functional by a polynomial. The main difference between Uvarov and Krall-type modifications is based in the fact that the Krall-type modifications are made on classical moment functionals such as Jacobi and Laguerre, and the perturbed polynomials satisfy higher order linear differential equations (Krall 1981). Several papers have been devoted to study the modifications of univariate moment functionals. Among others, and referring the references therein, we can cite Álvarez-Nodarse et al. (2004), Bueno and Marcellán (2004), Maroni (1991).
In several variables, the study of the Uvarov and Krall modifications of moment functionals was started in Delgado et al. (2010), where the modification was considered for a bilinear form obtained by adding a Dirac mass to a positive definite moment functional. Explicit formulas of the perturbed orthogonal polynomials were derived in terms of the orthogonal polynomials associated with the original moment functional. In Fernández et al. (2010), the case when the new measure is obtained from adding a set of mass points to another measure is analysed. Also in this case, orthogonal polynomials in several variables associated with the modified measure can be explicitly expressed in terms of orthogonal polynomials associated with the first measure, so are the reproducing kernels associated with these polynomials. A Uvarov modification of the bivariate classical measure on the unit disk by adding a finite set of equally spaced mass points on the border was studied in Delgado et al. (2012). In this situation, both orthogonal polynomials and reproducing kernels associated with the new measure were explicitly expressed in terms of those corresponding with the classical one, and asymptotics of kernel functions were studied.
Besides Uvarov modifications by adding Dirac masses at a finite and discrete set of points, in the context of several variables it is possible to modify the moment functional with other moment functionals defined on lower-dimensional manifolds such as curves, surfaces, etc. A family of orthogonal polynomials with respect to such type of Uvarov modification of the classical ball measure by means of a mass uniformly distributed over the sphere was introduced in Martínez and Piñar (2016), and the authors proved that, at least in the Legendre case, these multivariate orthogonal polynomials satisfy a fourth order partial differential equation, which constitutes a natural extension of Krall orthogonal polynomials to the multivariate case. Moreover, in Delgado et al. (2018), an inner product on the triangle defined by adding Krall terms over the border and the vertexes of the triangle is studied. For particular values of the parameters, orthogonal polynomials associated with these inner product satisfy fourth order partial differential equations with polynomial coefficients, as an extension of the classical theory introduced (Krall 1940) and developed later in Krall (1981).
The aim of this paper is to study the three term relations for orthogonal polynomials in several variables associated with a moment linear functional obtained by a Uvarov modification of a given moment functional. In this way, we define a general frame for Uvarov perturbations of a multivariate moment functional, we establish conditions for the existence of perturbed polynomials, and analyse the impact of the perturbation on the coefficients of the three term relations. Moreover, we analyse two interesting examples. First, we compute explicitly the matrix coefficients of the three term relations for the monic polynomials in the case of Jacobi measure on the simplex with mass points added on the vertices, completing the study of this case started in Delgado et al. (2010). In addition, we show the explicit expressions of the perturbed and non perturbed polynomials of low degrees for a special election of the parameters as well as we draw both families of polynomials and its zeros, where we can see the influence of the mass points. Finally, a non positive definite bivariate moment functional based on Bessel and Laguerre polynomials perturbed by a mass point at the origin is analysed. In this case we also compute the explicit expressions for the matrix coefficients of both families of bivariate orthogonal polynomials. The question about the spectral problem associated with the orthogonal polynomials corresponding to the Uvarov perturbation of the measure supported on the simplex presented in the first example remains open. Reading the results given in Delgado et al. (2018) and Martínez and Piñar (2016), we can not expect that these polynomials can be eigenfunctions of a fourth order partial differential equation since the Uvarov term is taken as discrete points. Moreover, if that kind of Uvarov polynomials satisfy higher order partial linear differential equations is a topic that remains open.
The structure of the paper is as follows. In Sect. 2 we recall the basic background about orthogonal polynomials in several variables, including necessary definitions and notations. In Sect. 3 Uvarov modifications of given linear functionals are studied in detail, providing results for the existence of orthogonal polynomials with respect to the new linear functional. In Sect. 4 we obtain the expressions of the three term relations for Uvarov orthogonal polynomials from the recurrence relations of the starting family. Finally, in Sect. 5 two examples are analyzed in detail, in order to show that the results are valid for positive definite linear functionals and also for non positive definite linear functionals. The first one deals with the Uvarov orthogonal polynomials on the simplex, obtained by adding mass points to the positive definite linear functional of very classical bivariate Jacobi polynomials on the simplex. Explicit expressions for the matrices of the three term relations satisfied by the Uvarov modification are presented, and for specific values of the parameters we compare the zeros of the classical bivariate Jacobi polynomials and those of the Uvarov family. We must point out that the zeros of multivariate polynomials constitute a theory that remains open, only a few analytic results concerning zeros are known (see Dunkl and Xu 2014;Area et al. 2015). In general, a zero of a multivariate polynomial is an algebraic curve, and different basis have different zeros. Finally, the second example is devoted to the Bessel-Laguerre case, which is non positive definite moment functional. For this case we show that the presented approach is also valid, giving rise to the matrices of the three term relations for the Uvarov modification.

Orthogonal polynomials in several variables
Let N 0 be the set of nonnegative integers and d ∈ N 0 . For m = (m 1 , . . . , m d ) ∈ N d 0 and x = (x 1 , . . . , x d ) ∈ R d , we write a monomial as The number |m| = m 1 + · · · + m d is called the total degree of x m . Along this paper, we denote by the linear space of polynomials in d variables with real coefficients, by n the linear space of real polynomials of total degree not greater than n, and we will denote by P n the space of homogeneous polynomials of total degree n in d variables. It is well known that (Dunkl and Xu 2014) For h, k ≥ 1, let M h×k (R), and M h×k ( ) be the linear spaces of (h × k) matrices with real and polynomial entries, respectively, and I h will represent the identity matrix of size h × h. In general, if h = k, we will omit one of the subscript.
We define the degree of a polynomial matrix A = a i, j (x) h,k i, j=1 ∈ M h×k ( ) as the maximum of the degrees of the entries in such a matrix, i. e., As usual, given a matrix M, we will denote by M its transpose, and, if M is a square matrix, then we will say that it is non-singular if det M = 0.
Following (Dunkl and Xu 2014, p. 71), if M 1 , M 2 , . . ., M d are matrices of the same size p × q, then their joint matrix M is defined as Given a sequence of polynomials of total degree n, {P n m } |m|=n , we will use the vector notation (see Dunkl and Xu 2014) to define P n as the column vector polynomial where m 1 , m 2 , . . . , m r n are the elements in {m ∈ N d 0 : |m| = n} arranged according to the reverse lexicographical order. Observe that P 0 is a constant (r 0 = 1), and P 1 is a column vector of independent multivariate polynomials of degree 1 of dimension r 1 = d.
Definition 1 A polynomial system (PS) is a vector polynomial sequence {P n } n≥0 such that the set of the entries of {P m } n m=0 is a basis of n for each n ≥ 0, and by extension we will say that {P m } n m=0 is a basis of n .
The simplest case of polynomial system is the so-called canonical polynomial system, defined as Using the vector notation, for a given polynomial system {P n } n≥0 , the vector polynomial P n can be written as P n (x) = G n,n X n + G n,n−1 X n−1 + · · · + G n,0 X 0 , where G n = G n,n is called the leading coefficient of P n , which is a square matrix of size r n . Moreover, since {P m } n m=0 is a basis of n , then G n is non-singular. We will say that two PS {P n } n≥0 and {Q n } n≥0 have the same leading coefficient if P 0 = Q 0 , and P n and Q n have the same leading coefficient matrix for n ≥ 1, that is, the entries of the vector P n − Q n are polynomials in n−1 .
In addition, a polynomial system is called monic if every polynomial contains only one monic term of highest degree, that is, for n ≥ 0, where |m k | = n, and R(x) ∈ n−1 . Equivalently, a monic polynomial system is a polynomial system such that its leading coefficient is the identity matrix, i. e., G n = I r n , for n ≥ 0.
Usually, a moment linear functional u defined on is introduced from its moments. In fact (Dunkl and  The action of a moment linear functional u is extended over polynomial matrices in the following way (see, for instance, Dunkl and Xu 2014) Given a moment functional u in , two polynomials p and q are said to be orthogonal with respect to u if u, p q = 0. For each n ≥ 0, let V n ⊂ n be the set of polynomials of total degree n that are orthogonal with respect to the linear functional u to all polynomials in n−1 together with zero. Then, V n is a linear space of dimension less than or equal to r n ( Dunkl and Xu 2014).
Definition 2 An orthogonal polynomial system (OPS) with respect to a given linear functional u is a PS {P n } n≥0 such that where n,m is the (n + 1) × (m + 1) zero matrix for n = m, and the identity matrix for n = m, and H n ∈ M r n (R) is a symmetric and non-singular matrix. Moreover, if H n is a diagonal matrix, we will say that {P n } n≥0 is a mutually orthogonal polynomial system (OPS) with respect to the linear functional u.
A moment functional u is called quasi definite if there exists an OPS with respect to u. Given a quasi definite moment linear functional u, orthogonal polynomial systems with respect to u are not unique. In fact, {P n } n≥0 and { P n } n≥0 are OPS associated with u if and only if there exist non-singular matrices F n such that P n = F n P n , n ≥ 0.
In this case, for n, m ≥ 0 we get u, P n P m = F n u, P n P m F m = H n n,m , where H n = F n H n F n is a non-singular and symmetric matrix. Therefore, there exists a unique monic orthogonal polynomial system that can be obtained by P n = G −1 n P n , n ≥ 0, where G n are the respective leading coefficients of P n .
Observe that u is quasi definite if and only if dim V n = r n , ∀n ≥ 0.
In this case, a PS {P n } n≥0 is an OPS if and only if the set of entries of the vector P n is a basis A positive definite moment functional is quasi definite, and it is possible to construct an orthonormal polynomial system, that is, an orthogonal polynomial system such that H n is positive definite. If H n is the identity matrix, then {P n m } |m|=n is an orthonormal basis for V n and the OPS is called an orthonormal polynomial system.
Orthogonal polynomials in several variables are characterized by d vector-matrix three term relations (see Dunkl and Xu 2014, Theorem 3.3.7, p. 74). More precisely, Theorem 3 (Dunkl and Xu 2014) Let {P n } n≥0 = {P n m (x) : |m| = n, n ∈ N 0 }, P 0 = 1, be an arbitrary sequence in . Then the following statements are equivalent.

There exists a linear functional u which defines a quasi definite moment functional on
and which makes {P n } n≥0 an orthogonal basis in . 2. For n ≥ 0, 1 ≤ i ≤ d, there exist matrices A n,i , B n,i and C n,i of respective sizes r n × r n+1 , r n × r n and r n × r n−1 , such that (a) the polynomials P n satisfy the three term relations and, for the joint matrices A n of A n,1 , A n,2 , . . . , A n,d , of size d r n × r n+1 and C n+1 of C n+1,1 , C n+1,2 , . . . , C n+1,d , of size d r n × r n+1 , we get In that case, we get where H n = u, P n P n , and A n,i H n+1 = H n C n+1,i .
Relations (2) can be written in a block matrix way (Kowalski 1982a, b;Dunkl and Xu 2014). In fact, for 1 ≤ i ≤ d, we define the block Jacobi matrices Observe that the entries of the block Jacobi matrices are the coefficients of the ith three term relation, whose sizes increase to infinity. If we denote the column of all polynomials, then, three term relations (2) become to The version of above theorem for orthonormal polynomial systems {P n } n≥0 is obtained by changing C n,i by A n−1,i , 1 ≤ i ≤ d, since H n = I r n , n ≥ 0.
If { P n } n≥0 is another OPS associated with u, then there exist non-singular matrices F n such that P n = F n P n , n ≥ 0. Multiplying (2) times F n , we deduce that This means that { P n } n≥0 satisfy the three term relations where A n,i = F n A n,i F −1 n+1 , B n,i = F n B n,i F −1 n , and C n,i = F n C n,i F −1 n−1 . Obviously, the rank conditions (3) and (4) are preserved since the rank is unchanged upon left or right multiplication by a non-singular matrix (Horn and Johnson 2013, p. 13).
When the orthogonal polynomial system {P n } n≥0 is monic, comparing the highest coefficient matrices at both sides of (2), it follows that A n,i = L n,i , for n ≥ 0, and 1 ≤ i ≤ d (Dunkl and Xu 2014), where L n,i are matrices of size r n × r n+1 defined by These matrices verify L n,i L n,i = I r n , and rank L n,i = r n . Moreover, the rank of the joint matrix L n of L n,i is r n+1 (Dunkl and Xu 2014, p. 71).
For the particular case d = 2, we have that L n,i , i = 1, 2, are the (n + 1) × (n + 2) matrices defined as In the general case, comparing the leading coefficient matrices at both sides of (2), we get where G n is the leading coefficient matrix of P n . Let u be a quasi definite moment linear functional, and let {P n } n≥0 be an OPS with respect to u. In terms of {P n } n≥0 , the kernel of V m , denoted by P m (u; x, y) (Dunkl and Xu 2014, p. 97), is defined by Similarly, the kernel of n , takes the form The definition of both kernels does not depend on a particular basis. For orthogonal polynomials in one variable, the kernel function is called reproducing kernel function. In several variables, we have an analogous property (Dunkl and Xu 2014), that is,

Uvarov orthogonal polynomials in several variables
From now on, we consider u a quasi definite moment functional defined on . Then orthogonal polynomials of several variables with respect to u exist, and let us denote by {P n } n≥0 an OPS associated with it. Let N ≥ 1 be a positive integer and let ξ (1) , ξ (2) , . . . , ξ (N ) be distinct points in R d . Obviously, every point has d entries, then we will write Let be a symmetric matrix of size N × N . We define the new moment functional v as a Uvarov modification of the original moment functional given by for p, q ∈ , where denotes the vector of evaluations of the polynomial p(x) at the points ξ (1) , ξ (2) , . . . , ξ (N ) . We want to know how the new inner product (10) acts over polynomial systems. Given a PS {P n } n≥0 , if we denote by P n (ξ ) the matrix that has P n (ξ ( j) ) as columns, then, the action of (10) is as follows: If (10) is quasi definite, we denote by {Q n } n≥0 an orthogonal polynomial system associated with it, such that P n and Q n have the same leading coefficient, for all n ≥ 0. Following Delgado et al. (2010), if u is given by means of a measure dμ(x) on R d with all finite moments and we assume that dμ is positive definite in the sense that and the matrix is positive definite, then v is positive definite and an OPS with respect to v exist.
Our first goal is to study the existence of orthogonal polynomials with respect to the moment functional v defined in (10).
We need to introduce several extra notations. If {P n } n≥0 is an OPS with respect to u, then we denote by K n−1 the square matrix of constants whose entries are K n−1 (u; ξ ( j) , ξ (k) ), and, denote by K n−1 (ξ, x) the N × 1 vector of polynomials In (12) and (13) for n = 0 we assume K −1 (u; x, y) = 0. From the fact that K n (u; x, y) − K n−1 (u; x, y) = P n (u; x, y), we have immediately the following relations, which will be used below.
In Fernández et al. (2010), a necessary and sufficient condition in order to be v quasi definite when N = 1 is given. In addition, orthogonal polynomials with respect to v can be expressed in terms of those with respect to the linear functional u. That result can be extended for N ≥ 1 by using a similar technique as in Delgado et al. (2010), but in this case, we focus the attention on the analysis of the quasi definite character of the perturbed linear functional.

Theorem 4 The moment linear functional v is quasi definite if and only if the N × N matrix
is non-singular for n ≥ 0. In this case, if {P n } n≥0 denotes an OPS with respect to the linear functional u, the system of polynomials {Q n } n≥0 defined by is an OPS with respect to the linear functional v taking K −1 (·, ·) = 0, where Moreover, In the rest of the section, we will suppose that v is quasi definite, and we denote by {Q n } n≥0 an OPS with respect to v defined by (15). Also we denote Then H n is a r n × r n symmetric non-singular matrix. It turns out that both H n and H −1 n can be expressed in terms of matrices that only involve {P m } m≥0 . In Delgado et al. (2010) was consider the simplest case when u is positive definite and {P n } n≥0 is orthonormal, that is, H n = I r n .
In order to study the inverse, we calculate We study this last term. Observe that, from (14), we get P n (ξ )H −1 n P n (ξ ) = (K n − K n−1 ) = (I N + K n ) − (I N + K n−1 ), and then, Substituting above, we get H n [H −1 n − H −1 n P n (ξ ) n P n (ξ ) H −1 n ] = I r n , and the result holds.
Next result gives explicit formulas for the reproducing kernels associated with v, which we denote by Theorem 6 For m ≥ 0, m defined in (16) is a symmetric matrix, and where we assume K −1 (x, y) ≡ 0. Furthermore, for n ≥ 0,

Three term relations for Uvarov orthogonal polynomials
In this section, let {P n } n≥0 be an OPS with respect to u, and let {Q n } n≥0 be an OPS with respect to v such that they have the same leading coefficient. Both OPS satisfy three term relations, denoted by where P −1 = 0 and P 0 = G 0 = 0, and by with Q −1 = 0 and Q 0 = G 0 = 0. The coefficients B n,i and B n,i , for n ≥ 0, are r n × r n matrices, and, A n,i , A n,i , C n,i , and C n,i are, respectively, r n × r n+1 , and r n × r n−1 coefficient matrices given by (5) satisfying the respective rank conditions (3) and (4). Here, Theorem 7 The matrices A n,i , B n,i and C n,i of the three term relations (18) for the vector polynomials {Q n } n≥0 orthogonal with respect to the linear functional v defined in (10) are given by where the matrices A n,i , B n,i and C n,i are those of the three term relations for the polynomials {P n } n≥0 orthogonal with respect to the linear functional u, P n (ξ ) is defined in (11), the matrices H n are defined in (1), and n is defined in (16).
Proof Since both OPS have the same leading coefficient, by (9) we get A n,i = A n,i , for 1 ≤ i ≤ d, and n ≥ 0.
We will compute B n,i from the explicit expressions of the vector polynomials in terms of the canonical basis. In this way, we know that P 0 (x) = Q 0 (x), and, for n ≥ 1, we can express where G n = G n are non-singular matrices of size r n , and G n m and G n m , for m = 0, 1, . . . , n−1, are matrices of constants of dimension r n × r m . Relating the coefficient of the term X n in (18), we get and, analogously, for the first family, using (17) by defining L −1,i = 0. Next, we want to deduce the matrix coefficient of X n in expression (15) written for n + 1. First, we observe that, for 1 ≤ j ≤ d and n ≥ 0, we have and then, the coefficient of X n in K n (u; ξ ( j) , x) is given by P n (ξ ( j) ) H −1 n . Therefore, the vector of kernels can be written as . . .
Since G n = G n is a non-singular matrix, we get the announced expression. Now, we compute C n+1,i , n ≥ 0. From (5), we know that C n+1,i = H n+1 A n,i H −1 n = H n+1 A n,i H −1 n . Using Proposition 5, we get which completes the proof.
We would like to notice that, in the monic case, the result simplifies by substituting A n,i = A n,i = L n,i .

Remark 1 We can give a block matrix perspective of expressions (19). Let
be the block Jacobi matrices associated with the three term relations for the Uvarov polynomials. We also define the block matrices where means a zero matrix of adequate size, and the omitted elements are considered as zeros. Then, the block Jacobi matrix associated with the Uvarov orthogonal polynomials is a perturbation of the original block Jacobi matrix in the form

Examples
In this section we analyse two examples in the bivariate case to apply our results. In the first example we study Uvarov polynomials on the bivariate simplex with mass points at the vertexes. We transform the standard basis of the polynomials on the simplex to the monomial basis, and compute explicitly all matrices that we need. We compare the some low total degree polynomials, by showing their explicit expressions both for the classical case and for the Uvarov modification, by choosing some specific values of the parameters. Second example is devoted to a non positive definite case based on Bessel and Laguerre polynomials.
In both cases, we compute explicitly the involved coefficients. For simplicity in this bivariate case, we will denote x = x 1 , y = x 2 .
Let us compute the inverse of H (α,β,γ ) n . For standard polynomials, we just need to compute the inverse of the diagonal matrix H (α,β,γ ) n of size (n + 1) × (n + 1) with entries given in (25). By using (32) we have Thus, where h (α,β,γ ) n,k−1 and u (α,β,γ ) k,i,n are defined in (25) and (33), respectively. Now, we introduce the Uvarov moment functional on the simplex. Let us denote by u the linear functional associated with the bivariate orthogonal polynomials on the simplex and let the matrix in (10) be the diagonal matrix where M 1 , M 2 , M 3 are positive real numbers. Hence, the Uvarov linear functional v is defined by In this case, the matrices K n defined in (12) are explicitly given by where we denote u = (1, 1, 1) , A n = diag a n,1 , a n,2 , a n,3 , and a n,1 = (α + β + γ + n + 3) Since is positive definite, the inverse of the matrix I 3 + K n is computed as Let us denote where z n,i = M −1 i + a n,i − b n . Using the Sherman-Morrison-Woodbury identity, it follows (see Golub and Van Loan 1996) The matrix P n (ξ ) defined in (11) is explicitly given by from the properties Then, we can express the monic Uvarov polynomials on the simplex by using the explicit expression (15). Applying Theorem 7, the coefficients of the three term relations for Uvarov polynomials are given by (19), where the involved matrices are already explicitly computed.
Finally, we analyse a particular example. Consider α = β = 1, γ = 1/2, and M 1 = M 2 = M 3 = 1/2. The Uvarov inner product is given by Table 1 shows the explicit expression of the first classical monic polynomials on the simplex, and the Uvarov monic polynomials perturbed as above. We have plotted the polynomials of degree 2 up to degree 5 of the zeros (as algebraic curves Area et al. (2015)) of both families as well as the corresponding surfaces in Figs. 1, 2, 3, 4, 5. We must point out that the zeros of multivariate polynomials constitute a theory that remains open, only a few analytic results concerning zeros are known (see Dunkl and Xu 2014;Area et al. 2015). In general, a zero of a multivariate polynomial is an algebraic curve, and different basis have different zeros. Here, we wanted to show the impact of the Uvarov modification on the orthogonal polynomials as well as its zeros.   . 2 Zeros of the monic polynomials of degree 3, P 3,0 , P 3,1 , P 3,2 , and P 3,3 . In the first row, the classical polynomials on the simplex, and in the second row, the Uvarov modification, for x ∈ [0, 1] and y ∈ [0, x]
Both families of orthogonal polynomials (35) and (38) (with respect to the same weight function on the same domain, and solution of the same partial differential equation) are related as with u (g,γ ) i, j,n = 0 if j > i and u (g,γ ) i, j,n = if j ≤ i. The orthogonality relation for the monic polynomials reads where H (g,γ ) n is a non-singular matrix of size (n + 1) × (n + 1). By using (41) we have The monic Bessel-Laguerre polynomials { P (g,γ ) n } n≥0 satisfy three term relations where now , 0 ≤ i ≤ n − 1.
Hence, the Uvarov linear functional v is defined by v, f = w, f + M f (0, 0).