Structural formulas for matrix-valued orthogonal polynomials related to $2\times 2$ hypergeometric operators

We give some structural formulas for the family of matrix-valued orthogonal polynomials of size $2\times 2$ introduced by C. Calder\'on et al. in an earlier work, which are common eigenfunctions of a differential operator of hypergeometric type. Specifically, we give a Rodrigues formula that allows us to write this family of polynomials explicitly in terms of the classical Jacobi polynomials, and write, for the sequence of orthonormal polynomials, the three-term recurrence relation and the Christoffel-Darboux identity. We obtain a Pearson equation, which enables us to prove that the sequence of derivatives of the orthogonal polynomials is also orthogonal, and to compute a Rodrigues formula for these polynomials as well as a matrix-valued differential operator having these polynomials as eigenfunctions. We also describe the second-order differential operators of the algebra associated with the weight matrix.


Introduction
In the last few years, the search for examples of matrix-valued orthogonal polynomials that are common eigenfunctions of a second order differential operator, that is to say, satisfying a bispectral property in the sense of [13], has received a lot of attention after the seminal work of A. Durán in [15].
The theory of matrix-valued orthogonal polynomials was started by Krein in 1949 [37], [38] (see also [1] and [2]), in connection with spectral analysis and moment problems. Nevertheless, the first examples of orthogonal matrix polynomials satisfying this extra property and non reducible to scalar case, appeared more recently in [27,28,29,25] and [19]. The collection of examples has been growing lately (see for instance [16,17,21,22,26,41,40,3,35,36,42,34,4]). Moreover, the problem of giving a general classification of these families of matrix-valued orthogonal polynomials as solutions of the so called Matrix Bochner Problem has been also recently addressed in [8] and in [7] for the special case of 2 × 2 hypergeometric matrix differential operators.
As the case of classical orthogonal polynomials, the families of matrix-valued orthogonal polynomials satisfy many formal properties such as structural formulas (see for instance [20,24,18,3,34]), which have been very useful to compute explicitly the orthogonal polynomials related with several of these families. Having these explicit formulas is essential when one is looking for applications of these matrix-valued bispectral polynomials, such as in the problem of time and band limiting over a non-commutative ring and matrix-valued commuting operators, see [30,10,31,11,12,32].
Recently, in [4], a new family of matrix-valued orthogonal polynomials of size 2×2 was introduced, which are common eigenfunctions of a differential operator of hypergeometric type (in the sense defined by Juan A. Tirao in [44]): In particular, the polynomials (P (α,β,v) n ) n≥0 introduced in [4], orthogonal with respect to the weight matrix W (α,β,v) given in (2.4) and (2.5), are common eigenfunctions of an hypergeometric operator with matrix eigenvalues Λ n , which are diagonal matrices with no repetition in their entries. This fact could be especially useful if one intends to use this family of polynomials in the context of time and band limiting, where the commutativity of the matrix-valued eigenvalues (Λ n ) n could play an important role.
In this paper we give some structural formulas for the family of matrix-valued orthogonal polynomials introduced in [4]. In particular, in Section 3 we give a Rodrigues formula (see Theorem 3.1), which allows us to write this family of polynomials explicitly in terms of the classical Jacobi polynomials (see Corollary 3.3).
In Section 4, this Rodrigues formula allows us to compute the norms of the sequence of monic orthogonal polynomials and therefore, we can find the coefficients of the three-term recurrence relation and the Christoffel-Darboux identity for the sequence of orthonormal polynomials.
In Section 5, we obtain a Pearson equation (see Proposition 5.4), which allows us to prove that the sequence of derivatives of k-th order, k ≥ 1, of the orthogonal polynomials is also orthogonal with respect to the weight matrix given explicitly in Proposition 5.3.
In Section 6, following the ideas in [34,Section 5.1], we use the Pearson equation to give explicit lowering and rising operators for the sequence of derivatives. Thus, we deduce a Rodrigues formula for these polynomials and find a matrix-valued differential operator that has these matrix-valued polynomials as common eigenfunctions.
Finally, in Section 7, we describe the algebra of second order differential operators associated with the weight matrix W (α,β,v) given in (2.4) and (2.5). Indeed, for a given weight matrix W , the analysis of the algebra D(W ) of all differential operators that have a sequence of matrix-valued orthogonal polynomials with respect to W as eigenfunctions has received much attention in the literature in the last fifteen years [9,33,45,42,47,6,8]. While for classical orthogonal polynomials the structure of this algebra is very well known (see [39]), in the matrix setting, where this algebra is non commutative, the situation is highly non trivial.

Preliminaries
In this section we give some background on matrix-valued orthogonal polynomials (see [23] for further details). A weight matrix W is a complex N × N matrix-valued integrable function on the interval (a, b), such that W is positive definite almost everywhere and with finite moments of all orders, i.e., b a t n dW (t) ∈ C N ×N , n ∈ N. The weight matrix W induces a Hermitian sesquilinear form, for any pair of N × N matrix-valued functions P (t) and Q(t), where Q * (t) denotes the conjugate transpose of Q(t).
A sequence (P n ) n≥0 of orthogonal polynomials with respect to a weight matrix W is a sequence of matrix-valued polynomials satisfying that P n (t), n ≥ 0, is a matrix polynomial of degree n with non-singular leading coefficient, and P n , P m W = ∆ n δ n,m , where ∆ n , n ≥ 0, is a positive definite matrix. When ∆ n = I, here I denotes the identity matrix, we say that the polynomials (P n ) n≥0 are orthonormal. In particular, when the leading coefficient of P n (t), n ≥ 0, is the identity matrix, we say that the polynomials (P n ) n≥0 are monic.
Given a weight matrix W , there exists a unique sequence of monic orthogonal polynomials (P n ) n≥0 in C N ×N [t], any other sequence of orthogonal polynomials (Q n ) n≥0 can be written as Q n (t) = K n P n (t) for some non-singular matrix K n .
Any sequence of monic orthogonal matrix-valued polynomials (P n ) n≥0 satisfies a three-term recurrence relation where P −1 (t) = 0, P 0 (t) = I. The N × N matrix coefficients A n and B n enjoy certain properties: in particular, A n is non-singular for any n. Two weights W and W are said to be equivalent if there exists a non-singular matrix M , which does not depend on t, such that for all t ∈ (a, b).
A weight matrix W reduces to a smaller size if there exists a non-singular matrix M such that where W 1 and W 2 are weights of smaller size. A weight matrix W is said to be irreducible if it does not reduce to a smaller size (see [19], [46]). Let D be a right-hand side ordinary differential operator with matrix valued polynomial coefficients, The operator D acts on a polynomial function P (t) as P D = s i=0 ∂ i P F i (t) . We say that the differential operator D is symmetric with respect to W if The differential operator D = We will need the following result to find the Rodrigues' formula for the sequence of orthogonal polynomials with respect to a weight matrix W . Theorem 2.1. ([18, Lemma 1.1]) Let F 2 , F 1 and F 0 be matrix polynomials of degrees not larger than 2, 1, and 0, respectively. Let W , R n be N ×N matrix functions twice and n times differentiable, respectively, in an open set of the real line Ω. Assume that W (t) is non-singular for t ∈ Ω and that satisfies the identity and the differential equations in (2.2). Define the functions P n , n ≥ 1, by If for a matrix Λ n , the function R n satisfies then the function P n satisfies The family of matrix-valued orthogonal polynomials. In [4] the authors introduce a Jacobi type weight matrix W (α,β,v) (t) and a differential operator D (α,β,v) such that D (α,β,v) is symmetric with respect to the weight matrix W (α,β,v) (t) .
Let α, β, v ∈ R, α, β > −1 and |α − β| < |v| < α + β + 2. We consider the weight matrix function where, for the sake of clearness in the rest of the paper, we use the notation: is an irreducible weight matrix and the hypergeometric type differential operator given by is symmetric with respect to the weight matrix W (α,β,v) .
In the same paper, the authors also give the corresponding monic orthogonal polynomials in terms of the hypergeometric function 2 H 1 (C, U, V ; t) defined by J. A. Tirao in [44] and their three-term recurrence relation.

Rodrigues formula
In this section we will provide a Rodrigues formula for the sequence of monic orthogonal polynomials P (α,β,v) n n≥0 with respect to the weight matrix W = W (α,β,v) in (2.4). Moreover, the Rodrigues formula will allow us to find an explicit expression for the polynomials in terms of Jacobi polynomials.
where (c n ) n and (d n ) n are arbitrary sequences of complex numbers. Then P n (t) is a polynomial of degree n with non-singular leading coefficient equal to where (a) n = a(a + 1) . . . (a + n − 1) denotes the usual Pochhammer symbol. Moreover, if we put then (P n ) n≥0 is a sequence of monic orthogonal polynomials with respect to W and P n = P Proof. Let W be the weight matrix given in (2.4) and F 2, F 1 , F 0 and Λ n are the polynomials coefficients defined in (2.8)-(2.10).
Following straightforward computations, we can prove that the matrix-valued function R n (t) satisfies the equation Theorem 2.1 guarantees that the function P n (t) = (R n (t)) n (W (t)) −1 is an eigenfunction of D (α,β,v) with eigenvalue Λ n given in (2.10). Then P (α,β,v) n (t) and P n (t) satisfy the same differential equation. We will prove that P n is a polynomial of degree n with non-singular leading coefficient. We will use the following Rodrigues formula for the classical Jacobi polynomial p Thus, we obtain We can rewrite (W (t)) −1 as Observe that R n,0 J 0 = 0. Thus, P n (t) becomes Hence, P n (t) is a polynomial of degree n if and only if t = 0 and t = 1 are zeros of the following polynomial and t = 1 has multiplicity two, i.e., Taking into account that p Now, by taking derivative of Q (t) with respect to t and considering the identity we obtain This shows that Q (t) is divisible by t (t − 1) 2 therefore, P n (t) is a polynomial of degree n since deg (Q(t)) = n + 3.
Observe that the leading coefficient of P n (t) is determined by the leading coefficient of The previous matrix coefficient is non-singular since |α − β| < |v| < α + β + 2. Moreover, if (3.3) holds true then P n (t) is a monic polynomial and equal to P Corollary 3.2. Consider the weight matrix W (α,β,v) (t) given in (2.4) and (2.5). Then, the monic orthogonal polynomials P (α,β,v) n (t) satisfy the Rodrigues formula We can see in the proof of Theorem 3.1 that Rodrigues' formula allows us to find an explicit expression for the polynomials in terms of the classical Jacobi polynomials.  .3). Then, the sequence of monic orthogonal polynomials (P n ) n≥0 , defined by (3.1) can be written as Moreover, Proof. The expression in (3.5) follows from the proof above and to obtain (3.6) we use the following property for the classical Jacobi polynomials p

Orthonormal Polynomials
In this section we give an explicit expression for the norm of the matrix-valued polynomials P . In addition, for the sequence of orthonormal polynomials we show the three-term recurrence relation and the Christoffel-Darboux formula, introduced for a general sequence of matrixvalued orthogonal polynomials in [14] (see also [23]).
Proposition 4.1. The norm of the monic orthogonal polynomials P Therefore the sequence of polynomials is orthonormal with respect to W.
Using the expressions in (3.2), after some straightforward computations we complete the proof.
The sequence of orthonormal polynomials satisfies the following properties. satisfy the three-term recurrence is the coefficient of the three-term recurrence relation for the monic orthogonal polynomials P (α,β,v) n n≥0 (2.13). Clearly, B (α,β,v) n is a symmetric matrix.
Proof. By replacing the identity P (t) in the three-term recurrence relation (2.13) and using identity (2.17) we obtain (4.2), and by (2.18) one verifies that We also have the following Christoffel-Darboux formula for the sequence of orthonormal polynomials P (α,β,v) n n≥0 Hence, the sequence of monic polynomials P (α,β,v) n n≥0 where the explicit expression of P (α,β,v) n −2 follows from (4.1).

The Derivatives of the Orthogonal Matrix-Valued Polynomials
In this section we prove that polynomials in the sequence of derivatives of the orthogonal matrix polynomials P (α,β,v) n n≥0 are also orthogonal by obtaining a Pearson equation for the weight n (t) be the derivative of order k of the monic polynomial P (α,β,v) n (t), for n ≥ k. Then are monic polynomials of degree n − k for all n ≥ k. The polynomial P (α,β,v) n (t) is an eigenfunction of the operator D (α,β,v) given above in (2.7)-(2.9). Taking derivative k times we have that P (α,β,v,k) n (t) is an eigenfunction of the differential hypergeometric operator with C (k) = C + kI, U (k) = U + 2kI = (α + β + 4 + 2k) I, where C, U and V are the matrix entries of the operator D (α,β,v) given in (2.9). One has that where Λ (k) n = Λ n + kU + k(k − 1)I, with Λ n given in (2.10). One has in particular, the standard expression for the eigenvalue shown in [4, Proposition 3.3], Λ (k) Remark 5.1. One notices that D (α,β,v,k) = D (α+k,β+k,v) . Thus, the sequence of derivatives are still common eigenfunctions of an hypergeometric operator with diagonal matrix eigenvalues Λ (k) n , with no repetition in their entries. Proposition 5.2. As in (2.11), we have the following explicit expression for the sequence of polynomials P (α,β,v,k) n n≥k in terms of hypergeometric functions where C (k) , U (k) and V are the entries of the differential operator in (5.2) and λ  We include the proof for completeness.
We will use the following Pearson equation to prove that the sequence of polynomials P (α,β,v,k) n n≥k is orthogonal with respect to W (k) .
Theorem 5.4. The weight matrix W (k) satisfies the following Pearson equation, Proof. By replacing the expression of Φ (k) (t) and Ψ (k) (t) in (5.6) and taking derivative we obtain Hence, the left hand side of (5.12) is a product between a polynomial of degree five and t (α+k−1) (1 − t) (β+k−1) . Therefore, equating to zero the entries of this polynomial, taking into account (5.10) and the equality W Remark 5.5. Let us consider the matrix-valued functions W (α,β,v,k) (t) = W (k) (t), Φ (k) (t) and Ψ (k) (t), k ∈ N, defined in (5.5) and Theorem 5.4 respectively. Then, by straightforward computations, one can verify the following identities: Taking into account that deg Φ (k) (t) = 2 and deg Ψ (k) (t) = 1, we obtain from [5, Corollary 3.10] the following Corollary 5.6. The sequence of polynomials P (α,β,v,k) n n≥k is orthogonal with respect to the weight The following results are obtained in a similar way than in Theorem 3.1 and Corollary 3.3.
Proposition 5.9. The orthogonal monic polynomials P (α,β,v,k) n n≥k satisfy the three-term recurrence relation The explicit expressions of B Considering that W (k) (t) = W (α+k,β+k,v) (t) (see Proposition 5.3), the previous recurrence follows directly from (2.3). Notwithstanding, we include the following proof for completeness.
Proof. If we write P (α,β,v,k) n (t) = n−k s=0 P s n−k t s , from (5.4) we have the following explicit expressions, If we consider the coefficient of order n − k and n − k − 1 in the three-term recurrence relation we have, respectively. Comparing with the expressions of B (α+k,β+k,v) n−k and A (α+k,β+k,v) n−k given by substituting properly in (2.14)-(2.16), the proposition follows.

Shift operators
In this section we use Pearson equation (5.6) to give explicit lowering and rising operators for the monic n-degree polynomials P (α,β,v,k) n+k (t), n ≥ 0, defined in (5.1). Moreover, from the existence of the shift operators we deduce a Rodrigues formula for the sequence of derivatives P (α,β,v,k) n+k n≥0 , and we find a matrix-valued differential operator for which these matrix-valued polynomials are eigenfunctions. In what follows, we will consider the matrix-valued functions W (k) (t) , Φ (k) (t) and Ψ (k) (t), k ∈ N as defined above in Theorem 5.4. For any pair of matrix-valued functions P and Q, we denote Proposition 6.1. Let η (k) be the first order matrix-valued right differential operator Then d dt : L 2 W (k) → L 2 W (k+1) and η k : Proof. From dP dt , Q k+1 = 1 0 dP (t) dt W (k+1) (t)Q * (t)dt, integrating by parts and taking into account equalities (5.13) and (5.14) in Remark 5.5 we get, Lemma 6.2. The following identity holds true for a given k ≥ 0, where Proof. It holds that Iη (k+n−1) · · · η (k+1) η (k) is a polynomial of degree n. From the definition of the monic sequence of derivatives in (5.1) one has Thus, Proposition 6.1 implies that P Therefore, applying the raising operators η (k+n−1) · · · η (k+1) η (k) to P (α,β,v,k+n) n+k . For the leading coefficient C k n of the polynomial Iη (k+n−1) · · · η (k+1) η (k) one obtains the expression The diagonal matrix entries A k 2 and B k 1 are defined in (5.7) and (5.10). Then, by replacing B k 1 = (α + β + 4 + 2k) A k 2 in the identity above we have .
From the proposition and the lemma above we obtain another expression for the Rodrigues formula. satisfy the following Rodrigues formula, where the matrices C k n are given by the expression in (6.2).
Proof. Let Q be a matrix-valued function and η (k) the raising operator in (6.1), then Using the identities (5.13) and (5.14) we obtain Iterating, it gives Now, taking Q (t) = I and using Lemma 6.2 we have Corollary 6.4. Let W (k) (t) be the weight matrix (5.5). Then, the differential operator are eigenfunctions of the operator E (k) with eigenvalue where A k 2 is given by (5.7).
Remark 6.5. The operators E (k) and D (k) in (5.2) commute. This result follows from the fact that the corresponding eigenvalues Λ n E (k) and Λ (k) n+k in (5.3) commute, and the linear map that assigns to each differential operator in the algebra of differential operators D(W (k) ) its corresponding sequence of eigenvalues, is an isomorphism (see [33, Propositions 2.6 and 2.8]).
is not symmetric with respect to W (k) . Moreover, it is symmetric with respect to W (k+1) . Indeed, In fact, if we substitute the coefficient of the second derivative in the first symmetry condition in (2.2) we obtain which does not hold. Taking the main coefficient W of W (αβ,v,k) in (5.5), one has in particular The second statement follows from Proposition 6.1.

The algebra D (W )
In this section we will discuss some properties of the structure of the algebra of matrix differential operators having as eigenfunctions a sequence of polynomials (P n ) n≥0 , orthogonal with respect to the weight matrix W = W (α,β,v) , i.e., The definition of D(W ) does not depend on the particular sequence of orthogonal polynomials (see [33,Corollary 2.5]).
Theorem 7.1. Consider the weight matrix function W = W (α,β,v) (t). Then, the differential operators of order at most two in D(W ) are of the form Proof. Let P (α,β,v) n n≥0 be the monic sequence of orthogonal polynomials with respect to W (α,β,v) .
We obtain the explicit expressions of P n−1 n and P n−2 n by substituting k = 0 in the equalities (5.15) and (5.16) respectively.
From equation (7.2) for k = n − 1 we get Multiplying equation (7.3) by v (α + β + 2 (n + 1)) (κ v,β + 2 (n + 1)) (κ −v,β + 2 (n + 1)) n one obtains a matrix polynomial on n of degree four, where each coefficient must be equal to zero. From the expression of the coefficient of n 4 we get the expression for A 1 given above and from the expression of the coefficient of n 3 we get B 0 in terms of A 2 and B 1 . Looking at the entries (1, 1) , (1, 2) and (2, 2) of the coefficient of n 2 and the fact that κ v,−β and κ −v,−β are no zero we get (C 0 ) 12 , (C 0 ) 11 and (B 1 ) 12 respectively in terms of A 2 and the other entries of C 0 and B 1 . Finally, looking at the coefficient of n we get the values of (B 1 ) 11 , (B 1 ) 21 , (B 1 ) 22 and (C 0 ) 21 , consequently, we obtain the values of B 1 , B 0 and C 0 written above.
Let D 2 be the complex vector space of differential operators in D(W ) of order at most two. We have already proved that dim D 2 ≤ 5.
If D is symmetric then D ∈ D (W ). Using symmetry equations in (2.2) one verifies that the operator D in (7.1) is symmetric with respect to W if and only if a, d, e ∈ R and condition where Im(z) denotes the imaginary part of a complex number z. Then, since κ v,β + 2 > 0 because of the restrictions of the parameters α, β and v in the definition of W (α,β,v) in (2.4), to verify (7.6) one needs to have a, d ∈ R and condition (7.5).
In addition, from the third symmetry equation (2.2) we have that e ∈ R. Thus, there exists at least five linearly independent symmetric operators of order at most two in D(W ). Therefore dim D 2 = 5.
Corollary 7.5. The algebra D (W ) is not commutative.
Proof. Using the isomorphism between the algebra of differential operators and the algebra of matrix-valued functions of n generated by the eigenvalues going with this operators we have that D 1 D 3 = D 3 D 1 since Λ n (D 1 ) Λ n (D 3 ) = Λ n (D 3 ) Λ n (D 1 ).
Remark 7.6. In [42] the authors study the algebra D W (p,q) , where W (p,q) is, for p = q 2 , the irreducible weight matrix Let us denote by D the differential operators appearing in [42]. Then, taking α = β =