Inhomogeneous Jacobi Matrices on Trees

We study Jacobi matrices on trees with one end at infinity. We show that the defect indices cannot be greater than 1 and give criteria for essential self-adjointness.


Introduction
The aim of the paper is to study a special class of symmetric unbounded operators and their spectral properties. These are Jacobi operators defined on trees. They are immediate generalizations of classical Jacobi matrices that act on sequences {u n } ∞ n=0 by the rule (J u) n = λ n u n+1 + β n u n + λ n−1 u n−1 , n ≥ 0, where {λ n } ∞ n=0 and {β n } ∞ n=0 are sequences of positive and real numbers, respectively, with the convention u −1 = λ −1 = 0. These matrices are closely related to the set of polynomials defined recursively by x p n (x) = λ n p n+1 (x) + β n p n (x) + λ n−1 p n−1 (x), n ≥ 0, (1) with p −1 = 0, p 0 = 1.
In the case when the coefficients of the matrix are bounded, the matrix J represents a self-adjoint operator on 2 (N 0 ). If E(x) denotes the resolution of identity associated with J, then the polynomials p n (x) are orthonormal with respect to the measure dμ(x) = d(E(x)e 0 , e 0 ), where e 0 is the sequence taking value 1 at n = 0 and vanishing elsewhere and (u, v) denotes the standard inner product in 2 (N 0 ). The measure μ has bounded support.
When the coefficients are unbounded, the operator J is well defined on the domain D(J ) consisting of sequences with finitely many nonzero terms. In that case, if this operator is essentially self-adjoint, then again the polynomials p n are orthonormal with respect to the measure dμ(x) = d(E(x)e 0 , e 0 ), except that this measure has unbounded support. Moreover, there is a unique orthogonality measure for polynomials p n . By a classical theorem, if the operator J is not essentially self-adjoint, there are many measures μ on the real line so that the polynomials belong to L 2 (μ); i.e., ∞ −∞ x 2n dμ(x) < ∞, n ∈ N 0 , and the polynomials p n are orthogonal with respect to the inner product Therefore essential self-adjointness is a crucial property that distinguishes between the so-called determinate and indeterminate cases. Intuitively the unbounded matrix J is essentially self-adjoint when the coefficients have moderate growth. But the converse is not true in general. For the classical theory of Jacobi matrices, orthogonal polynomials, and moment problems, we direct the reader to [1,2,6], and to [5] for a modern treatment.
In a recent paper [3], homogeneous Jacobi matrices on homogeneous trees were studied. Two types of homogeneous trees were considered. One of them was the tree with infinitely many origin points called leaves, like in Fig. 1.
The tree consists of vertices with heights from zero to infinity. Every vertex x with height n ≥ 1 is connected to a unique vertex η(x), the parent, with height n + 1, and d vertices x 1 , . . . , x d with height n − 1, the children, like in Fig. 2.
Every vertex x with height zero determines the infinite sequence η k (x) of vertices with height k. Moreover, for every two vertices x and y with height zero, the sequences η k (x) and η k (y) coincide for k large enough. Therefore, we say that the tree has one end at infinity: The Jacobi matrices were defined on 2 ( ), where denotes the set of all vertices of the tree. The formula is as follows: where n denotes the height of the vertex x.
An interesting phenomenon occurred. It turned out that the operator J defined on functions {v(x)} x∈ , with finitely many nonzero terms, is always essentially selfadjoint, regardless of the growth of the coefficients λ n and β n . For example, the operator J with coefficients λ n = (n + 1) 2 and β n = 0 is not essentially self-adjoint when considered as the classical Jacobi matrix on 2 (N 0 ). But it is essentialy self-adjoint when it acts on 2 ( ).
Moreover, its spectrum is discrete and consists of the zeros of all the polynomials p n associated with the classical Jacobi matrix with coefficients √ d λ n and β n , i.e., satisfying Every eigenvalue is of infinite multiplicity. Our aim is to study the inhomogeneous Jacobi matrix on that tree. This means we do not require that the coefficients of the matrix depend only on the height of the vertex. With every vertex x, we associate a positive number λ x and a real number β x . We are going to study operators of the form One of the main differences between the classical case and the case of the tree is that the eigenvalue equation . . .
cannot be solved recursively, unlike the equation Nonetheless, we will show that the equation has a nonzero solution for every nonreal number z (Corollary 5).
Actually, when we give up homogeneity of the matrix J, we can as well give up homogeneity of the tree. This means the number of descendants of vertices of is not constant; i.e., the quantities #η −1 (x) may vary.
The operator J is symmetric on 2 ( ) with respect to the natural inner product We are interested in studying the essential self-adjointness of the matrix J. It turns out that, unlike in the homogeneous case, the matrix J may not be essentially self-adjoint. However, the defect indices cannot be greater than 1 (Proposition 6). We derive certain criteria assuring essential self-adjointness. For example, the analog of the Carleman condition holds (see Theorem 16). Moreover, we relate essential self-adjointness of J with essential self-adjointness of the classical Jacobi matrix J 0 obtained from J by restriction to an infinite path of the tree (see Theorem 9).

Definitions and Basic Properties
We will consider a tree with one end at infinity. Its vertices are located on heights from zero to infinity. Let l(x) denote the height of the vertex x. Every vertex x with height l(x) ≥ 0 is directly connected to a unique vertex η(x) with height l(x) + 1, the parent. When l(x) ≥ 1, the vertex x is directly connected to a finite nonempty set of vertices y on height l(x) − 1, called its children. The set of children of x will be denoted by η −1 (x) (see Fig. 3). The number of vertices in η −1 (x) may vary with x. Vertices x with l(x) = 0 have no children; i.e., η −1 (x) = ∅. For a given vertex x, let x denote the finite subtree containing the vertex x together with all its descendants, i.e., vertices y such that η k (y) = x for some k. Thus l(y) = l(x) − k.
Define F( ) to be the set of all complex valued functions on , while F 0 ( ) denotes the subspace consisting of functions with finite support. By 2 ( ), we denote the space of square summable functions on with standard inner product Consider the operator J acting on F 0 ( ) according to the rule where λ x are positive constants while β x are real ones.
Remark In the case when #(η −1 (x)) = 1, for all x ∈ with l(x) ≥ 1, the tree consists of one vertical line that can be identified with N 0 . The matrix J becomes a classical Jacobi matrix.
Let S and S * be the operators acting on F 0 ( ) as follows: Then The operators S and S * are straightforward generalizations of weighted shift and backward weighted shift operators usually acting on 2 (N 0 ). Let M be a multiplication operator on F 0 ( ) defined by In particular, J is a symmetric linear operator on F 0 ( ).
Let v ∈ F 0 ( ) and x ∈ . Then The formula makes sense for all functions v in F( ). Hence we may extend the definition of J to the whole space F( ), by setting On the way, we have simplified the notation as Observe that the adjoint of (J, F 0 ( )) in 2 ( ) is the restriction of this extension to the domain of the adjoint operator, i.e., to the space of all functions v ∈ 2 ( ), so that J v belongs to 2 ( ).
We will study eigenfunctions of the operator J, i.e., functions v ∈ F( ) satisfying Unlike in the classical case, this equation cannot be solved recursively; i.e., setting v( Therefore the existence of nonzero solutions of (5) is not obvious. Our aim is to show that such solutions exist for every nonreal number z.
For x ∈ , let J x denote the truncation of the Jacobi matrix J to the subtree x , i.e., the matrix with the parameters λ x y , β x y so that Proof Assume for a contradiction that v(η(x)) = 0. Let w denote the truncation of v to x . Thus J x w = zw. Moreover w = 0. Therefore z must be a real number, as J x is a finite dimensional symmetric linear operator.
Proof Assume for a contradiction that Define the function u ∈ F( x ∪ {η(x)}) by setting u(y) = v(y) for y ∈ x and u(η(x)) = 0. Then (J u)(y) = zu(y) for y ∈ x . In view of Lemma 1, we get a contradiction.

Corollary 3 Assume there exists a function
Then v does not vanish on .
Proof Assume for a contradiction that v(y) = 0 for a vertex y. First we will show that v vanishes on y . If (y) = 0, then y = {y}, and the conclusion follows. Assume (y) ≥ 1. Consider any vertex x ∈ η −1 (y); i.e., y = η(x). Then by Lemma 1, we get that the function v vanishes identically on x . But hence v vanishes on y . From the recurrence relation we get v(η(y)) = 0. Therefore, by the first part of the proof, v vanishes identically on η(y) . Applying the same procedure infinitely many times we achieve that v vanishes at every vertex of , as

Lemma 4 For any nonreal number z and any x
Moreover, the function v cannot vanish and is unique up to a constant multiple.
Proof We will use induction on the height l(x 0 ). Assume l( Thus we may set In this way (6) is fulfilled.
Assume the conclusion is true for all vertices on height n. Let l(x 0 ) = n + 1. Consider vertices x 1 , x 2 , . . . , x k ∈ η −1 (x 0 ). Then l(x j ) = n for j = 1, 2, . . . , k. By the induction hypothesis, for every vertex x j , there exists a nonzero function v j defined on x j satisfying We have We are going to define the function v on x 0 in the following way. Set v(x 0 ) = 1 and for some constants c j , which will be specified later. In this way, we get In order to conclude the proof, we must show that Thus we want to have The expression in the brackets on the right-hand side is nonzero for every j = 1, 2, . . . , k, by Lemma 4. Therefore (7) is satisfied for an appropriate choice of nonzero constants c 1 , c 2 , c 3 , . . . , c k . By Lemma 1, the function v cannot vanish at any vertex. Moreover, if there was another functionṽ satisfying the conclusion of Lemma 4, then v − cṽ would also satisfy the conclusion and would vanish for an appropriate choice of the constant c.

Corollary 5 For any nonreal number z, there exists a nonzero function v so that
The function v cannot vanish and is unique up to a constant multiple.
Proof Fix a vertex y, l(y) = 0. By Lemma 4, for any subtree η k (y) , k ≥ 1, there exists a unique function v k defined on η k (y) so that the function v is defined at every vertex of , and the conclusion follows.
Remark The conclusion may not be true for some real values of z. Indeed, consider a tree with infinitely many vertices at height 0. Assume #(η −1 (x)) = 2 for any vertex x ∈ with l(x) ≥ 1. Let λ y = 1 and β y = 0 for all vertices y such that l(y) = 0. Let l(y 1 ) = l(y 2 ) = 0 and η(y 1 ) = η ( y 2 ). Then the function v = δ y 1 − δ y 2 satisfies J v = 0. In this way, we obtain infinitely many solutions, which vanish on x with l(x) ≥ 1.

Essential Self-Adjointness and Defect Indices
The dimension of the defect space N z is called the defect index. It is known that the defect index is constant for z in the upper half-plane and for z in the lower half-plane. In our case, the defect index is constant on C\R as J v = zv is equivalent to J v = zv. We refer to [4,6] for the theory of symmetric operators in Hilbert space and their self-adjoint extensions.

Proposition 6 The defect indices of the operator J cannot be greater than 1.
Proof Fix a nonreal number z. Assume J is not essentially self-adjoint. Then there exists 0 = v ∈ 2 ( ) satisfying J v = zv. By Corollary 5, the function v is unique up to a constant multiple. Hence the defect space is one-dimensional. Proposition 6 implies the following.

Corollary 7 Let J be a Jacobi matrix on . Fix a nonreal number z, and let v denote the unique, up to a constant multiple, nonzero solution of the equation J v = zv. Then J is essentially self-adjoint if and only if v / ∈ 2 ( ).
Theorem 8 There exist Jacobi matrices on that are not essentially self-adjoint.
Proof We set β x ≡ 0. Fix a nonreal number z. Choose an infinite path {x n } in so that l(x n ) = n. We will construct a matrix J by induction on n. Assume that we have constructed a matrix J on x n−1 \{x n−1 } and a nonvanishing function v on and We want to extend the definition of J and v so that the conclusion remains valid when n − 1 is replaced by n.
Our first task is to define λ x n−1 and v(x n ) so that The right-hand side of (8) cannot vanish by Lemma 2. We will define λ x n−1 and v(x n ) so as to satisfy (8). By specifying λ x n−1 large enough, we may assume that For any y ∈ η −1 (x n ) and y = x n−1 , consider the subtree y \{y}. Set λ x = 1 for any x ∈ y \{y}. By Lemma 4, there is a nonzero solution v y defined on y satisfying By rescaling, we may assume that We want to define the numbers λ y for y ∈ η −1 (x n ) and y = x n−1 so that Hence we want to get By Lemma 2, the numerator (9) cannot vanish. We may multiply v y by a constant of absolute value 1 so that the expression on the right-hand side of (9) becomes positive. In this way, the values λ y for y ∈ η −1 (x n ) and y = x n−1 are defined. We extend the definition of v to x n by setting On the way, we have also extended the definition of J so that Moreover, by construction, we have v | xn Remark The Jacobi matrix J constructed in the proof satisfies β x ≡ 0 and λ x = 1 for vertices x whose distance from the path {x n } is greater than 2.
Remark Another way of proving Theorem 8 is as follows. Fix any Jacobi matrix J 0 so that the operator J 0 is bounded on 2 ( ). For example, we may set β x ≡ 0 and λ x = #η −1 (y) −1/2 , whenever x ∈ η −1 (y). Let S denote the operator acting according to the rule By 0 we denote the set of leaves, i.e., vertices with height 0. Then The operator S is thus bounded on 2 ( ) with S ≤ 1. The adjoint operator S * acts by the rule Then for J 0 = S + S * , we get J 0 ≤ 2. Fix an infinite path {x n } and a sequence of positive numbers {λ n }. Let J 1 be the degenerate Jacobi matrix defined by β x ≡ 0 and λ x n = λ n , λ x = 0, for x / ∈ {x n }. Choose the coefficients λ n so that the classical Jacobi matrix associated with the coefficients λ n and β n ≡ 0 is not essentially self-adjoint. For example, set λ n = 2 n . Let J = J 0 + J 1 . The matrix J is nondegenerate. Moreover, J is not essentially self-adjoint as a bounded perturbation of a not essentially self-adjoint operator ( [6], cf. Prop. 8.6 [4]).
The next theorem provides a relation between Jacobi matrices on the tree and classical Jacobi matrices associated with an infinite path of .
Theorem 9 Assume a Jacobi matrix J on is not essentially self-adjoint and β x ≡ 0. Choose an infinite path {x n } with l(x 0 ) = 0 and x n = η n (x 0 ). Then the classical Jacobi matrix J 0 with λ n = λ x n and β n ≡ 0 is not essentialy self-adjoint.
Proof of Theorem 9 By Corollary 5, there exists a nonvanishing function v on so that J v = iv and v(x 0 ) = 1. In view of Corollary 7, the function v is square summable on . By (10) evaluated at x = x n , we obtaiñ Hence The last inequality follows from Lemma 10. Thereforẽ The last inequality is equivalent to not essential self-adjointness of the classical Jacobi matrix J 0 with λ n = λ x n and β n ≡ 0. Indeed, let p n and q n denote the polynomials of the first and the second kind associated with J 0 ; i.e., x p n (x) = λ n p n+1 (x) + λ n−1 p n−1 (x), n ≥ 0, where p −1 = 0 p 0 = 1 and q 0 = 0, In view of [1, Problem 10, p. 84] or [5,Thm. 3 (i), (iv)], the matrix J 0 is not essentially selfadjoint.
Fix a vertex x 0 , l(x 0 ) = 0, and let x 1 = η(x 0 ). Assume also that i.e., that x 1 has at least two children, i.e., x 0 has a sibling. By Corollary 5 applied to , there exists a nonvanishing function v z on satisfying Similarly, Corollary 5 applied to \{x 0 } implies existence of a nonvanishing function u z on \{x 0 } so that We extend the definition of the function u z to by setting u z (x 0 ) = 0. The functions v z and u z satisfying (13) and (14) will be called the solution and the associated solution of the equation By Corollary 5, every solution of (15) is a linear combination of v z and u z . The following lemma is straightforward but useful.

The operatorÃ is essentialy self-adjoint if and only if A is essentially self-adjoint.
Theorem 12 Assume J is not essentially self-adjoint. Fix a vertex x 0 with l(x 0 ) = 0, so that x 0 has a sibling, and a nonreal number z. Then the associated solution u z is square summable on .
Proof Let H 0 = Cδ x 0 . The operatorJ acts on 2 ( \{x 0 }) and is not essentially selfadjoint by Lemma 11. Moreover, ifũ z denotes the truncation of u z to˜ = \{x 0 }, we have By Corollary 3 applied to˜ , we know thatũ z cannot vanish. SinceJ is not essentially self-adjoint, there exists a function 0 =ṽ ∈ 2 (˜ ) such that By Lemma 4, applied to˜ , we get thatũ z (x) = cṽ(x) for x ∈˜ . Henceũ z is square summable, which implies the conclusion.
Corollary 7 and Theorem 12 imply: Corollary 13 Assume a Jacobi matrix J on is not essentially self-adjoint. Fix a vertex x 0 with l(x 0 ) = 0, so that x 0 has a sibling. Then for any nonreal number z, every solution of the equation Fix a vertex x 0 , l(x 0 ) = 0, and remove from all the edges from the infinite path {x n } ∞ n=0 , where x n = η n (x 0 ). Note that we do not remove vertices x n , n ≥ 0. In this way the tree splits into an infinite number of finite subtrees of the form n := x n \ x n−1 . In other words, n consists of x n and all its descendants with the exception of x n−1 and its descendants.
Lemma 14 Let x ∈ n for some n ≥ 1.
Proof By Lemma 1, we know that v z and u z cannot vanish. Both functions satisfy , and the conclusion follows.

Proposition 15 For the solution v z and the associated solution u z , we have
Proof By (14), we get for n ≥ 1, Observe that η −1 (x n )\{x n−1 } ⊂ n . Hence Lemma 14 implies v z (x n )u z (y) = u z (x n )v z (y), y ∈ η −1 (x n )\{x n−1 }. Now, multiplying the equations by v z (x n ) and u z (x n ), respectively, and subtracting sidewise leads to The conclusion follows as The following theorem provides a natural analog of the Carleman criterion for essential self-adjointness.
Theorem 16 Let J be a Jacobi matrix associated with the coefficients λ x and β x . Let x n denote any infinite path such that l(x 0 ) = 0 and x n = η n (x 0 ). Assume 1 λ x n = ∞.
Then the operator J is essentially self-adjoint.
Proof Assume first that x 0 has a sibling. Then the result follows by a standard argument from Corollary 13 and Proposition 15. Indeed, if J was not essentially self-adjoint, then the functions v z and u z would be square summable for any nonreal z. Thus the series λ −1 x n would be summable, as well. Assume now that x 0 has no siblings. Set x 1 = η(x 0 ), and consider the tree = ∪ {x 0 } augmented by one vertex x 0 , so that η(x 0 ) = x 1 . Let J denote the Jacobi matrix on with coefficients λ x , β x defined by We have λ x n = λ x n . By the first part of the proof, the operator J is essentially selfadjoint. However J is a one-dimensional extension of the operator J. Hence, in view of Lemma 11, the operator J is essentially selfadjoint.

Remark
The assumption does not depend on the choice of the infinite path, as any two such paths meet at a certain vertex.