Inhomogeneous Jacobi matrices on trees

We study Jacobi matrices on trees with one end at inifinity. We show that the defect indices cannot be greater than 1 and give criteria for essential selfadjointness. We construct certain polynomials associated with matrices, which mimic orthogonal polynomials in the classical case. Nonnegativity of Jacobi matrices is studied as well.


Introduction
The aim of the paper is to study a special class of symmetric unbounded operators and their spectral properties. These are Jacobi operators defined on one sided tree. They are immediate generalizations of classical Jacobi matrices which act on sequences {u(n)} ∞ n=0 by the rule Ju(n) = λ n u(n + 1) + β n u(n) + λ n−1 u(n − 1), n ≥ 0, where {λ n } ∞ n=0 and {β n } ∞ n=0 are sequences of positive and real numbers, respectively, with the convention u(−1) = λ −1 = 0. These matrices are closely related with the set of polynomials defined recursively by (1) xp n (x) = λ n p n+1 (x) + β n p n (x) + λ n−1 p n−1 (x), n ≥ 0, with p −1 = 0, p 0 = 1.
In case the coefficients of the matrix are bounded, the matrix J represents selfadjoint operator on ℓ 2 (N 0 ). If E(x) denotes the resolution of identity associated to J, then the polynomials p n (x) are orthonormal with respect to the measure dµ(x) = d(E(x)δ 0 , δ 0 ), where δ 0 is the sequence taking value 1 at n = 0 and vanishing elsewhere and (u, v) denotes the standard inner product in ℓ 2 (N 0 ). The measure µ has bounded support.
When the coefficients are unbounded the operator J is well defined on the domain D(J) consisting of sequences with finitely many nonzero terms. In that case, if this operator is essentially selfadjoint then again the polynomials p n are orthonormal with respect to the measure dµ(x) = d(E(x)δ 0 , δ 0 ), except that this measure has unbounded support. Moreover there is a unique orthogonality measure for polynomials p n . By a classical theorem, if the operator J is not essentially selfadjoint, there are many measures µ on the real line so that the polynomials belong to L 2 (µ), i.e. ∞ −∞ x 2n dµ(x) < ∞ and the polynomials p n are orthogonal with respect to the inner product Therefore essential selfadjointness is a crucial property that distinguishes between the so called determinate and indeterminate cases. Intuitively the unbounded matrix J is essentially selfadjoint when the coefficients have moderate growth. But the converse is not true in general. For the classical theory of Jacobi matrices, orthogonal polynomials and moment problems we address the reader to [1], [2], [5], and to [4] for a modern treatment.
In a recent paper [3] homogeneous Jacobi matrices on one sided homogeneous trees were studied. Two types of homogeneous trees were considered. One of them was the tree with infinitely many origin points (on level 0) and one end at infinity. The tree Γ consists of vertices on levels from zero to infinity. Every vertex x on level n is connected with a unique vertex x ′ on level n + 1 and d vertices x 1 , . . . , x d on level n−1 for n ≥ 1 like in the figure below: The Jacobi matrices were defined on ℓ 2 (Γ) where Γ denotes the set of all vertices of the tree. The formula is as follows (when the degree equals d + 1) where n denotes the level of the vertex x counting from above.
An interesting phenomenon occured. It turned out that the operator J defined on functions {v(x)} x∈Γ , with finitely many nonzero terms, is always essentially selfadjoint, regardless of the growth of the coefficients λ n and β n . For example the operator J with coefficients λ n = (n + 1) 2 and β n = 0 is not essentially selfadjoint when considered as the classical Jacobi matrix on ℓ 2 (N 0 ). But it is essentialy selfadjoint when it acts on ℓ 2 (Γ).
Moreover its spectrum is discrete and consists of the zeros of all the polynomials p n associated with classical Jacobi matrix with coefficients √ d λ n and β n , i.e. satisfying Our aim is to study inhomogeneous Jacobi matrix on that tree. This means we do not require that the coefficients of the matrix are constant on each level of the tree. With every vertex x we associate a positive number λ x and a real number β x . We are going to study operators of the form One of the main differences between the classical case and the case of the tree Γ is that the eigenvalue equation cannot be solved recursively, unlike the equation This not a coincidence as we are going to show that the equation (2) may not admit nonzero solutions for real values of z (cf. Proposition 5). But we will show the equation has a nonzero solution for every nonreal z (Corollary 3).
Actually, when we give up homogeneity of the matrix J, we can as well give up homogoneity of the tree. Again x ′ is the only vertex one level up connected with x and x 1 , x 2 , ..., x d are all vertices one level down connected with x. The number d may now vary if we do not assume homogeneity of the tree.
The operator J is symmetric on ℓ 2 (Γ) with respect to the natural inner product We are interested in studying the essential selfadjointness of the matrix J. It turns out that unlike in homogeneous case, the matrix J may not be essentially selfadjoint. However the defect indices cannot be greater than 1 (Corollary 3). We derive certain criteria assuring essential selfadjointness. For example the analog of Carleman condition holds (see Theorem 6). Moreover we relate essential selfadjointness of J with essential selfadjointness of the classical Jacobi matrix J 0 obtained from J by restriction to an infinite path of the tree (see Theorem 4 and Remark following it). Classical Jacobi matrices are associated with orthogonal polynomials through the formula (1). In case of the tree Γ there is no natural way of defining polynomials associated with Jacobi matrices on Γ, since (as was mentioned above) the eignevalue equation may be not solvable. In Section 3 we define certainpolynomials associated with J. We prove that they have real and simple zeros. Also we show interlacing property for roots of two consecutive polynomials. We also prove that the zeros of these polynomials describe the spectrum of restriction of J to finite subtrees of Γ. However, unlike in the classical case, there is no natural orthogonality relation between these polynomials.
In Section 5 we give a criterion for nonnegativity of the Jacobi matrix J on Γ. In the classical case the Jacobi matrix J is positive definite if and only if (−1) n p n (0) > 0 for every n, where p n are the orthogonal polynomials associated with J. In case of tree Γ we do not have solutions of eigenvalue problem at our disposal or orthogonal polynomials. Therefore we had to find another way of getting the result. The nonnegativity of the matrix J proved to be a useful tool in construction of a Jacobi matrix on Γ for which the eigenvalue equation (2) does not admit solutions for some real values.

Definitions and basic properties
We will consider a tree Γ with one end at infinity. Its vertices are located on levels from zero to infinity. Every vertex x on level k ≥ 0 is connected with a unique vertex x ′ on level k + 1. Moreover, when k ≥ 1 the vertex x is connected with a finite number of vertices y on level k − 1. This set will be denoted by N x . Let ℓ(x) denote the level of vertex x.
For a given vertex x let Γ x denote the finite subtree containing the vertex x together with vertices y so that ℓ(y) < ℓ(x) and connected with x by a path.
Define F (Γ) to be the set of all complex valued functions with finite support on Γ. Let Consider the operator J acting on F (Γ) according to the rule where λ x are positive constants while β x are real ones. Let S be the operator acting by the rule Then the adjoint operator S * is given by Let M be a multliplication operator defined by In particular J is a symmetric linear operator. We will study formal eignefunctions of the operator J, i.e. functions v defined on Γ satisfying Jv = zv. Evaluation at the vertex x gives that equivalently we have the recurrence relation In order to simplify the notation set N x = ∅ if ℓ(x) = 0. Then (6) takes the form and z / ∈ R such that Jv(y) = zv(y) for y ∈ Γ x . Then v(x ′ ) = 0.
Proof. Assume for a contradiction that v(x ′ ) = 0. Let J x denote the finite Jacobi matrix defined on Γ x obtained from J by truncation.
Namely J x = P x JP x , where P x denotes the orthogonal projection from ℓ 2 (Γ) to ℓ 2 (Γ x ). Let w denote the truncation of v to Γ x . Since v(x ′ ) = 0 we have J x w = zw. Moreover w = 0. Therefore z must be a real number, as J x is a finite dimensional symmetric linear operator.
Proof. Assume for a contradiction that Define the function u ∈ F (Γ ′ x ) by setting u(y) = v(y) for y ∈ Γ x and u(x ′ ) = 0. Then Ju(y) = zu(y) for y ∈ Γ x . In view of Lemma 1 we get a contradiction. Proof. Assume for a contradiction that v(x) = 0 for a vertex x. By Lemma 1 we get that the function v vanishes identically on Γ x . From the recurrence relation we get v(x ′ ) = 0. Therefore v vanishes identically on Γ x ′ . Applying the same procedure infinitely many times we achieve that v vanishes at every vertex of Γ.

Lemma 3. For any nonreal number z and any
Moreover the function v cannot vanish and is unique up to a constant multiple.
Proof. We will prove Lemma 3 by induction on the level l(x 0 ). Assume l(x 0 ) = 1. Set v(x 0 ) = 1. Let x ∈ N x 0 . Then l(x) = 0. We want to have

Thus we may set
In this way (8) is fulfilled. Assume the conclusion is true for all vertices on the level n. Let We have We are going to define the function v on Γ x 0 in the following way: set In order to conclude the proof we must show that Thus we want to have The expression in the brackets on the right hand side is nonzero for every i = 1, 2, . . . , k by Lemma 2. Therefore (9) is satisfied for an appropriate choice of the value v(x 0 ) and nonzero constants c 2 , c 3 , . . . , c k . By Lemma 1 the function v cannot vanish. Moreover if there was another functionṽ satisfying the conclusion of Lemma 3, then v − cṽ would also satisfy the conclusion and would vanish for an approprate choice of the constant c. Thus v = cṽ.
By the proof of Lemma 3 we get the following.

Corollary 2. For any nonreal number z there exists a nonzero function v so that
Jv The function v cannot vanish and is unique up to a constant multiple.
Remark. In Chapter 5 we are going to show that the conclusion of Corollary 2 may not be true for real numbers z. Observe that for classical Jacobi matrices (when Γ = N 0 ) the recurrence relation (λ −1 = 0) has nonzero solutions for any z ∈ C.

Polynomials and zeros
The classical Jacobi matrices are related to orthogonal polynomials. Namely setting v 0 = 1 in (10) gives that v(n) = p n (z), where p n is a polynomial of order n, with real coefficients. The question arises if Jacobi matrices on trees are connected to polynomials, as well. In general we cannot expect that the solution of Jv = zv will satisfy that v(t) = P t (z), where P t is a polynomial for every t ∈ Γ. But we may expect that P t (z) is a polynomial for t in a subtree Γ x for some x ∈ Γ.
is a polynomial with real coefficients and positive leading coefficient. Moreover if t ∈ Γ ′ y ⊂ Γ ′ x then the polynomial P x,t is divisible by P y,t . Proof. We will use induction on the level l(x). Let l(x) = 0. By Corol- Hence Assume now the conclusion is valid for vertices on the level n. Let l(x) = n+1. By induction hypothesis, for any y ∈ N x there is a nonzero solution v y so that P y,t (z) is a polynomial with real coefficients for t ∈ Γ ′ y . In particular the polynomial v y (x) = P y,x (z) has real coefficients. Moreover by Lemma 1 the polynomial P y,x (z) cannot vanish for z / ∈ R. Fix y 1 ∈ N x and let Since the value v x (x) determines the solution, the function v x does not depend on the choice of y 1 ∈ N x . Thus the formula (11) and the above reasoning is valid for any choice of y ∈ N x . Hence By (13) and by induction hypothesis This implies the last part of the conclusion.

Remarks
The formulas (12) and (13) imply that for y ∈ N x and t ∈ Γ ′ y we have Let y ∈ Γ x . Then y and x are connected in Γ x by a path y = y 0 , y 1 , . . . , y n = x. By iterating (14) we get P x,y (z) = P yn,yn (z) P y n−1 ,yn (z) · P y n−1 ,y n−1 (z) P y n−2 ,y n−1 (z) · . . . · P y 1 ,y 1 (z) P y 0 ,y 1 (z) P y 0 ,y 0 (z). Hence These formulas and (12) imply that the polynomial P x,y (z) can be described in terms of the polynomials of the form P t,t ′ (z) for t ∈ Γ x .
where a n,x (z) and b n (z) are polynomials with real coefficients. Moreover the polynomial b n+1 is divisible by b n .
Proof. Consider the subtree Γ xn . Let x ∈ Γ xn . By Proposition 1 there is a solution v n so that v n (x) and v n (x 0 ) are polynomials with real coefficients. Then x , have only real zeros. Moreover for any x ∈ Γ the zeros of P x,x and P x,x ′ are single, and the zeros of P x,x interlace with the zeros of P x,x ′ , i.e. if x 1 < x 2 < . . . < x n denote the zeros of P x,x ′ , then P x,x has n − 1 zeros y 1 < y 2 < . . . < y n−1 and Proof. We will use induction on l(x). Let l(x) = 0. Then P x,x = 1 and Let l(x) = n. By the recurrence relation we have where N x = {y 1 , y 2 , . . . , y k }. By (14), with t = y j , we get By induction hypothesis the zeros of P y j ,y j (z) are real and single and interlace with the zeros of P y j ,x (z) for any j. This implies deg P y j ,x = deg P y j ,y j + 1.
In view of (15) and (16) we get Let r be a root of P x,x (z). We are going to study the sign of P x,x ′ (r) making use of (15). If P y j ,x (r) = 0, then (16) implies P x,y j (r) = 0. But since P x,x (r) = 0 then P y j 0 ,x (r 1 ) = 0 for some j 0 , by (12). Consider the quantity where ε > 0 is infinitesimally small. We have as the polynomials P y j 0 ,y j 0 (z) and P y j 0 ,x (z) have the same number of roots to the right of r + ε, by induction hypothesis and by the fact that the leading coefficients are positive. Consider the limit P x,y j 0 (r) = lim ε→0 + P x,x (r + ε) P y j 0 ,y j 0 (r + ε) P y j 0 ,x (r + ε) The polynomials P y,x , for y ∈ N x , have single roots by induction hypothesis. Thus the limit in the right hand side of (17) is nonzero in view of (12). Since P y j 0 ,y j 0 (r) = 0 (by induction hypothesis) we get that P x,j 0 (r) = 0. Hence the sign of the limit is determined by the sign of P x,x (r + ε). By plugging z = r into (15) we get that P x,x ′ (r) and P x,x (r + ε) have opposite signs.
Consider now two consecutive roots r 1 < r 2 of P x,x (z). The signs of P x,x (r 1 + ε) and P x,x (r 2 + ε) are opposite. Therefore the signs of P x,x ′ (r 1 ) and P x,x ′ (r 2 ) are also opposite. Thus P x,x ′ (z) must vanish in the interval (r 1 , r 2 ).
Assume now that r is the largest root of P x,x (z). Then P x,x (r +ε) > 0 for small positive ε. By the above reasoning we have P x,x ′ (r) < 0, which means that P x,x ′ must vanish somewhere to the right of r, as the leading coefficient is positive. Similarly if r is the smallest root of P x,x (z) then the signs of P x,x ′ (r) and P x,x (r + ε) are opposite. But since the degree of P x,x ′ is by one greater than the degree of P x,x (z) and the leading coefficients are positive, we get that P x,x ′ must vanish below r.
For x ∈ Γ let J x denote the truncation of the Jacobi matrix J to the subtree Γ x , i.e. the matrix with the parameters λ x y , β x y so that Let r belong to the spectrum of J x . Then r satisfies at least one of the two conditions (a) P x,x ′ (r) = 0.
Proof. First we will show that the numbers described in the theorem belong to the spectrum of J x . Assume P x,x ′ (r) = 0. By Theorem 1 we have P x,x (r) = 0. By Lemma 2 for any nonreal z there is a solution v x of the equation Jv x = zv x so that v x (y) = P x,y (z) for y ∈ Γ ′ x . Let u(y) = lim ε→0 P x,y (r + iε), y ∈ Γ x .
Then u satisfies J x u = ru. Moreover u is nonzero as u(x) = P x,x (r) = 0.
Assume now that there exist y ∈ Γ x and y 1 , y 2 ∈ N y so that P y 1 ,y (r) = P y 2 ,y (r) = 0. By the above reasoning there are two nonzero solutions u 1 , u 2 , defined on Γ y 1 , Γ y 2 , respectively, of the equations J y 1 u 1 = ru 1 and J y 2 u 2 = ru 2 and u 1 (y 1 ) = 0, u 2 (y 2 ) = 0. Consider the function u y 1 ,y 2 defined on Γ y as follows Then u y 1 ,y 2 = 0 and J x u y 1 ,y 2 = Ju y 1 ,y 2 = ru y 1 ,y 2 .
Then the eigenvectors u y 1 ,y 2 , . . . , u y 1 ,yn are linearly independent, as the support of u y 1 ,y i coincides with Γ y 1 ∪Γ y i . Hence the dimension of the space spanned by these eigenvectors is at least n − 1.
In the previous part of the proof we have constructed eigenvectors corresponding to the set of numbers described in the theorem. We will calculate the dimension of the space spanned by these eigenvectors. The proof will be complete if the dimension coincides with the dimension of the space ℓ 2 (Γ x ), i.e. with #Γ x . We will use induction with respect to ℓ(x). Assume the conclusion is valid for l(x) = n. Let ℓ(x) = n + 1. Denote N x = {y 1 , y 2 , . . . , y k }. Let n j = degP y j ,x . Every eigenvector of J y j corresponding to the case (b) is an eigenvector of J x as well. Therefore, by induction hypothesis, the dimension of the linear span of all eigenvectors of J y i corresponding to the case (b) is equal Such eigenvectors corresponding to J y i and J y j , for i = j have disjont supports, hence the total dimension of the eigenvectors corresponding to the case (b) for J y 1 , . . . , J y k is equal k j=1 #Γ y j − k j=1 degP y j ,x Consider the product P y 1 ,x (z) . . . P y k ,x (z).
We know that every polynomial P y j ,x has single roots. We have By the reasoning performed in the first part of the proof, the root r l gives rise to n l − 1 linearly independent eigenvectors of J x . Moreover the degree of the polynomial P x,x ′ is equal to L + 1 as and deg P x,x = L (cf. (12)). The roots of P x,x ′ lead to L + 1 linearly independent eigenvectors of J x , which are linearly independent from the ones constructed in (b), as they do not vanish at x. Summarizing the number of linearly independent eigenvectors of J x is not less then

Essential selfadjointness and defect indices
Let z / ∈ C. The function v ∈ ℓ 2 (Γ) belongs to the defect space corresponding to z if v is orthogonal to Im (zI − J) = (zI − J)(F (Γ)). In particular v is orthogonal to (zI − J)δ x for any x ∈ Γ. This implies Jv = zv.

Proposition 2. The defect indices of the operator J cannot be greater than 1.
Proof. Fix a nonreal number z. Let v ∈ ℓ 2 (Γ) satisfy v = 0 and Jv = zv. By Corollary 1 the function v is unique up to a constant multiple. Proof. We set β x ≡ 0. Fix a nonreal number z. Choose an infinite path x n in Γ so that l(x n ) = n. We will construct a matrix J by induction on n. Assume we have constructed a matrix J on Γ x n−1 \ {x n−1 } and a nonvanishing function v on Γ x n−1 so that v | Γx n−1 2 2 ≤ 1 − 2 −(n−1) and

Proposition 2 implies
Jv(x) = zv(x), x ∈ Γ x n−1 \ {x n−1 }. We want to extend the definition of J and v so that the conclusion remains valid when n − 1 is replaced by n.
Our first task is to define λ x n−1 and v(x n ) so that The right hand side of (18) cannot vanish by Lemma 2. We will define λ x n−1 and v(x n ) so as to satisfy (18). By specifying λ x n−1 large enough we may assume that |v(x n )| 2 ≤ 2 −n−1 . For any y ∈ N xn and y = x n−1 consider the subtree Γ y \ {y}. Set λ x = 1 for any x ∈ Γ y \ {y}. By Lemma 3 there is a nonzero solution v y defined on Γ y satisfying Jv y (x) = zv y (x), x ∈ Γ y \ {y}.
We want to define the numbers λ y for y ∈ N xn and y = x n−1 so that zv y (y) = Jv(y) = λ y v y (x n ) + x∈Ny λ x v y (x).
Hence we want to have By Lemma 2 the numerator (19) cannot vanish. We may multiply v y by a constant of absolute value 1 so that the expression on the right hand side of (19) becomes positive. In this way the values λ y for y ∈ N xn and y = x n−1 are defined. We extend the definition of v to Γ xn by setting v(x) = v y (x), x ∈ Γ y , y = x n−1 . On the way we have also extended the definition of J so that

Remark 1.
The Jacobi matrix J constructed in the proof satisfies β x ≡ 0 and λ x = 1 for vertices x whose distance from the path {x n } is greater than 2.
Remark 2. Another way of proving Theorem 3 is as follows. Fix any Jacobi matrix J 0 so that the operator J 0 is bounded on ℓ 2 (Γ). For example Let β x ≡ 0 and λ Let S denote the operator acting according to the rule The operator S is thus bounded. The adjoint operator S * acts by the rule Then J 0 = S + S * is the Jacobi matrix such that J 0 2→2 ≤ 2. Fix an infinite path {x n } and a sequence of positive numbers {λ n }. Let J 1 be the degenerate Jacobi matrix defined by β x ≡ 0 and λ xn = λ n , λ x = 0 for x / ∈ {x n }. Choose the coefficients λ n so that the classical Jacobi matrix associated with the coefficients λ n and β n ≡ 0 is not essentially selfadjoint. For example let λ n = 2 −n . Let J = J 0 + J 1 . The matrix J is nondegenerate. Moreover J is not essentially selfadjoint as a bounded perturbation of not essentially selfadjoint operator.
The next theorem provides a relation between Jacobi matrices on the tree Γ and classical Jacobi matrices associated with the infinite paths of Γ.
The Jacobi matrix J on Γ will be called symmetric if β x ≡ 0.
Theorem 4. Let J be a nonessentially selfadjoint symmetric Jacobi matrix on Γ. Choose an infinite path {x n } so that l(x n ) = n. Then the classical Jacobi matrix J 0 with λ n = λ xn and β n ≡ 0 is nonessentialy selfadjoint.
Before proving the theorem we will need the following lemma. Proof. By assumptions we have We know thatṽ cannot vanish and is unique up to a constant multiple. The function Reṽ satisfies (20) and takes the value 1 at x 0 . Thus v = Reṽ, i.e.ṽ is real valued. We will show thatṽ(x) is positive by induction. Observe that ifṽ(x) is positive for any vertex on level zero then by (20)ṽ is positive. Assume the opposite, i.e.ṽ is negative at some vertices on level zero. Sinceṽ(x 0 ) = 1 there are two vertices y 1 , y 2 on level zero so that y ′ 1 = y ′ 2 andṽ(y 1 ) > 0,ṽ(y 2 ) < 0. By (20) evaluated at x = y 1 and x = y 2 we get thatṽ(y ′ 1 ) > 0 andṽ(y ′ 2 ) < 0, which gives a contradiction.

The last inequality follows from Lemma 4. Thereforẽ
By assumptions the seqeunceṽ(x n ) is square summable. Thus The last inequality is equivalent to nonessential selfadjointness of the classical Jacobi matrix J 0 with λ n = λ xn and β n ≡ 0. Indeed, let p n and q n denote the polynomials of the first and the second kind associated with J 0 . Then (21) reduces to Remark The symmetry assumption β x ≡ 0 is essential. There exist nonessentially selfadjoint Jacobi matrices J on Γ so that the classical Jacobi matrix J 0 associated with the path {x n } is essentially selfadjoint. Indeed for every vertex x n , n ≥ 1, fix a vertex y n−1 = x n−1 in N xn . Let P denote the orthogonal projection from ℓ 2 (Γ) onto ℓ 2 ({x n , y n } ∞ n=0 ). We will consider Jacobi matrices J so that β x = 0 for x / ∈ {y n } ∞ n=0 . Let J 1 = P JP. First we are going to study the essential selfadjointness of the operator J 1 . To this end consider the equation J 1 v(x) = zv(x). This is equivalent to zv(x n ) = λ xn v(x n+1 ) + λ x n−1 v(x n−1 ) + λ y n−1 v(y n−1 ), zv(y n−1 ) = λ y n−1 v(x n ) + β y n−1 v(y n−1 ).
Assume the classical Jacobi matrix with coefficients λ n and −β n is not essentially selfadjoint. Then the sequence v n is square summable. Moreover (22) implies |v(y n−1 )| 2 = µ 2 n 1 + β 2 n |v n | 2 = |v n | 2 . Hence i.e. the operator J 1 is not essentialy selfadjoint. Let J 2 be any bounded Jacobi matrix on Γ. Then J = J 1 + J 2 is a nonessentially selfadjoint Jacobi matrix on Γ.
The matrix J 0 is associated with the coefficients λ n = λ xn and β n ≡ 0. Thus, in order to conclude the reasoning it suffices to prove the following.

Lemma 5.
There exists a nonselfadjoint classical Jacobi matrix J Jx n = λ n x n+1 − β n x n + λ n−1 x n−1 so that the Jacobi matrix J ′ x n = λ n x n+1 + λ n−1 x n−1 is essentially selfadjoint.
Proof. We will assume that β n = 0. Nonessential selfadjointness of J is equivalent to the fact that every solution of the recurrence relation is square summable. Assume the sequence x n satisfies this recurrence relation. Then Plugging in the last two equations into (23) results in Let β 2n−1 = aλ 2n−1 and Then Choose an increasing sequence λ 2n so that every solution u 2n of the last equation is square summable. Assume also that λ 2n = λ 2n+1 . Then by (24) we get Thus the sequence x n is square summable, i. e. the Jacobi matrix J is not essentially selfadjoint.
The following lemma is straightforward but useful. Then u is square summable on Γ.
Fix an infinite path {x n } so that l(x n ) = n. By Corollary 2, for a nonreal number z there exist two nonzero solutions v z and u z on Γ such that Observe that we have The functions v z and u z satisfying (6) and (7) will be called the solution and the associated solution of the equation

Summarizing we get
Consider the graph obtained from Γ by removing all links (x n , x n+1 ) (we do not remove vertices). This graph splits into the infinite sum of finite subtrees Γ n . The tree Γ n contains the vertex x n . Moreover x n is the only vertex of Γ n on the level n.
Proof. By Lemma 1 we know that v z and u z cannot vanish. Both functions satisfy Ju z (x) = zu z (x), Jv z (x) = zv z (x) for x ∈ Γ n \ {x n }. By Lemma 3 we get v z (x) = cu z (x) for x ∈ Γ n . Plugging in x = x n gives the conclusion.

Proposition 4. For the solution v z and the associated solution
Proof. By the recurrence relation (6) we get On multiplying the equations by u z (x n ) and v z (x n ), respectively, subtracting sidewise and making use of Lemma 6 gives The conclusion follows as Theorem 6. Let J be a Jacobi matrix associated with the coefficients λ x and β x . Let x n denote any infinite path so that l(x n ) = n. Assume Then the operator J is essentially selfadjoint.
Proof. The result follows by the standard argument from Proposition 4. If J was not essentially selfadjoint then the functions v and u would be square summable, thus the series λ −1 xn would be summable.

Remark.
The assumption does not depend on the choice of the infinite path, as any two such paths will meet at a certain vertex.

Nonnegative Jacobi matrices on trees
We say that a matrix J is positive definite if The next theorem gives characterization of positive definite Jacobi matrices on Γ.

Theorem 7.
(i) Assume there exists a positive function m(x) on Γ such that Then the matrix J is positive definite (ii) If the matrix J is positive definite there exists a positive function m(x) on Γ such that Thus, on dividing by m(x), the formula (27) takes the form We have (see (5)) (ii) Consider the operator U acting by the rule Clearly U is a unitary operator. Let ThenJ is a nonpositive definite operator and Fix an infinite path x n so that l(x n ) = n. Thus Γ = ∞ n=0 Γ xn . Let P n denote the orthogonal projection from ℓ 2 (Γ) onto ℓ 2 (Γ xn ) andJ n = P nJ P n . ThenJ n is a bounded nonpositive linear operator. Therefore −a n I <J n ≤ 0 < 1 n I, for a positive constant a n . Hence 0 <J n + a n I < a n + 1 n I. We have (30) 0 < ((J n + a n I)δ x , δ x ) = a n − β x , x ∈ Γ xn .
Observe that Let f n := ( 1 n I −J n ) −1 δ x 0 = [(a n + 1 n )I − (J n + a n I)] −1 δ x 0 = ∞ k=0 1 (a n + 1 n ) k+1 (J n + a n I) k δ x 0 . By (30) and (31) the function (J n + a n I) k δ x 0 is nonnegative, and positive on all vertices of Γ xn at distance from x 0 less or equal to k. Hence f n ≥ 0 and f n (x) > 0 for any x ∈ Γ xn . Moreover Then m n (x 0 ) = 1 and Observe that for any fixed t ∈ Γ the sequence m n (t) is bounded. Indeed, assume the opposite. Let t be the vertex closest to x 0 , so that m n (t) is unbounded. Let s be the vertex adjacent to t, so that d(x 0 , t) = d(x 0 , s) + 1.
Then applying (32) with x = s implies that the sequence m n (s) is unbounded, which gives a contradiction.
Observe also that for any fixed t ∈ Γ the sequence m n (t) cannot accumulate at zero. Indeed, assume the opposite. Let t be the vertex closest to x 0 so that m n (t) accumulates at zero. Again let s be the vertex adjacent to t, so that d(x 0 , t) = d(x 0 , s) + 1.
Then applying (32) with x = t implies that the sequence m n (s) also accumulates at zero, which gives a contradiction.
Consider the sequence of functions m n . Let m be any pointwise accumulation point of this sequence. Then m(x) > 0 and by (32) and (33) we obtain λ x m(x ′ ) + y∈Nx λ y m(y) = β x m(x), x ∈ Γ \ {x 0 }, In order to get the conclusion (i.e. to guarantee equality also in (35)) we have to modify slightly the function m(x).
Observe that after removing all the edges from the path {x n } the tree Γ splits into the sequence of disjoint trees Γ n so that x n ∈ Γ n . By (34) evaluated at x = x n we have λ xn m(x n+1 ) + λ x n−1 m(x n−1 ) + y∈Nx n y =x n−1 λ y m(y) = β xn m(x n ), n ≥ 1.
Let the coefficients c n be defined by c 0 = 0 and (36) y∈Nx n y =x n−1 λ y m(y) = c n m(x n ), n ≥ 1.

Proposition 5.
There exist Jacobi matrices J on trees so that the equation Jv = tv does not admit nonzero solutions for some real values of t.
Proof. We may admit that t = 0. Consider a tree Γ with #N x = 2 for every vertex x, ℓ(x) ≥ 1. Fix an infinite path x n , so that ℓ(x n ) = n. Then N xn = {x n−1 , y n−1 } for n ≥ 1. We will define the coefficients λ x and β x on Γ y k in such a way that the operator J restricted to ℓ 2 (Γ y k \ {y k }) is positive. For example we may set λ x = 1 and β x = 4 for any x ∈ Γ y k \ {y k }. In this way if the function v satisfies Jv(y) = 0 for y ∈ Γ y k \ {y k }, then either v = 0 or v cannot vanish on Γ y k . If v does not vanish on Γ y k its restriction to Γ y k is unique up to a constant multiple. Let λ y k = 1 and set β y k in such a way that v(y ′ k ) = v(x k+1 ) = 0. Set also λ x k = 1 and β x k = 0 for any k. Thus the matrix J is defined. Assume Ju = 0. If u vanishes on every subtree Γ yn then by the recurrence relation u vanishes at every vertex x n , with n ≥ 1, as y ′ n = x n+1 . Moreover by the recurrence relation evaluated at x 1 we obtain v(x 0 ) = 0, i.e. v = 0. If u does not vanish on every subtree Γ yn , let n be the smallest index for which u does not vanish on Γ yn .
x n y n x n+1 x n+2 y n+1 Then v(x k ) = 0 for any k ≤ n. We must have v(y n ) = 0. By construction we also get v(x n+1 ) = 0. By the recurrence relation evaluated at x n+1 we conclude that v(x n+2 ) = 0. This implies that v does not vanish on Γ y n+1 . But by construction v(x n+2 ) = 0, which is a contradiction.