Orthogonal Cauchy-like matrices

Cauchy-like matrices arise often as building blocks in decomposition formulas and fast algorithms for various displacement-structured matrices. A complete characterization for orthogonal Cauchy-like matrices is given here. In particular, we show that orthogonal Cauchy-like matrices correspond to eigenvector matrices of certain symmetric matrices related to the solution of secular equations. Moreover, the construction of orthogonal Cauchy-like matrices is related to that of orthogonal rational functions with variable poles.


Introduction
A matrix C ∈ R n×n is of Cauchy type if its entries C ij have the form where x i , y j for i, j = 1, . . . , n are mutually distinct real numbers, called nodes. Besides their pervasive occurrence in computations with rational functions [1], Cauchy matrices play an important role in deriving algebraic and computational properties of many relevant structured matrix classes. For example, they occur as fundamental blocks (together with trigonometric transforms) in decomposition formulas and fast solvers for Toeplitz, Hankel, and related matrices, see, e.g., [2].
The main goal of this contribution is to provide a complete answer to the following question: can the rows and columns of a Cauchy matrix be scaled so that the matrix becomes orthogonal? Interest in this question arises from the paper [3], where orthogonal matrices obtained by scaling rows and columns of Cauchy matrices are needed in the design of all-pass filters for signal processing purposes. Moreover, in [4] Cauchy matrices have been characterized as transition matrices between eigenbases of certain pairs of diagonalizable matrices, as better described below. Thus, it is interesting to characterize Cauchy matrices that can be orthogonalized by a row/column scaling, as the related eigenbases are similarly conditioned.
Matrices obtained by scaling rows and columns of a Cauchy matrix of order n can be parametrized by about 4n coefficients. On the other hand, an invertible matrix X is orthogonal if and only if it fulfills the identity X T X = I , which boils down to about n 2 /2 quadratic scalar equations. It may therefore seem that orthogonalization of a Cauchy matrix by scaling its rows and columns is feasible only for small n: the number of constraints grows faster than that of free variables. Instead, the main result of this paper shows that, for any fixed order n, there is an infinite number of Cauchy matrices that can be orthogonalized in the way said before.
Let us briefly explain the structure of the paper. A few basic facts and concepts on Cauchy matrices are recalled in the next section. Section 3 contains the main results of this paper, namely, the complete description of the set of orthogonal Cauchy-like matrices, that is, the orthogonal matrices obtained by diagonal scalings of Cauchy matrices, of any order n. Section 4 is devoted to displaying various algebraic and computational properties of orthogonal Cauchy-like matrices. There, we illustrate their relationships with secular equations, the diagonalization of a subclass of symmetric quasiseparable matrices and the construction of orthogonal rational functions with free poles. Moreover, in that section we specialize to orthogonal Cauchy-like matrices the characterization of Cauchy matrices obtained in [4], and provide a complete description of matrix sets that are simultaneously diagonalized by orthogonal Cauchy-like matrices. Finally, Section 5 exhibits a particular sequence of orthogonal Cauchy-like matrices of arbitrary order, whose nodes are based on Chebyshev points.
In the sequel, we adopt the following notation. The matrix in (1) is referred to simply as Cauchy(x, y). Let R n 0 be the set of vectors in R n without null entries. We denote by 1 the all-ones vector in R n . For 0 = x ∈ R we set sign(x) = 1 if x ≥ 0 and sign(x) = −1 if x < 0. For any z = (z 1 , . . . , z n ) T ∈ R n let D z ∈ R n×n be a diagonal matrix with the diagonal entries z 1 , . . . , z n .

Cauchy matrices and their properties
Algebraic and computational properties of Cauchy matrices are better understood by making use of the so-called displacement operators. Let M, N ∈ R n×n and define The matrix operator D M,N is widely known as a (Sylvester-type) displacement operator [5]. The map D M,N is invertible if and only if the spectra of M and N are disjoint. For any nonnegative integer r the set is a displacement-structured matrix space. Various interesting matrix sets, such as circulant, Toeplitz, Vandermonde, Hankel, and also Cauchy matrices, are actually subsets of displacement-structured matrix spaces with a small rank k. These spaces share two important features: first, matrices in S r M,N can be parametrized by a small set of coefficients (typically, their number is O(nr)) by means of appropriate inversion formulas for the related displacement operator. Furthermore, if an invertible matrix belongs to S r M,N then its inverse belongs to S r N,M . By means of these simple facts, linear systems and least squares problems with displacement-structured matrices can be solved by means of fast algorithms, requiring O(n 2 ) arithmetic operations or even less [6][7][8].
Cauchy matrices are very special displacement-structured matrices. For x, y ∈ R n we adopt the simplified notations D x,y and S r x,y for the displacement operator D x,y (X) = D x X − XD y and the related displacement-structured matrix space, respectively. If the numbers x 1 , . . . , x n , y 1 , . . . , y n are all distinct then the operator D x,y is invertible. Indeed, for any given matrix A ∈ R n×n the solution of the matrix x,y . More generally, a matrix A ∈ R n×n belonging to a S r x,y space with r n is said to possess a Cauchy-like displacement structure [5][6][7]. In this work, we term Cauchy-like matrix any matrix that belongs to some S 1 x,y space.
Definition 1 Let x i , y j ∈ R be pairwise distinct numbers, for i, j = 1, . . . , n. A Cauchy-like matrix with nodes x 1 , . . . , x n and y 1 , . . . , y n is any matrix K ∈ R n×n such that D x,y (K) = vw T for some v, w ∈ R n , i.e., The matrices of the previous definition are also called generalized Cauchy matrices by some authors, see, e.g., [9]. It is immediate to observe that the vector parameters x, y, v and w in (2) are not unique. Indeed, adding a constant vector to both x and y does not modify the denominators in (2). Moreover, the vectors v and w are defined apart of a nonzero constant. However, this ambiguity does not harm the following. The lemma below, whose proof is immediate, provides a useful factored form for Cauchy-like matrices.
Lemma 1 A Cauchy-like matrix with nodes x 1 , . . . , x n and y 1 , . . . , y n can be factored as K = D v CD w where C = Cauchy(x, y) and v, w ∈ R n are such that D x,y (K) = vw T . Conversely, a matrix factored as K = D v CD w where C = Cauchy(x, y) and v, w ∈ R n is Cauchy-like.
A notable feature of the set of Cauchy-like matrices is the invariance under row and column permutation. This fact is at the basis of stable numerical methods for solving linear systems with Cauchy-like matrices and, more generally, matrices with a Cauchy-like displacement structure [6,7]. Indeed, let K be the matrix in (2) and let P , Q be two permutation matrices. Introduce the permuted vectors v = P v and w = Qw. Direct inspection proves the identity Hence, when dealing with Cauchy-like matrices there is no loss of generality in supposing that the vectors x and y are ordered monotonically, i.e., x 1 < x 2 < · · · < x n , y 1 < y 2 < · · · < y n .
Such an ordering can always be obtained by a row and column permutation of the matrix.

The inverse and determinant of a Cauchy matrix
The invertibility of Cauchy matrices is a well-known fact which has been rediscovered many times. We refer to [10] for an earlier exposition of explicit formulas for the determinant and inverse of a generic Cauchy matrix. For example, the formula shows that every Cauchy matrix is nonsingular. To illustrate the convenience of displacement operators, we recall hereafter a simple displacement-based derivation of the structure of the inverse of a Cauchy matrix. Let C = Cauchy(x, y). From D x,y (C) = 11 T we derive Let a = C −1 1 and b = C −T 1. Then we conclude D y,x (C −1 ) = −ab T and, by Lemma 1, The last passages exploits the identity Cauchy(x, y) T = −Cauchy(y, x). Incidentally, (6) inspired the authors of the paper [9] to investigate nonsingular matrices X such that X −1 = D a X T D b for some diagonal matrices D a and D b , particularly about the sign patterns that appear in these matrices. Clearly, equation (6) implies that a, b ∈ R n 0 . In fact, explicit expressions for the vectors a and b above can be obtained from the solution of certain polynomial interpolation problems, see, e.g., [10,11]. In particular, for i = 1, . . . , n, where p(x) and q(x) are the polynomials

Main results
On the basis of the facts recalled in the previous section, the construction of orthogonal Cauchy-like matrices with given poles x 1 , . . . , x n and y 1 , . . . , y n amounts to solving the quadratic matrix equation x,y . In this section, we provide necessary and sufficient conditions for the solvability of this problem, together with a complete description of the solution set. First, we characterize the Cauchy matrices that can be diagonally scaled to orthogonality. Subsequently, we describe all orthogonal Cauchy-like matrices with prescribed nodes. Later, we solve the inverse problem of choosing a set of nodes given the other set, so that the resulting Cauchy matrix can be scaled to have its columns be orthogonal.

Definition 2
Let K n be the set of Cauchy matrices C ∈ R n×n such that there exist v, w ∈ R n 0 such that D v CD w is orthogonal.
Thus, K n consists of all Cauchy matrices that can be made orthogonal by scaling rows and columns. Owing to (3), the set K n is closed under row/column permutations, i.e., permuting rows and columns of a Cauchy matrix has no effect on whether the matrix belongs to K n or not. We formalize this fact in the next proposition.
Proposition 2 Let C = Cauchy(x, y) and let P , Q be two arbitrary permutation matrices. Then P CQ T ∈ K n if and only if C ∈ K n .
Hence, to characterize the matrices in K n we can restrict our attention to Cauchy matrices whose nodes verify the inequalities (4). The following results provide necessary and sufficient conditions for an n × n Cauchy matrix to belong to K n . The condition in the forthcoming Lemma 3 concerns the signs of the numbers a i and b i defined in (7), while that in Theorem 4 only involves the nodes position on the real line.

Lemma 3
The matrix C = Cauchy(x, y) belongs to K n if and only if the numbers a 1 , . . . , a n and b 1 , . . . , b n from (7) are either all positive or all negative.
Proof From (6) we have C −1 = D a C T D b with a, b ∈ R n 0 given by (7). Furthermore, C ∈ K n if and only if there exist v, w ∈ R n 0 such that the Cauchy-like matrix K = D v CD w is orthogonal. The latter factorization yields the representation or, equivalently, Comparing entrywise the matrices in the two sides of the latter equation, we see that the identity Conversely, let a i b j > 0. Then let σ = sign(a i ), w i = ± √ σ a i and v i = ± √ σ b i for i = 1, . . . , n and define K = D v CD w . We obtain w 2 i v 2 j = a i b j and the identity K T = K −1 can be derived by reversing the order of the previous arguments. y) where the vectors x and y fulfill (4). Then, C ∈ K n if and only if the nodes x i and y j interlace, that is, either x 1 < y 1 < x 2 < y 2 < · · · < x n < y n or y 1 < x 1 < y 2 < x 2 < · · · < y n < x n . More precisely, we have the first sequence of inequalities when the numbers a i , b i in (7) are all negative and the second sequence when they are all positive.
Proof Firstly, we prove that the node interlacing condition is necessary for having C ∈ K n . Arguing by contradiction, let v, w ∈ R n 0 be such that K = D v CD w is orthogonal and suppose that the nodes x i and y j do not interlace. Then at least one of the following conditions is true: In the first case consider the identity By hypothesis, y j < x i if and only if y j < x i+1 . Hence, the terms 1/(x i − y j ) and 1/(x i+1 − y j ) have the same sign for j = 1, . . . , n. Consequently, the rightmost expression in the previous equation is nonzero and we have a contradiction. Case (2) can be treated analogously by considering the formula for (K T K) i,i+1 and interchanging the role of x and y. Conversely, suppose that the entries of x and y interlace as follows: Let (8). Then From (7) we obtain sign(a i ) = sign(b i ) = −1 and the claim follows from Lemma 3. The case where y 1 < x 1 < y 2 < x 2 < · · · < y n < x n can be treated analogously by interchanging the role of x and y. Here we have sign(a i ) = sign(b i ) = +1, and this completes the proof.
Remark 1 As shown in the preceding theorem, the set K n splits into two disjoint subsets, K n = K 1 n ∪ K 2 n and K 1 n ∩ K 2 n = ∅, where K 1 n contains all Cauchy matrices whose nodes (reordered as in (4)) fulfill the inequalities x 1 < y 1 < · · · < x n < y n and K n 2 consists of the Cauchy matrices such that y 1 < x 1 < · · · < y n < x n . Thus, up to permutations of rows and column, the sign patterns of matrices in K 1 n and K 2 respectively. Any sign pattern that can be traced to one of the above by permuting rows and columns can be realized by a Cauchy matrix in K n 1 or K n 2 , respectively. Observing these patterns it is not difficult to realize that these two sets are invariant under matrix transposition, that is, C ∈ K i n ⇔ C T ∈ K i n for i = 1, 2. Furthermore, using the inversion formula (6) and Lemma 3, we also conclude The next result characterizes all orthogonal Cauchy-like matrices having prescribed nodes x i and y i that verify the interlacing inequalities in Theorem 4.
Corollary 5 Let C = Cauchy(x, y) ∈ K n , and let a i , b i be as in (7). The Cauchy- Proof The equations above can be rewritten as Since v, w ∈ R n 0 by construction, the matrices D v and D w are invertible and condition (10) implies the identity K T = K −1 . Indeed, By matching the (i, j )-entry of the leftmost and rightmost matrices in the previous equation we find . . , n, which implies (10).

Supplementary results
This section is divided into sub-sections dedicated to showing various algebraic and computational properties of orthogonal Cauchy-like matrices. First, we illustrate their relationships with secular equations and a family of quasiseparable matrices. Next, we show their occurrence in the construction of orthogonal rational functions with free poles, and specialize to orthogonal Cauchy-like matrices the characterization of Cauchy matrices obtained in [4]. Lastly, we extend results from [12,13] to provide a complete description of matrix sets that are simultaneously diagonalized by orthogonal Cauchy-like matrices.

Secular equations and quasiseparable matrices
Let A n ⊂ R n×n be the set of all matrices A ∈ R n×n that admit the decomposition where in addition the entries of x = (x 1 , . . . , x n ) T are pairwise distinct. Thus A n consists of particular symmetric, irreducible matrices that can be decomposed into the sum of a diagonal and a rank-one matrix. Our next goal is to prove that orthogonal Cauchy-like matrices are exactly the eigenvector matrices of matrices belonging to A n . One part of this claim is actually known. Indeed, consider the following theorem.

Theorem 6 Let
The eigenvalues of A are equal to the n roots y 1 , . . . , y n of the rational function The corresponding normalized eigenvectors k 1 , . . . , k n are given by The preceding theorem merely restates results of Golub [14] and of Bunch, Nielsen and Sorensen [15] who added formula (12); see also [16,Lemma 10.3]. The nonlinear equation r(t) = 0 with r(t) as in (11) is known as a secular equation and recurs in a variety of modified matrix eigenvalue problems [14,15]. A close look at (12) shows that the eigenvector matrix of A is an orthogonal Cauchy-like matrix. Indeed, the vector k i is the i-th column of the matrix K = D v Cauchy(x, y)D w where w i = 1/ (D x − y i I ) −1 v 2 normalizes the i-th column of K to unit 2-norm. We prove hereafter that this result can be somewhat reversed, that is, every orthogonal Cauchy-like matrix is the eigenvector matrix of some matrix A ∈ A n .
Proof From the displacement equation D x K − KD y = vw T we easily get The left-hand side of this equation is symmetric. Hence, the rank-one term vw T K T must be symmetric too, that is, we can set Kw = −αv for some scalar α = 0.
Actually, the value of α can be obtained from the identity αvv T = KD y K T − D x as follows: Hence (13) follows. We conclude that KD y K T is the spectral factorization of A = D x + αvv T ∈ A n .
Remark 2 Theorem 7 proves that the orthogonal matrix K = D v CD w with C = Cauchy(x, y) ∈ K n diagonalizes the matrix A = D x + αvv T . In view of the splitting K n = K n 1 ∪ K n 2 shown in Remark 1, it is worth pointing out that the sign of α determines whether C belongs to K 1 n or K 2 n . Indeed, from (13) we have sign(α) = sign n i=1 (y i −x i ) . Owing to the interlacing conditions in Theorem 4, we conclude that C ∈ K 1 n ⇔ α > 0.

An inverse problem for orthogonal rational functions
The QR factorization of Cauchy-like matrices can be performed in O(n 2 ) arithmetic operations by taking advantage of the displacement structure, and allows to efficiently orthogonalize a given set of rational functions with respect to a discrete inner product [17,18]. In fact, the numerical computation of rational orthogonal functions with prescribed poles is an interesting problem related to the numerical solution of inverse eigenvalue problems with quasiseparable matrices and of secular equations, see Chapter 14 of [16]. In this section, we propose a different approach to the construction of orthogonal rational functions, considering the poles as variables. More precisely, we want to solve the following problem.
Let ω = (ω 1 , . . . , ω n ) T ∈ R n 0 , C = Cauchy(x, y) and consider the Cauchy-like matrix K = D ω C, i.e., It is readily seen that the inner product ϕ i , ϕ j coincides with the (i, j )-entry of the matrix K T K. Hence, solving Problem 1 amounts to constructing the matrix K so that its columns are orthogonal, given the coefficients ω i and x i . In what follows, we present a complete description of the set of solutions to Problem 1 that is also amenable to numerical methods for computing the required poles y 1 , . . . , y n .
Introduce the rational function Note that this function depends only on the data x, ω ∈ R n of Problem 1. In particular, its poles are the nodes of the discrete inner product and not the poles of the sought functions ϕ 1 (t), . . . , ϕ n (t). As the following result shows, the latter are the solutions of the secular equation f (t) = α for some α = 0. Furthermore, lim t=±∞ f (t) = 0. Hence, for any fixed nonzero scalar α there exist distinct numbers y 1 , . . . , y n such that f (y i ) = α. Figure 1 illustrates a scenario where α > 0 (left panel) or α < 0 (right panel). In particular, if α < 0 then the numbers x i and y i can be reordered so that x 1 < y 1 < x 2 < y 2 < · · · < x n < y n while for α > 0 we obtain the other inequalities in the claim of Theorem 4.
The solutions of the secular equation f (x) = α with α = 0 and f (t) as in (15) generally have no closed form. However, due to the relevance of the secular equation in numerical linear algebra, a wealth of efficient and accurate numerical methods are available to solve it, see, e.g., [19,20]. In particular, it is worth noting that the matrices in A n also belong to the wider class of quasiseparable matrices, for which very efficient methods of computing eigenvalues are available, see [16]. The equivalence of the eigenproblem for matrices in A n , the solution of the secular equation and the construction of orthogonal Cauchy-like matrices have been developed in the previous Conversely, let y 1 , . . . , y n be the eigenvalues of A = D x + βωω T where β = 0 is arbitrary. Then y 1 , . . . , y n are the poles of a solution to Problem 1.
Proof If y 1 , . . . , y n solve Problem 1 then the matrix K in (14) has orthogonal columns. Hence there exists a vector w ∈ R n 0 such that K = KD w is an orthogonal matrix. By Theorem 7, there exists A ∈ A n such that A = KD y K T , namely, Conversely, let y 1 , . . . , y n be the eigenvalues of A = D x +βωω T for some β = 0. From Theorem 6 we have where f (t) is as in (15). We only need to apply Theorem 8 with α = −1/β to get that the poles y 1 , . . . , y n solve Problem 1, and the proof is complete.

Characterization of K n in terms of Cauchy pairs
We borrow from [4] the following definition. Actually, the original definition in the paper cited above is stated in terms of invariant subspaces rather than eigenvalues and applies to arbitrary fields. The one given here is equivalent to the real-valued case of that in [4]. The main result in that paper characterizes Cauchy matrices in terms of Cauchy pairs. More precisely, the pair (A, B) is a Cauchy pair if and only if the matrices A and B admit diagonalizations Cauchy(α, β). This fact allows the author of [4] to derive a bijection between suitably defined equivalence classes of Cauchy pairs and permutationally equivalent Cauchy matrices. The goal of this section is to prove a similar characterization for matrices in K n . (A, B) be a Cauchy pair where A, B ∈ R n×n are symmetric matrices. Let α 1 , . . . , α n and β 1 , . . . , β n be the eigenvalues of A and B, respectively. Then

Lemma 10 Let
Proof Let A = UD α U T and B = V D β V T be spectral factorizations of the given matrices A and B where U and V are orthogonal matrices. Since rank(A − B) = 1, there exists z ∈ R n and a scalar σ = 0 such that A − B = σ zz T , implying that Now, let K = U T V , p = σ U T z and q = V T z. It is immediate to see that K is an orthogonal matrix that verifies the identity D α,β (K) = pq T , that is, K is an orthogonal Cauchy-like matrix. By Lemma 1, that matrix admits the factorization K = D v CD w where C = Cauchy(α, β) and v, w ∈ R n 0 . In particular, C ∈ K n . Finally, define X = UD v and Y = V D −1 w . Note that X and Y are invertible. Since diagonal matrices commute, we have and the proof is complete.
Proof By assumption, there exist v, w ∈ R n 0 such that K = D v CD w is orthogonal. From the displacement equation D α K − KD β = vw T we get Thus (D α , KD β K T ) is a Cauchy pair of symmetric matrices. Now, consider a spectral factorization A = UD α U T with an orthogonal matrix U . Define B = UKD β K T U T and the claim follows.
Putting the previous two lemmas together, we easily get our next result.

Corollary 12
Let C be a Cauchy matrix. We have C ∈ K n if and only if there is a Cauchy pair of symmetric matrices (A, B)

Related matrix algebras
Previous results show that every matrix in the set A n is diagonalized by an orthogonal Cauchy-like matrix. Equivalently, for any given orthogonal Cauchy-like matrix K there exists a diagonal matrix such that K K T belongs to A n . The goal of this section is to characterize all matrices that are diagonalized by a given orthogonal Cauchy-like matrix. More precisely, for a given orthogonal matrix X ∈ R n×n let This set is a commutative matrix algebra, that is, a vector space that is closed under matrix multiplication, of dimension n. If X is an orthogonal Cauchy-like matrix then Theorem 7 proves that L(X) has nonempty intersection with A n .
Corollary 13 below provides a complete description of L(X) when X is an orthogonal Cauchy-like matrix. A similar characterization has been carried out in [12,13] in the case where X = D v CD w is an orthogonal Cauchy-like matrix such that v = 1 or, more generally, v = α1 for some α = 0. The interest in that special case arises in the construction of matrix algebras of Loewner matrices.

Corollary 13
Let K = D v CD w be an orthogonal Cauchy-like matrix where C = Cauchy(x, y) ∈ K n and v and w are normalized so that the constant α in (10) equals 1

. Then A ∈ L(K) if and only if
where Av = z. In this case, the eigenvalues of A are the entries of the vector λ = C T D v z.
Proof Let A = K K T be the spectral factorization of an arbitrary matrix A ∈ L(K). Then A is a symmetric matrix such that where we set z = K w. By assumption and (7), To prove the formula for the eigenvalues, let λ = 1 be the vector containing the eigenvalues of A. By the previous arguments we have the identity z = K w = KD w λ. Then λ = D −1 w K T z = C T D v z, as claimed. Moreover, this identity shows that the linear map z → λ is invertible. Noting that the map λ → KD λ K T is linear and invertible, we conclude that the compound map z → KD λ K T is a vector space isomorphism between R n and L(K), and the proof is complete.
The characterization provided by the corollary above allows to recover the entries of a generic matrix A ∈ L(K) from the knowledge of the vectors x, v and z. Indeed, let 1 ≤ i, j ≤ n be distinct integers. The displacement formula (16) yields Thus the off-diagonal entries of A admit the expression The diagonal entries of A cannot be retrieved from the previous formula, since the displacement operator D x,x is singular and its kernel consists precisely of the diagonal matrices. However, the matrix A can be identified by means of the additional information provided by the identity Av = z. In fact, with the previous notation we have Recall that v i = 0 due to the nonsingularity of K = D v CD w . Finally, the identity Av = z suggests a method for calculating eigenvalues of A other than the one in the theorem. From K K T v = z we obtain K T z = K T v. Hence, the i-th eigenvalue of A is λ i = (K T z) i /(K T v) i , for i = 1, . . . , n.

A numerical example
In this section, we consider a sequence of orthogonal Cauchy-like matrices of arbitrary order n. The construction is based on properties of Chebyshev polynomials, which make the matrices easily computable. Let T n (x) and U n (x) denote the n-th degree Chebyshev polynomials of the first and second kind, respectively: for x ∈ [−1, 1], T n (x) = cos(n arccos x), U n (x) = sin((n + 1) arccos x) sin(arccos x) .
For any fixed integer n ≥ 1, define the polynomials p(x) and q(x) in (8) as follows: p(x) = T n (x), q(x) = (x + 1)U n−1 (x). Actually, the polynomials in (8) are monic, while these are not. However our construction does not depend on p(x) and q(x) being monic. Indeed, the products a i b j entering the expression of the entries of the sought orthogonal matrix are unaffected by scaling p(x) and q(x) by arbitrary (nonzero) constants, as a consequence of (7). Consider the roots of p(x) and q(x) as nodes of a Cauchy matrix. Numbering them in ascending order, we have respectively, for i = 1, . . . , n. Hence −1 = y 1 < x 1 < y 2 < x 2 < · · · < x n < 1. Furthermore, Using known formulas for the differentiation of Chebyshev polynomials, after some simplification we get By (7), the coefficients of the vectors a and b are Note that a i > 0 and b i > 0 for i = 1, . . . , n, as expected from Theorem 4. The aforementioned construction of orthogonal Cauchy-like matrices is described in the following statement.

Corollary 14
For any fixed integer n ≥ 1 and i = 1, . . . , n let x i , y i , a i , b i be as in (17) and (18). Moreover, let v i = √ b i and w i = √ a i for i = 1, . . . , n.
• The Cauchy-like matrix K = (v i w j /(x i − y j )) is orthogonal.
• The matrix A = D x −vv T ∈ A n admits the spectral factorization A = KD y K T . Moreover, the matrix B = D y + ww T ∈ A n admits the spectral factorization Proof The first claim follows from (6) and Corollary 5. The first part of the last claim is a consequence of Theorem 7. Indeed, we have i y i − x i = −1 and v T v = i b i = 1, thus α = −1 in (13). The second part can be deduced from the previous one by noting that B = K T (A + vv T )K = D y + K T vv T K and To avoid numerical cancellation, the denominators x i − y j appearing in the entries in the matrix Cauchy(x, y) can be computed using the right-hand side of the formula which do not involve subtraction of similar quantities, thus avoiding numerical cancellation. Analogously, the formulas for a i and b i in (18) can be revised as follows: (20) These alternative formulas provide a significant improvement on the quality of calculations in machine arithmetic. Figure 2 illustrates the growth of K T K −I where K is the orthogonal Cauchy-like matrix defined in Corollary 14 with respect to n = 2 k for k = 2, . . . , 12. This measurement quantifies the lack of orthogonality of K due to finite precision computations. Matrix norms are the ∞-norm (left panel) and Frobenius norm (right panel). The results obtained using the formulas in Corollary 14 are graphed with blue diamonds, while the red circles represent the results obtained with the subtraction-free formulas (19) and (20). Dotted lines plot the functions y = nu and y = n 2 u, where u ≈ 2.2 · 10 −16 is the machine precision, and are included for eye guidance. Computations are performed in standard floating-point arithmetic using MATLAB© R2021a on a computer equipped with a 1.4GHz Intel i5 dual-core processor and 8GB RAM. The similarity of the two graphs seems to indicate that the errors in forming K T K due to computer arithmetics are strongly localized. Actually, close observations show that errors accumulate mainly on the computed diagonal entries of the matrix product. Anyway, errors arising from the use of the modified formulas (19) and (20) are consistent with a relative perturbation of the order of u in the entries of K.
In this work, we have provided a complete characterization of orthogonal matrices with a Cauchy-like structure. Moreover, we have highlighted their relationships with the solution of secular equations, the diagonalization of symmetric quasiseparable matrices, and the computation of orthogonal rational functions with free poles. Furthermore, we have found all matrices that are diagonalized by matrices of that type. These results were obtained by making extensive use of the displacement structure of the matrices involved.
Interest in orthogonal Cauchy-like matrices originally stemmed from their appearance in the design of special filters for signal processing purposes [3]. However, our results may have more than just theoretical interest. In fact, linear systems with various displacement-structured matrices can be solved in numerically efficient ways by means of algorithms built around so-called fast orthogonal transforms, i.e., matrixvector products with orthogonal matrices that can be performed with algorithms using O(n log n) arithmetic operations [2,5,21]. Using the notation introduced in Section 2, the basic technique is as follows. Let A ∈ S r M,N and let U, V be invertible matrices. Then, UAV ∈ S r P ,Q where P = UMU −1 and Q = V −1 NV . In this way, different displacement-structured matrix spaces can be transformed into each other. This technique is at the basis of viable numerical algorithms for numerical linear algebra with displacement-structured matrices. Indeed, the matrices U and V above are often related to Fourier-type trigonometric transforms, which are fast, numerically stable and allow their effective parallelization. Matrix-vector products with Cauchy-like matrices can also be performed in comparable polylogarithmic arithmetic complexity, see, e.g., [21,22] owing to the diagonally scaled form of those matrices. Therefore, fast transforms based on orthogonal Cauchy-like matrices could be considered in the design of new structured linear solvers, transforming matrices through different structures. Admittedly, the issue of numerical stability of this kind of calculation is quite controversial. While some authors claim that in practice fast algorithms for matrix-vector multiplication with Cauchy-like matrices perform satisfactorily [21], known error analyses are not always supportive. On the other hand, a possible decrease in numerical stability of fast linear solvers can be compensated for by iterative refinement techniques as suggested in, e.g., [7]. In addition, the knowledge of matrix algebras that are simultaneously diagonalized by orthogonal Cauchy-like matrices could be exploited for structured eigensolvers.
Finally, it seems appropriate to shed light on the possible construction of orthogonal Cauchy-like matrices with displacement rank greater than 1 and other displacement structures, making room for further work. For this purpose, it is helpful to recall more properties of the displacement operators D M,N and their associated rank-structured spaces S r M,N introduced in Section 2. A matrix having displacement rank r > 1 can be written as the sum of (at most) r matrices having displacement rank 1. However, this decomposition is unsuitable for the numerical treatment of orthogonal matrices. On the other hand, higher displacement-rank matrices can be factored in terms of low displacement-rank factors, by considering appropriate displacement operators. Indeed, let X and Y be two displacement-structured matrices, X ∈ S p M,N and Y ∈ S q N,P . It is not difficult to verify that D M,P (XY ) = D M,N (X)Y + XD N,P (Y ). Hence XY ∈ S p+q M,P . An immediate consequence of this is that the product of k orthogonal Cauchy-like matrices is an orthogonal matrix with displacement rank k. Accordingly, the orthogonal Cauchy-like matrices discussed in this paper can be used to build higher displacement-rank orthogonal matrices in factorized form. Conversion to other displacement-structured spaces can be carried out as mentioned above.
Funding Open access funding provided by Università degli Studi di Udine within the CRUI-CARE Agreement. Partial financial support was received from GNCS-INdAM, Italy.
Data availability Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

Competing interests
The author declare no competing interests.
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.