1 Introduction

We denote by \(\mathbb{C}^{n\times n}\) and \(\mathbb{R}^{n\times n}\) the sets of \(n \times n\) complex matrices and \(n \times n\) real matrices, respectively. For a positive integer n, let \(S_{n}\) be the set of all n! permutations of \(\{1, 2, \ldots, n\}\). If \(A=(a_{i,j}) \in\mathbb{C}^{n\times n}\) and \(\sigma\in S_{n}\), then the sequence \(a_{1,\sigma(1)}, a_{2,\sigma(2)}, \ldots, a_{n,\sigma (n)}\) is called the transversal of A [2]. Let \(A\in \mathbb{C}^{n\times n}\), \(1\leq i_{1} \leq i_{2}\leq\cdots\leq i_{k}\leq n\), and \(1\leq j_{1} \leq j_{2}\leq\cdots\leq j_{s}\leq n\). We denote by \(A[i_{1}, i_{2}, \ldots, i_{k}|j_{1}, j_{2}, \ldots, j_{s}]\) the \(k\times s\) submatrix of A that lies in the rows \(i_{1}, i_{2}, \ldots, i_{k}\) and columns \(j_{1}, j_{2}, \ldots, j_{s}\). Denote by \(A(i_{1}, i_{2}, \ldots, i_{k}|j_{1}, j_{2}, \ldots, j_{s})\) the \((n-k)\times(n-s)\) submatrix of A obtained by deleting the rows \(i_{1}, i_{2}, \ldots, i_{k}\) and columns \(j_{1}, j_{2}, \ldots, j_{s}\). A matrix \(A=(a_{i,j})\in\mathbb{C}^{n\times n}\) is called diagonally magic if

$$ \sum_{i=1}^{n} a_{i,\sigma(i)}=\sum_{i=1}^{n} a_{i,\pi(i)} $$

for all \(\sigma, \pi\in S_{n}\).

Obviously, the zero matrix \(0_{n\times n}\) and \(J=[1]_{n\times n}\), the matrix of all ones, are diagonally magic matrices. In [1], we prove that

$$ B_{n}= \begin{pmatrix} 1&2 & \cdots& n\\ n+1&n+2 & \cdots& 2n\\ \vdots&\vdots& \ddots& \vdots\\ (n-1)n+1&(n-1)n+2 & \cdots& n^{2} \end{pmatrix} $$

and the Henkel matrix

$$ C_{n}= \begin{pmatrix} 1&2 & \cdots& n\\ 2&3 & \cdots& n+1\\ \vdots&\vdots& \ddots& \vdots\\ n&n+1 & \cdots& 2n-1 \end{pmatrix} $$

are diagonally magic matrices. So, there are a lot of diagonally magic matrices. The nonnegative matrices \(B_{n}\) and \(C_{n}\) have been a hot research area [3, 4].

2 Main result

The rank inequality for diagonally magic matrices can be stated as follows.

Theorem 2.1

([1], Theorem 2.1)

Let \(A \in\mathbb{C}^{n\times n}\) be a diagonally magic matrix. Then \(\operatorname {rank}(A)\leq2\).

There are diagonally magic matrices of ranks 0, 1, 2. Indeed, \(\operatorname {rank}(0_{n\times n} )=0\), \(\operatorname {rank}([1]_{n\times n} )=1\), and \(\operatorname {rank}(B_{n})=\operatorname {rank}(C_{n})=2\).

The purpose of this note is to give a simple proof of Theorem 2.1. Our proof depends only on the following fact.

Theorem 2.2

Let \(C =(c_{i,j})\in\mathbb{R}^{n\times n}\) be a positive matrix with

$$\prod_{i=1}^{n}c_{i,\gamma(i)}=\prod _{j=1}^{n}c_{j,\tau(j)} $$

for all \(\gamma, \tau\in S_{n}\). Then there exist positive diagonal matrices \(X=\operatorname {diag}(x_{1},x_{2},\ldots,x_{n})\) and \(Y= \operatorname {diag}(y_{1},y_{2},\ldots,y_{n})\) such that

$$C=XJY. $$

Proof

Let B be a \(k\times k\) submatrix of C. Then there are row and column indices \(\alpha=(i_{1},i_{2},\ldots,i_{k})\) and \(\beta=(j_{1},j_{2},\ldots,j_{k})\) such that \(B=C[\alpha|\beta]\). Note that the union of a transversal of B and a transversal of \(C(\alpha|\beta)\) is a transversal of C. Choose an arbitrary but fixed transversal T of the square matrix \(C(\alpha|\beta)\). For any \(\sigma,\pi\in S_{k}\), \(c_{i_{1},j_{\sigma(1)}},\ldots ,c_{i_{k},j_{\sigma(k)}}\) and the entries in T constitute a transversal of C, whereas \(c_{i_{1},j_{\pi(1)}},\ldots,c_{i_{k},j_{\pi(k)}}\) and the entries in T also constitute a transversal of C. Let b be the product of the entries in T. Obviously, \(b>0\). Since

$$\prod_{i=1}^{n}c_{i,\gamma(i)}=\prod _{j=1}^{n}c_{j,\tau(j)} $$

for all \(\gamma, \tau\in S_{n}\), we have

$$b\prod_{t=1}^{k} c_{i_{t},j_{\sigma(t)}}=b\prod _{t=1}^{k} c_{i_{t},j_{\pi(t)}}, $$

which yields

$$\prod_{t=1}^{k} c_{i_{t},j_{\sigma(t)}}=\prod _{t=1}^{k} c_{i_{t},j_{\pi(t)}}. $$

Particularly, this shows that any \(2\times2\) submatrix

$$B= \begin{pmatrix} c_{i_{1},j_{1}} &c_{i_{1},j_{2}}\\ c_{i_{2},j_{1}}&c_{i_{2},j_{2}} \end{pmatrix} $$

of C satisfies

$$ c_{i_{1},j_{1}}c_{i_{2},j_{2}}=c_{i_{1},j_{2}}c_{i_{2},j_{1}}. $$
(1)

For any \(x_{1}>0\), let

$$ y_{j}=\frac{c_{1,j}}{x_{1}} $$
(2)

for \(j=1,2,\ldots,n\) and

$$ x_{i}=\frac{c_{i,1}}{c_{1,1}} x_{1} $$
(3)

for \(i=2,3,\ldots,n\). According to (1), (2), and (3), we have

$$c_{i,j}=\frac{c_{i,1}c_{1,j}}{c_{1,1}}=x_{i}y_{j} $$

for all \(i,j=1,2,\ldots,n\). Let \(X=\operatorname {diag}(x_{1},x_{2},\ldots,x_{n})\) and \(Y= \operatorname {diag}(y_{1},y_{2},\ldots,y_{n})\). Obviously, X and Y are positive diagonal matrices, and we have

$$C=XJY. $$

This completes the proof. □

We are now ready to present our proof of Theorem 2.1.

Proof of Theorem 2.1

First, let A be real. Let A be a diagonally magic matrix. Then the elementwise exponential \(C =\exp (A):=(c_{i,j})\in\mathbb{R}^{n\times n}\) is a positive matrix with all permutation products equal. Hence, by Theorem 2.2 it is diagonally equivalent to J, the all-1s matrix, that is,

$$c_{i,j} = x_{i}y_{j},\quad i,j=1, 2, \ldots, n, $$

for suitable positive vectors \(x=(x_{1}, x_{2}, \ldots, x_{n})^{T}\) and \(y=(y_{1}, y_{2}, \ldots, y_{n})^{T}\). Hence, if \(q = (\log (x_{1}), \log (x_{2}), \ldots, \log (x_{n}) )^{T}\) and \(r= (\log (y_{1}), \log(y_{2}), \ldots, \log (y_{n}) )^{T}\), then

$$a_{i,j} = \log (x_{i}) + \log (y_{j}),\quad i,j=1, 2, \ldots, n. $$

Hence,

$$A = e_{n}\cdot q^{T} + r\cdot e_{n}^{T}, $$

where \(e_{n}=(\underbrace{1, 1,\ldots, 1}_{n})^{T}\). Thus, A is the sum of two matrices of rank 1 and, hence, at most of rank 2.

Now let A be complex, so that

$$A = B + iC \quad (B, C \text{ real}). $$

Since A is a diagonally magic matrix, so are B and C. Hence, both B and C are of the form

$$\begin{aligned}& B=e_{n}\cdot q_{1}^{T} + r_{1}\cdot e_{n}^{T} \quad \text{with } q_{1}, r_{1} \text{ real}, \\& C=e_{n}\cdot q_{2}^{T} + r_{2}\cdot e_{n}^{T} \quad \text{with } q_{2}, r_{2} \text{ real,} \end{aligned}$$

and hence

$$ A=e_{n}\cdot(q_{1}+iq_{2})^{T} + (r_{1}+ir_{2})\cdot e^{T}_{n}. $$
(4)

The matrix A has rank at most 2. This completes the proof. □

According to (4), we obtain that a diagonally magic matrix A can be presented in the form

$$ A=e_{n}\cdot x^{T} + y\cdot e^{T}_{n}= \begin{pmatrix} x_{1}+y_{1}&x_{1}+y_{2} & \cdots& x_{1}+y_{n}\\ x_{2}+y_{1}&x_{2}+y_{2} & \cdots& x_{2}+y_{n}\\ \vdots&\vdots& \ddots& \vdots\\ x_{n}+y_{1}&x_{n}+y_{2} & \cdots& x_{n}+y_{n} \end{pmatrix} . $$
(5)

If \(A=(a_{i,j}) \in\mathbb{C}^{n\times n}\) is a diagonally magic matrix, for any \(x_{1}\in\mathbb{C}\), let

$$y_{j}=a_{1,j}-x_{1} $$

for \(j=1,2,\ldots,n\) and

$$x_{i}=a_{i,1}-a_{1,1}+x_{1} $$

for \(i=2,3,\ldots,n\). By (5) we have

$$a_{i,j}=x_{i}+y_{j} $$

for all \(i,j=1,2,\ldots,n\). For example,

$$\begin{aligned} B_{n} =& \begin{pmatrix} 1&2 & \cdots& n\\ n+1&n+2 & \cdots& 2n\\ \vdots&\vdots& \ddots& \vdots\\ (n-1)n+1&(n-1)n+2 & \cdots& n^{2} \end{pmatrix} \\ =& \begin{pmatrix} 0+1&0+2 & \cdots& 0+n\\ n+1&n+2 & \cdots& n+n\\ \vdots&\vdots& \ddots& \vdots\\ (n-1)n+1&(n-1)n+2 & \cdots& (n-1)n+n \end{pmatrix} \end{aligned}$$

and

$$ C_{n}= \begin{pmatrix} 1&2 & \cdots& n\\ 2&3 & \cdots& n+1\\ \vdots&\vdots& \ddots& \vdots\\ n&n+1 & \cdots& 2n-1 \end{pmatrix} = \begin{pmatrix} 1+0&1+1 & \cdots& 1+(n-1)\\ 2+0&2+1 & \cdots& 2+(n-1)\\ \vdots&\vdots& \ddots& \vdots\\ n+0&n+1 & \cdots& n+(n-1) \end{pmatrix} . $$

By (5) we can get the characteristic polynomial, the eigenvalues, and the eigenvectors of A. In fact, the characteristic polynomial of A is

$$ p_{A}(\lambda)=\lambda^{n-2} \Biggl( \lambda^{2}-\lambda \Biggl(\sum_{i=1}^{n}(x_{i}+y_{i}) \Biggr)+\sum_{i=1}^{n}x_{i}\sum _{j=1}^{n}y_{j}-n\sum _{i=1}^{n}(x_{i}y_{i}) \Biggr). $$
(6)

From (6) we can see that the algebraic multiplicity of the eigenvalue 0 of the diagonally magic matrix A is at least \(n-2\).