Sums of Square-Zero Infinite Matrices Revisited

We improve some earlier results and prove that every N×N\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\mathbb {N}\times \mathbb {N}$$\end{document} column finite matrix over a field of characteristic different from 2 is a sum of at most 10 square-zero matrices.


Introduction
Representing the elements of an algebra as the sums or products of elements with some particular properties is of interest of many authors. One of such 'properties' is being square-zero. It is known [9] that n ×n complex matrix can be written as a sum of square-zero matrices if and only if its trace is equal to 0. It is also known [1] that every linear bounded operator is a sum of (at most) 64 square-zero operators. Recently, Hou, Li and Zheng [3] showed that every infinite strictly upper triangular matrix over an associative ring with identity can be written as a sum of at most 4 square-zero matrices.
In [7,8] it was shown that every N × N column finite, i.e., having only a finite number of nonzero entries in each column, matrix defined over a field F such that char(F) = 2 can be written as a sum of (at most) 12 square-zero matrices.
In this short note we will improve the result from [7,8] and prove the following: Additionally, we will show an example of a class of matrices for which 8 square-zero matrices are sufficient. More precisely we will also prove.
then any N × N column finite matrix a whose i-th diagonal entry is equal to a ii can be written as a sum of 8 square-zero matrices.

Proofs
Let's introduce the notation. The algebra consisting of all of N×N column finite matrices over a field F will be denoted by and N LT C f (F) for its subalgebras of upper triangular, strictly upper triangular, diagonal and strictly lower triangular column finite matrices, respectively. By j ∞ we will mean the infinite Jordan block, i.e., For any a and any invertible b, by a b we will denote the element b −1 ab. Before we prove the main theorems let's cite some auxiliary results. From Lemma 2.1 given in [4] one can deduce the following.
Now, we move to our problem. Let a ∈ M C f (F) be an arbitrary matrix. We write it as the following sum: The below proposition refers to l from the decomposition given above.

Proposition 2.3 [7, Prop.2.5] Let F be a field and let l ∈ N LT ∞ (F) be a column finite matrix. Then l is a sum of at most 4 square-zero matrices.
We will focus on t.
In investigations from [7] a theorem from [9] was used. Namely, the theorem stating that n × n matrix over C is a sum of (some number of) square-zero matrices if and only if its trace is 0. Moreover, any 2 × 2 complex matrix is a sum of at most 2 square-zero matrices. In our case it could be generalized for fields of characteristic different from 2. Let's explain this issue in more details. The proof presented by Wang and Wu uses the fact that every matrix with zero trace defined over a field of characteristic 0 is similar to a matrix with zero main diagonal. This statement can be found in [2] as an exercise. In the same task one reads that if char(F) = 0, then the above conclusion is false. However, there are always some exceptions. In particular we have

Lemma 2.4 Let F be a field of characteristic different from 2. For any a
Proof It can be checked that for, whereas for any b = 0, the matrix, Notice that the assumption char(F) = 2 in the above lemma is indispensable. From the latter result it follows immediately. Now we get back to t defined with accordance to (2.1). Instead of writing it as a sum of a diagonal and strictly triangular matrix, we decompose it as follows.
Obviously t 2 is strictly upper triangular matrix with no zero entries in the first superdiagonal, whereas t 1 has nonzero entries only in the main and (maybe) the first superdiagonal. Thus, by Lemma 2.1, t 2 is similar to j ∞ . Therefore we have, Lemma 2.6 For any field F, the matrix t 2 defined as in (2.2) is a sum of 2 square-zero matrices.
Consider now t 1 . Decompose it into, with t 3 , t 4 are of the forms, where x i , y i can be found inductively, using the following recurrence: where x i are defined as in (2.4). Clearly, in this case (t 1 ) ii = y i for all i ∈ N. One can check that, Since a satisfies assumption (1.1), t also satisfies it. Hence, the elements of the main diagonal of t 1 , are pairwise different. By Lemma 2.2, this means that for some x ∈ T ∞ (F) the matrix t x 1 is diagonal and (t x 1 ) ii = y i with y 2i = −y 2i−1 . By our previous considerations, t x 1 is a sum of 2 square-zero matrices. Clearly, d is a sum of 2 squarezero matrices as well. Concluding, t is a sum of 4 square-zero matrices, and our result follows.
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.