An existence and uniqueness result about algebras of Schwartz distributions

We prove that there exists essentially one {\it minimal} differential algebra of distributions $\A$, satisfying all the properties stated in the Schwartz impossibility result [L. Schwartz, Sur l'impossibilit\'e de la multiplication des distributions, 1954], and such that $\C_p^{\infty} \subseteq \A \subseteq \DO' $ (where $\C_p^{\infty}$ is the set of piecewise smooth functions and $\DO'$ is the set of Schwartz distributions over $\RE$). This algebra is endowed with a multiplicative product of distributions, which is a generalization of the product defined in [N.C.Dias, J.N.Prata, A multiplicative product of distributions and a class of ordinary differential equations with distributional coefficients, 2009]. If the algebra is not minimal, but satisfies the previous conditions, is closed under anti-differentiation and the dual product by smooth functions, and the distributional product is continuous at zero then it is necessarily an extension of $\A$.


Introduction
The Schwartz famous impossibility result [17] states that: Theorem 1.1. There is no associative algebra (G, +, ⊛) satisfying the following properties: (A1) The space of Schwartz distributions D ′ over R is linearly embedded into G. If we replace (A1) by: p is the set of piecewise smooth functions, then: The products * M , M ⊆ R (cf. Definition 2.7) are extensions (to the case of possible intersecting singular supports) of the product of distributions with disjoint singular supports presented by Hörmander in [pag.55 [11]]. Theorem 1.2 was proved by two of us for the case M = R in [4], and will be (easily) extended to the general case M ⊆ R in section 2.3.
In this paper we want to study the related problem of whether the associative algebras (A, +, * M ) are unique, i.e. the only ones satisfying the conditions (A1') and (A2)-(A4).
Let us introduce the following notation: We say that in G the product ⊛ by smooth functions is continuous (or simply that ⊛ is partially continuous) at F ∈ G iff for every ξ ∈ C ∞ , and every sequence (F n ) n∈N , F n We note that the dual product and the family of products * M (defined in A) are all partially continuous (cf. Theorem 2.11(vi)). We also remark that if ⊛ (defined in G) is partially continuous at zero then it is partially continuous everywhere in G (because ⊛ is bilinear, and G is a vector space).
Our results are summarized in the following Theorem and Corollary: Main Theorem. Let (G, +, ⊛) be an associative algebra of distributions that satisfies the conditions (A1'), (A2)-(A4) given above, and (A5.1) Every F ∈ G is locally a finite order derivative of some G ∈ G ∩ C, (A5. 2) The product ⊛ is partially continuous at zero; then A ⊆ G and the restriction of ⊛ to A is given by * M for some M ⊆ R.
In other words, (A, +, * M ) is a subalgebra of (G, +, ⊛). Remark 1.3. Notice that every F ∈ D ′ is locally a finite order derivative of some continuous function G ∈ C (cf. Theorem 3.4.2, [19]). The condition (A5.1) adds the requirement that if F ∈ G then also G ∈ G. This condition can be replaced by (cf. Remark 3.3): (A5.1') Anti-differentiation and the dual product by smooth functions are inner operations in G.
The conditions (A5.1) and (A5.1') are both satisfied by G = D ′ and G = A. We also note that the conditions (A5.1) and (A5.2) can be replaced by the single, stronger condition (cf. Remark 3.2): (A6) Every F ∈ G is globally a finite order derivative of some G ∈ G ∩ C; which is satisfied by G = A and by G = D ′ (Ω) for arbitrary compact sets Ω ⊂ R. We will see that (A6) implies (A5.2) (and, of course, also (A5.1)).  The proof of this Corollary is straightforward: since the space A satisfies (A1') and (A6), and thus (cf. Remark 1.3) also (A5.1) and (A5.2), it follows from the Main Theorem that ⊛ = * M for some M ⊆ R.
The problem of proving the uniqueness of the algebras (A, +, * M ) was considered before in an article by B. Fuchssteiner that was published in Mathematische Annalen [8] (see also [9]) and recently reviewed in the Ph.D thesis [18]. The main result of [8] is basically our Corollary 1.4. Unfortunately, the paper [8] is not so well-known and came to our knowledge only after we have concluded our own proof of the uniqueness result. In spite of the obvious intersection with the results of [8], we have decided to write down our own results because: (i) our proof is different and, in our view, simpler than the one presented in [8,18]; and (ii) our results are more general, because they do not apply only to the space A, but instead to the family G ⊆ D ′ . In practice this means that we do not impose the restriction that the product ⊛ is an inner operation in A; instead we prove that this is a consequence of the properties (A1'), (A2)-(A5) for a general space G ⊆ D ′ .
Finally, we remark that the algebras (A, +, * M ) provide an interesting setting to obtain intrinsic formulations (i.e. defined within the space of Schwartz distributions) for some classes of differential operators and differential equations with singular coefficients. This approach has been explored namely for Schrödinger operators with point interactions and for ODEs with singular coefficients [5,6,7]. It yields a formulation which is more general than the ones based on other intrinsic products like the model products [1,12,15], and alternative to the non-intrinsic formulations like the ones in terms of Colombeau generalized functions [2,3,10,13,15,16].
In the next section we study the main properties of the product * M and show that for all M ⊆ R, the algebras (A, +, * M ) satisfy the conditions in Theorem 1.2. In section 3 we prove the Main Theorem. Capital roman letters F , G and J denote general distributions; φ, ψ and ξ are smooth functions; and f , g and h are locally integrable functions or regular distributions (we normally use the same notation for both objects). If we need to be more specific, we use the subscript D ′ for regular distributions; for instance f D ′ is the regular distribution associated to the locally integrable function f .
The characteristic function of Ω ⊆ R is written χ Ω ; the Heaviside step function is H = χ R + . As usual δ x is the Dirac measure with support at x; if x = 0 we write only δ. 2.1. General definitions. Let D be the space of smooth functions with support on a compact subset of R, and D ′ is its dual (the space of Schwartz distributions). As usual, supp F denotes the support of F ∈ D ′ , and sing supp F denotes its singular support.
For every locally integrable function f ∈ L 1 loc one defines a regular distri- By abuse of notation, we will usually identify f D ′ with f . The nth-order Schwartz distributional derivative of the distribution F is defined by x t denotes the nth-order classical (pointwise) derivative of t. If f is absolutely continuous, the Schwartz distributional derivative and the classical pointwise derivative (defined a.e.) coincide, i.e.
The dual product of a function φ ∈ C ∞ by a distribution F ∈ D ′ is defined by and it is a generalization of the standard product of functions, i.e.
The dual product is bilinear. Moreover, the distributional derivative D x satisfies the Leibniz rule with respect to the dual product:

2.2.
The multiplicative product * . For a detailed presentation and proofs of the main results, the reader should refer to [4]. Let C ∞ p be the space of piecewise smooth functions on R: f ∈ C ∞ p iff there is a finite set I ⊂ R such that f ∈ C ∞ (R\I) and the lateral limits lim x→x ± 0 f (n) (x) exist and are finite for all x 0 ∈ I and all n ∈ N 0 . We have of course C ∞ p ⊂ L 1 loc . Definition 2.1. Let A be the space of piecewise smooth functions C ∞ p (regarded as regular distributions) together with their distributional derivatives to all orders.
All the elements of A are distributions with finite singular supports. They can be written explicitly in the form: ., m (where x 0 = −∞ and x m+1 = +∞) such that: and ∆ F has support on a subset of I: We have, of course, sing supp F ⊆ I.
Let us recall the definition of the Hörmander product of distributions with non-intersecting singular supports [pag.55, [11]]. Definition 2.3. Let F, G ∈ D ′ be two distributions such that sing supp F and sing supp G are finite disjoint sets. Let {Ω i ⊂ R, i = 1, . . . , d} be a finite covering of R such that, on each open set Ω i , either F or G is a smooth function. The Hörmander product of F by G is defined by Let us emphasise that the Hörmander product is well-defined for all F, G ∈ A provided that sing supp F and sing supp G are finite disjoint sets. We now extend the Hörmander product to the case of distributions with intersecting singular supports (see [4] for details) The product * is defined by where the product F (x) · G(x + ǫ) is the Hörmander product and the limit is taken in the distributional sense.
Notice that for sufficiently small ǫ > 0, F (x) and G(x + ǫ) have disjoint singular supports, hence the Hörmander product in (2.8) is well-defined.
The next theorem provides an explicit formula for F * G. Let F, G ∈ A, let I = (sing supp F ∪ sing supp G) = {x 1 < .. < x m }, and consider the associated set of open intervals Ω i = (x i , x i+1 ), i = 0, .., m (with x 0 = −∞ and x m+1 = +∞). Then, in view of Lemma 2.2, F and G can be written in the form: I\sing supp F, and likewise for ∆ G x i . Then we have: Theorem 2.5. Let F, G ∈ A be written in the form (2.9). Then F * G is given explicitly by Finally, the main properties of the product * are summarized in the following Theorem (cf. Theorems 3.16 and 3.18, [4]): Theorem 2.6. The product * is an inner operation in A, it is associative, distributive, non-commutative and it reproduces the product of continuous functions in A ∩ C. The distributional derivative D x is an inner operator in A and satisfies the Leibniz rule with respect to the product * .
We conclude that the space A, endowed with the product * , is an associative (but non-commutative) differential algebra of distributions that satisfies the properties stated in Theorem 1.2. It is, however, not the unique algebra that satisfies these conditions, as we now show.
2.3. The algebras (A, +, * M ). Let F, G ∈ A be written in the form: We can combine both formulas and obtain a slightly more general product (one that acts as F * G on the points that belong to a given set M ⊆ R, and as G * F on the points that don't belong to M ): Notice that for M = R we have F * M G = F * G and for M = ∅, F * M G = G * F . The next Remark provides some explicit formulas: Remark 2.8. Let n, m ∈ N 0 and s, t ∈ R. Let M ⊆ R and let H be the Heaviside step function. It follows from (2.10) and (2.14) that: Let us introduce the following distributions: Definition 2.9. Let M ⊆ R and let F ∈ A be written in the form (2.11). The distribution F M ∈ A associated with F is defined by We can now write F * M G in a compact form: Lemma 2.10. Let F, G ∈ A and let F M , G M be the associated distributions of the form (2.15). Then Proof. Using (2.15), we have and likewise: Hence, (cf. (2.14)): We now study the main properties of * M :  which proves that the product is right-distributive. Equivalently, one proves that it is also left-distributive.
a simple calculation shows that: Using the associativity of * we get Hence the product * M is associative.
(v) If F ∈ C ∞ then in (2.11) we have F = f . It follows from (2.10) that F * G = G * F = f G for all G ∈ A and so, from (2.14), that where J = G * F − F * G, and J R\M is defined by (2.15). Notice that from (2.12) and (2.13) we have explicitly: Moreover, since J is of finite support (supp J ⊆ I F ∪ I G ), we have It follows that

Main Theorem
In this section we prove the Main Theorem. Several preparatory results that are required for the proof will be given in section 3.1 (Theorems 3.1, 3.5, 3.6 and 3.7).

Preparatory results.
Theorem 3.1. Let ξ ∈ C ∞ and F ∈ G. Then Proof. If F ∈ G ∩ C then from (A4) Moreover, if (3.1) is valid for some F ∈ G (and all ξ ∈ C ∞ ) then it is valid for F ′ = D x F : 1), and the dual product satisfies the Leibniz rule. This proves that for all g ∈ G ∩ C, ξ ∈ C ∞ and all k ∈ N 0 . We now extend the previous result to all F ∈ G. We will need to impose the extra conditions (A5.1) and (A5.2). Let (φ i ) i be a countable family of smooth real functions satisfying: Then, ∀F ∈ G we have | Ω i . Using (3.2) we get: . Then F i ∈ G and is of compact support (because for some h i ∈ G ∩ C, s i ∈ N 0 (cf. (A5.1)), and thus, for every F ∈ G, ξ ∈ C ∞ and n ∈ N: in D ′ ). We then rewrite (3.3) in the form and take the limit n → +∞. Using the partial continuity of the product (cf. (A5.2)), we find: where we used (3.2) in the second step. In the same manner one proves that F ⊛ ξ = ξF .

Remark 3.2.
Notice that if we assume that G satisfies (A6), the proof of the previous Theorem is concluded in eq.(3.2) since every F ∈ G is (globally) a finite order derivative of some g ∈ G ∩ C. The condition (A6) is satisfied by G = A, and also by G = D ′ (Ω) for Ω ⊆ R a compact set. If G satisfies (A6) we can also conclude that ⊛ is partially continuous (i.e. it also satisfies (A5.2)). This follows from the partial continuity of the dual product and the fact that from (3.2), ξ ⊛ F = ξF for all ξ ∈ C ∞ and F ∈ G.  1'). To prove this let us consider the partition of unity (φ i ) i defined above by (P1)-(P3). For F ∈ G we have for some s i ∈ N 0 and h i ∈ C (cf. Corollary 3.4-2a, [19]). Moreover, since anti-differentiation is an inner operation in G (cf. (A5.1')), we also have h i ∈ G. Hence F ∈ G can be written in the form: and the rest of the proof follows from eq.(3.3).
Proof. We will prove that supp (F ⊛ G) ⊆ supp F (the same result is valid for G). The previous statement is equivalent to proving that (Ω c denotes the complement of Ω): with the obvious exception of the case supp F = R for which the result is trivial. Let us consider a partition of unity φ 1 , φ 2 ∈ C ∞ such that: where we used (3.1) to obtain the second term. It follows that: because, by (ii) and (iii), φ 1 F = 0 and φ 2 t = 0, respectively.
Proof. Let us first consider s < t. Since supp H(s − x)∩ supp H(x− t) = ∅, it follows from Lemma 3.4 that . The other case s > t is proved in the same way. Let now s = t. It follows from (A4) that Twice differentiating this equation, we get where we took into account that x |x − s| = 2δ s . We now prove that δ s ⊛ |x − s| = 0. To make it simple let s = 0. We have: Notice that x ⊛ H = xH and δ ⊛ x = x ⊛ δ = xδ = 0 (cf. Theorem 3.1). In the same way one proves that |x − s| ⊛ δ s = 0. Hence eq.(3.7) reduces to which concludes the proof.
Before we proceed to the next Theorem, let us recall the following useful formula which is valid for all n, m ∈ N 0 (eq.(26), section 2.6, [14]): where we used the convention 0! = 1. The case m ≥ n can be easily inverted, yielding (3.10) δ (j) = (−1) n j! (j + n)! x n δ (j+n) , ∀j, n ∈ N 0 . Theorem 3.6. For every s, t ∈ R, and every i, j ∈ N 0 we have Moreover, if t = s, we get: s for some n ∈ N 0 and a k ∈ R, k = 0, .., n. To simplify the presentation, assume that s = 0 and i ≤ j. Then: where we used (3.9). Substituting (3.12) in the previous expression: a k δ (k) = 0 =⇒ a k = 0, ∀k ≥ i + 1 and so (3.14) Let us return to (3.10) and set n = i + 1: It follows that: where we used (3.1) and the associativity of ⊛. Since (3.14) is valid for all j, we have: for some a ′ k ∈ R. Substituting in (3.16) and using (3.9), we finally get which concludes the proof.
Theorem 3.7. For all n ∈ N 0 and s, t ∈ R we have: where c t is some function c t : R −→ {0, 1}.
Proof. (1) For s < t, supp H(x − t) ∩ supp δ (n) s = ∅ and thus the product is zero. For s > t, we have H( (2) Let us begin by proving that the formulas are true for n = 0. Assume for , for some m ∈ N 0 and c k ∈ R. As before and thus H ⊛ δ = cδ for some c ∈ R. Moreover (cf. Theorem 3.5) Generalizing now for t ∈ R: H(x − t) ⊛ δ t = c t δ t where c t is an arbitrary function c t : R −→ {0, 1}. Each function c t defines a particular ⊛-product. Let us denote it by ⊛ ct . Thus H(x − t) ⊛ ct δ t = c t δ t . Let us proceed. Differentiating H(x − t) ⊛ ct H(x − t) = H(x − t) we get: which completes the proof of (2) for n = 0.
Assume now that holds for some n ∈ N. Differentiating (3.18) we get and since the first term is zero (cf. Theorem 3.6) we conclude that (3.18) is valid for n + 1, and thus for all n ∈ N 0 . In the same way one proves that δ t , for all n ∈ N 0 .

Main Theorem.
We can now easily prove the Main Theorem.
Proof. The inclusion A ⊆ G follows directly from (A1') and the fact that the distributional derivative D x is an inner operator in G. Let then F, G ∈ A, we want to prove that F ⊛ G = F * M G for some M ⊆ R. Let us write F, G in the form (2.5). From the distributive property of ⊛: Let us consider each term separately: 1) We first prove that (3.20) f ⊛ g = f * M g , ∀f, g ∈ C ∞ p , ∀M ⊆ R . Since (cf. Theorem 3.1): ξH(x − a) = ξ ⊛ H(x − a) = H(x − a) ⊛ ξ , ∀ξ ∈ C ∞ and also (cf. Theorem 3.5): we have for f = f i H(x − a) and g = g j H(x − b), f i , g j ∈ C ∞ , using the associativity of ⊛: where the second identity follows from (2.14) and holds for all M ⊆ R.
Moreover, for f or g smooth, (3.21) also holds (cf. Theorem 3.1). Since every f, g ∈ C ∞ p is the sum of a smooth function with a finite number of functions of the form ξH(x − a), ξ ∈ C ∞ , a ∈ R, using the distributive property of ⊛ and * M we get (3.20).
2) We now prove that for some M ⊆ R (3.22) f ⊛ ∆ G = f * M ∆ G , ∀f ∈ C ∞ p , ∀G ∈ A . It follows from Theorem 3.7 and Remark 2.8 that Moreover, ∆ G is of the form (2.7) and every f ∈ C ∞ p is the sum of some ψ ∈ C ∞ with a finite linear combination of terms of the form ξH(x − t), ξ ∈ C ∞ . Hence, using the distributive property of ⊛ and * M , we get (3.22). An equivalent proof shows that (3.26) ∆ F ⊛ g = ∆ F * M g , ∀F ∈ A , ∀g ∈ C ∞ p .
Notice that once the set M is fixed for the product g ⊛ ∆ F , the product in the reversed order is also fixed, i.e. ∆ F ⊛ g = ∆ F * M g (with the same M ); this follows from Theorem 3.7 (2), which determines that the function c t (and thus the set M ) is the same in both products.