Asymptotically uniform functions: a single hypothesis which solves two old problems

The asymptotic study of a time-dependent function $f$ as the solution of a differential equation often leads to the question of whether its derivative $\dot f$ vanishes at infinity. We show that a necessary and sufficient condition for this is that $\dot f$ is what may be called"asymptotically uniform". We generalize the result to higher order derivatives. We further show that the same property for $f$ itself is also necessary and sufficient for its one-sided improper integrals to exist. On the way, the article provides a broad study of such asymptotically uniform functions.


Introduction
When does the derivative of a function f : R + → R tend to zero at infinity?This certainly is the case when both f and its derivative ḟ := df /dt converge (see Section 2).However, it has been known for a long time that this set of hypotheses is too strong.
A similar situation occurs with (Riemann) integrals: the fact that a function is integrable does not imply that it tends to zero at infinity.
In the present work we shall address the following two questions.
Differential case: Let f : R + → R be differentiable with lim t→+∞ f (t) = α ∈ R. Is there a necessary and sufficient condition for ḟ to converge to 0 at infinity?
Integral case: Let g : R + → R be integrable over every interval [0, t], with lim t→+∞ t 0 g(s)ds = α ∈ R. Does there exist a necessary and sufficient condition for g to converge to 0 at infinity?Since the domain of definition of all our functions will be R + , we shall write ∞ instead of +∞.

General discussion
Historically, the differential case seems to have received more attention.A simple result states that, if f and ḟ converge, then the limiting value of the latter has to be 0.This is an immediate consequence of the following lemma.
Lemma 1 Let f : R + → R be differentiable with lim t→∞ f (t) = α ∈ R.Then, for any sequence of intervals ([a n , b n ]) n≥1 with a n < b n , a n ↑ ∞ and inf n≥1 (b n − a n ) > 0, there exists ξ n ∈ (a n , b n ) with lim n→∞ ḟ (ξ n ) = 0.
Proof.For every n ≥ 1, the mean value theorem provides Numerous converging functions have been given whose first derivatives do not converge p. 40,and p. 45 for counterexamples in the integral case].
To our knowledge, apart from Thieme, who provided a necessary and sufficient condition for the differential case [Thi,Corollary A.17], only sufficient conditions have been given so far.The uniform continuity of ḟ , respectively of g for the integral case, is one of the best known and most frequently used among those sufficient conditions; in control theory [Mul], the result for g is known as Barbȃlat's lemma [Bar, Far-Weg].
If one assumes continuity of g, the integral case is a consequence of the differential case.However, in view of the existence of differentiable functions whose derivatives are not integrable, i.e., for which the fundamental theorem of calculus does not hold, one has to treat the two questions separately: recall for instance the existence of differentiable functions whose derivatives are bounded but not integrable p. 88].
Already in the nineteenth century, a considerable number of mathematicians looked for sufficient conditions for the differential case.As mentioned in [Had2], the first conditions for the vanishing of the derivative seems to be studies by A. Kneser (1896) about the asymptotic behaviour of solutions y of ordinary differential equations ÿ = F (t, y), when F possesses the same sign as y [Kne,p. 182].In his considerations on the asymptotics of a function of the solution of a system of ordinary differential equations, Hadamard [Had1,p. 334] appears to be the first to have stated the result independently of such equations; his supplementary assumption is the boundedness of the second derivative.
Obviously unaware of [Had1], Littlewood [Lit] gave an easier proof of the result for an arbitrarily differentiable function, which he deems "fundamental" for his study of power series.In 1913, under somewhat stronger (but, as they stress, not necessary) conditions on f , Hardy and Littlewood [Har-Lit] initiated the study of bounds on the first derivative at infinity involving f and f .We give one of their results.
Theorem 2 Let g and h be positive non decreasing functions and f be continuous.If f = o(g), or if g = 1 and f → α, and if f = O(h), then ḟ = o( √ gh).
In the same article, the authors extended the result to the case of several derivatives; this led them to the following version of Hadamard's lemma (p. 423), which merely requires the boundedness of the last derivative.
Theorem 3 If f has derivatives of all orders and if f → s at infinity, then, if any derivative is bounded, all preceding derivatives tend to zero, or, to state the matter in another way, if any derivative does not tend to zero, no subsequent derivative is bounded.
A different proof of this result has been given by Coppel [Cop,p. 141] in 1965.
In 1914, Landau [Lan] originated a quantitative study of the problem by obtaining bounds on |f | and | f | which guarantee that | ḟ (t)| lies below a certain value for sufficiently large t.This result led to an enormous body of research, exhibited in the book [Mit-Pec-Fin], which mentions it on its very first page.
Most results involve assumptions about f for the differential case, on ġ for the integral one.We rather look for conditions on ḟ only that solve the differential case, resp.on g only for the integral one.As mentioned already, a class of functions introduced by Thieme provides the solution to the differential case.We show that it solves the integral case as well.Furthermore, we perform a detailed study of this class, and also give a more suggestive characterization of the latter.
The next section introduces the class of asymptotically uniform functions, the subject of this manuscript, and shows that it solves the two problems mentioned in the Introduction.Section 4 lists and discusses properties of these functions.Section 5 studies some sufficient and/or necessary conditions for a function itself to vanish at infinity.In Section 6 we show that the class of asymptotically uniform functions may also be characterized as those which can be written as the sum of a uniformly continuous function and one which vanishes at infinity.The last section extends the differential case to functions with more derivatives

Asymptotically uniform functions
To our knowledge, for the integral case, the usual sufficient conditions presuppose the continuity of the integrand.However, there exist asymptotically discontinuous functions, i.e., functions discontinuous on every unbounded interval, which simultaneously have a convergent integral and vanish at infinity, for instance on R + , g(t) := 0, t ∈ N, e −t , elsewhere.
On every interval [0, T ], T > 0, g is Riemann integrable, as it has only finitely many discontinuities.As T → ∞, the integral converges to 1 and g to zero.For a necessary and sufficient condition, we therefore include asymptotically discontinuous functions in our considerations and introduce the following class, which is equivalent to that proposed by Thieme.Additional examples are provided in Section 4.
Definition 1 A function f : R + → R will be called asymptotically uniform, (a.u.) for short, if for any ε > 0 there exist T ≥ 0 and δ > 0 such that for all s, t ≥ Proof.According to Lemma 2, the convergence of ḟ in R entails its uniform asymptoticity.Conversely, if ḟ is (a.u.), for every ε > 0 there exist T ≥ 0 and and thus lim t→∞ ḟ (t) = 0.
We can even say more when the derivative is continuous.
Theorem 5 Let f : R + → R be differentiable with a continuous derivative and lim t→∞ f (t) = α ∈ R. Then lim t→∞ ḟ (t) = 0 if and only if ḟ is uniformly continuous.
Proof.This follows from Theorem 4 and Property 5 in Section 4.

Integral case
Theorem 6 Let g : R + → R be Riemann integrable over every interval [0, t] and let lim t→∞ t 0 g(s)ds = α ∈ R. Then lim t→∞ g(t) = 0 if and only if g is (a.u.).
Proof.Lemma 2 again entails the uniform asymptoticity of g if the latter converges.For the converse, let ε > 0 be given.Then there exist According to the triangle inequality, for every t > max{T, T ′ }, Consequently, |g(t)| < ε for t > max{T, T ′ } and thus lim t→∞ g(t) = 0.
As in the differential case we have the following theorem, when g is continuous.
Theorem 7 Let g : R + → R be continuous and Riemann integrable over every interval [0, t].If lim t→∞ t 0 g(s)ds = α ∈ R, then lim t→∞ g(t) = 0 if and only if g is uniformly continuous.
Note that the sufficient condition is the already mentioned Barbȃlat's lemma [Far-Weg].

5.
A continuous function f : R + → R is (a.u.) if and only if it is uniformly continuous.Indeed, for ε > 0, let T and δ be numbers provided by the definition of asymptotic uniformity.Since f is uniformly continuous over [0, T + δ], there exists δ * such that for all s, t ≤ T + δ with |t − s| ≤ δ * we have |f (t) − f (s)| < ε.The number min{δ, δ * } yields the uniform continuity over R + .The converse is given by item 4.This result suggests that asymptotic uniformity is mainly interesting for asymptotically discontinuous functions.For instance, the everywhere discontinuous function e −t I Q∩R + (t), which obviously converges to 0 at infinity, is (a.u.), according to Lemma 2.

Examples:
(a) The function f (t) = t is unbounded but uniformly continuous and thus (a.u.).(b) The function f (t) = sin(t), which obviously does not converge at infinity, is uniformly continuous and thus (a.u.).(c) The bounded function f (t) = sin(t 2 ) is continuous but not uniformly continuous.According to property 5, it is not (a.u.).
9. An (a.u.) function is not necessarily measurable.Consider any bounded non-measurable one, for example the indicatrix g of a non-measurable subset of R + .Then f (t) = e −t g(t) converges to 0 at infinity and thus is (a.u.) according to Lemma 2, but clearly not measurable.
10.We now give a differentiable function which converges at infinity, whose derivative is discontinuous on an unbounded sequence of points, but is nevertheless (a.u.) as it converges to 0 at infinity (Lemma 2).Consider the function periodically extended over R + .f is discontinuous at the integers, and bounded.Clearly, it admits a primitive, say g, which satisfies 0 ≤ ġ(t) = f (t) ≤ 1. Choosing g(0) = 0, we get 0 ≤ g(t) ≤ t.Then the primitive of e −t ġ(t) is h(t) = e −t g(t) − t 0 (−e −s )g(s)ds = e −t g(t) + t 0 e −s g(s)ds, and the last integrand is continuous.Since h is increasing and bounded, it converges as t → ∞. h is our function: its derivative converges to 0 and is discontinuous at t = k, k ∈ N.
11. Contrarily to its derivative, the unbounded function f (t) = t 2 is not uniformly continuous.Our investigations led us to the following result, which is certainly known, but that we did not find in the literature: if f : R + → R is bounded and differentiable with a uniformly continuous derivative ḟ , then the latter is bounded and f is uniformly continuous.
As ḟ is uniformly continuous, choosing ε = 1, there exists δ > 0 such that for all 0 < h < δ and t ≥ 0, | ḟ (t + h) − ḟ (t)| < 1.Furthermore, for any fixed h 0 with 0 < h 0 < δ and t ≥ 0, according to the mean value theorem, there exists 12. It follows from the definition that if f : R + → R is (a.u.) and if g : R → R is uniformly continuous, then the composite function g • f is (a.u.) as well.
13.The (a.u.) property is not preserved by almost everywhere equality.For instance, f (t) = e −t is (a.u.) but clearly not g(t) = e −t + I N (t), despite the fact that the two functions are equal almost everywhere.

About converging functions
Definition 2 We shall say that f : R + → R satisfies condition (C), if (C) for a.e.(i.e., for almost every) Condition (C) is for instance satisfied for Lebesgue integrable functions, as shown in [Les].The following result holds for continuous functions.(b) ⇒ (a): let E be the set of all t ∈ R + for which f (nt) does not converge to 0 as n ↑ ∞.Suppose that E = ∅ and let t ∈ E. By definition of the convergence to 0, there exist ε 0 > 0 and (n k ) k≥1 ↑ ∞ such that According to the uniform continuity of f , it is thus always possible to choose δ ′ ∈ E c such that 0 < δ ′ < δ(ε ′ ).It follows that the sequence (f (nδ ′ )) n≥1 converges to 0 as n ↑ ∞.Therefore, there exists N ≥ 1 such that We partition R + according to This contradiction entails E = ∅ and thus (C * ) holds.f being continuous, Lemma 3 (Croft's lemma) completes the proof.
One obtains another equivalence when the continuity assumption is lifted.
Theorem 8 Let f : R + → R be such that (C) holds.The following two statements are equivalent: Thus (C) is also valid for u.Since the latter is uniformly continuous, Lemma 4 provides its convergence to zero at infinity, and thus that of f .Definition 3 We shall say that f : R + → R is a (u, r)-function if f = u+r, with u uniformly continuous and r converging to 0 at infinity.
Remark This decomposition is obviously not unique: for instance, one may add to u any uniformly continuous function vanishing at infinity and subtract it from r.

Back to asymptotically uniform functions
Let N * denote the positive integers.In our context, a function f : R + → R is piecewise differentiable if there exists a strictly increasing sequence (t n ) ∈N * going to ∞ as n → ∞ such that f is required to be everywhere continuous and differentiable only over the open intervals (t n , t n+1 ), n ∈ N * .The following consequence of the mean value theorem is likely known.We nevertheless provide a proof, which we did not find in the literature.
Lemma 5 Let f : R + → R be piecewise differentiable with associated sequence Proof.Let 0 ≤ s < t.If the interval [s, t] does not contain elements of (t n ) n∈N , the mean value theorem entails |f (t) − f (s)| ≤ M(t − s).Otherwise there exist n p ≤ n q in N * such that 0 ≤ s ≤ t np < t np+1 < ... < t nq ≤ t, where t np is the smallest element of the sequence (t n ) n∈N * to the right of s and similarly for t nq to the left of t (n p = n q means that [s, t] contains only one element t np with 0 ≤ s ≤ t np ≤ t) .The triangle inequality implies Since f is continuous over each interval [t n , t n+1 ] and differentiable over (t n , t n+1 ), the mean value theorem applies and Hence f is Lipschitz continuous with constant M.
Theorem 9 For f : R + → R, the following two statements are equivalent: contains the graph of a function defined over [T, ∞) and Lipschitz continuous there.
Proof.(a) ⇒ (b): for ε > 0, let T ≥ 0 and δ > 0 be numbers guaranteed by the definition of the uniform asymptoticity of f for ε 2 .Consider the equispaced nodes (points at which a function is evaluated) where a n is the slope of g over [t n , t n+1 ] (a point moving on a line always drifts away from its initial vertical position).Thus |f Accordingly, for all n ∈ N, we get ) and the graph of g belongs to V ε T (f ).Furthermore, the slope of g over [t n , t n+1 ] satisfies Consequently, f is (a.u.).
Corollary 1 For a function f : R + → R, the following three statements are equivalent: (a) f is (a.u.); (b) There exists a uniformly continuous function u : R + → R such that lim t→∞ (f (t) − u(t)) = 0; (c) f is a (u, r)-function.
Proof.(a) ⇒ (b): we shall construct u as a piecewise affine function, as in the first part of the proof of Theorem 9, but with a graph contained in strips of decreasing thickness.For ε > 0 and every k ≥ 1 we define ε k = ε k and denote by u k the function corresponding to ε k 2 and t ≥ T k in the first part of the proof of Theorem 9.As we want T k to be a node, i.e., u k−1 (T k ) = f (T k ) = u k (T k ), we choose the corresponding δ k as δ k−1 divided by some integer power of 2. Furthermore, we choose T k sufficiently large to warrant that T k > T k−1 + 1.Now define the function u to be u k over [T k , T k+1 ] and u 1 (T 1 ) over [0, T 1 ].Clearly u is continuous and |f (t) − u(t)| < ε k for t ≥ T k ; since ε k ↓ 0 as k → ∞, we deduce that lim t→∞ (f (t) − u(t)) = 0.According to property 6, u is (a.u.).Since u is continuous, property 5 entails its uniform continuity and we get (b).(b) ⇒ (a): by property 4, u is (a.u.), and by property 6, f is (a.u.).Since (b) ⇔ (c) is obvious, we have proved the equivalence of the three statements.
7 Extension to higher-order derivatives Theorem 10 Let f : R + → R be n-times differentiable with lim t→∞ f (t) = α ∈ R. Then the following two assertions are equivalent: Note that the case n = 1 is Theorem 4.
We shall say that a function F defined over R + ultimately satisfies a property P if there exists T ≥ 0 such that F has the property P for all t ≥ T .
To prove (a), it is sufficient to show that f (n) is ultimately bounded.Indeed, if this is true, the already mentioned Theorem 6 in [Cop,p. 141] provides the convergence to zero of f Since the derivative of f (n−1) is (a.u.) by hypothesis, the convergence of f (n)  to 0 is given by that of f (n−1) and Theorem 4.
First we note that, under the hypothesis, none of the derivatives is ultimately 0. Indeed, if one of them is 0, let i, 1 ≤ i ≤ n, be the smallest index such that f (i) is identically 0 over some interval [T, ∞).If i = 1, then f is constant over [T, ∞) and (a) is clearly true since all subsequent derivatives are also 0 over (T, ∞).On the other hand, under the hypothesis, the case i ≥ 2 leads to a contradiction.Indeed, the first derivative would have to be different from 0 asymptotically; moreover, f would be a polynomial of degree ≥ 1.In that case f could not converge in R as t → ∞.We can thus assume that none of the derivatives is ultimately 0.
Let f : R + → R, n ∈ N * , t ≥ 0, h > 0. According to Taylor's theorem over closed intervals [Fle,p. 168], if f (n−1) is continuous over [t, t + h] and f (n) defined over (t, t + h), then there exists t < ξ t,h < t + h such that f being n times differentiable over R + , it satisfies these conditions and thus Extending the index range to n, we get Since f converges to α ∈ R when t → ∞ and f (n) is (a.u.), for every M > 0, there exists T M ≥ 0 and 0 < δ M (we can always choose δ M < 1), so that Hence, for every h with 0 Consequently R h is ultimately bounded, uniformly in h for 0 < h ≤ δ M < 1.Clearly, if two functions are bounded over [T M , ∞), then the same property holds for any multiple or linear combination of them.In the spirit of Richardson extrapolation [Hil,p. 292], we shall use the two operations to successively eliminate all the terms of R h except the last one, which will be a bounded multiple of f (n) ; f (n) will thus bounded over [T M , ∞) as well.We introduce the functions f k (t) := f (k) (t)   k!
and recall that none of them is ultimately 0. From the definition we get Then one considers the function again, by the triangle inequality, |R (2) h | ≤ (1 + 2 2 )(1 + 2)M.After n − 1 steps, we get a single term, which is a multiple of f (n) , namely: Since |R We can also adapt Hadamard's lemma (Theorem 1) to the case when the derivatives are uniformly continuous.This also generalizes Theorem 5.
We stress that the result holds for one more derivative than in Hadamard's lemma.

Lemma 3 (
p. 17]) If f : R + → R is continuous and (C * ) holds, then lim t→∞ f (t) = 0.On the other hand, (C) is sufficient to provide convergence for uniformly continuous functions.We prove the following equivalence.Lemma 4 Let f : R + → R be continuous and (C) hold.Then the following two statements are equivalent: (a) lim t→∞ f (t) = 0; (b) f is uniformly continuous.Proof.(a) ⇒ (b): if a continuous function converges at infinity, it is uniformly continuous [And-Mor-Tet, p. 233].
, a bound independent of n. g being continuous over [T, ∞) and piecewise differentiable with bounded derivative, it is Lipschitz continuous according to Lemma 5. (b) ⇒ (a): let T correspond to ε 3 > 0 in point (b) and let g be a Lipschitz function whose graph is contained in V ε 3 T (f ).As g is obviously uniformly continuous, ∃ δ > 0 such that ∀ s, t ≥ T, |t − s| < δ, we have |g(t) − g(s)| < ε