Covariance and comparison inequalities under quadrant dependence

We study the difference between a distribution of the random vectors [X,Y]\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$[X,Y]$$\end{document} and [X′,Y′]\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$[X',Y']$$\end{document}, where X′\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X'$$\end{document} and Y′\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Y'$$\end{document} are independent and X′\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X'$$\end{document} has the same law as X\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X$$\end{document} and Y′\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Y'$$\end{document} as Y\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Y$$\end{document}. Particular interest is focused on positively quadrant dependent random variables X\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X$$\end{document} and Y\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Y$$\end{document}, in this case the bounds for the difference in question are expressed in terms of the covariance of X\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$X$$\end{document} and Y\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$Y$$\end{document}.

Let us recall that the random variables X, Y are positively quadrant dependent (PQD) if H X,Y (t, s) := P(X ≤ t, Y ≤ s) − P(X ≤ t)P(Y ≤ s) ≥ 0 for all t, s ∈ R and X, Y are negatively quadrant dependent (NQD) if H X,Y (t, s) ≤ 0. It is well known that X, Y are NQD iff X, −Y are PQD. In view of this duality, we shall focus only on the PQD case later on.
Denote by the so-called Hoeffding covariance (it is always well defined for PQD or NQD r.v.'s, even though it may be infinite and if the usual product moment covariance exists, then it is equal to the Hoeffding covariance). From this fact it follows that uncorrelated PQD (or NQD) r.v.'s are independent. Therefore, in the study of limit theorems, covariance is usually used to "control" the dependence of r.v.'s. It is also well known that monotonic functions of positively (negatively) dependent r.v.'s inherit such properties. In particular, the indicators It is easy to see that thus, in the study of limit theorems for empirical processes based on positively or negatively dependent observations, it is important to control H X,Y (t, s). In this context the upper bounds for H X,Y (t, s) in terms of Cov H (X, Y ) are very useful. On the other hand, it is interesting to establish how far the random variables X, Y are from the independent ones, in the sense of the difference between the joint distribution and the product of its marginals. The bounds for the covariance of the indicator functions in terms of the covariance of the random variables are called the covariance inequalities. The first inequalities of the form where X, Y are associated and absolutely continuous and C depends on the densities of X, Y , were obtained by Bagai and Prakasa Rao (cf. [1]) and Roussas ([13]). These authors studied properties of the estimators of the survival function and kernel estimators of the density based on the sample of associated r.v.'s. The inequalities (1.1) were intensively studied in [6,7] and [8]. In particular in [8], it was proved that if X, Y are absolutely continuous PQD r.v.'s with The discrete case was considered in [6] and it was proved that if X, Y are integer-valued PQD r.v.'s, then sup We would also like to refer the reader to the monograph [2], where the covariance inequalities for Lipschitz functions of associated r.v.'s are studied.
In recent years, however, live interest in positively and negatively dependent r.v.'s have led to fruitful results not only in range of covariance inequalities. Several limit theorems have been proved under this kind of dependence, in particular: laws of large numbers, CLT and the rate of convergence in the CLT, invariance principle, moment bounds, convergence of empirical processes etc. (see [2,10,11] where further references are given). Now let X, Y be any random variables and X , Y independent copies of X and Y , i.e. X has the same distribution as X , and Y as Y and X , Y are independent. We may write where Q(t, s) = (−∞, t × (−∞, s is a quadrant. Let us introduce the following notation for D ⊂ R 2 . Because the notions of uncorrelatedness and independence coincide, the key assumption in limit theorems for positively or negatively dependent r.v.'s always involves their covariance structure. Indeed covariance plays the role of the measure of dependence between r.v.'s. Taking this into account, it appears to be important to establish how, in fact, the covariance assesses the dependence. The answer may be given by finding the bounds for H X,Y (D) in terms of Cov H (X, Y ) (we shall call from now on comparison inequalities). This is the main goal of the paper.
The paper is organized as follows. In the second section we consider the comparison inequalities for integer-valued r.v.'s while the third one is devoted to the absolutely continuous random vectors. Section 4 presents the special case of Farlie-Gumbel-Morgenstern (FGM) r.v.'s.

Discrete case
With a view to stating the main result of this section we shall introduce the notion of a δ-hull of a set D ⊂ R 2 as follows for some (t, s) ∈ D and |a| ≤ δ, |b| ≤ δ}.
Let Z denote the set of integers. For integer-valued r.v.'s we have the following general comparison theorem.

Theorem 2.1 Let X, Y be any random variables with values in
For the proof of this theorem and the results of the next section, we shall need the following identity of Newman (cf. (4.10) in [9]).

Lemma 2.2
Let g 1 and g 2 be absolutely continuous functions and X, Y random variables such that g 1 (X ) and g 2 (Y ) are square-integrable. Then It is worth mentioning here, that Proof of Theorem 2.1 For i ∈ Z let us define the functions which are absolutely continuous and differentiable except the points i − 1/2, i, i + 1/2. We shall use these functions to approximate the indicators. In fact, we have and by Lemma 2.2, we get It is easy to see that For PQD r.v.'s we immediately obtain the following corollary.

Corollary 2.3 Let X, Y be PQD r.v.'s with values in Z and D ⊂ R 2 be any set, then
The inequality (2.2) is optimal up to a constant in this sense, that the left and the right hand-side may approach 0 with the same speed. Let us illustrate it with an example.  Cov H (X, Y ). Therefore we get another corollary. L(a, b) and D ⊂ R 2 be any set, then

Absolutely continuous case
In this section we shall study absolutely continuous random vectors [X, Y ]. Denote by f X,Y (x, y) the joint density of [X, Y ] and by f X (x), f Y (y) the marginal densities of X and Y respectively. We shall assume that these densities are bounded in the essential supremum norm (L ∞ norm). We put We will obtain bounds for H X,Y (D), where D ⊂ R 2 is a compact set which boundary is a Peano curve (continuous, piecewise C 1 , without self-intersections). The length of will be denoted by L and the planar measure of D by μ(D). Recall that, by the isoperimetric inequality, we have μ(D) ≤ 1 4π L 2 .

Theorem 3.1 Let the r.v.'s X, Y and the set D be as above, then
In the proof we shall use the following elementary lemma.
We begin with a comparison inequality for a square S i, j,δ . Let us observe that by Lemma 3.2, therefore by Lemma 2.2 we get Let us split the set of indices I into two disjoint parts I and I int . The family S i, j,δ (i, j)∈I covers the boundary , i.e.
which follows from the fact that may be divided into at most L δ consecutive parts of the length δ (the last at most δ), each may be covered by a square of the side δ parallel to the axis, which in turn may be covered by at most four squares S i, j,δ . Now, we have (3.5) Therefore, from (3.2), (3

.3) and (3.4), it follows that
Let us put h = sup (t,s)∈D H X,Y (t, s) and assume that h > 0. If h = 0, then it is easy to prove that H X,Y (D) = 0. Further, let L = max(L , 1) and assume that δ ≤ 1. Then We put δ = √ 2 4 √ h and η = √ h to obtain an optimal exponent at h. Since h ≤ 1 4 we see that δ ≤ 1 and η ≤ δ/2. Thus we get and the proof is completed.
By direct application of (1.2) to Theorem 3.1, for PQD r.v.'s, we get the following inequality A more careful study of the proof of Theorem 3.1, in the case of PQD r.v.'s, leads to the following result.

Theorem 3.3 Let the assumptions of Theorem 3.1 be satisfied and X, Y be PQD r.v.'s, then
Proof As in the proof of Theorem 3.1, from (3.6) we get If Cov H (X, Y ) ≥ 1 then the conclusion is a trivial inequality. Assume Cov H (X, Y ) < 1 and take δ = 5 √ Cov H (X, Y ) and η = (Cov H (X, Y )) 2/5 /2 to optimize the exponent in Cov H (X, Y ) and arrive at the conclusion.

FGM case
It is said that the r.v.'s X, Y have the joint FGM distribution function if where ρ ∈ [−1, 1], F X,Y , F X , F Y are the joint distribution function and marginal d.f.'s respectively. Denote by f X,Y , f X , f Y their densities. The corresponding copula takes the form (For details on copulas see [12]). In this section we shall consider a more general form of (4.1). Let H be a family of monotonically nonincreasing functions h : 0, 1 → R such that where It is easy to see that C(u, v) is now a copula with PQD property. Covariance inequalities for absolutely continuous r.v.'s X, Y with copula of the form (4.2) were studied in [7], where measurability of the functions h 1   The above inequality enables us to prove the following comparison theorem for the FGM distributed r.v.'s. Proof Under our assumptions Thus, for any Borel set B ⊂ R 2 we get by applying the transformation u = F X (x), v = F Y (y) and inequality (4.5).
Similarly as in Example 2.4, it may be shown that (4.6) is optimal up to a constant, i.e. the left and the right-hand side of this inequality may converge to 0 with the same rate.