New copula families and mixing properties

We characterize absolutely continuous symmetric copulas with square integrable densities in this paper. This characterization is used to create new copula families, that are perturbations of the independence copula. The full study of mixing properties of Markov chains generated by these copula families is conducted. An extension that includes the Farlie-Gumbel-Morgenstern family of copulas is proposed. We propose some examples of copulas that generate non-mixing Markov chains, but whose convex combinations generate $\psi$-mixing Markov chains. Some general results on $\psi$-mixing are given. The Spearman's correlation $\rho_S$ and Kendall's $\tau$ are provided for the created copula families. Some general remarks are provided for $\rho_S$ and $\tau$. A central limit theorem is provided for parameter estimators in one example. A simulation study is conducted to support derived asymptotic distributions for some examples.


Introduction
A bivariate copula, which is defined as a restriction to the unit square [0; 1] 2 of a bivariate joint cumulative distribution function with uniform marginals, is a tool that has been gaining popularity in dependence modeling.An equivalent definition can be found in Nelsen (2006) [32], where one can find several related notions.The popularity of copulas is due to Sklar's theorem that relates them to the joint distribution and marginals of a bivariate random vector (see Sklar (1959) [35]).
Many authors have worked on building copulas with various properties.Some constructions can be found in Nelsen (2006) [32], where several construction methods are presented.Arakelian and Karlis (2014) [2] studied mixtures of copulas, that were investigated for mixing by Longla (2015) [26].Longla (2014) [27] constructed copulas based on prescribed ρ-mixing properties obtained from extending results of Beare (2010) [3] on mixing for copula-based Markov chains.Longla et al. (2022) [23] and Longla et al. (2022b) [24] constructed copulas in general based on perturbations.In these two works, two perturbation methods were considered.For the first method, copulas are perturbed at the level of the variables by adding some noise components to each of the variables.For the second method, a copula C(u, v) is perturbed by creating C D (u, v) = C(u, v) + D(u, v) and requiring that C D (u, v) satisfies the definition of a copula.This last method of perturbation was also presented in Komornik et al (2017) [19] and the references therein.Several other authors have considered extensions of existing copula families via other methods, that are not the focus of this work, see for instance Morillas (2005) [31], Aas et al (2009) [1], Klement et al (2017) [18], Chesneau (2021) [8] among others.
Chesneau (2021) [8] considered multivariate trigonometric copulas, which seem close to one of the examples of copula families of this paper, but are in fact very far from the work we provide here.We are concerned with the important question of characterizing absolutely continuous symmetric copulas with square integrable densities.After characterizing such copulas, we extract some general copula families with specific functions, one of which is made of trigonometric functions.These copula families come with pre-difined mixing structure of the copula-based Markov chains they generate.This, in general, can not be said about the copula families of Chesneau (2021) [8], which where published when this paper was already written.We also provide the mixing structure of Markov chains generated by any of the copulas constructed in this paper.Therefore, one of the main points of the paper is the presentation of several copula families and the mixing structure of the Markov chains they generate.Moreover, for each of these copulas, we provide joint distributions of any two variables along the Markov chain they generate; and show that the copulas of these variables also belong to the initial copula family.We show that the set of all copulas for each of the selected basis of functions is closed under the operation * , defined by C 1 (u, v) = C(u, v), for n > 1 where C ,i (u, v) is the derivative with respect to the i th component of (u, v).
This product plays a central role in our study and its properties can be found in Darsow et al. (1992) [9].Recall that for n = 2 this product is the joint distribution of (X 1 , X 3 ) when (X 1 , X 2 , X 3 ) is a stationary Markov chain with copula C(u, v) and the uniform marginal distribution.In general, the copula C n (u, v) is the joint distribution of (X 0 , X n ) when X 0 , • • • , X n is a stationary Markov chain generated by C(u, v) and the Uniform(0,1) distribution.This is a simple consequence of applying recursively the product formula (see Longla (2022) [23]).
The product * and related notions are used in this paper to derive properties of Markov chains, including mixing and association.In Longla et al. (2022) [23], measures of association such as Spearman's ρ and Kendall's τ were studied for perturbations of copulas.Recall that in terms of copulas, these coefficients are defined as and τ (C) = 1 − 4 Formulas ( 1) and ( 2) are respectively formula (5.1.15c)and formula (5.1.12) of Nelsen (2006) [32].For definitions and practical use of these measures, we refer the reader to Nelsen (2006) [32] and references therein.In this paper, our interest for these measures is in finding the relationship between them and the parameters of the copulas, then propose an estimation procedure.Among copula families, for which we derive measures of association, is the family of copulas based on Legendre polynomials.This family is an extension of the Farlie-Gumbel-Morgenstern (FGM) copula family.The FGM has been extensively studied in the literature with several extensions (see Hurlimann (2017) [17] or Ebaid et al. (2022) [13]).Our extension is unrelated to extensions that exist in the literature.We use a new approach to extend this family in a way that has not been done in the literature.Moreover, we show that copulas from this family generate ψ-mixing stationary Markov chains for all values of the family parameter θ ∈ [−1, 1].This improves a previous result of Longla et al (2022) [23], who failed to provide the proof for the boundary points θ = ±1.The new characterization of copulas obtained in this paper resulted in a new elegant proof of ψ-mixing that includes the boundary points.For more on mixing coefficients, see Bradley (2005)[6], Beare (2010) [3] and the references therein.To avoid confusions in this paper, we say that a function f (x) is a version of the function g(x) if and only if the Lebesgue measure (in the appropriate dimension) of the set on which they differ is 0. This paper is structured as follows.In section 2, we present a characterization of copulas with square integrable densities.The obtained characterization is used to construct new copula families.Among these general functional families of copulas, are finite and infinite sums, including new sine and cosine copula families.A second group of copulas that is shown as example here is based on Legendre polynomials and contains the FGM copula.We also provide the Spearman's ρ S and Kendall's τ coefficients for each of the examples of copula families along with some useful remarks.All these results are new to the best of our knowledge.In section 3, we develop mixing properties of the extended FGM copula family based on Legendre polynomials.Several examples are provided here along with some copulas that don't generate mixing Markov chains, but have convex combinations that generate Mixing Markov chains.These examples answer an open question on convex combinations of copulas and mixing.A new general result on mixing for Markov chains generated by copulas with square integrable densities is given.We provide an extension of a previous result of Longla et al (2022c) [25] on ψ-mixing and ψ ′ -mixing.Namely, we show that ψ-mixing follows from the two conditions c s1 (u, v) > 0 on a set of Lebesgue measure 1 and c s2 (u, v) < 2 for all (u, v) ∈ [0, 1] 2 .In section 4 we provide some parameter estimators and their asymptotic distributions via centra limit theorems.We provide here a simulation study.Section 5 covers comments and conclusions.The Appendix of proofs ends the paper.

Square integrable symmetric densities
In this section, we provide a characterization of copulas with square integrable densities.The aim is to show that any such copula can be represented as a sum of possibly non-zero infinitely many terms.Such a representation indicates a new way to construct new copula families by modifying the independence copula Π(u, v) = uv.We use this method in this section to provide new sets of copulas and compute their coefficients ρ S and τ .
Recall that a copula in general is defined as a multivariate cumulative distribution function, for which the restriction on [0, 1] 2 has uniform marginals on (0,1).For any bivariate random variable (X, Y ) with continuous marginal distributions (F X , F Y ), the copula of (X, Y ) is the joint cumulative distribution function of (F X (X), F Y (Y )).Note that both F X (X) and F Y (Y ) are uniform on [0, 1].This is why most works on copula theory ignore the marginal distributions and are limited to uniform marginals.This is done without loss of generality, when variables are assumed continuous (see Nelsen (2006) [32], Durante and Sempi (2016) [12] or Sklar (1959) [35] for more on the topic).This work is concerned mainly with absolutely continuous symmetric bivariate copulas C(u, v) for which the density function c(u, v) exist and satisfies Formula (3) follows from the fact that the singular part of the copula vanishes for absolutely continuous copulas (see Darsow et al. (1992) [9]).It is known that the density of a copula is a positive bivariate function on [0, 1] 2 .Many authors have worked on copulas and their properties.Copulas are used to model dependence in various applied fields for problems that include but are not limited to estimation, classification and statistical tests of hypotheses.When used to model dependence, they help establish mixing properties of Markov chains, that are useful in establishing central limit theorems for sample averages of functions of observations (see Doukhan et al. (2009)[10] Chen and Fan (2006) [7], Peligrad and Utev (1997) [33], and Jones (2004) [20]).For more on properties of copulas, see Darsow et al. (1992) [9] and Beare (2010) [3].
In this paper, we say that a copula C(u, v) is symmetric if and only if C(u, v) = C(v, u) for all (u, v) ∈ [0, 1] 2 .Such copulas are useful in modeling reversible Markov chains.For a symmetric absolutely continuous copula, the density c(u, v) is square integrable if and only if Square integrable copula densities are kernels of Hilbert-Shmidt operators on the Hilbert space L 2 (0, 1).This means, that the linear operator defined by is a Hilbert-Shmidt operator in L 2 (0, 1) (see Ferreira and Manegatto (2009) [15]).Let {φ k (x), k ∈ N} be an orthonormal basis made of eigenfunctions of the operator , where λ k is called eigenvalue of K associated to the eigenfunction φ k (x) and the equality is given in the sense of L 2 (0, 1).Based on this, for a symmetric copula with square integrable density, we have (see also Beare (2010) [3], Longla (2014) [27] ): • The sequence of |λ k | has a finite number of values including their multiplicities, or converges to 0. • {φ k (x), k ∈ N} can be selected to form an orthonormal system in L 2 (0, 1).
Due to the fact that L 2 (0, 1) is a separable Hilbert space and the orthonormal system made of eigenfunctions of this operator is complete, it follows that formula (6) holds in the sense of L 2 (0, 1).Therefore, by Mercer's theorem (see formula 27 page 445 of Mercer (1909) [29] or Theorem 2.4 of Ferreira and Manegatto (2009) [15]) where convergence of the series is point-wise and uniform on (0, 1) 2 when the opertor is non-negative definite (when all eigenvalues are non-negative).The same conclusion can be achieved when all eigenvalues are non-positive (or the operator is non-positive).In fact, Mercer (1909) [29] showed that for any continuous non-negative definite kernel k(x, y), the series converges point-wise and uniformly.A non-negative definite kernel is a function k(x, y) such that n i,j=1 k(x i , x j )s i s j ≥ 0 for all x i , x j , s i , s j .
Moreover, it is obvious that for a continuous symmetric kernel k(x, y), the trace of the operator is Therefore, the series of partial sums of eigenvalues is convergent when the density of the copula is symmetric, square integrable, positive definite and continuous on [0, 1] 2 .It is important to mention that these conditions impose that eigenvalues be all positive (or all negative), therefore ruling out some of the copulas.But the decomposition can be extended by continuity to copulas with eigenvalues of both signs.
Proposition 2.1 For any absolutely continuous copula C(u, v) the following holds.
1.The kernel operator (5) has and eigenvalue λ = 1 associated to the eigenfunction φ(x) = 1; 2. There exists a basis of eigenfunctions of (5) for any square integrable symmetric copula density.Moreover, there exists a decomposition (6) containing φ 1 (x) = 1 associated to the eigenvalue λ 1 = 1. 3. A selected basis might not contain φ(x) = 1 in the case when the eigenvalue λ = 1 has multiplicity at least equal to 2, generating a subspace of dimension higher than 1 with orthonormal basis not containing φ(x) = 1.
Note that when the eigenvalue λ = 1 has multiplicity s > 1 in the decomposition, s has to be finite because of square integrability.It is also worth mentioning that it is impossible for some functions to be eigenfunctions associated to the eigen value 1.For instance, φ k (x) has to be bounded.
We devote the following subsections of this work to the case when φ(x) = 1 is used.This case by itself is interesting because it turns into the question of perturbations of the independence copula (see Longla et al. (2022) [23]).As we will see below, the mixing structure of Markov chains generated by our constructed copula families is easily established in general.This eases the study of estimators of coefficients and measures of association.

Perturbations of the copula Π(u, v)
The copula Π(u, v) = uv is called independence copula.It is used to model data under the assumption of independence.This means that the copula of random variables that are independent is Π(u, v).The notion of perturbation has been recently used in many papers to address modifications that are done to a copula to improve estimation results.This is done by introducing some level of dependence in the case of the independence copula (see Durante et al (2013) [11]) through a function D(u, v).When Π(u, v) is perturbed, the obtained copula exhibits some dependence that can be estimated to fit better the data at hands.
Various types of perturbations have been considered in Durante et al (2013) [11], Komornik et al (2017) [19], Longla et al (2022) [23], Longla et al (2022c) [25] and the references therein.In this section, we investigate perturbations of the kind C D (u, v) = Π(u, v) + D(u, v), where D(u, v) is the perturbation factor to be determined using (6).This perturbation method was studied in Komornik et al (2017) [19] and the references therein.Longla et al (2022) [23] and Longla et al (2022c) [25] considered mixing for Markov chains generated by perturbations with Remark 1 The use of M (u, v) to perturb a copula C(u, v) for data with uniform marginals assumes that values of the Markov chain have a positive probability of repeating themselves, before providing new values from the transition copula C(u, v).
Considering the decomposition (6), when the first term uses φ(x) = 1, reindexing the remaining functions φ k (x) and integrating leads to This implies C(u, v) = uv + D(u, v), where Note that D(u, 0) = D(u, 1) = 0 and D(u, v) is symmetric.Therefore, the perturbation is a copula when its second order mixed derivative is non-negative (refer to Longla et al. (2022) [23]).So, any representation (7) with λ k ∈ R defines a copula with square integrable density if and only if Condition ( 9) is required for square integrability of the density and takes into account the fact that φ k (x) form an orthonormal system of functions.Condition ( 8) is satisfied when the following holds: Note that condition (10) is not necessary.For different values of k, the functions φ k (x) might have their minima or maxima at different points.It is a strong condition, but easy to handle in practice.Note that when condition (10) holds, if φ k (x) are uniformly bounded, then (9) holds.This can be shown by first showing that the sum of |λ k | is convergent and larger than the sum in (9).For all systems of functions that are considered in this section, all these conditions are satisfied.Thus, the following characterization theorems hold.
Theorem 2.2 Any symmetric absolutely continuous copula with square integrable density can be represented by (7), where {1, In Theorem 2.2, equality is understood as equality of operators in L 2 (0, 1).This equality turns into point-wise or uniform convergence when the density of the copula is continuous and defines a positive semi-definite operator.
is such that conditions (10) and (9) hold for an orthonormal system of functions of L 2 (0, 1) that contains φ 1 (x) = 1.The function (7) is an absolutely continuous symmetric copula with square integrable density that we call Type-I Longla copula with notation L(u, v, Λ, Φ).
For any system of functions φ k (x) that is uniformly bounded, condition (10) implies condition (9).Theorem 2.3 opens the ground for various copula families with functions as parameters.These copulas can have finite or infinite sums of terms.They depend both on the system of functions φ k (x) and the coefficients λ k .The characterization formulated in the following general result summarizes the above analysis.

Examples of new copula families
Theorem 2.3 and Theorem 2.2 provide a relationship between extreme values of the functions φ k (x) and the possible eigenvalues λ k of (5).For example, a function φ(x) with maximum 2 and minimum −3 cannot be associated to an eigenvalue less than −1/9 or larger than 1/6, when all other coefficients are equal to 0. The larger the values of |φ k (x)|, the smaller the maximal range of λ k , when the sum (6) has only φ k (x) and 1. Setting Φ(x) = x 0 φ(s)ds for any mean-zero normal function φ(x) on [0, 1], we have the copula An example of copula ( 11) is defined for Copula ( 12) is also a representative of the only class of copulas with square integrable symmetric densities based on a single function φ(x) that takes 2 values on [0, 1].This class of copulas is defined using where A is a set of measure 1 α+1 for some strictly postive real number α. Copula ( 12) is obtained via integration to get Φ(x) of formula (11) for ).
For copula (12), while the parameter λ answers for dependence (we will show later that its absolute value is the maximal coefficient of correlation), the parameter α answers for the portions of [0, 1] 2 that have constant likelihood values.In fact, the density of this copula is constant on each of the four portions of its support.Via simple computations we establish that for copula (12), Remark 2 The coefficients ρ S (C α,λ ) and τ (C α,λ ) show that for the purpose of correlation or association, the range of α can be limited to (0, 1], because 1/α and α produce the same value for ρ S (C α,λ ) and τ (C α,λ ).The maximum of these coefficients is reached at α = 1.
Interest in the copula family C 1,λ (u, v) is in the fact that the range of the Pearson correlation coefficient of functions of variables (U, V ) generated by this copula family is [−1, 1].For λ ̸ = ±1, the procedure to generate a stationary Markov chain from C 1,λ (u, v) with the uniform marginal distribution is as follows.We generate an observation U 0 from U nif orm(0, 1); then for every integer i, generate Q i from U nif orm(0, 1) and define U i using the quantile function of the conditional distribution of U i |U i−1 .The quantile function is the inverse with respect to v of the derivative with respect to u of C(u, v) at the point (U i−1 , U i ) (see Longla and Peligrad (2012) [28]).We obtain the formula Recall that the joint cumulative distribution function of (U 0 , U n ) is the copula C 1,λ n (u, v).Therefore, the following holds.
Lemma 1 For any stationary Markov chain generated by C 1,λ (u, v) and the uniform marginal distribution for λ ̸ = ±1 , we have Un −Qn → 0 in probability when n → ∞.
The limiting value of U n as n → ∞ doesn't depend on the initial value U 0 .This is true because λ is replaced by λ n → 0 in formula ( 13) and Q n is a random sample obtained independently.This fact is one of the selling points of the study of mixing properties.The intial point of the Markov chain doesn't affect the long run behavior under mixing assumptions.
An extension of the copula family ( 12) is obtained considering for all s ∈ N * , Formula ( 14) defines the density a copula for any set Via simple computations, we show that functions Moreover, this copula coincides with copula (12) when s = 1 and a 1 = 0.This copula departs from independence by adding local perturbations to rectangles [a i , a i+1 ] × [a i , a i+1 ] of size ±θ.An interesting case is when all functions have the same maximum, same minimum and the absolute values of these numbers are same.The range of the λ k is symmetric.This is the case when φ k is a sin w k x, or a cos w k x.We consider these cases in the subsections below.

Copulas based trigonometric on functions
Here, we look into copula families based on trigonometric bases of L 2 (0, 1).These copulas are different from those introduced in Chesneau (2021) [8], because they don't involve sums of variables.Moreover, the mixing structure of copulas of Chesneau (2021) [8] is not as evident as that of the copula families that we construct here.It is also worth mentioning that the copula families that are introduced here can be finite or infinite sums and that their fold products remain in the considered classes.When modeling with these copulas, the existence of waves in the scatter plot can indicate the number of functions to consider in the sum.An observation on ρ S (C) and τ (C) shows that some terms can be added to increase linear correlation without affecting association.
• The sine-cosine copulas as perturbations of Π(u, v) We call sine-cosine copulas perturbations of the independence copula that consider the trigonometric orthonormal basis of L 2 (0, 1) that consists of functions: is a copula, when holds automatically.The series (15) also converges uniformly and absolutely on [0, 1] 2 thanks to Weierstrass' M -test and the fact that the functions in the series are uniformly bounded on [0, 1] 2 .Figure 1 The Spearman's ρ and Kendall's τ for this copula family have been computed and are given below.
We can see on Figure 1 that the range of the copula decreases as we increase the number of terms in the sum.For sine-cosine copulas, Lemma 2 shows that terms related to cosine in the density of the copula don't affect ρ S (C), while terms with sine linearly affect ρ S (C).
Remark 3 Any copula from this family with µ k = 0 for all integer k ≥ 1 satisfies ρ S (C) = τ (C) = 0.This is a subclass of copulas of the form Therefore, the Spearman's ρ and Kendall's τ are not good parameters for tests of independence because they don't uniquely identify a member of this family.
Fig. 2: Examples of sine copula densities Lemma 3 For copulas defined by formula (17), we have This family of copulas seems close to the sine-cosine family, but the formula of ρ S (C) or τ (C) in Lemma 3 shows that they are not comparable.We see that for these copulas ρ S (C) = 0 when all odd coefficients are equal to zero.Formula (16) defines a subclass of copulas from (17), for which all odd coefficients are equal to zero.Any finite sum of terms from (17) that includes the first term uv and with coefficients that satisfy condition ( 18) is a copula.An example is Remark 4 The maximal coefficient of correlation of copula-based Markov chains generated by copulas ( 19) is |λ|.We find ρ S (C) = 96 π 4 λ and τ (C) = 64

The extended Farlie -Gumbel -Morgenstern copulas
In this subsection, we construct an example of copula family based on formula (7) that extends the Farlie-Gumbel-Morgenstern copulas.The FGM copula is a pertubation of the independence copula (see Longla et al (2022c) ).The form of the FGM copula allows an extension based on shifted Legendre polynomials, orthonormal basis of L 2 (0, 1).It is important to mention that formula (7) allows various families of extensions of the FGM copula that depend on the used orthonormal systems of functions.Legendre polynomials are defined by Rodrigues' formula This formula was discovered by Rodrigues in (1816) [34] and leads to Legendre polynomials form a basis of L 2 (−1, 1).When they are shifted to [0, 1], a renormalization is used to obtain shifted orthonormal polynomials.They appear in many fields of mathematics as solutions to a special differential equation and are used for approximation of functions.For these polynomials, . The transformation y = 2x − 1 gives to the orthonormal basis of shifted Legendre polynomials as This minimum is achieved at least at one point.Therefore, when (7) contains φ k (x) alone, the range of λ k is ).
Lemma 4 For any copula of the extended FGM copula family, Simple computations show that −1/3 ≤ ρ S (C) ≤ 1/3 and τ (C) can be increased or decreased by adding more non-zero terms to the series.In general, copulas with φ k (x) given in formula (22) satisfy condition (10).We call this set of copulas extended Legendre-Farlie-Gumbel-Morgenstern family of copulas.Some copulas of this form are as follows.For |λ| ≤ 1/3, 3|λ The first example above is the FGM copula.For more on this copula family, see Farlie (1960) [14], Gumbel (1960) [16], Jonhson and Kotz (1975) [21] or Morgenstern (1956) [30].In general this copula has parameter θ in place of 3λ 1 .The second example is one of the possible extensions of this copula family.For this extension, we have Any subclass of the copula family defined with Legendre polynomials and containing the copula given by formula ( 24) is an extension of the FGM family.
Depending on what the investigator is looking for, it might be better to set some coefficients equal to zero, deal with the signs of the coefficients to increase or reduce the strength of dependence.This is justified by formula (26).The FGM copula family has parameter θ = 3λ, with |θ|≤ 1; so |λ|≤ 1/3.No copula from our extension can have ρ S (C) out of this range because increasing the number of non-zero parameters reduces the range of λ 1 .Example (25) has −1/5 ≤ λ 2 ≤ 1/5 when λ 1 = 0, but can be extended to −1/5 ≤ λ 2 ≤ 2/5.For any copula from this family, it is worth formulating the following.
Remark 5 Lemma 4 shows that any copulas from this family with the same λ 1 have the same ρ S (C), while it is not the case for τ (C).Other formulations of these copulas use the basis , where j k is a sequence of positive integers.

Counter example -the sine basis
This case is to show that not all orthonormal bases of L 2 (0, 1) can be used to construct copulas with square integrable densities.We take the case of functions constructed using the sine basis.They are not copulas under any assumptions.In fact, if If this function is a copula, then using C(u, 1) = u and taking the derivative of the series, we obtain Multiplying by sin kπu and integrating leads to 1−cos kπ kπ = λ k πk (1 − cos kπ).Therefore, λ 2k+1 = 1.As said earlier, the function φ(x) = 1 in this case is embedded in a subspace of dimension ∞ of eigenfunctions associated to the eigenvalue 1.This is in contradiction with condition (9).So, no such decomposition represents a square integrable copula density.

Copula-based Markov chains and ψ-mixing
We study here the long run behavior of several copula families constructed with various sets of bases of L 2 (0, 1).We provide the study of the mixing structure of copula-based Markov chains generated by our constructed copulas.In some cases, we provide a modification of the copulas to present them as convex combinations of some members of the constructed families.This transformation provides a link between this work and results of Longla et al. (2022) [23] and Longla (2015) [26].Mixing coefficients are used in probability theory to establish central limit theorems for sums of dependent random variables (see Peligrad and Utev (1997) [33]), which are used to construct confidence intervals for means of functions of Markov chains.Copulas characterize the dependence structure of Markov chains or multivariate random variables in general via Sklar's theorem (See Sklar (1959) [35]) when marginal distributions are continuous.Therefore, it is important to look into some conditions on copulas that would guarantee a given dependence structure (see Beare (2010) [3], Longla (2015) [26] among others).Dependence structures considered in the literature include α-mixing, β-mixing, ρ-mixing, ϕ-mixing, ψ-mixing, ψ ′ -mixing, ψ * -mixing and others.We define only mixing coefficients of interest in this paper.For more on mixing coefficients, see Bradley (2007) [5].For a stationary Markov chain X 0 , • • • , X n , A = σ(X 0 ), mixing coefficients are defined as where P n (B|A) stands for P (X n ∈ B|X 0 ∈ A), and P n (A ∩ B) stands for P (X 0 ∈ A and X n ∈ B).Using these coefficients, Bradley (2005) [6] in his survey presented the following result.Suppose X 0 , • • • , X n is a strictly stationary sequence of random variables.If there exists n such that ψ ′ n > 0 and ψ * n < ∞, then ψ n → 0 (the sequence is ψ-mixing).Based on this statement, results of Longla (2015) [26] and Longla et al. (2022c) [25], we can conclude the following.
Theorem 3.1 An absolutely continuous copula C(u,v) generates ψ-mixing stationary Markov chains when there exist integers s 1 and s 2 such that some versions of the densities of C s1 (u, v) and C s2 (u, v) satisfy one of the following conditions.
1.There exists a constant real number M such that c s1 (u, v) > 0 on a set of Lebesgue measure 1 and It is important in Theorem 3.1 that the version of the density is bounded above on [0, 1].Failure of this condition on a set of Lebesgue measure 0 can imply that there is no ψ-mixing.
) is bounded away from zero on a set of Lebesgue measure 1 for some positive integers s 1 , s 2 .Assume that a version of the density of (a 2 for some positive integer s 3 or a version of the density of ( a i C i ) s4 (u, v) is strictly less than 2 for some s 4 ∈ N.Then, any convex combination of copulas C 1 (u, v), • • • , C k (u, v) generates exponential ψ-mixing stationary Markov chains with continuous marginals.
Note that when α = 1, the range of λ includes ±1, for which there is no ρ-mixing, ψ * -mixing or ψ ′ -mixing.This case is equivalent to ) When |λ| < 1, the density (30) splits [0, 1] 2 into 4 subsets.It is constant on each of the subsets, and the union of the 4 subsets is of full Lebesgue measure.Clearly, for |λ| = 1, this density is not bounded away from zero.

Mixing properties of the extended FGM family
We investigate the long run behavior of Markov chains generated by extended FGM copulas.Define α k = 1 if k is odd and α k = min P k (x), if k is even.
Theorem 3.3 For large enough values of n, a version of the copula density c n (u, v) is bounded above by M < 2 when φ k (x) is defined using Legendre polynomials with Therefore the copula C(u, v) generates ψ * -mixing stationary Markov chains (which is equivalent to ψ-mixing for stationary Markov chains) for all values of its parameters.
Remark 6 It follows from the proof of Theorem 3.3, that c n (x, y) → 1 as n → ∞, when the conditions of Theorem 3.3 are satisfied.For any system of eigenfunctions φ k (x) and for all values of the parameters described in Theorem 3.3, Theorem 3.2 holds because ψ-mixing implies ρ-mixing.Theorem 3.2 holds when extended FGM copulas are considered.A result similar to Theorem 3.3 is valid for our trigonometric extensions ( 15) and ( 17 Based on the analysis above, we can derive several facts about mixing properties of Markov chains generated by absolutely continuous copulas with square integrable densities.In general, if we assume that a Markov chain is generated by a copula (7) satisfying conditions ( 9) and ( 10) with an absolutely continuous marginal distribution, then the following holds.

Answer to an open question on mixing
Longla (2015) was solely devoted to mixing properties of copula-based Markov chains.Results were provided for several mixing coefficients, in one direction.
There was a proof that mixing for Markov chains generated by copulas implies mixing for Markov chains generated by their convex combinations.In this subsection, we show that mixing for Markov chains generated by convex combinations doesn't require mixing for Markov chains generated by any of the copulas of the convex combination.To do this, we take extreme cases of formula (30), defined by λ = ±1.None of them exhibits mixing in the sense of the coefficients defined in this paper, but they have square integrable densities.

Central limit theorem and simulation study
In this section we consider some functions of the Markov chain and derive central limit theorems for estimators of parameters of the copula and the population mean.We use an example of copula from the derived copula families, with special association properties.In fact, we select a copula that has ρ S (C) = 0, and a modification that also includes τ (C) = 0.For these examples, regular estimation procedures based on these measures of association fail.

Central limit theorem for parameter estimators
We consider in this section central limit theorems for averages of functions of the Markov chain generated by the copula with 2. The central limit theorem holds in the form where Based on properties of the copulas that we have constructed here, the variance σ 2 f exists, is finite and is strictly positive.To construct a confidence set or test a specific value of (µ 1 , µ 2 ), we can use the following consequence of Theorem 4.1.
Corollary 3 Assume that U 1 , • • • Un is a realization of the stationary Markov chain generated by (32) and the uniform marginal distribution.The following holds.
1.For f (x) = I(x ≤ a), a ∈ (0, 1), we get a dependent Bernoulli sequence and a CLT for the sample relative frequency of success holds with Note that for µ 1 ̸ = 0, the observations are dependent, but there exists a value of a = 1 π arccos(( . This means that the sample behaves as if it was a simple random sample.A researcher who starts with this simple random sample assumption would have wrong conclusions.2. For f (x) = −λ ln(1 − x), we get a Markov chain with exponential marginal distribution of parameter λ.The estimator of λ is S n (f )/n and the CLT holds with ( 245684 (these values where obtained using R to integrate).
This example shows that we can still estimate λ using the sample mean, but the variance of the estimator has to be treated more carefully.The asymptotic variance is not λ 2 /n.It is multiplied by a coefficient that can be less than or grater than 1 depending on the choice of µ 1 .3. For f (x) = x, the asymptotic variance is always larger than it would be, if the data was assumed from a simple random sample.

Simulation study
We generate a Markov chain (U 1 , • • • U 1000 ) using the uniform distribution on (0,1) and the copula (35) for µ 1 = 0.05.We use the following standard procedure to generate the Markov chain.Randomly generate an observation from U nif orm(0, 1) as U 1 .and for every i > 1, we generate a new observation W from U nif orm(0, 1) and solve for U i the equation C ,1 (U i−1 , U i ) = W .This step is implemented using the R function uniroot.After generating this sample, we performed a correlation test.This test showed that one could conclude there is no correlation between variables along the Markov chain.Moreover, the scatter plot would also convince further the investigator that this data seems to be a random sample.We use α = 0.05 for level of confidence of the intervals.
1. Case 1: We use a = 01., 0.2, 0.3, 0.4, 0.5, 0.6, 0.7, 0.8, 0.9 to obtain a set of variables Y i = I(U ≤ 0.i) for our sample.For this function of the observations we get a sample of values (Y 1 . . .Y 9).We now consider the problem of estimation of the parameters a i of the model using the obtained dependent Bernoulli sequence.We get âi = Ȳ i and the variance is given by formula (36) for every i.A confidence interval is constructed.We run N such studies, and get the proportions of intervals that contain the true value of the parameter.This is done under independence assumption and under our model.We obtain the following comparative results, that report the coverage probabilities under each of the models and for the given values of 100 and 1000 for N .These tables show that the researcher would be off by a lot by assuming independence for the considered data.Despite the small departure from normality with µ 1 = .05and µ 2 = −.2, there is a serious impact on the coverage probability of the confidence interval.If dependence is neglected, the interval would not be close to a 95% confidence interval.2. Case 2: We create a sample of Y = −λ ln(1−U ) with λ = 0.5, 1, 5, 10, 20, 30.
For each of the values of λ, we create N samples.Each of the samples produces a confidence interval, and we report proportions of samples that cover the true parameter λ as coverage probabilities.For the given value of µ 1 , we have σ 2 f = 0.9702667λ 2 and σ f = 0.9953594λ.For this example, the Table 3: Coverage probabilities under the Exponential marginal low departure from independence doesn't influence much the distribution of the estimator or the confidence interval.Assuming independence in this case would not have serious consequences.When the same sample is used for all values of λ, we obtain equal coverage probabilities.3. Case 3. In this case, we check that the Data that shows possible correlation 0, ρ S = 0 and τ = 0. We estimate the mean of the marginal distribution and build a confidence interval for samples of 1000 observations.This is repeated 100 times to obtain coverage probabilities.The procedure is repeated for sample sizes 500 and 100.These tables also show that the confidence intervals under independence tend to capture more than 94% of the time the true value of the mean.4. Case 4: Estimation of µ 1 for a sample with the uniform marginal distribution.We construct 100 confidence intervals for each of the true values of   This table shows that the best estimator would use values of w close to 1, giving more weigh tho the impact of φ 1 (x) on the estimator of µ 1 .This is true because when w ≤ 0.6, even for a sample of size 5000, confidence intervals for µ 1 are too wide and cover the true value of µ 1 almost always.

Conclusion and comments
We have provided a characterization of copulas with square integrable symmetric densities.We have used this representation to construct several families of copulas.One of the constructed copula families is an extension of the FGM copula family.Other families include trigonometric functions and are all new in the literature.Mixing properties of the constructed copula families have been established.Theorems have been provided for ρ-mixing and ψ-mixing of Markov chains generated by these copulas and the uniform distribution.This result is equivalent to the said mixing for Markov chains generated by means of a continuous marginal distribution.
Copulas based on {1, sin 2πkx, cos 2πkx, k ∈ N} have been proposed.Their Spearman ρ and Kendall τ indicate a way to modify the copula without modifying τ (C) = ρ S (C) = 0.The equalities τ (C) = ρ S (C) = 0 hold for the independence copula.We have investigated central limit theorems for these Markov chains.An example with |µ 1 |≤ .11,µ 2 = −4µ 1 , µ k = 0, k ≥ 3 was used in simulations to illustrate departure from independence, while conserving τ (C) = ρ S (C) = 0.This simulation study has shown that in some cases, the dependence is not seen graphically, and even on the correlation test.However, assuming independence produces poor confidence intervals.Various constructions of this form can be used to model the dependence structure of the data while keeping the key factors of association that the researcher doesn't want to modify.We have provided in this case a central limit theorem for the estimator of (µ 1 , µ 2 ); and indicated how it can be used for testing and confidence intervals.
We have shown that examples of copulas based on finite sums of terms can increase ρ S (C) up to 0.49 and τ (C) to 0.32.These values are larger than those of the popular FGM copula family.A conclusion on the obtained extension of the FGM copula family is that it can help modify the Kendall coefficient while keeping the Spearman's ρ at the level λ 1 by introducing non-zero coefficients.This fact is based on ρ S (C) = λ 1 for all copulas of the family.
We also mention that functions φ k (x) can act as parameters of the constructed copula families, and be subject to estimation issues.This would be case when one needs to find the appropriate perturbation that fits the best the relevant situation based on the information at hands.This question will rely on mixing properties of the constructed copula families and is one of the topics for further research on estimation problems based on these copulas.A drawback for the constructed copula families is that, being absolutely continuous, they do not exhibit any tail dependence.Consideration of their tail dependent extensions is part of ongoing work.

Appendix of proofs
Proof of Proposition 2.1 1.Note that for any copula, we have C(x, 1) = x and C(x, 0) = 0. Therefore, for any absolutely continuous copula, φ(x) = 1 satisfies It follows that the function φ(x) = 1 is an eigenfunction of any kernel operator defined by an absolutely continuous copula associated to the eigenvalue 1. 2. The proof of the second point relies on formula (6) and the first point of Proposition 2.1.3. The proof of this third point is a consequence of the fact that when the eigenvalue 1 has multipilicity higher than 1, it generates a subspace of dimension greater than 1.In this subspace, an orthonormal basis not containing φ(x) = 1 can be constructed.□

Proof of Theorem 2.3
The statement of Theorem 2.3 implicitly claims that the sum has pointwise convergence.To prove this, we recall Weierstrass' M-test.By Weierstrass' M-test, for a sequence f n (x) defined on the same support E, if |f n (x)| ≤ M n New copula families and mixing properties for all x and M n < ∞, then f n (x) converges uniformly and absolutely on E. For the sum (7), Condition (10) converges as a consequence of Condition (10).Thus, by Weierstrass M-test, the series that defines the density of the copula converges absolutely and uniformly.Moreover, (7) also holds with absolute convergence and uniform convergence.The rest of the proof relies on the fact that functions form an orthonormal system.Orthogonality implies C(1, x) = C(x, 1) = x, because The first condition of this theorem implies ψ ′ -mixing as a consequence of Longla (2015) [26].Longla (2015) [26] showed that if the density of the absolutely continuous part of C s1 (u, v) is bounded away from 0 on a set of Lebesgue measure 1, then ψ ′ s1 > 0.Moreover, c s2 (u, v) < M implies ψ s2 < ∞.Therefore, by Bradley (2005) [6], the first condition of Theorem 3.1 implies ψ-mixing.The second condition implies ψ as a result of Longla et al. (2022c) [25].□ Proof of Example 1.
If this copula generates ψ-mixing stationary Markov chains with continuous marginal distributions, then it would have to generate ψ ′ -mixing and would therefore generate ρ-mixing (contradiction) with Longla (2014) [27].Therefore, it doesn't generate ψ-mixing.Though its density is c(u, v) = 0 for all (u, v) ∈ (0, 1) 2 such that v ̸ = u ̸ = 1 − v, this density doesn't exist on a set of Lebesgue measure 0 and is bounded on a set of Lebesgue measure 1.The second example is based on the fact that the density of the given copula is equal to 2 on a set of Lebesgue measure 1/2 and equal to 0 on its complement.□ Proof of Lemma 2 The upper bound on these values is based on the assumptions on the coefficients of the series.These conditions imply that |µ k | ≤ 1/2 and the largest value in absolute value is achieved when The last inequality uses |λ 1 | + |µ 1 | ≤ 1/2.Equality happens for |µ 1 | = 1/2 and λ 1 = 0.These inequalities justify the boundary on ρ S (C) and τ (C).□ Proof of Theorem 3.2 The proof of Theorem 3.2 relies on the fact that C n (u, v) is the n th -fold product of C(u, v) , φ k (x) form an orthonormal basis of L 2 (0, 1), λ n k are all eigenvalues of the Hilbert-Schmidt operator associated to c n (u, v).Note that though we have square integrability, the supremum in Theorem 3.2 can be equal to 1.The rest is an application of Longla and Peligrad (2012) [28].□.
Proof of Theorem 3.3 The proof of Theorem 3.3 is an application of Longla et al (2022c) [25].In Longla et al (2022c) [25] it was shown that if the density c n (u, v) < 2 on [0, 1] for some n, then the copula C(u, v) generates ψ * -mixing.Let λ = sup k |λ k | < 1.From the fact that the density is square integrable, we have Therefore, c n (u, v) is bounded away from 0 on a set of Lebesgue measure 1.This implies ψ ′ -mixing.Moreover, |λ k | ≤ (2k + 1) −1 when k is even.Thus, M is a constant free of n.So, we conclude that as n → ∞, M λ n−3 → 0. Therefore, c n (x, y) < 2 for sufficiently large values of n.By Longla et al (2022c) [25] it follows that the Markov chains are ψ-mixing.
Note that there are still some values of λ k > (2k + 1) −1 that are not considered by Theorem 3.3 which can increase up to the minimum of the considered even Legendre polynomials.Using the same arguments, we extend the theorem to the cases when the supremum of λ k is less than 1.Due to  Moreover, in the context of reversible Markov chains, we have where A i = 1 0 φ i (u)f (u)du and σ 2 is the variance of f (X i ).In this case, µ 2 = −4µ 1 .Simple computations lead to the formula of σ 2 f .• Case 1: Dependent Bernoulli observations.Select f (x) = I(x ≤ a) for a ∈ (0, 1).Via simple computations, we obtain µ = Ef (U 1 ) = a, σ 2 = a(1 − a) and A 2 i = (1−cos 2πia) 2 2π 2 i 2 = 2 sin 4 πia π 2 i 2 .Thus, equation ( 36 Theorem 4.1 and Formula (41) conclude the proof.□ Figure 1(b) is the graph of

)Corollary 2
Moreover, ρn = sup k |λ k | n and Markov chains generated with uniform marginals are both geometrically ergodic and exponential ρ-mixing if and only if sup k |λ k | < 1.If the sum has a finite number of non-zero terms with |λ k |< 1 for all k, then the Markov chains generated with uniform marginals are ψ-mixing.For any set {a i , i = 1, • • • , s : 0 ≤ a 1 < a 2 < • • • < as < a s+1 = 1}, and |θ i | ≤ 1, formula (14) defines the density of a copula that generates ψ − mixing stationary Markov chains with continuous marginal distributions.

Theorem 3 . 4
Under the assumptions of Theorem (3.2), 1.If the sequence φ k (x) is uniformly bounded, and sup k |λ k | < 1 for all k, then the Markov chain is ψ-mixing.2. If sup k |λ k | < 1 for all k, then the Markov chain is ρ-mixing.3.If Condition (10) allows strict inequality, then the Markov chain is ψ ′mixing.Each of the examples of Figure (1) is bounded away from zero on [0, 1] 2 and is strictly less than 2. Therefore, by Theorem (3.4), these examples generate ψ-mixing Markov chains.This means that the sine copulas and sine-cosine copulas generate ψ-mixing Markov chains.This is justified by the fact that each of the basis functions is bounded by √ 2 and condition (10) implies |λ k |≤ 1/2.It is important to note that the uniform bound on the sequence φ k (x) is crucial in statement 1 of Theorem 3.4.

Theorem 4 . 1
Assume that U 1 , • • • Un is a realization of the stationary Markov chain generated by(32) and the uniform marginal distribution.The following holds.

Table 1 :
Coverage probabilities under assumption of independence

Table 2 :
Coverage probabilities under our model C(u,v)

Table 4 :
Coverage probabilities under uniform marginal

Table 6 :
Coverage probabilities for µ 1 as function of w and µ 1 .