Abstract
In this paper, the marginal distributions of concomitants of k-record values based on Sarmanov family of bivariate distributions are obtained, as an extension of several recent papers. Besides, we derive the joint distribution of concomitants of k-record values for this family. Furthermore, some new and useful properties of information measures, namely, the extropy, Shannon entropy, inaccuracy measure, cumulative entropy, cumulative residual entropy, and cumulative residual Fisher information are studied. Finally, we offered various examples accompanied by numerical investigations that backed up the theoretical findings.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
1 Introduction
Let \(\{X_i, ~i \ge 1\}\) be a sequence of independent and identically distributed random variables (RVs) with a common continuous distribution function (DF) \(F_X(x)\) and probability density function (PDF) \(f_X(x).\) An observation \(X_j\) is called an upper record value if \(X_j>X_i\) for every \(i<j.\) An analogous definition can be given for lower record values. The model of record values becomes inadequate in several situations when the expected waiting time between two record values is very large. For example, serious difficulties for the statistical inference based on records arise because record data is extremely rare in practical contexts and the predicted waiting time for each record following the first is infinite. In those situations, second or third largest values are of great importance. These issues are avoided, however, if we consider the k-record value model; refer to Aly et al. [9], Berred [21], and Fashandi and Ahmadi [26]. The PDF of the nth upper and lower k-record values are, respectively, given by Dziubdziela and Kopocinski [25] as
and
where \(\Gamma (.)\) is the gamma function and \(\overline{F}_X(x)=1-F_X(x).\) Moreover, the joint PDF of the nth upper k-record value, \(X_{n,k}^{(u)},\) and the mth upper k-record value, \(X_{m,k}^{(u)},\) is given by
When prior information is in the form of marginal distributions, it is advantageous to consider families of bivariate DFs with specified marginals when modelling bivariate data. Among these families, the Farlie–Gumbel–Morgenstern (FGM) family was studied extensively by many authors. The FGM family is represented by the bivariate DF \(F_{X,Y}(x,y)=F_X(x)F_Y(y)[1+\theta (1-F_X(x))(1-F_Y(y))],\) \(-1\le \theta \le 1,\) where \(F_X(x)\) and \(F_Y(y)\) are the marginal DFs of two RVs X and Y, respectively. The FGM family has undergone a number of alterations to broaden the range of the correlation between its marginals, according to the literature. These extended families have been the subject of numerous studies from various angles. Examples of these studies are Abd Elgawad and Alawady [1], Abd Elgawad et al. [2, 5], Alawady et al. [6, 7], Barakat et al. [12,13,14, 16, 17], Beg and Ahsanullah [19], and Bekrizadeh et al. [20].
Sarmanov [34] proposed a new mathematical model of hydrological processes described by the following family of bivariate DFs
denoted by SAR\((\alpha ).\) The corresponding PDF is given by
The Sarmanov copula’s correlation coefficient is \(\alpha .\) As a result, the copula’s maximum correlation coefficient is 0.529 (cf. [10]; page 74). Recently, some aspects of concomitants of order statistics (OSs) and record values from this family were presented by Barakat et al. [15] and Husseiny et al. [27], respectively. In the present paper, we reveal some additional motivating properties for SAR\((\alpha )\). We also discuss some aspects of the concomitants of k-record values and some information measures that are relevant to this abandoned family.
Let \((X_i, Y_i), i=1,2,...,\) be a random bivariate sample, with a common continuous DF \(F_{X,Y}(x,y)\) \(=P(X\le x,Y\le y).\) When the investigator is just interested in studying the sequence of k-records of the first component X, the second component associated with the k-record value of the first one is termed as the concomitant of that k-record value. The subject of k-record values and their concomitants can be found in a wide range of practical experiments, e.g. Bdair and Raqab [18], Chacko and Mary [22], Chacko and Muraleedharan [23], and Thomas et al. [35]. The PDFs of the concomitants \(Y_{[n,k]}^{(u)}\) (the nth upper concomitant of \(X_{n,k}^{(u)}\)) and \(Y_{[n,k]}^{(\ell )}\) (the nth lower concomitant of \(X_{n,k}^{(\ell )}\)) are given by
respectively, where \(f_{Y| X}(y|x)\) is the conditional PDF of Y given X. Moreover, the joint PDF of concomitants \(Y_{[n,k]}^{(u)}\) and \(Y_{[m,k]}^{(u)}\) (\( n<m\)) is given by
Now we are going to take a quick look at some of information measures that will be discussed in this study. The Shannon entropy is a statistical measure of information that determines how much variability associated with an RV is reduced on average. In domains as disparate as computer science, financial analysis, and medical research, this measure is used. For a continuous RV X with a PDF \(f_X(x),\) the Shannon entropy is defined by \(H(X)=-\text{ E }(\log f_X(X)).\) Obviously, \(H(X + b)=H(X),\) for any \(b \ge 0.\) For further details about this measure, see Abd Elgawad et al. [3, 4], Alawady et al. [8], and Barakat and Husseiny [11]. Lad et al. [30] presented a new measure of uncertainty termed extropy, which is a complement dual of Shannon entropy and is defined by \(J(X)=-\frac{1}{2}\text{ E }(f_X(X)).\) It goes without saying that \(J(X)\le 0.\) The total log scoring rule is one of the statistical applications of extropy, which is used to score forecasting distributions.
In the literature, there are several different versions of entropy, each one suitable for a specific situation. Rao et al. [33] introduced the cumulative residual entropy (abbreviated by CRE) of X as \(CRE(X)=-\int _{0}^{\infty }\overline{F}_X(x)\log \overline{F}_X(x)dx\). The Shannon entropy presents various drawbacks when used as a continuous counterpart of the entropy for discrete RVs. Various efforts have been made by some authors to define alternative information measures. One example is given in Di Crescenzo and Longobardi [24], where it was proposed a measure, called cumulative entropy (abbreviated by CE), and it is defined by \(CE(X)=-\int _{0}^{\infty }F_X(x)\log (F_X(x))dx,\) for any non-negative RV X with the DF \(F_X.\) Unlike the Shannon entropy, CE is always non-negative. Moreover, the CE is appropriate to deal with reliability systems whose uncertainty is related to the past. We also consider a measure of inaccuracy known as the Kerridge measure of inaccuracy linked with two RVs as an expansion of uncertainty, which was defined by Kerridge [28]. This inaccuracy measure is defined as \(I(X;Y)= -\int _{0}^{\infty }f_X(t)\log (f_Y(t))\,dt\) for any non-negative RVs X and Y. Kharazmi and Balakrishnan [29] recently introduced the cumulative residual Fisher information (abbreviated as CF) for the location parameter, which is defined as \(CF_{\overline{F}_Y(y)}=\int _{0}^{\infty }\left( \frac{\partial \log \overline{F}_Y(y)}{\partial y}\right) ^2 \overline{F}_Y(y)dy .\)
The rest of the paper is organized as follows. In Sect. 2, we obtain some new interesting results pertaining to SAR\((\alpha )\). In Sect. 3, we obtain the marginal DF of concomitants of k-record values and joint DF of concomitants of k-record values based on SAR\((\alpha ).\) In Sect. 4, we get some new elegant and useful relations for the Shannon entropy concerning the Sarmanov copula. In Sect. 5, the Shannon entropy, inaccuracy measure, extropy, CE, CRE, and CF for SAR\((\alpha )\) are derived with some illustrative examples. Finally, in Sect. 6, we perform some numerical studies which lend further support to our theoretical results with discussion.
2 Properties of Concomitants of k-Record Values Based on Sarmanov Copula
The FGM copula is \(C(u,v;\theta )=uv(1+\theta (1-u)(1-v)),~0\le u,v\le 1,\) and the corresponding density copula is \(\mathcal{C}(u,v;\theta )=1+\theta (1-2u)(1-2v).\) Moreover, in view of (1.1) and (1.2), the Sarmanov copula is \(S(u,v;\alpha )=uv[1+3\alpha (1-u)(1-v)+5\alpha ^2 (1-u)(1-v) (2u-1)(2v-1)],\,0\le u,v\le 1,\) with corresponding PDF \(\mathcal{S}(u,v;\alpha )=1+3\alpha (2u-1)(2v-1)+\frac{5}{4} \alpha ^2 (3(2u-1)^2-1)(3(2v-1)^2-1).\) Clearly, the FGM copula is radially symmetric about \((\frac{1}{2},\frac{1}{2}),\) i.e. \(\mathcal{C}(\frac{1}{2}-u,\frac{1}{2}-v;\theta )=\mathcal{C}(\frac{1}{2}+u,\frac{1}{2}+v;\theta )\) (cf. [32]). We can simply prove that SAR\((\alpha )\) is the only one of the known-extended families of FGM with a radially symmetric copula about \((\frac{1}{2},\frac{1}{2}).\) In this section, we will go over two more intriguing aspects of the Sarmanov copula.
Proposition 1
Let \(\mathcal{S}_{[n,k]}^{(\ell )}(.;\alpha )\) and \(\mathcal{S}_{[n,k]}^{(u)}(.;\alpha )\) be the PDFs of concomitants of the nth lower and upper k-record values based on the Sarmanov copula, respectively. Then,
Proof
According to (1.3), we get \(\mathcal{S}_{[n,k]}^{(u)}(v;\alpha )=\int _{0}^{1}\mathcal{S}(u,v;\alpha ) g_{n,k}^{(u)}(u) du,\) where \(g_{n,k}^{(u)}(u)\) is the nth upper k-record values from uniform distribution over (0,1). Taking the transformation \(u=\frac{1}{2}-z,\) and change v to \((\frac{1}{2}-v),\) we get
since \(g_{n,k}^{(u)}(\frac{1}{2}-z)=g_{n,k}^{(\ell )}(\frac{1}{2}+z).\) Put \(\frac{1}{2}+z =\eta ,\) we get \(\mathcal{S}^{(u)}_{[n,k]}~ (\frac{1}{2}-v;\alpha )=\int _{0}^{1}\mathcal{S}(\eta ,\frac{1}{2}+v;\alpha )g_{n,k}^{(\ell )}(\eta ) d\eta = \mathcal{S}^{(\ell )}_{[n,k]}(\frac{1}{2}+v;\alpha ).\) This proves the proposition. \(\square \)
Remark 2.1
The proof of Proposition 1 shows that for the concomitants of lower and upper k-records based on any radially symmetric copula, the relation (2.1) is valid.
Remark 2.2
The case \(k=1,\) which mainly includes the case of record values, was handled by Husseiny et al. [27].
The FGM and Sarmanov copulas are linked by the concomitants of k-record values in the following elegant result.
Theorem 2.1
Let \(\mathcal{C}_{[n,k]}^{(\ell )}(.;\alpha )\) and \(\mathcal{C}_{[n,k]}^{(u)}(.;\alpha )\) be the PDFs of the concomitants of nth lower and upper k-record values based on the FGM copula, respectively. Then
Proof
With a short glance at the Sarmanov copula, we can write
where \(L(u,v;\alpha )=\frac{5}{4}\alpha ^2[3(2u-1)^2-1][3(2v-1)^2-1]-2.\) Clearly, the function \(L(u,v;\alpha )\) is radially symmetric. Moreover, the relation (2.3) yields
where \(g_{n,k}^{(i)}(u)\) is the PDF of the nth (lower, \(i=\ell ,\) or upper, \(i=u\)) k-record values based on the uniform distribution over (0,1). Because the function \(L(u,v;\alpha )\) is radially symmetric, we can proceed as we did in the proof of Proposition 1 to show that \(J ^{(\ell )}_{[n,k]}(v;\alpha )=J^{(u)}_{[n,k]}(v;\alpha ).\) Therefore, we can get the relation (2.2) by using the relation (2.4). \(\square \)
Remark 2.3
When \(k=1\) in (2.2), we get the result which was handled by Husseiny et al. [27].
3 Concomitants of k-Record Values Based on SAR\((\alpha )\)
In this section, the marginal DF, moment generating function (MGF), and moments of the upper k-record values based on SAR\((\alpha )\) are obtained. Moreover, the joint DF of the bivariate concomitants of k-record values based on SAR\((\alpha )\) is derived.
3.1 Marginal DF of Concomitants of k-Record Values
The PDF of \(Y_{[n,k]}^{(u)}\) is represented in the following theorem in a useful way. We use the notation \(X\sim F\) to signify that X is distributed as F, for a given RV X and some DF F.
Theorem 3.1
Let \(V_1\sim F_Y^{2}\) and \(V_2\sim F_Y^{3}.\) Then
where \(\delta ^{(\alpha )}_{n,k:1}=\alpha \left[ 1-2\left( \frac{k}{k+1}\right) ^{n}\right] \) and \(\delta ^{(\alpha )}_{n,k:2}=\alpha ^{2}\left[ 12\left( \left( \frac{k}{k+2}\right) ^{n}-\left( \frac{k}{k+1}\right) ^{n}\right) +2\right] .\)
Proof
By using (1.2) and (1.3), we get
where
and
This completes the proof. \(\square \)
Remark 3.1
If \(k=1\) in Theorem 3.1, which mainly includes the case of record values, we obtain the results of Husseiny et al. [27].
Relying on (3.1), the MGF of \(Y_{[n,k]}^{(u)}\) based on SAR\((\alpha )\) is given by
where \(M_{Y}(t), M_{V_1}(t),\) and \(M_{V_2}(t)\) are the MGFs of the RVs \(Y, V_1\), and \(V_2,\) respectively. Thus, by using (3.1) the pth moment of \(Y_{[n,k]}^{(u)}\) based on SAR\((\alpha )\) is given by
where \(\mu _Y^{(p)}=E[Y^{p}],~\mu _{V_1}^{(p)}=E[{V_1}^{p}]\), and \(\mu _{V_2}^{(p)}=E[{V_2}^{p}].\)
Remark 3.2
When the value of n is too large, we can use the approximations \(\delta ^{(\alpha )}_{n,k:1}\approx \alpha \) and \(\delta ^{(\alpha )}_{n,k:2}\approx 2\alpha ^{2}.\)
The FGM and Sarmanov families have a common intriguing trait about the concomitants of k-record values based on them, as shown by the following theorem.
Theorem 3.2
Let \(\mathcal{F}_{[n,k]}^{(i)}(y;\theta )\) be the PDF of \(Y_{[n,k]}^{(i)}, i=\ell ,u,\) based on the FGM\((\theta )\) family. Furthermore, throughout this theorem, let \(f_{[n,k]}^{(i)}(y;\alpha )\) denote the PDF of \(Y_{[n,k]}^{(i)}, ~i=\ell ,u,\) based on SAR\((\alpha ).\) Then
-
1.
\(\mathcal{F}_{[n,k]}^{(u)}(y;-\theta )=\mathcal{F}_{[n,k]}^{(\ell )}(y;\theta ),\)
-
2.
\(f_{[n,k]}^{(u)}~ (y;-\alpha )= f_{[n,k]}^{(\ell )}(y;\alpha ).\)
Proof
It is simple to verify that \(\mathcal{F}_{[n,k]}^{(u)}(v;\theta )=1-\delta ^{(\theta )}_{n,k:1}[2F_Y(y)-1]=\mathcal{F}_{[n,k]}^{(\ell )}(v;-\theta ),\) where \(\delta ^{(\theta )}_{n,k:1} =\theta \left[ 1-2\left( \frac{k}{k+1}\right) ^{n}\right] ,\) since \(\delta ^{(-\theta )}_{n,k:1}=-\delta ^{(\theta )}_{n,k:1}.\) The first part of the theorem is thus proved. By using the same method as the proof of Theorem 3.1, we can now prove that
Using the simple-to-check relationships \(\delta ^{(-\alpha )}_{n,k:1}=-\delta ^{(\alpha )}_{n,k:1}\) and \(\delta ^{(-\alpha )}_{n,k:2}=\delta ^{(\alpha )}_{n,k:2},\) the relation (3.1) yields \(f_{[n,k]}^{(u)}(y;-\alpha )= f_{[n,k]}^{(\ell )}(y;\alpha ).\) The second part of the theorem is thus proved. \(\square \)
Remark 3.3
For \(k=1\) (record case) in Theorem 3.2, we obtain the results of Husseiny et al. [27].
3.2 Joint DF of Bivariate Concomitants of k-Record Values Based on SAR(\(\alpha \))
The following theorem gives the joint PDF \(f_{[n,m,k]}^{(u)}(y_1,y_2)\) (defined by (1.4)) of the concomitants of \(Y_{[n,k]}^{(u)}\) and \(Y_{[m,k]}^{(u)},~ n<m,\) in SAR\((\alpha ).\)
Theorem 3.3
Let \(V_1\sim F_Y^{2}\) and \(V_2\sim F_Y^{3}.\) Then
where \(\delta ^{(\alpha )}_{m,k:1}\) and \(\delta ^{(\alpha )}_{m,k:2}\) are defined via Theorem 3.1 by replacing n with m in \(\delta ^{(\alpha )}_{n,k:1}\) and \(\delta ^{(\alpha )}_{n,k:2}\), respectively,
and
Proof
Consider the integration
Taking the transformation \( u_1=-\log (\overline{F}_X(x_1))\) and \( u_2=-\log (\overline{F}_X(x_2)),\) we get
By using the transformation \( z=u_2-u_1,\) and after some algebra, we get
Now, by using (1.4), we get
By using the relations \(f_{V_1}=2f_YF_Y\) and \(f_{V_2}=3f_YF^2_Y\) and carrying out some algebra, we get
On the other hand, upon using (3.2), with \(p=1\) and \(q=0,\) for \(t=1,\) and with \(p=0\) and \(q=1,\) for \(t=2,\) we get after some algebra
Similarly, we can obtain \(\delta ^{(\alpha )}_{n,k:2}, \delta ^{(\alpha )}_{m,k:2}, \delta ^{(\alpha )}_{n,m,k:1}, \delta ^{(\alpha )}_{n,m,k:2}, \delta ^{(\alpha )}_{n,m,k:3},\) and \(\delta ^{(\alpha )}_{n,m,k:4}.\) This completes the proof. \(\square \)
Remark 3.4
In Theorem 3.3 for \(k=1,\) we obtain the results of Husseiny et al. [27].
The joint MGF of upper concomitants \(Y_{[n,k]}^{(u)}\) and \(Y_{[m,k]}^{(u)},~ n<m,\) based on SAR(\(\alpha \)) is given by
Thus, the product moment \(\text{ E }[Y_{[n,k]}Y_{[m,k]}]=\mu _{[n,m,k]}\) is obtained from (3.3) by
4 General Theoretical Relations of Entropy, Extropy, CE, CRE, and CF Based on Sarmanov Copula
We have the following general results concerning any radially symmetric copula and especially concerning the FGM and Sarmanov copulas.
Proposition 2
For any radially symmetric copula about \((\frac{1}{2},\frac{1}{2}),\) we get \(H^{(u)}_{[n,k]}=H^{(\ell )}_{[n,k]},\) where \(H^{(\ell )}_{[n,k]}\) and \(H^{(u)}_{[n,k]}\) are the Shannon entropies of the concomitants of the nth lower and upper k-record values of that copula, respectively.
Proof
The Shannon entropies \(H^{(\ell )}_{[n,k]}\) and \(H^{(u)}_{[n,k]}\) are given by
where \(\mathcal{L}^{(\ell )}_{[n,k]}(.)\) and \(\mathcal{L}^{(u)}_{[n,k]}(.)\) are the PDFs of the concomitants of the nth lower and upper k-record values based on that copula, respectively. Put \(i=u\) in (4.1) and use the transformation \(v=\frac{1}{2}-w,\) in view of Proposition 1 and Remark 2.1, we get \(H^{(u)}_{[n,k]} = -\int _{-\frac{1}{2}}^{\frac{1}{2}}\mathcal{L}^{(\ell )}_{[n,k]}(\frac{1}{2}+w)\log \mathcal{L}^{(\ell )}_{[n,k]}(\frac{1}{2}+w) dw.\) Thus, upon using the transformation \(\eta =\frac{1}{2}+w,\) we get
This completes the proof. \(\square \)
Theorem 4.1
Let the Shannon entropy associated with the FGM and Sarmanov copulas be denoted by \(H^{*(u)}_{[n,k]}(\theta )\) and \(H^{(u)}_{[n,k]}(\alpha ),\) respectively. Then, we get
Proof
From (4.1), Theorem 3.2, and Proposition 2, we get
This proves the first part of the theorem. The proof of the second part is similar. The proof is completed. \(\square \)
Proposition 3
For any radially symmetric copula about \((\frac{1}{2},\frac{1}{2}),\) we get \(J^{(u)}_{[n,k]}=J^{(\ell )}_{[n,k]},\) where \(J^{(\ell )}_{[n,k]}\) and \(J^{(u)}_{[n,k]}\) are extropies of the concomitants of the nth lower and upper k-record values of that copula, respectively.
Proof
The extropies \(J^{(\ell )}_{[n,k]}\) and \(J^{(u)}_{[n,k]}\) are given by \(J^{(i)}_{[n,k]} = -\frac{1}{2}\int _{0}^{1}\left[ \mathcal{L}^{(i)}_{[n,k]}(v)\right] ^{2} dv, ~i=\ell , u.\) In view of Proposition 1 and Remark 2.1, we get
Thus, upon using the transformation \(\eta =\frac{1}{2}+w,\) we get
This completes the proof. \(\square \)
Theorem 4.2
Let the extropy associated with the FGM and Sarmanov copulas be denoted by \(J^{*(u)}_{[n,k]}(\theta )\) and \(J^{(u)}_{[n,k]}(\alpha ),\) respectively. Then, we get
Proof
From (4.1), Theorem 3.2, and Proposition 2, we get
This proves the part based on FGM copula. The proof of the part based on Sarmanov copula is similar. The proof is completed. \(\square \)
Theorem 4.3
Let \({\overline{F}}_{[n,k]}^{(\ell )}(.;\alpha )\) and \({F}_{[n,k]}^{(u)}(.;\alpha )\) be the survival function and DF of concomitants of the nth lower and upper k-record values based on the Sarmanov copula, respectively. Then,
Proof
We have \({F}^{(u)}_{[n,k]}(v)=v\left[ 1+\delta ^{(\alpha )}_{n,k:1}(v-1)+\delta ^{(\alpha )}_{n,k:2}(2v-1)(v-1)\right] .\) Thus,
This completes the proof. \(\square \)
Proposition 4
For any radially symmetric copula about \((\frac{1}{2},\frac{1}{2}),\) we get \(CE^{(u)}_{[n,k]}=CRE^{(\ell )}_{[n,k]}.\)
Theorem 4.4
For the FGM and Sarmanov copulas, we, respectively, get
Proof
From Theorem 4.3 and Proposition 4, we get
This proves the theorem for FGM copula. The proof for Sarmanov copula is similar. \(\square \)
Proposition 5
Let \(\widetilde{Y}=aY_{[n,k]}^{(u)}+b,\) where \(a>0\) and \(b\ge 0.\) Then \(CE^{(u)}_{[n,k]}(\widetilde{Y}) = a CE^{(u)}_{[n,k]}(Y).\)
Proof
The proof follows directly by the definition of CE and the obvious relation \(F_{aY_{[n,k]}^{(u)}+b}(y)=F_{Y_{[n,k]}^{(u)}}(\frac{y-b}{a}).\) \(\square \)
Proposition 5 means that \(CE_{[n,k]}^{(u)}\) is a shift-independent measure. This property is also hold for \(CRE^{(u)}_{[n,k]}.\) On the other hand, the effect of linear transformations on the differential entropy and extropy is quite different, since \(H_{[n,k]}^{(u)}(\widetilde{Y})= H^{(u)}_{[n,k]}(Y)+\log a\) and \(J_{[n,k]}^{(u)}(\widetilde{Y})=\frac{1}{a} J^{(u)}_{[n,k]}(Y).\)
Proposition 6
Let \({\overline{F}_Y}^{(u)}_{[n,k]}(y)\) and \(R_{F^{(u)}_{[n,k]}}(y)\) be the survival and hazard functions of concomitants of the nth upper k-record values from \(SAR(\alpha ),\) respectively, where \(R_{F^{(u)}_{[n,k]}}(y)=\frac{f^{(u)}_{[n,k]}(y)}{\overline{F}^{(u)}_{[n,k]}(y)}.\) Then,
Proof
From the definition of \(CF_{\overline{F}_{[n,k]}}^{(u)}(\alpha )\) and by using the result of Nanda [31], that \(E[R_{F_{X}}(X)]\ge \frac{1}{E(X)},\) we get
The proof is completed. \(\square \)
5 Information Measures Based on SAR\((\alpha )\) with Illustrative Examples
Theorems 5.1-5.6 give an explicit form of the extropy, Shannon entropy, inaccuracy measures, CE, CRE, and CF for concomitants of the nth upper k-record values based on SAR\((\alpha )\), respectively.
Theorem 5.1
Let \(V_i\sim F_Y^{i+1}, i=1,2,3,\). Then, the extropy of \(Y_{[n,k]}^{(u)},\) for \(n\ge 1,\) is given by
where \(J_Y(Y)=-\frac{1}{2}\int _{0}^{\infty }f^{2}_Y(y)dy~\) and \(J_{Y}(V_i)=-\frac{1}{2}\int _{0}^{\infty }f^{2}_{V_i}(y)dy,~i=1,2,\) are the extropy measures of the RVs Y and \(V_i,\) respectively, and \( E_Y(f_{V_i})=\int _{0}^{\infty }f_Y(y)f_{V_i}(y)dy,~i=1,2,3.\) \(\square \)
Proof
Clearly, we have
The proof is completed. \(\square \)
Example 5.1
Suppose that X and Y have exponential distributions with mean \( \frac{1}{\theta *}\) and \(\frac{1}{\theta },\) respectively. We get \( J_Y(Y)=\frac{-\theta }{4},\) \( J_{Y}(V_1)=\frac{-\theta }{6},\) \(J_{Y}(V_2)=\frac{-3\theta }{20},\) \( E_Y(f_{V_1})=\frac{\theta }{3},\) \( E_Y(f_{V_2})=\frac{\theta }{4},\) and \(E_Y(f_{V_3})=\frac{\theta }{5}.\) Moreover,
Example 5.2
Suppose that X and Y have power distributions (i.e. \(f_{Y}(y)=c y^{c-1},~\) \(c>0,\) \( 0\le y\le 1\)). After simple algebra, we get \( J_Y(Y)=\frac{-c^{2}}{2(2c-1)},\) \( J_{Y}(V_1)=\frac{-2c^{2}}{4c-1},\) \(J_{Y}(V_2)=\frac{-9c^{2}}{2(6c-1)},\) \( E_Y(f_{V_1})=\frac{2c^{2}}{3c-1},\) \( E_Y(f_{V_2})=\frac{3c^{2}}{4c-1},\) and \(E_Y(f_{V_3})=\frac{4c^{2}}{5c-1}.\) Then,
Theorem 5.2
Let \(a(n,k)=1-3\delta ^{(\alpha )}_{n,k:1}+\frac{5}{2}\delta ^{(\alpha )}_{n,k:2},\) \(b(n,k)=3\delta ^{(\alpha )}_{n,k:1}-\frac{15}{2}\delta ^{(\alpha )}_{n,k:2},\) and \(c(n,k)=-(a(n,k)+b(n,k)-1)\). Furthermore, let \(3a(n,k)c(n,k)-b^2(n,k)>0\) and \( b(n,k)+2c(n,k)+1>0.\) Then the explicit form of the Shannon entropy of \(Y^{(u)}_{[n,k]},\) \(n\ge 1,\) based on SAR\((\alpha )\) is given by
where \(H(Y)=-E(\log f_Y(Y))\) is the Shannon entropy of Y, \(\phi _{f}(p)=\int _{0}^{\infty }F^{p}_Y(y)f_Y(y)\log f_Y(y)dy=\int _{0}^{1}z^{p}\log f_Y(F_Y^{-1}(z))dz,~p=1,2,~\) \(\psi _{[n,k]}=-\log (1+3\delta ^{(\alpha )}_{n,k:1}+\frac{5}{2}\delta ^{(\alpha )}_{n,k:2})+2b(n,k)J_{0}(n,k)+6c(n,k)J_{1}(n,k),~\) and
Proof
The Shannon entropy of \( Y_{[n,k]}^{(u)}\) is given by
where \(\psi _{[n,k]}=-E(\log (1+3\delta ^{(\alpha )}_{n,k:1}(2F_Y(Y_{[n,k]})-1)+\frac{5}{4}\delta ^{(\alpha )}_{n,k:2}(3(2F_Y(Y_{[n,k]})-1)^2-1))).\) Upon integrating by part, we get
where \(U_{n,k}=\log (1+3\delta ^{(\alpha )}_{n,k:1}(2F_Y(y)-1)+\frac{5}{4}\delta ^{(\alpha )}_{n,k:2}(3(2F_Y(y)-1)^2-1))\) and \(V_{n,k}=F_Y(y)(1+3\delta ^{(\alpha )}_{n,k:1}\) \((F_Y(y)-1)+\frac{5}{2}\delta ^{(\alpha )}_{n,k:2}(2F^2_Y(y)-3F_Y(y)+1)).\) Thus, by using the integral probability transformation \(z=F_Y(y)\) and simplifying the result, we get
The proof is completed. \(\square \)
Theorem 5.3
The inaccuracy measure between \(f_{[n,k]}^{(u)}(y)\) and \(f_Y(y)\) for \(n\ge 1,\) \(\alpha \ne 0,\) is given by
where \(H(Y)=-\int _{0}^{\infty }f_Y(y)\log f_Y(y)dy~\) is the Shannon entropy of the RV Y and
Proof
Clearly, we have
This completes the proof. \(\square \)
Example 5.3
Suppose that X and Y have exponential distributions with mean \( \frac{1}{\theta *}\) and \(\frac{1}{\theta },\) respectively. After simple algebra, we get \( H(Y)= 1-\log \theta ,\) \( \phi _{f}(1)=\frac{-3+2\log \theta }{4},\) and \(\phi _{f}(2)=\frac{-11+6\log \theta }{18}.\) Then,
Theorem 5.4
The CE of \(Y_{[n,k]}^{(u)},\) for \(n\ge 1,\) is given by
where \(CE(Y)\!=-\!\int _{0}^{\infty }F_Y(y)\log F_Y(y)dy\) \(=-\!\int _{0}^{1}\frac{z \log z}{f(F^{-1}{_Y(z)})}dz,~\) \(\phi _{f}(p)\) \(=\int _{0}^{\infty }F^{p}_Y(y)\log F_Y(y)dy\) \(=\int _{0}^{1}\frac{z^{p} \log z}{f(F^{-1}{_Y(z)})}dz,\) \(p=2,3,\) and \(\tau _{[n,k]}=-\int _{0}^{\infty }F_{[n,k]}^{(u)}(y)\log (1+3\delta ^{(\alpha )}_{n,k:1}(F_Y(y)-1)+\frac{5}{4}\delta ^{(\alpha )}_{n,k:2}(4F^{2}_Y(y)-6F_Y(y)+2)) dy.~\)
Proof
Clearly, we have
The proof is completed. \(\square \)
Example 5.4
For the Sarmanov copula, we have after simple algebra, \(CE(Y)=\frac{1}{4},\) \(\phi _{f}(2)=-\frac{1}{9},\) and \(\phi _{f}(3)=-\frac{1}{16}.\) Thus, the CE of \( Y_{[n,k]}^{(u)}\) is given by
where
Remark 5.1
For \(k=1\) (record case), All the result concerning the Shannon entropy, inaccuracy measure, extropy, and CE were obtained by Husseiny et al. [27].
Theorem 5.5
The CRE of \(Y_{[n,k]}^{(u)}\) for \(n\ge 1,\) is given by
where \(CRE(Y)\!=\!-\!\int _{0}^{\infty }\overline{F}_Y(y)\log \overline{F}_Y(y)dy \!=\!-\!\int _{0}^{1}\frac{z\log z}{f(F^{-1}{_Y(1-z)})}dz,\) \(\phi _{\overline{F}}(p+1)\!=\!\int _{0}^{\infty } F^{p+1}_Y(y)\overline{F}_Y(y)\log \overline{F}_Y(y)dy \) \(=\int _{0}^{1}\frac{z (1-z)^{p+1} \log z}{f(F^{-1}{_Y(1-z)})}dz,~p=0,1,~\) and \(\Omega _{[n,k]}=-\int _{0}^{\infty }\overline{F}_{[n,k]}^{(u)}(y)\log (1+3\delta ^{(\alpha )}_{n,k:1}F_Y(y)+\frac{5}{2}\delta ^{(\alpha )}_{n,k:2}(2F^{2}_Y(y)-F_Y(y))) dy.~\)
Proof
Clearly, we have
The proof is completed. \(\square \)
Example 5.5
For the Sarmanov copula, we have after simple algebra, \(CRE(Y)=\frac{1}{4},\) \(\phi _{\overline{F}_Y}(1)=-\frac{5}{36},\) and \(\phi _{\overline{F}_Y}(2)=-\frac{13}{144}.\) Thus, the CRE of \( Y_{[n,k]}^{(u)}\) is given by
where
Theorem 5.6
Let \(F_{[n,k]}^{(u)}(y)\) be the DF of the concomitant of the nth upper k-record value based on SAR\((\alpha ).\) Then, the CF for location parameter based on \(Y_{[n,k]}^{(u)},\) for \(n\ge 1,\) is given by
where
and
Proof
By using (3.1), the CF of \( Y_{[n,k]}^{(u)}\) is given by
\(\square \)
Remark 5.2
Let \(Y_{[n,k]}^{(i)},~i=\ell ,u,\) be based on SAR\((\alpha )\) with arbitrary marginals. In view of Theorem 3.2 and the relation between \(f_{[n,k]}^{(u)}(y)\) and \(f_{[n,k]}^{(\ell )}(y),\) it is easy to check that \(J^{(u)}_{[n,k]}(-\alpha )=J^{(\ell )}_{[n,k]}(\alpha ),\) \(H^{(u)}_{[n,k]}(-\alpha )=H^{(\ell )}_{[n,k]}(\alpha ),\) \(I^{(u)}_{[n,k]}(-\alpha )=I^{(\ell )}_{[n,k]}(\alpha ), CE^{(u)}_{[n,k]}(-\alpha )=CE^{(\ell )}_{[n,k]}(\alpha ), CRE^{(u)}_{[n,k]}(-\alpha )=CRE^{(\ell )}_{[n,k]}(\alpha ),\) and \(CF_{\overline{F}_{[n,k]}}^{(u)}(-\alpha )=CF_{\overline{F}_{[n,k]}}^{(\ell )}(\alpha ),\) where \(J^{(i)}_{[n,k]}(\alpha ), H^{(i)}_{[n,k]}(\alpha ), I^{(i)}_{[n,k]}(\alpha ), CE^{(i)}_{[n,k]}(\alpha ), CRE^{(i)}_{[n,k]}(\alpha ),\) and \(CF_{\overline{F}_{[n,k]}}^{(i)}(\alpha )\) are the extropy, Shannon entropy, inaccuracy measure, CE, CRE, and CF of \(Y_{[n,k]}^{(i)}, i=\ell ,u,\) respectively.
Example 5.6
Let X and Y have exponential distributions with means \( \frac{1}{\theta *}\) and \(\frac{1}{\theta },\) respectively. Then, \( CF_{\overline{F}_{[n,k]}}(Y)=\theta ,\) \(\tau _{\overline{F}_Y(y)}=\frac{\theta }{2}(3\delta ^{(\alpha )}_{n,k:1}-\frac{5}{2}\delta ^{(\alpha )}_{n,k:2})+\frac{5\theta }{3}\delta ^{(\alpha )}_{n,k:2},\) and \( \phi _{\overline{F}_Y(y)}=\frac{\theta }{2}(-3\delta ^{(\alpha )}_{n,k:1}+\frac{5}{2}\delta ^{(\alpha )}_{n,k:2})-\frac{5\theta }{3}\delta ^{(\alpha )}_{n,k:2}\) Thus, the CF of \( Y_{[n,k]}^{(u)}\) is given by
where
6 Numerical Study and Discussion
Tables 1 and 2 display the values of extropy, entropy, inaccuracy, CE, and CF of concomitants of the nth upper k-record values based on SAR\((\alpha )\) with marginals of the most popular distributions. The entries were evaluated by MATHEMATICA version 12 and Theorems 5.1-5.6. From these tables, the following properties can be extracted:
-
For the exponential marginals, the greatest value of the extropy is \(J_{[8,2]}^{(u)}(0.52)\simeq -0.069\) and the smallest value is \(J_{[8,2]}^{(u)}(-0.52)\simeq -0.4.\) In addition, with fixed \(\alpha ,\) the value of \(J_{[n,k]}^{(u)}(\alpha )\) slowly increases as n increases for \(\alpha >0.\) In contrast, the value of \(J_{[n,k]}^{(u)}(\alpha )\) slowly decreases as n increases for \(\alpha <0\) (see Table 1, Part 1).
-
For the power marginals, the greatest value of the extropy is \(J_{[8,2]}^{(u)}(-0.3)\simeq -0.55\) and the smallest value is \(J_{[8,2]}^{(u)}(0.52)\simeq -1.76.\) Moreover, with fixed \(\alpha ,\) the value of \(J_{[n,k]}^{(u)}(\alpha )\) slowly decreases as n increases for \(\alpha >0.\) In contrast, the value of \(J_{[n,k]}^{(u)}(\alpha )\) slowly increases as n increases for \(\alpha <0\) (see Table 1, Part 2).
-
By using the Sarmanov copula, Table 1, Part 3, shows that \(H_{[n,k]}^{(u)}(-\alpha )=H_{[n,k]}^{(u)}(\alpha ),\) which endorse the theoretical results given in Theorem 4.1. Moreover, the greatest values of the entropy is \( H_{[3,4]}^{(u)}(0.2)\simeq -0.0004\) and the smallest value is \(H_{[8,2]}^{(u)}(-0.52)\simeq -0.42.\) Also, the value of \(H_{[n,k]}^{(u)}(\alpha )\) increases as the value of k increases for large n \((n\ge 5).\)
-
For the exponential marginals, the value of \(I_{[n,k]}^{(u)}(\alpha )\) slowly increases as n increases for \(\alpha >0\). In contrast, the value of \(I_{[n,k]}^{(u)}(\alpha )\) slowly decreases as n increases for \(\alpha <0\) (see Table 2, Part 1).
-
By using Sarmanov copula, the greatest value of CE is \(CE_{[8,2]}^{(u)}(0.2)\simeq 0.259\) and the smallest value is \(CE_{[8,2]}^{(u)}(-0.52)\simeq 0.166.\) Also, with fixed \(\alpha \) and k the value of \(CE_{[n,k]}^{(u)}(\alpha )\) slowly increase as n increases for \(0<\alpha \le 0.4.\) Also, the value of \(CE_{[n,k]}^{(u)}(\alpha )\) slowly decreases as n increases for \(\alpha <0.\) In addition, the value of \(CE_{[n]}^{(u)}(\alpha )\) decreases with \(|\alpha |\) increases (see Table 2, Part 2).
-
For the exponential marginals, the greatest value of CF is \(CF_{\overline{F}_{[8,2]}}^{(u)}(-0.52)\simeq 2.86\) and the smallest value is \(CF_{\overline{F}_{[8,2]}}^{(u)}(0.52)\simeq 0.683.\) Moreover, with fixed k and \(\alpha >0,\) the value of \(CF_{\overline{F}_{[n,k]}}^{(u)}(\alpha )\) slowly decreases as n increases. In contrast, the value of \(CF_{\overline{F}_{[n,k]}}^{(u)}(\alpha )\) slowly increases as n increases for \(\alpha <0\) (see Table 2, Part 3).
References
Abd Elgawad, M.A., Alawady, M.A.: On concomitants of generalized order statistics from generalized FGM family under a general setting. Math. Slov. 72(2), 507–526 (2022)
Abd Elgawad, M.A., Barakat, H.M., Alawady, M.A.: Concomitants of generalized order statistics under the generalization of Farlie-Gumbel-Morgenstern type bivariate distributions. Bull. Iran. Math. Soc. 47(4), 1045–1068 (2021)
Abd Elgawad, M.A., Barakat, H.M., Alawady, M.A.: Concomitants of generalized order statistics from bivariate Cambanis family: some information measures. Bull. Iran. Math. Soc. 48(2), 563–585 (2021)
Abd Elgawad, M.A., Alawady, M.A., Barakat, H.M., Xiong, Shengwu: Concomitants of generalized order statistics from Huang-Kotz Farlie-Gumbel-Morgenstern bivariate distribution: some information measures. Bull. Malaysia Math. Sci. Soc. 43(3), 2627–2645 (2020)
Abd Elgawad, M.A., Barakat, H.M., Xiong, S., Alyami, S.A.: Information measures for generalized order statistics and their concomitants under general framework from Huang-Kotz FGM bivariate distribution. Entropy 23, 335 (2021). https://doi.org/10.3390/e23030335
Alawady, M.A., Barakat, H.M., Abd Elgawad, M.A.: Concomitants of generalized order statistics from bivariate Cambanis family of distributions under a general setting. Bull. Malaysia Math. Sci. Soc. 44(5), 3129–3159 (2021)
Alawady, M.A., Barakat, H.M., Xiong, S., Abd Elgawad, M.A.: Concomitants of generalized order statistics from iterated Farlie- Gumbel-Morgenstern type bivariate distribution. Commun. Statist. Theory Meth. 51(16), 5488–5504 (2022). https://doi.org/10.1080/03610926.2020.1842452
Alawady, M.A., Barakat, H.M., Xiong, Shengwu, Abd Elgawad, M.A.: On concomitants of dual generalized order statistics from Bairamov-Kotz-Becki Farlie-Gumbel-Morgenstern bivariate distributions. Asian-Eur. J. Math. 14(10), 2150185 (2021). https://doi.org/10.1142/S1793557121501850
Aly, A.E., Barakat, H.M., El-Adll, M.: Prediction intervals of the record-values process. Revstat 17(3), 401–427 (2019)
Balakrishnan, N., Lin, C.D.: Continuous bivariate distributions, 2nd edn. Springer, New York (2009)
Barakat, H.M., Husseiny, I.A.: Some information measures in concomitants of generalized order statistics under iterated FGM bivariate type. Quaestiones Math. 44(5), 581–598 (2021)
Barakat, H.M., Nigm, E.M., Husseiny, I.A.: Measures of information in order statistics and their concomitants for the single iterated Farlie-Gumbel-Morgenstern bivariate distribution. Math. Popul. Stud. 28(3), 154–175 (2021)
Barakat, H.M., Nigm, E.M., Syam, A.H.: Concomitants of order statistics and record values from Bairamov-Kotz-Becki-FGM bivariate-generalized exponential distribution. Filomat 32(9), 3313–3324 (2018)
Barakat, H.M., Nigm, E.M., Syam, A.H.: Concomitants of ordered variables from Huang-Kotz-FGM type bivariate-generalized exponential distribution. Bull. Malaysia Math. Sci. Soc. 42, 337–353 (2019)
Barakat, H.M., Alawady, M.A., Husseiny, I.A., Mansour, G.M.: Sarmanov family of bivariate distributions: statistical properties-concomitants of order statistics-information measures. Bull. Malaysia. Math. Sci. Soc. 45, 49–83 (2022). https://doi.org/10.1007/s40840-022-01241-z
Barakat, H.M., Nigm, E.M., Alawady, M.A., Husseiny, I.A.: Concomitants of order statistics and record values from iterated of FGM bivariate-generalized exponential distribution. Revstat 19(2), 291–307 (2019)
Barakat, H.M., Nigm, E.M., Alawady, M.A., Husseiny, I.A.: Concomitants of order statistics and record values from generalization of FGM bivariate-generalized exponential distribution. J. Stat. Theory Appl. 18(3), 309–322 (2019)
Bdair, O.M., Raqab, M.Z.: Mean residual life of kth records under double monitoring. Bull. Malaysia Math. Soc. 37(2), 457–464 (2013)
Beg, M.I., Ahsanullah, M.: Concomitants of generalized order statistics from Farlie Gumbel Morgenstern distributions. Statist. Methodol. 5, 1–20 (2008)
Bekrizadeh, H., Parham, G.A., Zadkarmi, M.R.: The new generalization of Farlie-Gumbel-Morgenstern copulas. App. Math. Sci. 6(71), 3527–3533 (2012)
Berred, M.: k-record values and the extreme-value index. J. Stat. Plann. Inf. 45, 49–63 (1995)
Chacko, M., Mary, M.S.: Concomitants of k-record values arising from morgenstern family of distributions and their applications in parameter estimation. Statist. Papers 54(1), 21–46 (2013)
Chacko, M., Muraleedharan, L.: Inference based on k-record values from generalized exponential distribution. Satistica, LXXVII I(1), 37–56 (2018)
Di Crescenzo, A., Longobardi, M.: On cumulative entropies. J. Stat. Plann. Inf. 139(12), 4072–4087 (2009)
Dziubdziela, W., Kopociński, W.: Limiting properties of the k-th record values. Appl. Math. 15, 187–190 (1976)
Fashandi, M., Ahmadi, J.: Characterizations of symmatric distributions based on Renyi entropy. Statist. Probab. Lett. 82, 798–804 (2012)
Husseiny, I.A., Barakat, H.M., Mansour, G.M., Alawady, M.A.: Information measures in records and their concomitants arising from Sarmanov family of bivariate distributions. J. Comp. Appl. Math. 408, 114120 (2022). https://doi.org/10.1016/j.cam.2022.114120
Kerridge, D.F.: Inaccuracy and inference. J. R. Stat. Soc. 23(1), 184–194 (1961)
Kharazmi, O., Balakrishnan, N.: Cumulative residual and relative cumulative residual Fisher information and their properties. IEEE Trans. Inf. Theory. 67(10), 6306–6312 (2021)
Lad, F., Sanfilippo, G., Agro, G.: Extropy: complementary dual of entropy. Stat. Sci. 30, 40–58 (2015)
Nanda, A.K.: Characterization of distributions through failure rate and mean residual life functions. Statist. Probab. Let. 80, 752–755 (2010)
Nelsen, R.B.: An introduction to copulas, 2nd edn. Springer-Verlag, New York (2006)
Rao, M., Chen, Y., Vemuri, B., Wang, F.: Cumulative residual entropy: a new measure of information. IEEE Trans. Inf. Theory 50, 1220–1228 (2004)
Sarmanov, I.O.: New forms of correlation relationships between positive quantities applied in hydrology. In: Mathematical Models in Hydrology Symposium, IAHS Publication No. 100, International Association of Hydrological Sciences, pp. 104-109 (1974)
Thomas, P.V., Anne, P., Veena, T.G.: Characterization of bivariate distributions using concomitants of generalized (k) record values. Statistica 74(4), 431–446 (2014)
Acknowledgements
The authors would like to thank Prof. Rosihan M. Ali Editor in Chief and anonymous referees for their helpful comments, which led to improvements of an earlier version of this paper.
Funding
Open access funding provided by The Science, Technology & Innovation Funding Authority (STDF) in cooperation with The Egyptian Knowledge Bank (EKB).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
The authors declare that they have no competing interests.
Additional information
Communicated by Anton Abdulbasah Kamil.
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Alawady, M.A., Barakat, H.M., Mansour, G.M. et al. Information Measures and Concomitants of k-Record Values Based on Sarmanov Family of Bivariate Distributions. Bull. Malays. Math. Sci. Soc. 46, 9 (2023). https://doi.org/10.1007/s40840-022-01396-9
Received:
Revised:
Accepted:
Published:
DOI: https://doi.org/10.1007/s40840-022-01396-9
Keywords
- Sarmanov family
- FGM family
- Concomitants
- K-record values
- Shannon entropy
- Inaccuracy measure
- Extropy
- Cumulative entropy
- Cumulative residual entropy
- Cumulative residual Fisher information