Measures of information for concomitants of generalized order statistics from subfamilies of Farlie–Gumbel– Morgenstern distributions

In this paper, we study Shannon’s entropy and Fisher information number for concomitants of generalized order statistics from subfamilies of Farlie–Gumbel–Morgenstern when the marginal distributions are Weibull, exponential, Pareto and power function. Also, we provide some numerical results of Shannon entropy and Fisher information number for concomitants of order statistics.

Morgenstern [12] has introduced FGM distributions; Gumbel [8] has studied FGM for exponential distribution. Farlie [5] has considered this family in the general form. Let F X (x) and F Y (y) be the distribution functions of the random variables X and Y , respectively. Then the probability density function (pdf) of the bivariate FGM distributions is given by: (1.1) Here, f X (x) and f Y (y) are the marginal pdf's of X and Y , respectively. The parameter α is known as the dependence parameter of the random variables X and Y . If α is zero, then X and Y are independent. For the FGM family with pdf given by (1.1), the density function of the concomitant of r-th GOS's Y [r,n,m,k] , 1 ≤ r ≤ n, is given by Beg and Ahsanullah [1], as follows: g [r,n,m,k] (y) = f Y (y) 1 + αC * (r, n, m, k)(2F Y (y) − 1) , (1. 2) where C * (r, n, m, k) = 1 − . Entropy is an index that is used to measure dispersion, volatility risk and uncertainty. This concept was formerly introduced by Shannon [13] in the information theory literature. The Shannon entropy of a random variable X is a mathematical measure of information which measures the average reduction of uncertainty of X . Tahmasebi and Behboodian [15] have introduced the Shannon entropy for concomitants of GOS's of FGM family, the Shannon entropy for a continuous random variable X with pdf f X (x) is defined as: (1.3) Tahmasebi and Jafari [16] have introduced the Fisher information number for concomitants of GOS's of FGM family, the Fisher information number for a continuous random variable X with pdf f X (x) is defined as: this is Fisher information number for location parameter, and also called shift-invariant Fisher information number. Furthermore, it has been used to develop a unifying theory physical law called the principle of "extreme physical information" (see Frieden [6,7]). Noting that, it is different than what was introduced by BuHamra and Ahsanullah [2].
Remark 1.1 In the computation of this paper, we use some important formulas as follow: , t ≥ 1. The rest of this article is organized as follows. In Sect. 2, we derive Shannon entropy for concomitants of GOS's of FGM family for some well-known distributions such as Weibull, Pareto and power function distributions. In Sect. 3, we develop Fisher information number for concomitants of GOS's of FGM family for some well-known distributions such as exponential, Pareto and power function distributions. In Sect. 4, we compute numerical values of our results for concomitants of order statistics.

Weibull distribution
The pdf and cdf for Weibull distribution are given by, respectively:   To find first, we want to obtain To find first, we want to obtain where ν = − (1) = 0.57722 is the Euler's constant. By substituting (2.8) and (2.9) in (2.9), the result follows.
From Weibull distribution, we can get the Shannon entropy of other related distributions such as exponential and Rayleigh distributions by changing the parameters.

Pareto distribution
The pdf and cdf for Pareto distribution are given by, respectively: Proof The proof is similar to the proof of Theorem 2.2.

Power distribution function
The pdf and cdf for Power distribution function are given by, respectively: (2.14)

Theorem 2.4 If Y [r,n,m,k] is the concomitant of r-th GOS's for Power distribution function from (1.1) and (2.14) then, from (2.1), the Shannon entropy of Y
Proof The proof is similar to the proof of Theorem 2.2.

Fisher information number for concomitants of GOS's from subfamilies of FGM family
Tahmasebi and Jafari [16] have introduced the Fisher information number for concomitants of GOS's of FGM family by the following theorem: In the following subsections, we will apply the last theorem for some subfamilies of FGM family such as exponential, Pareto and power function distributions.

Exponential distribution
The pdf and cdf for exponential distribution are given by, respectively:

Pareto distribution
To find we want to obtain     Proof The proof is similar to the proof of Theorem 3.3.