Advertisement

On the product of the bivariate beta components

  • M. Ghorbel
Original Paper
  • 22 Downloads

Abstract

The aim of the present paper is to derive the exact distribution and the corresponding moment function of the product \(P:=X_{1}X_{2}\) when \(X_{1}\) and \(X_{2}\) are distributed according to a bivariate beta distribution. We also give approximation for this distribution and show its robustness.

Keywords

Product of random variables Bivariate beta distribution Beta distribution 

1 Introduction

For given random variables \(X_{1}\) and \(X_{2}\), the distribution of the product \(P:=X_{1}X_{2},\) arise in many fields as biology, economics, engineering, genetics, hydrology, medicine, number theory, order statistics, physics, etc. (see, for example, Frisch and Sornette 1997; Sornette 1998; Nadarajah 2008). It is one of the most important research areas both from theoretical and application points of view.

In a number of applications, it is necessary to specify the properties of the product of random variables: this occurs in particular when the dimension of the involved random variables are of ratio type as for fuel consumption per mile, cost of a structure per 1 lb. of payload, amplification ratio, tolerances expressed in percentages of the desired value, etc. For instance, if the number of accidents during a period can be regarded as a random variable and if the same applies to the number of days spent in hospital by an accident victim and to the total cost per one day-patient, then the total cost is equal to the product of these three random variables. Another example is the following: if \(X_{i}\) is the random variable describing the amplification of the \(i\hbox {th}\) amplifier, then the total amplification \(X_{1},X_{2},\ldots ,X_{n}\) is also a random variable and it is important to know the distribution of this product.

Important examples in economics have be considered in detail by Nadarajah (2008). In fact, Nadarajah remarked that in traditional portfolio selection models, the product of random variables is often involved. The best examples are investments in different foreign markets. In portfolio diversification models (see, for example, Grubel 1968), there is uncertainty in prices of shares in local markets and uncertainty about exchange rates make the value of the portfolio in domestic currency related to a product of random variables. Similarly, in diversified production models by multinational companies (see, for example, Rugman 1979), not only is the local production is uncertain but also the exchange rates are uncertain so that the profits in local currency are again related to a product of random variables. Furthermore, another example drawn from the econometric literature is the following: in providing a forecast from an estimated equation, Feldstein (1971) pointed out that both the parameter and the value of the exogenous variable during the forecast period could be considered as random variables. The forecast was therefore proportional to a product of random variables.

In physics, Frisch and Sornette (1997) have developed a theory of extreme deviations generalizing the central limit theorem which, when applied to multiplication of random variables, predicts the generic presence of stretched exponential probability distribution functions (pdf’s). Their problem comes down to determine the tail of the pdf for a product of random variables.

Product of random variables also arising in hydrology stream flow is often defined as a product of two or more variables, representing, for example, the periodic and the stochastic components, respectively (see Cigizoglu and Bayazit 2000).

The distribution of the product \(P=X_{1}X_{2}\) has been studied by many researchers and for many distributions, among them Sakamoto (1943) for the uniform distribution, Springer and Thompson (1970) for the normal distribution, Tang and Gupta (1984) for the beta distribution and Nadarajah and Kotz (2004) for the Dirichlet distribution.

In this paper, our aim is to provide another construction of the bivariate beta distribution discussed in Olkin and Liu (2003) and to derive the exact distribution and the corresponding moment function of the product \(P=X_{1}X_{2}\) when \(X_{1}\) and \(X_{2}\) are distributed according to the bivariate beta distribution, whose density is given by
$$\begin{aligned} f\left( x_{1},x_{2}\right) =\frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{\prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }\frac{ \prod _{m=1}^{2}x_{m}^{\theta _{m}-1}\left( 1-x_{1}\right) ^{\theta _{2}+\theta _{3}-1}\left( 1-x_{2}\right) ^{\theta _{1}+\theta _{3}-1}}{ \left( 1-x_{1}x_{2}\right) ^{\sum _{m=1}^{3}\theta _{m}}}, \end{aligned}$$
(1)
where \(0<x_{1},x_{2}<1.\) This distribution is a particular case of the multivariate beta distribution discussed in Libby and Novick (1982). The most recent bivariate beta distributions are those given in Nadarajah (2006, 2007). Note that the bivariate beta distributions have attracted useful applications in several areas: for example, in the modeling of the proportions of substances in a mixture, brand shares, i.e., the proportions of brands of some consumer product that are bought by customers (Chatfield 1975), proportions of the electorate voting for the candidate in a two-candidate election (Hoyer and Mayer 1976) and the dependence between two soil strength parameters (A-Grivas and Asaoka 1982). They have also been used extensively as a prior in Bayesian statistics (see, for example, Apostolakis and Moieni 1987).
The paper is organized as follows. In Sect. 2, we give another construction of the bivariate beta distribution and we discuss its main properties. In Sect. 3, we derive exact expressions for the density function and the moment function of the product \(P=X_{1}X_{2}\). The calculations involve the Gauss hypergeometric functions defined by
$$\begin{aligned} _{2}F_{1}(a,b,c,x):=\sum _{k=0}^{\infty }\frac{\left( a\right) _{k}\left( b\right) _{k}}{\left( c\right) _{k}}\frac{x^{k}}{k!}, \end{aligned}$$
and
$$\begin{aligned} _{3}F_{2}(a,b,c,d,e,x):=\sum _{k=0}^{\infty }\frac{\left( a\right) _{k}\left( b\right) _{k}\left( c\right) _{k}}{\left( d\right) _{k}\left( e\right) _{k}} \frac{x^{k}}{k!} \end{aligned}$$
where \(\left( i\right) _{k}:=i(i+1)\cdots (i+k-1)\) denotes the ascending factorial. Finally, we propose approximation for the distribution of P and show its robustness.
For later frequent use of the beta distribution, we recall that: if a random variable A has the beta distribution with parameters \(a,b>0\) (say \(A \overset{d}{\sim }\) \(\beta (a,b)),\) then its density function is
$$\begin{aligned} f_{A}\left( x\right) =\frac{1}{B\left( a,b\right) }x^{a-1}\left( 1-x\right) ^{b-1},x\in \left[ 0,1\right] , \end{aligned}$$
where \(B(a,b):=\frac{\Gamma \left( a\right) \Gamma \left( b\right) }{\Gamma \left( a+b\right) }\) and its moment function is \(\mathbf {E}\left( A^{q}\right) =\frac{\Gamma \left( a+q\right) \Gamma \left( a+b\right) }{ \Gamma \left( a\right) \Gamma \left( a+b+q\right) },\) \(q>-a.\)

2 The bivariate beta distribution

First, suppose that \(\left( S_{1},S_{2},S_{3}\right)\) is distributed according to the inverted Dirichlet distribution (say \(ID\left( \theta _{1},\theta _{2},\theta _{3};\theta _{4}\right) )\) with positive parameters \(\theta _{1},\theta _{2},\theta _{3}\) and \(\theta _{4}\). Then it is known that its density function is given by
$$\begin{aligned} f\left( s_{1},s_{2},s_{3}\right) =\frac{\Gamma \left( \sum _{m=1}^{4}\theta _{m}\right) }{\prod _{m=1}^{4}\Gamma \left( \theta _{m}\right) } \prod _{m=1}^{3}s_{m}^{\theta _{m}-1}\cdot \left( 1+\sum _{m=1}^{3}s_{m}\right) ^{-\sum _{m=1}^{4}\theta _{m}}. \end{aligned}$$
From this, the joint density function of \(\left( X_{1},X_{2},S_{3}\right) =\left( \dfrac{S_{1}}{S_{1}+S_{3}},\dfrac{S_{2}}{S_{2}+S_{3}},S_{3}\right)\) can be written as
$$\begin{aligned} f\left( x_{1},x_{2},s_{3}\right)= & {} \frac{\Gamma \left( \sum _{m=1}^{4}\theta _{m}\right) }{\prod _{m=1}^{4}\Gamma \left( \theta _{m}\right) }\frac{ \prod _{m=1}^{2}x_{m}^{\theta _{m}-1}s_{3}^{\sum _{m=1}^{3}\theta _{m}-1}}{ \left( 1-x_{1}\right) ^{\theta _{1}+1}\left( 1-x_{2}\right) ^{\theta _{2}+1}} \\&\times \left( 1+s_{3}\left( 1+\frac{x_{1}}{1-x_{1}}+\frac{x_{2}}{1-x_{2}} \right) \right) ^{-\sum _{m=1}^{4}\theta _{m}}, \end{aligned}$$
when \(0<x_{1},x_{2}<1\) and \(0<s_{3}<\infty .\) As a consequence, the joint density function of \(\left( X_{1},X_{2}\right)\) is given by Eq. (1) which coincides with the bivariate beta distribution given in Olkin and Liu (2003).
Alternatively, the law of \(\left( X_{1},X_{2}\right)\) can easily be shown to be characterized by its joint moment function (see Appendix 1)
$$\begin{aligned} \mathbf {E}\left( X_{1}^{q_{1}}X_{2}^{q_{2}}\right)= & {} \frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{\prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }B(\theta _{1}+q_{1},\theta _{2}+\theta _{3})\cdot B(\theta _{2}+q_{2},\theta _{1}+\theta _{3}) \nonumber \\&\times\, _{3}F_{2}\left( \theta _{1}+q_{1},\theta _{2}+q_{2},\sum _{m=1}^{3}\theta _{m},\sum _{m=1}^{3}\theta _{m}+q_{1},\sum _{m=1}^{3}\theta _{m}+q_{2},1\right) . \end{aligned}$$
(2)
Furthermore, it is straightforward to verify that \(X_{1}\) has the beta distribution \(\beta \left( \theta _{1},\theta _{3}\right)\) and \(X_{2}\) has the beta distribution \(\beta \left( \theta _{2},\theta _{3}\right) .\) Their moment functions are
$$\begin{aligned} \mathbf {E}\left( X_{1}^{q_{1}}\right) =\frac{\Gamma \left( \theta _{1}+q_{1}\right) \Gamma \left( \theta _{1}+\theta _{3}\right) }{\Gamma \left( \theta _{1}\right) \Gamma \left( \theta _{1}+\theta _{3}+q_{1}\right) },\text { }q_{1}>-\theta _{1} \end{aligned}$$
and
$$\begin{aligned} \mathbf {E}\left( X_{2}^{q_{2}}\right) =\frac{\Gamma \left( \theta _{2}+q_{2}\right) \Gamma \left( \theta _{2}+\theta _{3}\right) }{\Gamma \left( \theta _{2}\right) \Gamma \left( \theta _{2}+\theta _{3}+q_{2}\right) },\text { }q_{2}>-\theta _{2} \end{aligned}$$
respectively. In particular, the mean values are \(\mathbf {E}\left( X_{1}\right) =\theta _{1}/\left( \theta _{1}+\theta _{3}\right)\) and \(\mathbf {E}\left( X_{2}\right) =\theta _{2}/\left( \theta _{2}+\theta _{3}\right)\).
The case \(\theta _{1}=\theta _{2}=\theta _{3}=1\) corresponds to the uniform distribution for which
$$\begin{aligned} \mathbf {E}\left( X_{1}^{q}\right) =\mathbf {E}\left( X_{2}^{q}\right) =\frac{1 }{q+1}\text {, }q>-1. \end{aligned}$$

3 Product of the bivariate beta components

The aim of this work is to study the product \(P:=X_{1}X_{2}\) when \(X_{1}\) and \(X_{2}\) are distributed according to Eq. (1). We derive in this section the exact density and moment functions of P and we give an approximation of its distribution. Here, we show goodness of its robustness.

3.1 Density function of P

In the following result, we derive the exact density function of the product P.

Theorem 1

If \(X_{1}\) and \(X_{2}\) are jointly distributed according to (1), then the density function of \(P=X_{1}X_{2},\) is given by
$$\begin{aligned} f_{P}(p)= & {} \frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{ \prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }B(\theta _{1}+\theta _{3},\theta _{2}+\theta _{3})\frac{(1-p)^{\theta _{3}-1}}{p^{\theta _{3}+1}} \nonumber \\&\times\, _{2}F_{1}\left( \theta _{2}+\theta _{3},\theta _{1}+\theta _{3},\theta _{1}+\theta _{2}+2\theta _{3},\frac{p-1}{p}\right) \end{aligned}$$
(3)
for \(0<p<1.\)

Proof

From Eq. (1), the joint density function of \(\left( X_{1},P\right) =\left( X_{1},X_{1}X_{2}\right)\) can be written as
$$\begin{aligned} f(x_{1},p)= & {} \frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{ \prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }\frac{x_{1}^{\theta _{1}-2}\left( 1-x_{1}\right) ^{\theta _{2}+\theta _{3}-1}}{\left( 1-p\right) ^{\sum _{m=1}^{3}\theta _{m}}}\left( \frac{p}{x_{1}}\right) ^{\theta _{2}-1}\left( 1-\frac{p}{x_{1}}\right) ^{\theta _{1}+\theta _{3}-1} \\= & {} \frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{ \prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }\frac{x_{1}^{-(\theta _{2}+\theta _{3})}\left( 1-x_{1}\right) ^{\theta _{2}+\theta _{3}-1}p^{\theta _{2}-1}\left( x_{1}-p\right) ^{\theta _{1}+\theta _{3}-1}}{ \left( 1-p\right) ^{\sum _{m=1}^{3}\theta _{m}}}. \end{aligned}$$
for \(0<p<1\) and \(p<x_{1}<1.\) Thus, the density function of P can be written as
$$\begin{aligned} f_{P}(p)= & {} \frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{ \prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }\frac{p^{\theta _{2}-1}}{ \left( 1-p\right) ^{\sum _{m=1}^{3}\theta _{m}}} \nonumber \\&\times \int _{p}^{1}x_{1}^{-(\theta _{2}+\theta _{3})}\left( 1-x_{1}\right) ^{\theta _{2}+\theta _{3}-1}\left( x_{1}-p\right) ^{\theta _{1}+\theta _{3}-1}dx_{1} \end{aligned}$$
(4)
for \(0<p<1.\) Using Eq. (2.2.6.1) in Prudnikov et al. (1986, volume 1), the integral in Eq. (4) is equal to
$$\begin{aligned}&\frac{\Gamma \left( \theta _{1}+\theta _{3}\right) \Gamma \left( \theta _{2}+\theta _{3}\right) }{\Gamma \left( \theta _{1}+\theta _{2}+2\theta _{3}\right) }\frac{(1-p)^{\theta _{1}+\theta _{2}+2\theta _{3}-1}}{p^{\theta _{2}+\theta _{3}}} \nonumber \\&\quad \times \,_{2}F_{1}\left( \theta _{2}+\theta _{3},\theta _{1}+\theta _{3},\theta _{1}+\theta _{2}+2\theta _{3},\frac{p-1}{p}\right) \end{aligned}$$
(5)
Inserting Eq. (5) in Eq. (4), the result follows. \(\square\)

3.2 Moments

In this section, we give the moments of the product \(P:=X_{1}X_{2}\). Now we establish the following result.

Theorem 2

If \(X_{1}\) and \(X_{2}\) are jointly distributed according to (1), then the moment function of \(P=X_{1}X_{2},\) is given by
$$\begin{aligned} \mathbf {E}\left( P^{n}\right)= & {} \frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{\prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }B(\theta _{1}+n,\theta _{2}+\theta _{3})B(\theta _{2}+n,\theta _{1}+\theta _{3}) \nonumber \\&\times \,_{3}F_{2}\left( \theta _{1}+n,\theta _{2}+n,\sum _{m=1}^{3}\theta _{m},\sum _{m=1}^{3}\theta _{m}+n,\sum _{m=1}^{3}\theta _{m}+n,1\right) \end{aligned}$$
(6)
for \(n\ge 1\). In particular, the first two moments of P are
$$\begin{aligned} \mathbf {E}\left( P\right)= & {} \frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{\prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }B(\theta _{1}+1,\theta _{2}+\theta _{3})B(\theta _{2}+1,\theta _{1}+\theta _{3}) \nonumber \\&\times \,_{3}F_{2}\left( \theta _{1}+1,\theta _{2}+1,\sum _{m=1}^{3}\theta _{m},\sum _{m=1}^{3}\theta _{m}+1,\sum _{m=1}^{3}\theta _{m}+1,1\right) \end{aligned}$$
(7)
and
$$\begin{aligned} \mathbf {E}\left( P^{2}\right)= & {} \frac{\Gamma \left( \sum _{m=1}^{3}\theta _{m}\right) }{\prod _{m=1}^{3}\Gamma \left( \theta _{m}\right) }B(\theta _{1}+2,\theta _{2}+\theta _{3})B(\theta _{2}+2,\theta _{1}+\theta _{3}) \nonumber \\&\times\, _{3}F_{2}\left( \theta _{1}+2,\theta _{2}+2,\sum _{m=1}^{3}\theta _{m},\sum _{m=1}^{3}\theta _{m}+2,\sum _{m=1}^{3}\theta _{m}+2,1\right) \end{aligned}$$
(8)

Proof

Equation (6) follows immediately from the fact that \(\mathbf {E}\left( P^{n}\right) =\mathbf {E}\left( \left( X_{1}X_{2}\right) ^{n}\right)\) and using Eq. (2) for \(q_{1}=q_{2}=n\). \(\square\)

3.3 Approximation

It is clear that the random variable P has support in the interval \(\left[ 0,1\right] .\) From this, it is evident that we will motivate to approximate its distribution by the beta distribution with parameters \(a,b>0\) (say \(P\overset{d}{\sim }\) \(\beta (a,b))\) and density function
$$\begin{aligned} f_{P}\left( y\right) =\frac{\Gamma \left( a+b\right) }{\Gamma \left( a\right) \Gamma \left( b\right) }y^{a-1}\left( 1-y\right) ^{b-1}. \end{aligned}$$
(9)
The idea of approximating distributions including complicated formulae with the beta distribution is not new, and it is very well-established in the statistics literature, based on the interesting work of Das Gupta (1968) (see also Sculli and Wong 1985; Fan 1991; Johannesson and Giri 1995). Note that recently, Nadarajah and Kotz (2004) have proposed this single beta distribution of the form (9) for correlated beta random variables. The purpose of doing this is not because the calculation of Eq. (3) cannot be handled, it consists rather in giving a simple approximation in terms of the beta distribution so that one can use the known procedures for inference, prediction, etc. Furthermore, the approximation presented may be useful especially to the practitioners of the bivariate beta distribution since they avoid the use of the Gauss hypergeometric function and since the beta distribution is widely accessible in standard statistical packages.
The choice of the beta parameters a and b is made using the method of moments. The first two moments of P can be written as
$$\begin{aligned} \mathbf {E}\left( P\right) =\frac{a}{a+b}\text { and }\mathbf {E}\left( P^{2}\right) =\frac{a\left( a+1\right) }{\left( a+b\right) \left( a+b+1\right) }. \end{aligned}$$
After some algebraic manipulation, it is straightforward to find that
$$\begin{aligned} a=\mathbf {E}\left( P\right) \frac{\mathbf {E}\left( P\right) -\mathbf {E} \left( P^{2}\right) }{\mathbf {E}\left( P^{2}\right) -\mathbf {E}^{2}\left( P\right) } \end{aligned}$$
(10)
and
$$\begin{aligned} b=\left( 1-\mathbf {E}\left( P\right) \right) \frac{\mathbf {E}\left( P\right) -\mathbf {E}\left( P^{2}\right) }{\mathbf {E}\left( P^{2}\right) -\mathbf {E} ^{2}\left( P\right) }. \end{aligned}$$
(11)
The two moments \(\mathbf {E}\left( P\right)\) and \(\mathbf {E}\left( P^{2}\right)\) are given by (7) and (8), respectively.
Table 1

Estimates of (ab) for selected \(\left( \theta _{1},\theta _{2},\theta _{3}\right)\)

\(\theta _{1}\)

\(\theta _{2}\)

\(\theta _{3}\)

a

b

\(\begin{array}{c} 0.5 \\ 0.5 \\ 2 \\ 4 \\ 4 \\ 4 \\ 4 \\ 6 \end{array}\)

\(\begin{array}{c} 0.5 \\ 2 \\ 2 \\ 2 \\ 4 \\ 2 \\ 4 \\ 6 \end{array}\)

\(\begin{array}{c} 2 \\ 2 \\ 2 \\ 2 \\ 2 \\ 4 \\ 4 \\ 6 \end{array}\)

\(\begin{array}{c} 0.1718 \\ 0.3567 \\ 0.9889 \\ 1.3575 \\ 2.0834 \\ 1.2703 \\ 1.9729 \\ 2.9642 \end{array}\)

\(\begin{array}{c} 3.2541 \\ 2.7357 \\ 2.6127 \\ 2.4623 \\ 2.4008 \\ 5.8306 \\ 5.5056 \\ 8.4547 \end{array}\)

For showing the robustness of the approximation, we selected eight values for the parameters \(\left( \theta _{1},\theta _{2},\theta _{3}\right)\) and computed the corresponding estimates for a and b using (10) and (11). From this, we use the selected parameters \(\left( \theta _{1},\theta _{2},\theta _{3}\right)\) and the estimates are shown in Table 1. Next, we checked robustness by comparing the exact and the approximated density functions of P as given by (3) and (9). These comparisons are illustrated in Figs. 1 and 2. It is clear that the approximation is quite good, and we remark that for (3) in Fig. 1, the solid curve is unimodal, and the broken curve is monotone decreasing. Similar findings were noted when this exercise was repeated for many other combinations of \(\left( \theta _{1},\theta _{2},\theta _{3}\right) .\) Noting that the numerical results are obtained by the use of PYTHON and the hyp2f1 and hyp3f2 functions in mpmath 0.14.
Fig. 1

The exact density function of P (solid curve) and its approximated function (broken curve) for (1): \(\left( \theta _{1},\theta _{2},\theta _{3}\right) =(0.5,0.5,2);\) (2): \(\left( \theta _{1},\theta _{2},\theta _{3}\right) =(0.5,2,2);\) (3): \(\left( \theta _{1},\theta _{2},\theta _{3}\right) =(2,2,2);\) (4): \(\left( \theta _{1},\theta _{2},\theta _{3}\right) =(4,2,2)\)

Fig. 2

The exact density function of P (solid curve) and its approximated function (broken curve) for (1): \(\left( \theta _{1},\theta _{2},\theta _{3}\right) =(4,4,2);\) (2): \(\left( \theta _{1},\theta _{2}, \theta _{3}\right) =(4,2,4);\) (3): \(\left( \theta _{1}, \theta _{2},\theta _{3}\right) =(4,4,4);\) (4): \(\left( \theta _{1},\theta _{2},\theta _{3}\right) =(6,6,6)\)

Notes

Acknowledgements

The author would like to thank the coordinating editor and the referee for carefully reading the paper and for their comments which greatly improved the paper.

References

  1. A-Grivas, D., & Asaoka, A. (1982). Slope safety prediction under static and seismic loads. Journal of Geotechnical and Geoenvironmental Engineering, 108, 713–729.Google Scholar
  2. Apostolakis, F. J., & Moieni, P. (1987). The foundations of models of dependence in probabilistic safety assessment. Reliability Engineering, 18, 177–195.CrossRefGoogle Scholar
  3. Chatfield, C. (1975). A marketing application of a characterization theorem. Statistical Distributions in Scientific Work, 2, 175–185.CrossRefGoogle Scholar
  4. Cigizoglu, H. K., & Bayazit, M. (2000). A generalized seasonal model for flow duration curve. Hydrological Processes, 14, 1053–1067.CrossRefGoogle Scholar
  5. Das Gupta, P. (1968). Two approximations for the distribution of double non-central beta. Sankhyā, 30, 83–88.MathSciNetGoogle Scholar
  6. Fan, D.-Y. (1991). The distribution of the product of independent beta variables. Communications in Statistics-Theory and Methods, 20, 4043–4052.MathSciNetCrossRefGoogle Scholar
  7. Feldstein, M. S. (1971). The error of forecast in econometric models when the forecast-period exogenous variables are stochastic. Econometrica, 39, 55–60.CrossRefGoogle Scholar
  8. Frisch, U., & Sornette, D. (1997). Extreme deviations and applications. Journal de Physique I France, 7, 1155–1171.CrossRefGoogle Scholar
  9. Grubel, H. G. (1968). Internationally diversified portfolios: Welfare gains capital flows. The American Economic Review, 58, 1299–1314.Google Scholar
  10. Gupta, A. K., & Nadarajah, S. (2006). Exact and approximate distributions for the linear combination of inverted Dirichlet components. Journal of the Japan Statistical Society, 36, 225–236.MathSciNetCrossRefGoogle Scholar
  11. Hoyer, R. W., & Mayer, L. S., (1976). The equivalence of various objective functions in a stochastic model of electoral competition. Department of Statistics, Princeton University. Tech. Rep. 114, Series 2.Google Scholar
  12. Johannesson, B., & Giri, N. (1995). On approximations involving the beta distribution. Communications in Statistics-Simulation and Computation, 24, 489–503.MathSciNetCrossRefGoogle Scholar
  13. Libby, D. L., & Novick, M. R. (1982). Multivariate generalized beta-distributions with applications to utility assessment. Journal of Educational Statistics, 7, 271–294.CrossRefGoogle Scholar
  14. Nadarajah, S. (2006). The bivariate \(F_{3}\)- beta distribution. Communications of the Korean Mathematical Society, 21, 363–374.CrossRefGoogle Scholar
  15. Nadarajah, S. (2007). A new bivariate beta distribution with application to drought data. Metron, 2, 153–174.MathSciNetGoogle Scholar
  16. Nadarajah, S. (2008). On the product of generalized Pareto random variables. Applied Economics Letters, 15, 253–259.CrossRefGoogle Scholar
  17. Nadarajah, S., & Kotz, S. (2004). Exact and approximate distributions for the product of Dirichlet components. Kybernetika, 40(6), 735–744.MathSciNetzbMATHGoogle Scholar
  18. Olkin, I., & Liu, R. (2003). A bivariate beta distribution. Statistics and Probability Letters, 62, 407–412.MathSciNetCrossRefGoogle Scholar
  19. Prudnikov, A. P., Brychkov, Y. A., & Marichev, O. I. (1986). Integrals and series (volumes 1 and 3). Amsterdam: Gordon and Breach Science Publishers.zbMATHGoogle Scholar
  20. Rugman, A. M. (1979). International diversification and the multinational enterprise. Lexington, Mass: Lexington Books.Google Scholar
  21. Sakamoto, H. (1943). On the distributions of the product and the quotient of the independent and uniformly distributed random variables. Tohoku Mathematical Journal, 49, 243–260.MathSciNetzbMATHGoogle Scholar
  22. Sculli, D., & Wong, K. L. (1985). The maximum and sum of two beta variables in the analysis of PERT networks. Omega, 13, 233–240.CrossRefGoogle Scholar
  23. Sornette, D. (1998). Multiplicative processes and power laws. Physical Review E, 57, 4811–4813.CrossRefGoogle Scholar
  24. Springer, M. D., & Thompson, W. E. (1970). The distribution of products of beta, gamma and Gaussian random variables. SIAM Journal on Applied Mathematics, 18, 721–737.MathSciNetCrossRefGoogle Scholar
  25. Tang, J., & Gupta, A. K. (1984). On the distribution of the product of independent beta random variables. Statistics and Probability Letters, 2, 165–168.MathSciNetCrossRefGoogle Scholar

Copyright information

© Japanese Federation of Statistical Science Associations 2018

Authors and Affiliations

  1. 1.Laboratoire de Probabilités et Statistique, Faculté des Sciences de SfaxUniversité de SfaxSfaxTunisia

Personalised recommendations