Skip to main content
Log in

Statistical Inference of Burr-XII Distribution Under Adaptive Type II Progressive Censored Schemes with Competing Risks

  • Published:
Results in Mathematics Aims and scope Submit manuscript

Abstract

This paper discusses the adaptive type II progressive censored data under the competitive risk model from multiple aspects such as experimental method comparison, data analysis, and optimized censoring scheme. The existence and uniqueness of the maximum likelihood estimation are derived, and the approximate confidence interval is constructed by the Fisher information matrix and delta method. Bayesian estimation under three loss functions and the highest posterior density credible intervals are provided via Markov Chain Monte Carlo simulations. Considering the effect of optimized censoring schemes to improve the efficiency of experiments, three optimization criteria are introduced under the condition of ensuring the amount of data and shortening the test duration. Finally, suggestions for experimental design are presented to better serve the actual production and life.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  1. Kundu, D., Joarder, A.: Analysis of type II progressively hybrid censored data. Comput. Stat. Data Anal. 50(10), 2509–2528 (2006)

    Article  MathSciNet  Google Scholar 

  2. Ng, H.K.T., Kundu, D., Ping, S.C.: Statistical analysis of exponential lifetimes under an adaptive type-II progressive censoring scheme. Nav. Res. Logist. 56(8), 687–698 (2009)

    Article  MathSciNet  Google Scholar 

  3. Elshahhat, A., Nassar, M.: Bayesian survival analysis for adaptive Type-II progressive hybrid censored Hjorth data. Comput. Stat. 36(3), 1965–1990 (2021)

    Article  MathSciNet  Google Scholar 

  4. Cui, W., Yan, Z., Peng, X.: Statistical analysis for constant-stress accelerated life test with Weibull distribution under adaptive type-II hybrid censored data. IEEE Access 7(2), 165336–165344 (2019)

    Article  Google Scholar 

  5. Almetwally, E.M., Almongy, H.M., Rastogi, M.K., Ibrahim, M.: Maximum product spacing estimation of Weibull distribution under adaptive type-II progressive censoring schemes. Ann. Data Sci. 7, 257–279 (2020)

    Article  Google Scholar 

  6. Hemmati, F., Khorram, E.: On adaptive progressively type-II censored competing risks data. Commun. Stat. Simul. Comput. 46(6), 4671–4693 (2017)

    Article  MathSciNet  Google Scholar 

  7. Hemmati, F., Khorram, E.: Bayesian analysis of the adaptive type-II progressively hybrid censoring scheme in presence of competing risks. In: Proceedings of ICCS-11, Lahore, Pakistan, vol. 21, pp. 181–194 (2011)

  8. Ashour, S.K., Nassar, M.M.A.: Analysis of generalized exponential distribution under adaptive type-II progressive hybrid censored competing risks data. Int. J. Adv. Stat. Prob. 2(2), 108–113 (2014)

    Article  Google Scholar 

  9. Yan, W., Yimin, S., Min, W.: Statistical inference for dependence competing risks model under middle censoring. J. Syst. Eng. Electron. 30(1), 213–226 (2019)

    Google Scholar 

  10. Mousa, M.A.M.A., Jaheen, Z.F.: Statistical inference for the Burr model based on progressively censored data. Comput. Math. Appl. 43(10–11), 1441–1449 (2002)

    Article  MathSciNet  Google Scholar 

  11. Soliman, A.A.: Estimation of parameters of life from progressively censored data using Burr-XII model. IEEE Trans. Reliab. 54(1), 34–42 (2005)

    Article  Google Scholar 

  12. Abdel-Hamid, A.H.: Constant-partially accelerated life tests for Burr type-XII distribution with progressive type-II censoring. Comput. Stat. Data Anal. 53(7), 2511–2523 (2009)

    Article  MathSciNet  Google Scholar 

  13. Debasis, K., Biswabrata, P.: Bayesian inference and life testing plans for generalized exponential distribution. Sci. China 52(6), 1373–1388 (2009)

    Article  MathSciNet  Google Scholar 

  14. Nassar, M., Abo-Kasem, O.E.: Estimation of the inverse Weibull parameters under adaptive type-II progressive hybrid censoring scheme. J. Comput. Appl. Math. 315, 228–239 (2017)

    Article  MathSciNet  Google Scholar 

  15. Ni, W., Li, G., Zhao, J., Cui, J., Wang, R., Gao, Z., et al.: Use of Monte Carlo simulation to evaluate the efficacy of tigecycline and minocycline for the treatment of pneumonia due to carbapenemase-producing Kebsiella pneumoniae. Infect. Dis. 50, 1–7 (2018)

    Article  Google Scholar 

  16. Ma, Y., Xi, C., Biegler, L.T.: Monte-Carlo-simulation-based optimization for copolymerization processes with embedded chemical composition distribution. Comput. Chem. Eng. 109, 261–275 (2018)

    Article  Google Scholar 

  17. Ko, Y., Kim, J., Rodriguez-Zas, S.L.: Markov chain Monte Carlo simulation of a Bayesian mixture model for gene network inference. Genes Genomics 41, 547–555 (2019)

    Article  Google Scholar 

  18. Panahi, H., Asadi, S.: On adaptive progressive hybrid censored Burr type III distribution: application to the nano droplet dispersion data. Qual. Technol. Quant. Manag. 18(2), 179–201 (2021)

    Article  Google Scholar 

  19. Kamps, U., Cramer, E.: On distributions of generalized order statistics. Stat. J. Theor. Appl. Stat. 35(3), 269–280 (2001)

    MathSciNet  MATH  Google Scholar 

Download references

Funding

The authors’ work was partially supported by the Fundamental Research Funds for the Central Universities (2020YJS183) and the National Statistical Science Research Project of China (No. 2019LZ32).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wenhao Gui.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Proof of Theorem 1

Suppose \(m_k>0\) and \(\alpha _k, k=1,2\) are fixed. Since \(\ln t\le t-1\) when \(t>0\), let \(t=\beta _k/\beta _k^{'}\), then

$$\begin{aligned} m_k\ln \beta _k&=m_k\ln \frac{\beta _k}{\beta _k^{'}}+m_k\ln \beta _k^{'}\le m_k\frac{\beta _k}{\beta _k^{'}}-m_k+m_k\ln \beta _k^{'}\\&=\beta _k(\sum _{i=1}^{J}R_i\ln u_{ki}+\sum _{i=1}^{m}\ln u_{ki}+R_m\ln u_{km})-m_k+m_k\ln \beta _k^{'}, \end{aligned}$$

which means that

$$\begin{aligned} l(&\alpha _1, \alpha _2, \beta _1, \beta _2)\\&=\sum _{k=1}^{2}m_k\ln \alpha _k+\sum _{k=1}^{2}m_k\ln \beta _k +\sum _{k=1}^{2}\sum _{i=1}^{m}\left( -\beta _k\ln u_{ki}+I_k\ln \frac{x_i^{\alpha _k-1}}{u_{ki}}\right) \\&\quad -\sum _{k=1}^{2}(\sum _{i=1}^{J}\beta _kR_i\ln u_{ki}+\beta _kR_m\ln u_{km})+\ln C\\&\le \sum _{k=1}^{2}m_k\ln \alpha _k +\sum _{k=1}^{2}\left( \beta _k(\sum _{i=1}^{J}R_i\ln u_{ki}\right. \\&\quad \left. +\sum _{i=1}^{m}\ln u_{ki}+R_m\ln u_{km})-m_k+m_k\ln \beta _k^{'}\right) \\&\quad +\sum _{k=1}^{2}\sum _{i=1}^{m}\left( -\beta _k\ln u_{ki}+I_k\ln \left( \frac{x_i^{\alpha _k-1}}{u_{ki}}\right) \right) \\&\quad -\sum _{k=1}^{2}\left( \sum _{i=1}^{J}\beta _kR_i\ln u_{ki}+\beta _kR_m\ln u_{km}\right) +\ln C\\&=\sum _{k=1}^{2}m_k\ln \alpha _k+\sum _{k=1}^{2}m_k\ln \beta _k^{'}-\sum _{k=1}^{2}m_k+\sum _{k=1}^{2}\sum _{i=1}^{m}I_k\ln \left( \frac{x_i^{\alpha _k-1}}{u_{ki}}\right) +\ln C. \end{aligned}$$

As \(m_k=\beta _k^{'}(\sum _{i=1}^{J}R_i\ln u_{ki}+\sum _{i=1}^{m}\ln u_{ki}+R_m\ln u_{km})\),

$$\begin{aligned} l(\alpha _1, \alpha _2, \beta _1, \beta _2)&\le \sum _{k=1}^{2}m_k\ln \alpha _k+\sum _{k=1}^{2}m_k\ln \beta _k^{'}\\&\quad +\sum _{k=1}^{2}\sum _{i=1}^{m}\left( -\beta _k^{'}\ln u_{ki}+I_k\ln \frac{x_i^{\alpha _k-1}}{u_{ki}}\right) \\&\quad -\sum _{k=1}^{2}\left( \sum _{i=1}^{J}\beta _k^{'}R_i\ln u_{ki}+\beta _k^{'}R_m\ln u_{km}\right) +\ln C\\&=l(\alpha _1, \alpha _2, \beta _1^{'}, \beta _2^{'}). \end{aligned}$$

The equation holds if and only if \(\beta _1=\beta _1^{'}\) and \(\beta _2=\beta _2^{'}\). The theorem is proved.

Proof of Theorem 2

Substitute \(\beta _1^{'}\) and \(\beta _2^{'}\) into (2.8). The profile log-likelihood function \(l(\alpha _1,\alpha _2)\) can be obtained as (B.1). Taking the derivative of \(\alpha _k\), the equation \(S(\alpha _k)\) including the MLE estimate of \(\alpha _k\) can be obtained.

$$\begin{aligned} l(\alpha _1,\alpha _2)&=\sum _{k=1}^{2}m_k\ln \alpha _k-\sum _{k=1}^{2}m_k\ln \left( \sum _{i=1}^{J+m+1}R_i^{'}\ln u_{ki}\right) \nonumber \\&\quad +\sum _{k=1}^{2}\sum _{i=1}^{m}I_k\ln \left( \frac{x_i^{\alpha _k-1}}{u_{ki}}\right) +\ln C. \end{aligned}$$
(B.1)

The following is to prove the existence of the maximum likelihood estimation of \(\alpha _k\). For \(\alpha _k\rightarrow 0^+, k=1,2\),

$$\begin{aligned} \lim _{\alpha _k\rightarrow 0^+}S_1(\alpha _k)= & {} \frac{1}{m_k}\sum _{i=1}^{m}I_k\lim _{\alpha _k\rightarrow 0^+}\frac{\ln x_i^{\alpha _k}}{1+x_i^{\alpha _k}}=0,\\ \lim _{\alpha _k\rightarrow 0^+}S_2(\alpha _k)= & {} 0, \lim _{\alpha _k\rightarrow 0^+}S_3(\alpha _k)>0, \end{aligned}$$

then

$$\begin{aligned} \lim _{\alpha _k\rightarrow 0^+}S(\alpha _k)=\lim _{\alpha _k\rightarrow 0^+}\frac{m_k}{\alpha _k}\left( 1+S_1(\alpha _k)-\frac{S_2(\alpha _k)}{S_3(\alpha _k)}\right) =+\infty . \end{aligned}$$

For \(\alpha _k\rightarrow +\infty , k=1,2\),

$$\begin{aligned} \lim _{\alpha _k\rightarrow +\infty }\frac{m_k}{\alpha _k}&=0, \lim _{\alpha _k\rightarrow +\infty }\frac{m_k}{\alpha _k}S_1(\alpha _k)=\lim _{\alpha _k\rightarrow +\infty }I_k\frac{\ln x_i}{1+x_i^{\alpha _k}}\\&=0, \lim _{\alpha _k\rightarrow +\infty }\frac{m_k}{\alpha _k}\frac{S_2(\alpha _k)}{S_3(\alpha _k)}>0, \end{aligned}$$

then

$$\begin{aligned} \lim _{\alpha _k\rightarrow +\infty }S(\alpha _k)=\lim _{\alpha _k\rightarrow +\infty }\frac{m_k}{\alpha _k}+\frac{m_k}{\alpha _k}S_1(\alpha _k)-\frac{m_k}{\alpha _k}\frac{S_2(\alpha _k)}{S_3(\alpha _k)}<0. \end{aligned}$$

Let

$$\begin{aligned} S(\alpha _k)&=\frac{m_k}{\alpha _k}+s(\alpha _k)^{'}+\sum _{i=1}^{m}\frac{I_k\ln x_i}{1+x_i^{\alpha _k}}, \\ S(\alpha _k)^{'}&=-m_k\frac{\sum _{i=1}^{J+m+1}R_i^{'}\frac{x_i^{\alpha _k}\ln x_i}{1+x_i^{\alpha _k}}}{\sum _{i=1}^{J+m+1}R_i^{'}\ln (1+x_i^{\alpha _k}) }. \end{aligned}$$

Obviously, \(\frac{m_k}{\alpha _k}\) and \(\sum _{i=1}^{m}\frac{I_k\ln x_i}{1+x_i^{\alpha _k}}\) in \(S(\alpha _k)\) decrease monotonically with \(\alpha _k\). The monotonicity of \(S(\alpha _k)^{'}\) will be proved below.

$$\begin{aligned} \frac{dS(\alpha _k)^{'}}{d\alpha _k}&=-\frac{m_kH(\alpha _k)}{(\sum _{i=1}^{J+m+1}R_i^{'}\ln (1+x_i^{\alpha _k}))^2}\\ H(\alpha _k)&=\sum _{i=1}^{J+m+1}R_i^{'}\frac{x_i^{\alpha _k}\ln ^2 x_i}{(1+x_i^{\alpha _k})^2}\sum _{i=1}^{J+m+1}R_i^{'}\ln (1+x_i^{\alpha _k})\\&\quad -\left( \sum _{i=1}^{J+m+1}R_i^{'}\frac{x_i^{\alpha _k}\ln x_i}{1+x_i^{\alpha _k}}\right) ^2\\&\ge \sum _{i=1}^{J+m+1}R_i^{'}\frac{x_i^{\alpha _k}\ln ^2 x_i}{(1+x_i^{\alpha _k})^2}\sum _{i=1}^{J+m+1}R_i^{'} x_i^{\alpha _k}\\&\quad -\left( \sum _{i=1}^{J+m+1}R_i^{'}\frac{x_i^{\alpha _k}\ln x_i}{1+x_i^{\alpha _k}}\right) ^2>0 \end{aligned}$$

The last step can be deduced by Cauchy-Schwarz inequality. \(S(\alpha _k)^{'}\) also decreases monotonically with \(\alpha _k\). Thus, \(S(\alpha _k)\) decreases monotonically from positive to negative, proving the existence and uniqueness of maximum likelihood estimation.

Generalized Order Statistics of Burr-XII Distribution

Knowing from [19], some results on generalized order statistics are proposed.

$$\begin{aligned} f_{X_{j:m:n}}(x)&=C_{j-1}f(x)\sum _{i=1}^{j}a_{i,j}(1-F(x))^{\gamma _i-1}\\ F_{X_{j:m:n}}(x)&=1-C_{j-1}\sum _{i=1}^{j}\frac{a_{i,j}}{\gamma _i}(1-F(x))^{\gamma _i}\quad \end{aligned}$$

where \(\gamma _j=n-j+1-\sum _{i=1}^{j-1}R_i\), \(C_{j-1}=\prod _{i=1}^{j}\gamma _i\), \(a_{i,j}=\prod _{k=1, k\ne i}^{j}\frac{1}{\gamma _k-\gamma _i}\), \(1\le i\le j\le m\).

For Burr-XII distribution, the pdf and cdf of \(X_{j:m:n}\) is expressed as below.

$$\begin{aligned} f_{X_{j:m:n}}(x)&=C_{j-1}\alpha _k\beta _k\sum _{i=1}^{j}a_{i,j}x^{\alpha _k-1}(1+x^{\alpha _k})^{\beta _k\gamma _i+1}\nonumber \\ F_{X_{j:m:n}}(x)&=1-C_{j-1}\sum _{i=1}^{j}\frac{a_{i,j}}{\gamma _i}(1+x^{\alpha _k})^{\beta _k\gamma _i} \end{aligned}$$
(C.1)

Expectations of the kth Moment of \(X_{m:m:n}\) Given \(J=j\) and \(J=m\)

When \(X_{j:m:n}=x_{j:m:n}\) is known, \(X_{j+1:m:n}\) can be regarded as a first-order statistic of a random sample following a truncated distribution with a sample size of \(\gamma _{j+1}=n-j-\sum _{i=1}^{j}R_i\). The conditional distribution of generalized order statistics can be obtained from “Appendix C”.

$$\begin{aligned} f_{X_s|X_r}(x_s|x_r)&=\frac{C_{s-1}}{C_{r-1}} \sum _{i=r+1}^{s}a_{i,s}\left( 1-\frac{F(x_s)}{F(x_r)}\right) ^{\gamma _i}\\&\quad \frac{f(x_s)}{1-F(x_s)}\quad x_r\le x_s, 1\le r<s\le n \end{aligned}$$

When it is known that \(X_{j:m:n}\) follows the Burr-XII distribution, the conditional probability density function and conditional distribution function of \(X_{j+1:m:n}\) are as follows.

$$\begin{aligned} f_{X_{j+1}|X_j}(x|x_j)&=\gamma _{j+1}(1+x_j^{\alpha _k})^{\beta _k\gamma _{j+1}}\alpha _k\beta _kx^{\alpha _k-1}(1+x^{\alpha _k})^{-(\beta _k\gamma _{j+1}+1)}\nonumber \\ F_{X_{j+1}|X_j}(x|x_j)&=1-(1+x_j^{\alpha _k})^{\beta _k\gamma _{j+1}}(1+x^{\alpha _k})^{-(\beta _k\gamma _{j+1})} \end{aligned}$$
(D.1)

In addition, when \(J=0\) and \(J=m\), the corresponding probability can be calculated.

$$\begin{aligned} P(J=0)&=P(X_{1:m:n}>T)=(1+T^{\alpha _k})^{-n\beta _k}\\ P(J=m)&=F_{X_{m:m:n}}(T)=1-C_{m-1}\sum _{i=1}^{m}\frac{a_{i,m}}{\gamma _i}(1+T^{\alpha _k})^{-\beta _k\gamma _i}. \end{aligned}$$

Then, the probability mass function of J, \(J=1, 2, \ldots , m-1\), can be computed with (C.1) and (D.1). Let \(\gamma _{m+1}\equiv 0\) and \(C_{0}\equiv 1\).

$$\begin{aligned} P(J=j)&=P(X_{j:m:n}<T\le X_{j+1:m:n})\\&=\int _{0}^{\infty }P(x<T\le X_{j+1:m:n}|X_{j:m:n}=x)f_{X_{j:m:n}}(x)dx\\&=\int _{0}^{T}[1-F_{X_{j+1:m:n}|X_{j:m:n}}(T|x_j)]f_{X_{j:m:n}}(x)dx\\&=C_{j-1}(1+x_j^{\alpha _k})^{\beta _k\gamma _{j+1}}\sum _{i=1}^{m}\frac{a_{i,j}}{\gamma _i+\gamma _{j+1}}[1-(1+T^{\alpha _k})^{-(\beta _k\gamma _i+\beta _k\gamma _{j+1})}] \end{aligned}$$

And the kth moment of \(X_{m:m:n}\) is

$$\begin{aligned} E(X^k_{m:m:n})=\displaystyle \sum _{j=1}^{m}P(J=j)E(X^k_{m:m:n}|J=j). \end{aligned}$$
(D.2)

Suppose \(Y_{r,s}\) is the rth order statistic of a random sample with sample size s from a left truncated Burr-XII distribution at time T. Let \(s=n-j-\sum _{i=1}^{j}R_i\), \(r=m-j\). Then the pdf g(x) and cdf G(x) of the parent distribution at time T are:

$$\begin{aligned} g(x)&=\frac{f(x; \alpha , \beta )}{1-F(T; \alpha , \beta )} =\frac{\alpha _k\beta _kx^{\alpha _k-1}(1+x^{\alpha _k})^{-(1+\beta _k)}}{(1+T^{\alpha _k})^{-\beta _k}},\quad T<x<\infty ,\\ G(x)&=\frac{F(x; \alpha , \beta )-F(T; \alpha , \beta )}{1-F(T; \alpha , \beta )}=1-\frac{(1+x^{\alpha _k})^{-\beta _k}}{(1+T^{\alpha _k})^{-\beta _k}},\quad T<x<\infty . \end{aligned}$$

Z follows Burr-XII distribution, that is \(Z\sim Burr(\alpha _k, \beta _k s)\). The density function and expectation of \(Y_{1,s}\) are:

$$\begin{aligned} f(Y_{1,s})&=s\cdot g(y)(1-G(y))^{s-1}\\&=\alpha _k\beta _ky^{\alpha _k-1}(1+y^{\alpha _k})^{-\beta _ks-1}(1+T^{\alpha _k})^{\beta _ks}\\&=(1+T^{\alpha _k})^{\beta _ks}\cdot Z,\\ E(Y_{1,s})&=(1+T^{\alpha _k})^{\beta _ks} E(Z)=(1+T^{\alpha _k})^{\beta _ks} \beta _ks\cdot B\left( \beta _ks-\frac{1}{\alpha _k},1+\frac{1}{\alpha _k}\right) , \end{aligned}$$

where B(.) is the beta function. Similarly, the expectation of \(Y_{r,s}\) and the conditional expectation \(E(X_{m:m:n}|J=j)\) can be obtained.

$$\begin{aligned} E(Y_{2,s})&=E(Y_{1,s})+E(Y_{1,s-1})\\&=(1+T^{\alpha _k})^{\beta _k(s-1)}\beta _k(s-1)\cdot \\&\quad B\left( \beta _k(s-1)-\frac{1}{\alpha _k},1+\frac{1}{\alpha _k}\right) +E(Y_{1,s})\\ E(Y_{r,s})&=\beta _k\sum _{l=s-r+1}^{s}l(1+T^{\alpha _k})^{\beta _kl}\cdot B\left( \beta _kl-\frac{1}{\alpha _k},1+\frac{1}{\alpha _k}\right) \\ E(X_{m:m:n}|J=j)&=T+E(Y_{r,s})=T\\&\quad +\beta _k\sum _{l=s-r+1}^{s}l(1+T^{\alpha _k})^{\beta _kl}\cdot B\left( \beta _kl-\frac{1}{\alpha _k},1+\frac{1}{\alpha _k}\right) \end{aligned}$$

According to (D.2), when \(J = j\), the conditional expectation \(E(X^k_{m:m:n}|J=j)\) is

$$\begin{aligned} E(X^k_{m:m:n}|J=j)&=\int _{T}^{\infty }x^kf(Y_{r,s})dx\\&=\int _{T}^{\infty }x^k\frac{s!}{(s-r)!}\alpha _k\beta _kx^{\alpha _k-1}\\&\quad \displaystyle \sum _{h=0}^{r-1}\frac{(-1)^h(1+x^{\alpha _k})^{-\beta _k(s-r+h+1)-1}}{h!(r-h-1)!(1+T^{\alpha _k})^{-\beta _k(s-r+h+1)}} dx\\&=\frac{s!}{(s-r)!}\alpha _k\beta _k\displaystyle \sum _{h=0}^{r-1}\frac{(-1)^h}{h!(r-h-1)!}(1+T^{\alpha _k})^{\beta _k(s-r+h+1)}\\&\quad \times \int _{T}^{\infty }x^{\alpha _k+k-1}(1+x^{\alpha _k})^{-\beta _k(s-r+h+1)-1}dx, \end{aligned}$$

and for \(J = m\),

$$\begin{aligned} E(X^k_{m:m:n}|J=m)&=\int _{0}^{T}\frac{x^kf_{X_{m:m:n}}(x)}{P(J=m)}dx\\\nonumber&=\frac{C_{m-1}\alpha _k\beta _k\sum _{i=1}^{m}a_{i,m}\int _{0}^{T}x^{\alpha _k+k-1}(1+x^{\alpha _k})^{-\beta _k\gamma _i-1}dx}{1-C_{m-1}\sum _{i=1}^{m}\frac{a_{i,m}}{\gamma _i}(1+T^{\alpha _k})^{-\beta _k\gamma _i}}. \end{aligned}$$

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Du, Y., Gui, W. Statistical Inference of Burr-XII Distribution Under Adaptive Type II Progressive Censored Schemes with Competing Risks. Results Math 77, 81 (2022). https://doi.org/10.1007/s00025-022-01617-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s00025-022-01617-4

Keywords

Mathematics Subject Classification

Navigation