Skip to main content
Log in

On robustness of the relative belief ratio and the strength of its evidence with respect to the geometric contamination prior

  • Review Article
  • Published:
Journal of the Korean Statistical Society Aims and scope Submit manuscript

Abstract

The relative belief ratio becomes a widespread tool in many hypothesis testing problems. It measures the statistical evidence that a given statement is true based on a combination of data, model and prior. Additionally, a measure of the strength is used to calibrate its value. In this paper, robustness of the relative belief ratio and its strength to the choice of the prior is studied. Specifically, the Gâteaux derivative is used to measure their sensitivity when the geometric contaminated prior is used. Examples are presented to illustrate the results.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Abdelrazeq, I., Al-Labadi, L., & Alzaatreh, A. (2020). On one-sample bayesian tests for the mean. Accepted Statistics. https://doi.org/10.1080/02331888.2020.1726918

  • Al-Labadi, L. (2020). The two-sample problem via relative belief ratio. Computational Statistics. https://doi.org/10.1007/s00180-020-00988-y

  • Al-Labadi, L., & Berry, S. (2020). Bayesian estimation of extropy and goodness of fit tests. Journal of Applied Statistics. https://doi.org/10.1080/02664763.2020.1812545

  • Al-Labadi, L., & Evans, M. (2017). Optimal robustness results for relative belief inferences and the relationship to prior-data conflict. Bayesian Analysis, 12, 705–728.

    Article  MathSciNet  Google Scholar 

  • Al-Labadi, L., & Evans, M. (2018). Prior based model checking. Canadian Journal of Statistics, 46, 380–398.

    Article  MathSciNet  Google Scholar 

  • Al-Labadi, L., Fazeli Asl, F., & Saberi, Z. (2020). A bayesian semiparametric gaussian copula approach to a multivariate normality test. Journal of Statistical Computation and Simulation. https://doi.org/10.1080/00949655.2020.1820504

  • Al-Labadi, L., Fazeli Asl, F., & Saberi, Z. (2021). A test for independence via bayesian nonparametric estimation of mutual information. Canadian Journal of Statistics. https://arxiv.org/abs/2002.03490

  • Al-Labadi, L., Fazeli Asl, F., & Wang, C. (2021). Measuring bayesian robustness using rényi divergence. Stats, 4, 251–268. https://doi.org/10.3390/stats4020018.

    Article  Google Scholar 

  • Al-Labadi, L., Patel, V., Vakiloroayaei, K., & Wan, C. (2020). Kullback–Leibler divergence for bayesian nonparametric model checking. Journal of the Korean Statistical Society. https://arxiv.org/abs/1903.00669

  • Al-Labadi, L., Zeynep, B., & Evans, M. (2017). Goodness of fit for the logistic regression model using relative belief. Journal of Statistical Distributions and Applications. https://doi.org/10.1186/s40488-017-0070-7.

    Article  MATH  Google Scholar 

  • Al-Labadi, L., Zeynep, B., & Evans, M. (2018). Statistical reasoning: Choosing and checking the ingredients, inferences based on a measure of statistical evidence with some applications. Entropy, 20, 289. https://doi.org/10.3390/e20040289.

    Article  Google Scholar 

  • Baskurt, Z., & Evans, M. (2013). Hypothesis assessment and inequalities for Bayes factors and relative belief ratios. Bayesian Analysis, 8, 569–590.

    Article  MathSciNet  Google Scholar 

  • Berger, J. (1984). The robust Bayesian viewpoint (with discussion). In J. Kadane (Ed.), Robustness in Baysian Statistics. Amsterdam: North-Holland.

    Google Scholar 

  • Berger, J. (1990). Robust Bayesian analysis: Sensitivity to the prior. Journal of Statistical Planning and Inference, 25, 303–328.

    Article  MathSciNet  Google Scholar 

  • Berger, J., & Berliner, L. M. (1986). Robust Bayes and empirical Bayes analysis with c-contaminated priors. Annals of Statistics, 14, 461–486.

    MathSciNet  MATH  Google Scholar 

  • Das Gupta, A., & Studden, W. J. (1988). Robust Bayesian analysis and optimal experimental designs in normal linear models with many parameters I, Tech. Report, Department of Statistics, Purdue University.

  • Das Gupta, A., & Studden, W. J. (1988). Variations in posterior measures for priors in a band: Effect of additional restrictions, Tech. Report, Department of Statistics, Purdue University.

  • De Robertis, L., & Hartigan, J. A. (1981). Bayesian inference using intervals of measures. Annals of Statistics, 9, 235–244.

    MathSciNet  MATH  Google Scholar 

  • Dey, D. K., & Birmiwal, L. R. (1994). Robust Bayesian analysis using divergence measures. Statistics & Probability Letters, 20, 287–294.

    Article  MathSciNet  Google Scholar 

  • Diaconis, P., & Freedman, D. (1986). On the consistency of Bayes estimates. Annals of Statistics, 14, 1–67.

    MathSciNet  MATH  Google Scholar 

  • Evans, M. (2015). Measuring statistical evidence using relative belief. Monographs on Statistics and Applied Probability 144, CRC Press, Taylor & Francis Group.

  • Evan, M., & Tomal, J. (2018). Measuring statistical evidence and multiple testing. FACET, 3, 563–583.

    Article  Google Scholar 

  • Gelfand, A. E., & Dey, D. K. (1991). On measuring Bayesian robustness of contaminated classes of priors. Statistics and Decisions, 9, 63–80.

    MathSciNet  MATH  Google Scholar 

  • Gustafson, P., & Wasserman, L. (1995). Local sensitivity diagnostics for Bayesian inference. Annals of Statistics, 23, 2153–2167.

    Article  MathSciNet  Google Scholar 

  • Kass, R. E., & Raftery, A. E. (1995). Bayes factors. Journal of the American Statistical Association, 90, 773–795.

    Article  MathSciNet  Google Scholar 

  • Ruggeri, E., & Wasserman, L. (1993). Infinitesimal sensitivity of posterior distributions. Canadian Journal of Statistics, 21, 195–203.

    Article  MathSciNet  Google Scholar 

  • Sivaganesan, S., & Berger, J. (1989). Ranges of posterior measures for priors with unimodal contaminations. Annals of Statistics, 17, 868–889.

    Article  MathSciNet  Google Scholar 

  • Wasserman, L. (1989). A robust Bayesian interpretation of likelihood regions. Annals of Statistics, 17, 1387–1393.

    Article  MathSciNet  Google Scholar 

Download references

Acknowledgements

The authors thank the Editor, the Associate Editor and anonymous referees for their important and constructive comments that led to significant improvement of the paper. In particular, the connection with covariance in Proposition 2 is highly appreciated.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luai Al-Labadi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A Proof of Proposition 1

From (8),

$$\begin{aligned} \lim _{\epsilon \rightarrow 0} \frac{RB_{\epsilon }(\theta |x)-RB(\theta |x)}{\epsilon }&=RB(\theta |x)\lim _{\epsilon \rightarrow 0} \frac{\frac{m(x)}{m_{\epsilon }(x)}-1}{\epsilon }\nonumber \\&=RB(\theta |x)\lim _{\epsilon \rightarrow 0} \frac{m(x)-m_{\epsilon }(x)}{\epsilon m_{\epsilon }(x)} \end{aligned}$$
(9)

Using L’Hôpital’s rule, we get

$$\begin{aligned} (9)&=RB(\theta |x)\lim _{\epsilon \rightarrow 0} \frac{-\frac{d}{d\epsilon }m_{\epsilon }(x)}{\epsilon \frac{d}{d\epsilon }m_{\epsilon }(x)+m_{\epsilon }(x)}. \end{aligned}$$

From (6),

$$\begin{aligned} \frac{d}{d\epsilon }m_{\epsilon }(x)=\int _{\Theta }\frac{d}{d\epsilon }\left\{ c(\epsilon )\pi ^{1-\epsilon }(\theta )q^{\epsilon }(\theta )\right\} f(x|\theta )d\theta . \end{aligned}$$

Let \(h(\epsilon )=c(\epsilon )\pi ^{1-\epsilon }(\theta )q^{\epsilon }(\theta )\). Then

$$\begin{aligned} \log h(\epsilon )=\log c(\epsilon )+(1-\epsilon )\log \pi (\theta )+\epsilon \log q(\theta ). \end{aligned}$$
(10)

It follows that

$$\begin{aligned} \frac{d}{d\epsilon }h(\epsilon )&=\left[ \frac{\frac{d}{d\epsilon }c(\epsilon )}{c(\epsilon )}+ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] c(\epsilon )\pi ^{1-\epsilon }(\theta )q^{\epsilon }(\theta ). \end{aligned}$$

Thus,

$$\begin{aligned} \frac{d}{d\epsilon }m_{\epsilon }(x)&=\int _{\Theta } \frac{d}{d\epsilon }c(\epsilon )\pi ^{1-\epsilon }(\theta )q^{\epsilon }(\theta ) f(x|\theta )d\theta \\+ & {} \int _{\Theta } \log \left( \frac{q(\theta )}{\pi (\theta )}\right) c(\epsilon ) \pi ^{1-\epsilon }(\theta )q^{\epsilon }(\theta ) f(x|\theta )d\theta . \end{aligned}$$

As \(\epsilon \rightarrow 0,\) we have

$$\begin{aligned} \frac{d}{d\epsilon }m_{\epsilon }(x)&\rightarrow \int _{\Theta } \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \pi (\theta ) f(x|\theta )d\theta \\&= m(x)\int _{\Theta } \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \frac{\pi (\theta ) f(x|\theta )}{m(x)}d\theta \\&= m(x) E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] . \end{aligned}$$

Thus,

$$\begin{aligned} (9)= RB(\theta |x)\frac{m(x) E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] }{m(x)}= RB(\theta |x)E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] . \end{aligned}$$

Appendix B Proof of Proposition 2

Let \(\Pi _{\epsilon }(\cdot |x)\) be the posterior cumulative distribution function with posterior density \(\pi _{\epsilon }(\cdot |x)\). By (7) and (8),

$$\begin{aligned}&\lim _{\epsilon \rightarrow 0}\left\lbrace \frac{\Pi _{\epsilon }\left( RB_{\epsilon }(\theta |x)\le RB_{\epsilon }(\theta _{0}|x)|x\right) -\Pi \left( RB(\theta |x)\le RB(\theta _{0}|x)|x\right) }{\epsilon }\right\rbrace \nonumber \\ &=\lim _{\epsilon \rightarrow 0}\left\lbrace \frac{\Pi _{\epsilon }\left( RB(\theta |x)\frac{m(x)}{m_{\epsilon }(x)}\le RB(\theta _{0}|x)\frac{m(x)}{m_{\epsilon }(x)}|x\right) -\Pi \left( RB(\theta |x)\frac{m(x)}{m_{\epsilon }(x)}\le RB(\theta _{0}|x)\frac{m(x)}{m_{\epsilon }(x)}|x\right) }{\epsilon }\right\rbrace \nonumber \\ &=\lim _{\epsilon \rightarrow 0}\left\lbrace \frac{\Pi _{\epsilon }\left( RB(\theta |x)\le RB(\theta _{0}|x)|x\right) -\Pi \left( RB(\theta |x)\le RB(\theta _{0}|x)|x\right) }{\epsilon }\right\rbrace \nonumber \\ &=\lim _{\epsilon \rightarrow 0}\left\lbrace \frac{\Pi _{\epsilon }\left( \frac{f_{\theta }(x)}{m(x)}\le \frac{f_{\theta _{0}}(x)}{m(x)}|x\right) -\Pi \left( \frac{f_{\theta }(x)}{m(x)}\le \frac{f_{\theta _{0}}(x)}{m(x)}|x\right) }{\epsilon }\right\rbrace \nonumber \\ &=\lim _{\epsilon \rightarrow 0}\left\lbrace \frac{\Pi _{\epsilon }(f_{\theta }(x)\le f_{\theta _{0}}(x)|x) -\Pi \left( f_{\theta }(x)\le f_{\theta _{0}}(x)|x\right) }{\epsilon }\right\rbrace \nonumber \\ &=I. \end{aligned}$$
(11)

Observe that,

$$\begin{aligned} \Pi _{\epsilon }\left( f_{\theta }(x)\le f_{\theta _{0}}(x)|x\right) =\int _{{\mathscr {M}}}\frac{c(\epsilon )\pi ^{(1-\epsilon )}(\theta )q^{\epsilon }(\theta )f(x|\theta )}{m_{\epsilon }(x)}\, d\theta . \end{aligned}$$

Therefore,

$$\begin{aligned} I=\lim _{\epsilon \rightarrow 0}\left\lbrace \frac{\int _{{\mathscr {M}}}\frac{c(\epsilon )\pi ^{(1-\epsilon )}(\theta )q^{\epsilon }(\theta )f(x|\theta )}{m_{\epsilon }(x)}\, d\theta -\int _{{\mathscr {M}}}\frac{\pi (\theta )f(x|\theta )}{m(x)}\, d\theta }{\epsilon }\right \rbrace . \end{aligned}$$

Using L’Hôpital’s rule,

$$\begin{aligned} I&=\lim _{\epsilon \rightarrow 0}\frac{\frac{d}{d\epsilon }\left[ \int _{{\mathscr {M}}}\frac{c(\epsilon )\pi ^{(1-\epsilon )}(\theta )q^{\epsilon }(\theta )f(x|\theta )}{m_{\epsilon }(x)}\, d\theta \right] -0}{1}\\&=\lim _{\epsilon \rightarrow 0}\int _{{\mathscr {M}}} \frac{d}{d\epsilon }\left[ \frac{c(\epsilon )\pi ^{(1-\epsilon )}(\theta )q^{\epsilon }(\theta )f(x|\theta )}{m_{\epsilon }(x)}\right] \, d\theta \\&=\lim _{\epsilon \rightarrow 0}\int _{{\mathscr {M}}} f(x|\theta )\frac{d}{d\epsilon }\left[ \frac{h(\epsilon )}{m_{\epsilon }(x)}\right] \, d\theta , \end{aligned}$$

where \(h(\epsilon )=c(\epsilon )\pi ^{(1-\epsilon )}(\theta )q^{\epsilon }(\theta )\). Hence,

$$\begin{aligned} I&=\lim _{\epsilon \rightarrow 0}\int _{{\mathscr {M}}} f(x|\theta ) \frac{m_{\epsilon }(x)\frac{d}{d\epsilon }h(\epsilon )-h(\epsilon )\frac{d}{d\epsilon }m_{\epsilon }(x)}{(m_{\epsilon }(x))^{2}} \, d\theta \\&=\lim _{\epsilon \rightarrow 0}\int _{{\mathscr {M}}} f(x|\theta )\frac{\frac{d}{d\epsilon }h(\epsilon )}{m_{\epsilon }(x)}d\theta -\lim _{\epsilon \rightarrow 0}\int _{{\mathscr {M}}}f(x|\theta )\frac{h(\epsilon )\frac{d}{d\epsilon }m_{\epsilon }(x)}{(m_{\epsilon }(x))^{2}} d\theta . \end{aligned}$$

Applying \(\frac{d}{d\epsilon }h(\epsilon )\) given by (10) in the above equation gives

$$\begin{aligned} I&=\lim _{\epsilon \rightarrow 0}\int _{{\mathscr {M}}}f(x|\theta ) \frac{\left[ \frac{\frac{d}{d\epsilon }c(\epsilon )}{c(\epsilon )}+ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] c(\epsilon )\pi ^{1-\epsilon }(\theta )q^{\epsilon }(\theta )}{m_{\epsilon }(x)}\,d\theta \\&\quad -\lim _{\epsilon \rightarrow 0}\int _{{\mathscr {M}}}f(x|\theta ) \frac{h(\epsilon )\frac{d}{d\epsilon }m_{\epsilon }(x) }{(m_{\epsilon }(x))^{2}}\,d\theta \\&=I_1-I_2. \end{aligned}$$

After some simplification, we obtain

$$\begin{aligned}&I_1 =\lim _{\epsilon \rightarrow 0}\left\{ \int _{{\mathscr {M}}}\frac{f(x|\theta )}{m_{\epsilon }(x)}\pi ^{1-\epsilon }(\theta )q^{\epsilon }(\theta )\frac{d}{d\epsilon }c(\epsilon )\,d\theta \right. \\&\quad \left. +\int _{{\mathscr {M}}}\frac{f(x|\theta )}{m_{\epsilon }(x)} \log \left( \frac{q(\theta )}{\pi (\theta )}\right) c(\epsilon ) \pi ^{1-\epsilon }(\theta )q^{\epsilon }(\theta )\,d\theta \right\} . \end{aligned}$$

Letting \(\epsilon \rightarrow 0\) makes

$$\begin{aligned} I_1\rightarrow \int _{{\mathscr {M}}}\log \left( \frac{q(\theta )}{\pi (\theta )}\right) \frac{f(x|\theta )\pi (\theta )}{m(x)}\,d\theta =\int _{{\mathscr {M}}}\log \left( \frac{q(\theta )}{\pi (\theta )}\right) \pi (\theta |x)\,d\theta . \end{aligned}$$

On the other hand, since \(h(\epsilon )\rightarrow \pi (\theta )\) and \(\frac{d}{d\epsilon }m_{\epsilon }(x)\rightarrow m(x) E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right]\) as \(\epsilon \rightarrow 0\), we have

$$\begin{aligned} I_2&\rightarrow \int _{{\mathscr {M}}}f(x|\theta )\frac{\pi (\theta )m(x) E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] }{(m(x))^2}\, d\theta \\&=E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] \int _{{\mathscr {M}}}\frac{f(x|\theta )\pi (\theta )}{m(x)}\,d\theta \\&= E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] \Pi \left( f_{\theta }(x)\le f_{\theta _{0}}(x)|x\right) \\&= E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] \Pi \left( RB(\theta |x)\le RB(\theta _{0}|x)|x\right) . \end{aligned}$$

Thus, as \(\epsilon \rightarrow 0\),

$$\begin{aligned} I&\rightarrow \int _{{\mathscr {M}}}\log \left( \frac{q(\theta )}{\pi (\theta )}\right) \pi (\theta |x)\, d\theta -E_{\pi (\theta |x)}\left[ \log \left( \frac{q(\theta )}{\pi (\theta )}\right) \right] \Pi \left( RB(\theta |x)\le RB(\theta _{0}|x)|x\right) . \end{aligned}$$

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Al-Labadi, L., Asl, F.F. On robustness of the relative belief ratio and the strength of its evidence with respect to the geometric contamination prior. J. Korean Stat. Soc. 51, 961–975 (2022). https://doi.org/10.1007/s42952-022-00170-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42952-022-00170-8

Keywords

Mathematics Subject Classification

Navigation