Skip to main content
Log in

Assessing the robustness of estimators when fitting Poisson inverse Gaussian models

  • Published:
Metrika Aims and scope Submit manuscript

Abstract

The generalized linear mixed model (GLMM) extends classical regression analysis to non-normal, correlated response data. Because inference for GLMMs can be computationally difficult, simplifying distributional assumptions are often made. We focus on the robustness of estimators when a main component of the model, the random effects distribution, is misspecified. Results for the maximum likelihood estimators of the Poisson inverse Gaussian model are presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Abramowitz M, Stegun I (eds) (1972) Handbook of mathematical functions. Dover, New York

    MATH  Google Scholar 

  • Bickel PJ (1984) Robust regression based on infinitesimal neighbourhoods. Ann Stat 12:1349–1368

    Article  MathSciNet  Google Scholar 

  • Bickel PJ, Doksum KA (2001) Mathematical statistics: basic ideas and selected topics. Prentice Hall, Upper Saddle River

    MATH  Google Scholar 

  • de Bruijn NG (1953) The difference-differential equation \(f^\prime (x) = e^\alpha x + \beta f(x-1)\), i, ii. Indag Math 15(449–458):459–464

    Article  MathSciNet  Google Scholar 

  • Dean CB, Nielsen JD (2007) Generalized linear mixed models: a review and some extensions. Lifetime Data Anal 13(4):497–512

    Article  MathSciNet  Google Scholar 

  • Dean C, Lawless JF, Willmot GE (1989) A mixed Poisson-inverse-Gaussian regression model. Canad J Stat 17:171–181

    Article  MathSciNet  Google Scholar 

  • Feller W (1966) An introduction to probability theory and its applications, vol 2. Wiley, New York

    MATH  Google Scholar 

  • Gustafson P (1996) The effect of mixing-distribution misspecification in conjugate mixture models. Canad J Stat 24:307–318

    Article  MathSciNet  Google Scholar 

  • Hampel FR (1974) The influence curve and its role in robust estimation. J Am Stat Assoc 69:383–393

    Article  MathSciNet  Google Scholar 

  • Hampel FR, Ronchetti EM, Rousseeuw PJ, Stahel WA (1986) Robust statistics: the approach based on influence functions. Wiley, New York

    MATH  Google Scholar 

  • Heagerty PJ, Zeger SL (2000) Marginalized multilevel models and likelihood inference (with comments and a rejoinder by the authors). Stat Sci 15(1):1–26

    Google Scholar 

  • Heckman J, Singer B (1984) A method for minimizing the impact of distributional assumptions in econometric models for duration data. Econom J Econom Soc 52(2):271–320

    MathSciNet  MATH  Google Scholar 

  • Hilbe JM (2014) Modeling count data. Cambridge University Press, New York

    Book  Google Scholar 

  • Hougaard P, Lee ML, Whitmore GA (1997) Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes. Biometrics 53:1225–1238

    Article  MathSciNet  Google Scholar 

  • Huber PJ (1981) Robust statistics. Wiley, New York

    Book  Google Scholar 

  • Karlis D, Xekalaki E (2005) Mixed Poisson distributions. Int Stat Rev 73(1):35–58

    Article  Google Scholar 

  • Litière S, Alonso A, Molenberghs G (2008) The impact of a misspecified random-effects distribution on the estimation and the performance of inferential procedures in generalized linear mixed models. Stat Med 27(16):3125–3144

    Article  MathSciNet  Google Scholar 

  • McCulloch CE, Neuhaus JM (2013) Misspecifying the shape of a random effects distribution: why getting it wrong may not matter. Biostatistics 14:477–490

    Article  Google Scholar 

  • McCulloch CE, Searle SR, Neuhaus JM (2008) Generalized, linear and mixed models. Wiley, New York

    MATH  Google Scholar 

  • Neuhaus JM, Hauck WW, Kalbfleisch JD (1992) The effects of mixture distribution misspecification when fitting mixed-effects logistic models. Biometrika 79:755–762

    Article  Google Scholar 

  • Ong SH (1998) A note on the mixed Poisson formulation of the Poisson-inverse Gaussian distribution. Commun Stat Simul Comput 27(1):67–78

    Article  MathSciNet  Google Scholar 

  • R Core Team. R: a language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, 2017. https://www.R-project.org

  • Rieder H (1994) Robust Asymptot Stat. Springer, New York

    Book  Google Scholar 

  • Seshadri V (1993) The inverse Gaussian distribution: a case study in exponential families. Clarendon, Oxford

    Google Scholar 

  • Seshadri V (1999) The inverse Gaussian distribution: statistical theory and applications. Springer, New York

    Book  Google Scholar 

  • Shaban SA (1981) Computation of the Poisson-inverse Gaussian distribution. Commun Stat Theory Methods 10(14):1389–1399

    Article  MathSciNet  Google Scholar 

  • Shoukri MM, Asyali MH, VanDorp R, Kelton DF (2004) The Poisson inverse Gaussian regression model in the analysis of clustered counts data. J Data Sci 2:17–32

    Google Scholar 

  • Stasinopoulos MD, Rigby RA, Heller GZ, Voudouris V, De Bastiani F (2017) Flexible regression and smoothing: using GAMLSS in R. Chapman and Hall/CRC, London

    Book  Google Scholar 

  • Teugels JL, Willmot G (1987) Approximations for stop-loss premiums. Insur Math Econ 6:195–202

    Article  MathSciNet  Google Scholar 

  • Van de Ven R, Weber NC (1995) Log-linear models for mean and dispersion in mixed Poisson regression models. Aust J Stat 37(2):205–216

    Article  MathSciNet  Google Scholar 

  • Verbeke G, Molenberghs G (2013) The gradient function as an exploratory goodness-of-fit assessment of the random-effects distribution in mixed models. Biostatistics 14:477–490

    Article  Google Scholar 

  • Weems KS, Smith PJ (2004) On robustness of maximum likelihood estimates for Poisson-lognormal models. Stat Probab Lett 66:189–196

    Article  MathSciNet  Google Scholar 

  • Willmot GE (1990) Asymptotic tail behaviour of Poisson mixtures with applications. Adv Appl Probab 22:147–159

    Article  MathSciNet  Google Scholar 

  • Zha L, Lord D, Zou Y (2016) The Poisson inverse Gaussian (PIG) generalized linear regression model for analyzing motor vehicle crash data. J Transp Saf Secur 8(1):18–35

    Article  Google Scholar 

Download references

Acknowledgements

The authors thank the Editor, reviewers, Dr. Dennis Boos (North Carolina State University) and Dr. Kimberly F. Sellers (Georgetown University) for helpful comments and suggestions that significantly improved this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kimberly S. Weems.

Ethics declarations

Conflict of interest

On behalf of all authors, the corresponding author states that there is no conflict of interest.

Additional information

This work was supported by National Science Foundation Grant #1700235. Additional support was provided by a grant from North Carolina Central university.

Appendix: Upper and lower bounds for Poisson inverse Gaussian probability ratios

Appendix: Upper and lower bounds for Poisson inverse Gaussian probability ratios

Below we provide proofs to Lemmas 2 and 3, in which we find an upper bound and a lower bound for Poisson inverse Gaussian probability ratios.

Lemma 2

Let \(\nu =+\sqrt{\tau ^{-1}(\tau ^{-1} + 2\mu )}\). Then for \(y>0,\)

$$\begin{aligned} (y+1)\frac{P_{y+1}}{P_y} \le (1+2\tau )^{-1/2}(1+\nu ^{-1})(2y-1). \end{aligned}$$

Proof

Equation (7) gives us the following relationship:

$$\begin{aligned} (y+1)\frac{P_{y+1}}{P_y}= & {} (y+1) \left\{ \frac{(1+2\tau )^{-1/2}}{y+1}\frac{ K_{y+\frac{1}{2}}(\nu )}{K_{y-\frac{1}{2}}(\nu )}\right\} \nonumber \\= & {} (1+2\tau )^{-1/2} \frac{ K_{y+\frac{1}{2}}(\nu )}{K_{y-\frac{1}{2}}(\nu )}. \end{aligned}$$
(13)

From Abramowitz and Stegun (1972), we have that for \(k=0,\pm \,1,\pm \,2,\ldots \)

$$\begin{aligned} K_{k+\frac{1}{2}}(\nu ) = {C(\nu )}\sum _{r=0}^{k }\frac{(k+r)!}{r!(k -r)!(2\nu )^r}, \end{aligned}$$
(14)

where \({C(\nu )}={\sqrt{\pi /(2\nu )} \exp (-\nu )}\). Therefore, we may write

$$\begin{aligned} \frac{K_{y+\frac{1}{2}}(\nu )}{{C(\nu )}}= & {} \sum _{r=0}^{y}\frac{(y+r)!}{r!(y-r)!(2\nu )^r}\\= & {} \sum _{r=0}^{y}\frac{(y-1+r)!}{r!(y-1-r)!(2\nu )^r} \frac{(y+r)}{(y-r)} \\= & {} \frac{(2y)!}{y! (2\nu )^y} + \sum _{r=0}^{y-1}\frac{(y-1+r)!}{r!(y-1-r)!(2\nu )^r}\frac{(y+r)}{(y-r)}\\\le & {} \frac{(2y)!}{y! (2z)^y} + (2y-1)\sum _{r=0}^{y-1}\frac{(y-1+r)!}{r!(y-1-r)!(2\nu )^r}\\\le & {} \frac{(2y)!}{y! (2\nu )^y} + \frac{(2y-1)K_{y-\frac{1}{2}}(\nu )}{{C(\nu )}}. \end{aligned}$$

Substituting in the numerator of (13) we find that

$$\begin{aligned} (y+1)\frac{P_{y+1}}{P_y}\le & {} (1+2\tau )^{-1/2}\left[ \frac{{C(\nu )}(2y)!/(y!(2\nu )^y) + (2y-1) K_{y-\frac{1}{2}}(\nu )}{K_{y-\frac{1}{2}}(\nu )}\right] \nonumber \\= & {} (1+2\tau )^{-1/2}\left[ \frac{{C(\nu )}(2y)!}{y!(2\nu )^y K_{y-\frac{1}{2}}(\nu ) } + 2y-1\right] \nonumber \\\le & {} (1+2\tau )^{-1/2}(1+\nu ^{-1})(2y-1), \end{aligned}$$
(15)

where the last line uses the following inequality:

$$\begin{aligned} \frac{y!(2\nu )^y K_{y-\frac{1}{2}}(\nu )}{{C(\nu )}(2y)!}= & {} \frac{y!(2\nu )^y}{(2y)!}\sum _{r=0}^{y-1}\frac{(y-1+r)!}{r!(y-1-r)!(2\nu )^r}\\\ge & {} \frac{y!(2\nu )^y}{(2y)!} \frac{(2y-2)!}{(y-1)!(2\nu )^{y-1}} \\= & {} \frac{\nu }{2y-1}. \end{aligned}$$

\(\square \)

Lemma 3

Let \(\nu =+\,\sqrt{\tau ^{-1}(\tau ^{-1} + 2\mu )}\). Then for \(y>0,\)

$$\begin{aligned} (y+1)\frac{P_{y+1}}{P_y} \ge (1+2\tau )^{-1/2} \left[ 1+\left( \frac{1}{1+2\nu }\right) ^y\right] . \end{aligned}$$

Proof

Recall from Eq. (13) that

$$\begin{aligned} (y+1)\frac{P_{y+1}}{P_y} =(1+2\tau )^{-1/2} \frac{ K_{y+\frac{1}{2}}(\nu )}{K_{y-\frac{1}{2}}(\nu )}. \end{aligned}$$
(16)

Using (14) with \({C(\nu )}={\sqrt{\pi /(2\nu )} \exp (-\nu )}\), we may write

$$\begin{aligned} \frac{K_{y-\frac{1}{2}}(z)}{{C(\nu )}}= & {} \sum _{r=0}^{y-1}\frac{(y-1+r)!}{r!(y-1-r)!(2\nu )^r} \\= & {} \sum _{r=0}^{y-1}\frac{(y+r)!}{r!(y-r)!(2\nu )^r} \frac{(y-r)}{(y+r)}\\\le & {} \sum _{r=0}^{y-1}\frac{(y+r)!}{r!(y-r)!(2\nu )^r}\\= & {} \sum _{r=0}^{y}\frac{(y+r)!}{r!(y-r)!(2z)^r} - \frac{(2y)!}{y! (2\nu )^y} \\= & {} \frac{K_{y+\frac{1}{2}}(\nu )}{{C(\nu )}} - \frac{(2y)!}{y! (2\nu )^y}. \end{aligned}$$

Therefore, substituting into (16), we have the following lower bound:

$$\begin{aligned} (y+1)\frac{P_{y+1}}{P_y}\ge & {} (1+2\tau )^{-1/2} \frac{ K_{y+\frac{1}{2}}(\nu )}{K_{y+\frac{1}{2}}(\nu ) -{C(\nu )}(2y)! / (y!(2\nu )^y)} \nonumber \\= & {} (1+2\tau )^{-1/2} \frac{1}{1 - {C(\nu )}(2y)!/(y!(2\nu )^y K_{y+\frac{1}{2}}(\nu ) )}\nonumber \\\ge & {} (1+2\tau )^{-1/2}\left[ 1+\frac{{C(\nu )}(2y)!}{y!(2\nu )^y K_{y+\frac{1}{2}}(\nu )}\right] , \end{aligned}$$
(17)

where we use the elementary inequality \(1/(1-x) \ge 1+x\). Notice that

$$\begin{aligned} \frac{y! (2\nu )^y K_{y+\frac{1}{2}}(\nu ) }{{C(\nu )}(2y)!}= & {} \sum _{r=0}^{y}\frac{(y+r)!}{r!(y-r)!(2\nu )^r} \frac{y! (2\nu )^y }{(2y)!}\\= & {} \sum _{r=0}^{y}\frac{(y+r)!}{(2y)!} \frac{y!(2\nu )^{y-r}}{r!(y-r)!}\\= & {} \sum _{r=0}^{y}\frac{(y+r)!}{(2y)!} \left( {\begin{array}{c}y\\ r\end{array}}\right) (2\nu )^{y-r} \\\le & {} \sum _{r=0}^{y} \left( {\begin{array}{c}y\\ r\end{array}}\right) (2\nu )^{y-r} \\= & {} (1+2\nu )^y. \end{aligned}$$

Therefore, (17) becomes

$$\begin{aligned} (y+1)\frac{P_{y+1}}{P_y} \ge (1+2\tau )^{-1/2} \left[ 1+\left( \frac{1}{1+2\nu }\right) ^y\right] \end{aligned}$$

by substitution. \(\square \)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Weems, K.S., Smith, P.J. Assessing the robustness of estimators when fitting Poisson inverse Gaussian models. Metrika 81, 985–1004 (2018). https://doi.org/10.1007/s00184-018-0664-1

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00184-018-0664-1

Keywords

Navigation