Advertisement

Abstract

Generalized linear mixed models (GLMM) are generalized linear models with normally distributed random effects in the linear predictor. Penalized quasi-likelihood (PQL), an approximate method of inference in GLMMs, involves repeated fitting of linear mixed models with “working” dependent variables and iterative weights that depend on parameter estimates from the previous cycle of iteration. The generality of PQL, and its implementation in commercially available software, has encouraged the application of GLMMs in many scientific fields. Caution is needed, however, since PQL may sometimes yield badly biased estimates of variance components, especially with binary outcomes.

Recent developments in numerical integration, including adaptive Gaussian quadrature, higher order Laplace expansions, stochastic integration and Markov chain Monte Oarlo (MOMO) algorithms, provide attractive alternatives to PQL for approximate likelihood inference in GLMMs. Analyses of some well known datasets, and simulations based on these analyses, suggest that PQL still performs remarkably well in comparison with more elaborate procedures in many practical situations. Adaptive Gaussian quadrature is a viable alternative for nested designs where the numerical integration is limited to a small number of dimensions. Higher order Laplace approximations hold the promise of accurate inference more generally. MOMO is likely the method of choice for the most complex problems that involve high dimensional integrals

Keywords

Variance Component Markov Chain Monte Carlo Generalize Linear Mixed Model Royal Statistical Society Quadrature Point 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [l]
    A. Agresti and J. Hartzel. Strategies for comparing treatments on a binary response with multi-centre data. Statistics in Medicine, 19:1115–1139, 2000.CrossRefGoogle Scholar
  2. [2]
    P. J. Beitler and J. R. Landis. A mixed-effects model for categorical data. Biometrics, 41:991–1000, 1985.CrossRefGoogle Scholar
  3. [3]
    J. G. Booth and J. P. Hobert. Maximizing generalized linear mixed model likelihoods with an automated Monte Carlo EM algorithm. Journal of the Royal Statistical Society, Series B, 61:265–285, 1999.MATHCrossRefGoogle Scholar
  4. [4]
    N. Breslow, B. Leroux, and R. Platt. Approximate hierarchical modelling of discrete data in epidemiology. Statistical Methods in Medical Research, 7:49–62, 1998.CrossRefGoogle Scholar
  5. [5]
    N. E. Breslow and D. G. Clayton. Approximate inference in generalized linear mixed models. Journal of the American Statistical Association, 88:9–25, 1993.MATHCrossRefGoogle Scholar
  6. [6]
    N. E. Breslow and X. H. Lin. Bias correction in generalized linear mixed models with a single component of dispersion. Biometrika, 82:81–91, 1995.MathSciNetMATHCrossRefGoogle Scholar
  7. [7]
    Z. Chen and L. Kuo. A note on the estimation of the multinomial logit model with random effects. American Statistician, 55:89–95, 2001.MathSciNetCrossRefGoogle Scholar
  8. [8]
    D. G. Clayton. Generalized linear mixed models. In W. R. Gilks, S. Richardson, and D. J. Spiegelhalter, editors, Markov Chain Monte Carlo in Practice, chapter 16, pages 275–301. Chapman and Hall, London, 1999.Google Scholar
  9. [9]
    CYTEL Software Corporation. EGRET for Windows. CYTEL Software Corporation, Cambridge, MA, 1999.Google Scholar
  10. [10]
    D. R. Cox and E. J. Snell. Analysis of Binary Data, Second Edition. Chapman and Hall, London, 1989.MATHGoogle Scholar
  11. [l1]
    P. J. Davis and I. Polonsky. Numerical interpolation, differentiation and integration. In M. Abramowitz and I. A. Stegun, editors, Handbook of Mathematical Functions, chapter 25, pages 875–924. U.S. Government Printing Office, Washington, D.C., 1964.Google Scholar
  12. [12]
    B. Engel and A. Keen. A simple approach for the analysis of generalized linear mixed models. Statistica Neerlandica, 48:1–22, 1994.MathSciNetMATHCrossRefGoogle Scholar
  13. [13]
    C. J. Geyer. Practical Markov chain Monte Carlo. Statistical Science, 7:473–511, 1992.CrossRefGoogle Scholar
  14. [14]
    H. Goldstein. Nonlinear multilevel models, with an application to discrete response data. Biometrika, 78:45–51, 1991.MathSciNetCrossRefGoogle Scholar
  15. [15]
    H. Goldstein. Multilevel Statistical Models. Edward Arnold, London, 1995.Google Scholar
  16. [16]
    H. Goldstein and J. Rasbash. Improved approximations for multilevel models with binary responses. Journal of the Royal Statistical Society, Series A, 159:505–513, 1996.MathSciNetMATHGoogle Scholar
  17. [17]
    S. Greenland. Hierarchical regression for epidemiologic analyses of multiple exposures. Environmental Health Perspectives, 102:33–39, 1994.CrossRefGoogle Scholar
  18. [18]
    D. Hedeker and R. D. Gibbons. MIXOR: A computer program for mixed-effects ordinal regression analysis. Computer Methods and Programs in Biomedicine, 49:157–176, 1996.CrossRefGoogle Scholar
  19. [19]
    C. R. Henderson. Best linear unbiased estimation and prediction under a selection model. Biometrics, 31:423–447, 1975.MATHCrossRefGoogle Scholar
  20. [20]
    A. Y. C. Kuk and Y. W. Cheng. The Monte Carlo Newton-Raphson algorithm. Journal of Statistical Computation and Simulation, 59:233–250, 1997.MATHCrossRefGoogle Scholar
  21. [21]
    A. Y. C. Kuk and Y. W. Cheng. Pointwise and functional approximations in Monte Carlo maximum likelihood estimation. Statistics and Computing, 9:91–99, 1999.CrossRefGoogle Scholar
  22. [22]
    Y. Lee and J. A. Nelder. Hierarchical generalized linear models (with discussion). Journal of the Royal Statistical Society, Series B, 58:619678, 1996.MathSciNetGoogle Scholar
  23. [23]
    Y. Lee and J. A. Nelder. Hierarchical generalised linear models: A synthesis of generalised linear models, random-effect models and structured dispersions. Biometrika, 88:987–1006, 2001.MathSciNetMATHCrossRefGoogle Scholar
  24. [24]
    X. Lin and N. E. Breslow. Bias correction in generalized linear mixed models with multiple components of dispersion. Journal of the American Statistical Association, 91:1007–1016, 1996.MathSciNetMATHCrossRefGoogle Scholar
  25. [25]
    X. Lin and D. Zhang. Inference in generalized additive mixed models by using smoothing splines. Journal of the Royal Statistical Society, Series B, 61:381–400, 1999.MathSciNetMATHCrossRefGoogle Scholar
  26. [26]
    R. C. Littell, G. A. Milliken, W. W. Stroup, and R. D. Wolfinger. SAS System for Mixed Models. SAS Institute Inc., Cary, N.C., 1996.Google Scholar
  27. [27]
    Q. Liu and D. A. Pierce. A note on gauss-hermite quadrature. Biometrika, 81:624–629, 1994.MathSciNetMATHGoogle Scholar
  28. [28]
    P. McCullagh and J. A. Nelder. Linear Models, Second Edition. Chapman and Hall, London, 1989.MATHGoogle Scholar
  29. [29]
    C. E. McCulloch. Maximum likelihood algorithms for generalized linear mixed models. Journal of the American Statistical Association, 92:162–170, 1997.MathSciNetMATHCrossRefGoogle Scholar
  30. [30]
    C. E. McCulloch and S. R. Searle. Generalized, Linear, and Mixed Models. Wiley, New York, 2001.MATHGoogle Scholar
  31. [31]
    C. A. McGilchrist. Estimation in generalized mixed models. Journal of the Royal Statistical Society, Series B, 56:61–69, 1994.MathSciNetMATHGoogle Scholar
  32. [32]
    K. J. McKonway, M. C. Jones, and P. C. Taylor. Statistical Modelling using GENSTAT. Arnold, London, 1999.Google Scholar
  33. [33]
    R. B. Millar and T. J. Willis. Estimating the relative density of snapper in and around a marine reserve using a log-linear mixed-effects model. Australian and New Zealand Journal of Statistics, 41:383–394, 1999.MATHCrossRefGoogle Scholar
  34. [34]
    J. Myles and D. Clayton. GLMMGibbs: An R Package for Estimating Bayesian Generalised Linear Mixed Models by Gibbs Sampling. Imperial Cancer Research Fund, London, 2001.Google Scholar
  35. [35]
    J. A. Nelder and R. W. M. Wedderburn. Generalized linear models. Journal of the Royal Statistical Society, Series A, 135:370–384, 1972.CrossRefGoogle Scholar
  36. [36]
    J. Pinheiro and D. M. Bates. Approximations to the log-likelihood function in the nonlinear mixed-effects model. Journal of Computational and Graphical Statistics, 4:12–35, 1995.Google Scholar
  37. [37]
    J. Rasbash, W. Browne, H. Goldstein, M. Yang, I. Plewis, M. Healy, G. Woodhouse, D. Draper, I. Langford, and T. Lewis. A User’s Guide to MLwiN. Institute of Education, London, 2000.Google Scholar
  38. [38]
    S. W. Raudenbush, A. S. Byrke, Y. F. Cheong, and R Congdon. HLM 5: Hierarchical Linear and Nonlinear Modeling. Scientific Software International, Lincolnwood, IL, 2000.Google Scholar
  39. [39]
    S. W. Raudenbush, M. L. Yang, and M. Yosef. Maximum likelihood for generalized linear models with nested random effects via high-order, multivariate Laplace approximation. Journal of Computational and Graphical Statistics, 9:141–157, 2000.MathSciNetGoogle Scholar
  40. [40]
    R. Schall. Estimation in generalized linear models with random effects. Biometrika, 78:719–727, 1991.MATHCrossRefGoogle Scholar
  41. [41]
    Z. M. Shun. Another look at the salamander mating data: A modified Laplace approximation approach. Journal of the American Statistical Association, 92:341–349, 1997.MATHCrossRefGoogle Scholar
  42. [42]
    Z. M. Shun and P. McCullagh. Laplace approximation of high-dimensional integrals. Journal of the Royal Statistical Society, Series B, 57:749–760, 1995.MathSciNetMATHGoogle Scholar
  43. [43]
    P. J. Solomon and D. R. Cox. Nonlinear component of variance models. Biometrika, 79:1–11, 1992.MathSciNetMATHCrossRefGoogle Scholar
  44. [44]
    D. J. Spiegelhalter, A. Thomas, N. G. Best, and W. R. Gilks. BUGS: Bayesian Inference using Gibbs Sampling, Version 0.30. Medical Research Council Biostatistics Unit, Cambridge, 1994.Google Scholar
  45. [45]
    SAS Institute Inc. Staff. The NLMIXED procedure. In SAS/STAT User’s Guide Version 8, chapter 46, pages 2421–2504. SAS Publishing, Cary, NC, 2000.Google Scholar
  46. [46]
    R. Stiratelli, N. Laird, and J. H. Ware. Random-effects models for serial observations with binary response. Biometrics, 40:961–971, 1984.CrossRefGoogle Scholar
  47. [47]
    P. F. Thall and S. C. Vail. Some covariance models for longitudinal count data with overdispersion. Biometrics, 46:657–671, 1990.MathSciNetMATHCrossRefGoogle Scholar
  48. [48]
    R. Wolfinger. Laplace’s approximation for nonlinear mixed models. Biometrika, 80:791–795, 1993.MathSciNetMATHCrossRefGoogle Scholar
  49. [49]
    R. Wolfinger and M. O’Connell. Generalized linear mixed models: A pseudo-likelihood approach. Journal of Statistical Computation and Simulation, 48:233–243, 1993.MATHCrossRefGoogle Scholar
  50. [50]
    S. L. Zeger and M. R. Karim. Generalized linear models with random effects; a Gibbs sampling approach. Journal of the American Statistical Association, 86:79–86, 1991.MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2004

Authors and Affiliations

  • Norman Breslow
    • 1
  1. 1.Department of BiostatisticsUniversity of WashingtonUSA

Personalised recommendations