Abstract
Piecewise growth mixture models are a flexible and useful class of methods for analyzing segmented trends in individual growth trajectory over time, where the individuals come from a mixture of two or more latent classes. These models allow each segment of the overall developmental process within each class to have a different functional form; examples include two linear phases of growth, or a quadratic phase followed by a linear phase. The changepoint (knot) is the time of transition from one developmental phase (segment) to another. Inferring the location of the changepoint(s) is often of practical interest, along with inference for other model parameters. A random changepoint allows for individual differences in the transition time within each class. The primary objectives of our study are as follows: (1) to develop a PGMM using a Bayesian inference approach that allows the estimation of multiple random changepoints within each class; (2) to develop a procedure to empirically detect the number of random changepoints within each class; and (3) to empirically investigate the bias and precision of the estimation of the model parameters, including the random changepoints, via a simulation study. We have developed the user-friendly package BayesianPGMM for R to facilitate the adoption of this methodology in practice, which is available at https://github.com/lockEF/BayesianPGMM. We describe an application to mouse-tracking data for a visual recognition task.
This is a preview of subscription content, log in to check access.


References
Akaike, H. (1974). A new look at the statistical model identification. IEEE Transactions on Automatic Control, 19(6), 716–723.
Bradway, K. P., & Thompson, C. W. (1962). Intelligence at adulthood: A twenty-five year follow-up. Journal of Educational Psychology, 53(1), 1.
Brooks, S. P., & Gelman, A. (1998). General methods for monitoring convergence of iterative simulations. Journal of Computational and Graphical Statistics, 7(4), 434–455.
Carlin, B. P., Gelfand, A. E., & Smith, A. F. (1992). Hierarchical Bayesian analysis of changepoint problems. Applied Statistics, 41(2), 389–405.
Casella, G., & George, E. I. (1992). Explaining the Gibbs sampler. The American Statistician, 46(3), 167–174.
Chib, S. (1998). Estimation and comparison of multiple change-point models. Journal of Econometrics, 86(2), 221–241.
Cudeck, R., & Klebe, K. J. (2002). Multiphase mixed-effects models for repeated measures data. Psychological Methods, 7(1), 41.
Daniels, M. J., & Kass, R. E. (1999). Nonconjugate Bayesian estimation of covariance matrices and its use in hierarchical models. Journal of the American Statistical Association, 94(448), 1254–1263.
Delattre, M., Lavielle, M., Poursat, M.-A., et al. (2014). A note on BIC in mixed-effects models. Electronic Journal of Statistics, 8(1), 456–475.
Dominicus, A., Ripatti, S., Pedersen, N. L., & Palmgren, J. (2008). A random change point model for assessing variability in repeated measures of cognitive function. Statistics in Medicine, 27(27), 5786–5798.
Fearnhead, P. (2006). Exact and efficient Bayesian inference for multiple changepoint problems. Statistics and Computing, 16(2), 203–213.
Fearnhead, P., & Liu, Z. (2011). Efficient Bayesian analysis of multiple changepoint models with dependence across segments. Statistics and Computing, 21(2), 217–229.
Gallistel, C. R., Fairhurst, S., & Balsam, P. (2004). The learning curve: Implications of a quantitative analysis. Proceedings of the National Academy of Sciences of the United States of America, 101(36), 13124–13131.
Gelman, A., et al. (2006). Prior distributions for variance parameters in hierarchical models (comment on article by Browne and Draper). Bayesian Analysis, 1(3), 515–534.
Gelman, A., & Rubin, D. B. (1992). Inference from iterative simulation using multiple sequences. Statistical Science, 7(4), 457–472.
Geman, S., & Geman, D. (1984). Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence, 6, 721–741.
Green, P. J. (1995). Reversible jump Markov chain Monte Carlo computation and Bayesian model determination. Biometrika, 82(4), 711–732.
Hall, C. B., Lipton, R. B., Sliwinski, M., & Stewart, W. F. (2000). A change point model for estimating the onset of cognitive decline in preclinical Alzheimer’s disease. Statistics in Medicine, 19(11–12), 1555–1566.
Hall, C. B., Ying, J., Kuo, L., Sliwinski, M., Buschke, H., Katz, M., et al. (2001). Estimation of bivariate measurements having different change points, with application to cognitive ageing. Statistics in Medicine, 20(24), 3695–3714.
Incera, S., & McLennan, C . T. (2016). Mouse tracking reveals that bilinguals behave like experts. Bilingualism: Language and Cognition, 19(03), 610–620.
Karlberg, J. (1987). On the modelling of human growth. Statistics in Medicine, 6(2), 185–192.
Karlberg, J., Jalil, F., Lam, B., Low, L., & Yeung, C. (1994). Linear growth retardation in relation to the three phases of growth. European Journal of Clinical Nutrition, 48, S25–43.
Kieffer, M. J. (2012). Before and after third grade: Longitudinal evidence for the shifting role of socioeconomic status in reading growth. Reading and Writing, 25(7), 1725–1746.
Kohli, N., Harring, J. R., & Hancock, G. R. (2013). Piecewise linear-linear latent growth mixture models with unknown knots. Educational and Psychological Measurement, 73(6), 935–955.
Kohli, N., Harring, J., & Zopluoglu, C. (2016). A finite mixture of nonlinear random coefficient models for continuous repeated measures data. Psychometrika, 81(3), 851–880.
Kohli, N., Hughes, J., Wang, C., Zopluoglu, C., & Davison, M. (2015a). Fitting a linear-linear piecewise growth mixture model with unknown knots: A comparison of two common approaches to inference. Psychological Methods, 20(2), 259–275.
Kohli, N., Sullivan, A. L., Sadeh, S., & Zopluoglu, C. (2015b). Longitudinal mathematics development of students with learning disabilities and students without disabilities: A comparison of linear, quadratic, and piecewise linear mixed effects models. Journal of School Psychology, 53(2), 105–120.
Kreisman, M. B. (2003). Evaluating academic outcomes of head start: An application of general growth mixture modeling. Early Childhood Research Quarterly, 18(2), 238–254.
Lai, Y., & Albert, P. S. (2014). Identifying multiple change points in a linear mixed effects model. Statistics in Medicine, 33(6), 1015–1028.
Morrell, C. H., Pearson, J. D., Carter, H. B., & Brant, L. J. (1995). Estimating unknown transition times using a piecewise nonlinear mixed-effects model in men with prostate cancer. Journal of the American Statistical Association, 90(429), 45–53.
Muniz Terrera, G., Van den Hout, A., & Matthews, F. (2011). Random change point models: investigating cognitive decline in the presence of missing data. Journal of Applied Statistics, 38(4), 705–716.
Myung, I. J. (2000). The importance of complexity in model selection. Journal of Mathematical Psychology, 44(1), 190–204.
Naumova, E. N., Must, A., & Laird, N. M. (2001). Tutorial in biostatistics: Evaluating the impact of critical periods in longitudinal studies of growth using piecewise mixed effects models. International Journal of Epidemiology, 30(6), 1332–1341.
Papastamoulis, P. (2014). Handling the label switching problem in latent class models via the ECR algorithm. Communications in Statistics-Simulation and Computation, 43(4), 913–927.
Papastamoulis, P. (2015). label. switching: An R package for dealing with the label switching problem in MCMC outputs. arXiv preprint arXiv:1503.02271.
Plummer, M. (2016). rjags: Bayesian Graphical Models using MCMC. R package version 4-6.
Schwarz, G., et al. (1978). Estimating the dimension of a model. The annals of statistics, 6(2), 461–464.
Sinharay, S. (2004). Experiences with Markov chain Monte Carlo convergence assessment in two psychometric examples. Journal of Educational and Behavioral Statistics, 29(4), 461–488.
Spiegelhalter, D., Thomas, A., Best, N., & Gilks, W. (1996). BUGS 0.5: Bayesian inference using Gibbs sampling manual (pp. 1–59). Cambridge: MRC Biostatistics Unit, Institute of Public Health.
Spiegelhalter, D. J., Best, N. G., Carlin, B. P., & Van Der Linde, A. (2002). Bayesian measures of model complexity and fit. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 64(4), 583–639.
Sullivan, A. L., Kohli, N., Farnsworth, E. M., Sadeh, S., & Jones, L. (2017). Longitudinal models of reading achievement of students with learning disabilities and without disabilities. School Psychology Quarterly, 32(3), 336–349.
Vaida, F., & Blanchard, S. (2005). Conditional Akaike information for mixed-effects models. Biometrika, 92(2), 351–370.
Vehtari, A., Gelman, A., & Gabry, J. (2017). Practical Bayesian model evaluation using leave-one-out cross-validation and WAIC. Statistics and Computing, 27(5), 1413–1432.
Wang, L., & McArdle, J. J. (2008). A simulation study comparison of Bayesian estimation with conventional methods for estimating unknown change points. Structural Equation Modeling, 15(1), 52–74.
Wang, Y., & Fang, Y. (2009). Least square and empirical Bayes approaches for estimating random change points. Journal of Data Science, 7(1), 111–127.
Zhao, L., & Banerjee, M. (2012). Bayesian piecewise mixture model for racial disparity in prostate cancer progression. Computational Statistics & Data Analysis, 56(2), 362–369.
Acknowledgements
We would like to thank Sara Incera and Conor T. McLennan of Cleveland State University for graciously providing the mouse-tracking data described in Sect. 4. This work was supported in part by NIH grant ULI RR033183/KL2 RR0333182 [to EFL].
Author information
Affiliations
Corresponding author
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Lock, E.F., Kohli, N. & Bose, M. Detecting Multiple Random Changepoints in Bayesian Piecewise Growth Mixture Models. Psychometrika 83, 733–750 (2018). https://doi.org/10.1007/s11336-017-9594-5
Received:
Revised:
Published:
Issue Date:
Keywords
- Bayesian
- longitudinal data
- Markov chain Monte Carlo
- mixture model
- nonlinear random effects models
- piecewise function