To identify and evaluate methods for assessing pediatric patient-reported outcome (PRO) data quality at the individual level.
We conducted a systematic literature review to identify methods for detecting invalid responses to PRO measures. Eight data quality indicators were applied to child-report data collected from 1780 children ages 8–11 years. We grouped children with similar data quality patterns and tested for between-group differences in factors hypothesized to influence self-report capacity.
We identified 126 articles that described 494 instances in which special measures or statistical techniques were applied to evaluate data quality at the individual level. We identified 22 data quality indicator subtypes: 9 direct methods (require administration of special items) and 13 archival techniques (statistical procedures applied to PRO data post hoc). Application of archival techniques to child-report PRO data revealed 3 distinct patterns (or classes) of the data quality indicators. Compared to class 1 (56%), classes 2 (36%) and 3 (8%) had greater variation in their PRO item responses. Three archival indicators were especially useful for differentiating plausible item response variation (class 2) from statistically unlikely response patterns (class 3). Neurodevelopmental conditions, which are associated with a range of cognitive processing challenges, were more common among children in class 3.
A multi-indicator approach is needed to identify invalid PRO responses. Once identified, assessment environments and measurement tools should be adapted to best support these individuals’ self-report capacity. Individual-level data quality indicators can be used to gauge the effectiveness of these accommodations.
This is a preview of subscription content, access via your institution.
Buy single article
Instant access to the full article PDF.
Tax calculation will be finalised during checkout.
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
Tax calculation will be finalised during checkout.
Forrest, C. B., Bevans, K. B., Tucker, C., Riley, A. W., Ravens-Sieberer, U., Gardner, W., et al. (2012). Commentary: The Patient-Reported Outcome Measurement Information System (PROMIS(R)) for children and youth: Application to pediatric psychology. Journal of Pediatric Psychology,37, 614–621. https://doi.org/10.1093/jpepsy/jss038.
Kramer, J. M., & Schwartz, A. (2017). Reducing barriers to patient-reported outcome measures for people with cognitive impairments. Archives of Physical Medicine and Rehabilitation,98, 1705–1715. https://doi.org/10.1016/j.apmr.2017.03.011.
Brod, M., Tesler, L. E., & Christensen, T. L. (2009). Qualitative research and content validity: Developing best practices based on science and experience. Quality of Life Research,18, 1263–1278. https://doi.org/10.1007/s11136-009-9540-9.
Lasch, K. E., Marquis, P., Vigneux, M., Abetz, L., Arnould, B., Bayliss, M., et al. (2010). PRO development: Rigorous qualitative research as the crucial foundation. Quality of Life Research,19, 1087–1096. https://doi.org/10.1007/s11136-010-9677-6.
Curran, P. G. (2016). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology,66, 4–19. https://doi.org/10.1016/j.jesp.2015.07.006.
DeSimone, J. A., Harms, P. D., & DeSimone, A. J. (2015). Best practice recommendations for data screening: Data screening. Journal of Organizational Behavior,36, 171–181. https://doi.org/10.1002/job.1962.
Johnson, J. A. (2005). Ascertaining the validity of individual protocols from Web-based personality inventories. Journal of Research in Personality,39, 103–129. https://doi.org/10.1016/j.jrp.2004.09.009.
Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods,17, 437–455. https://doi.org/10.1037/a0028085.
Huang, J. L., Bowling, N. A., Liu, M., & Li, Y. (2015). Detecting insufficient effort responding with an infrequency scale: Evaluating validity and participant reactions. Journal of Business and Psychology,30, 299–311. https://doi.org/10.1007/s10869-014-9357-6.
McKibben, W. B., & Silvia, P. J. (2017). Evaluating the distorting effects of inattentive responding and social desirability on self-report scales in creativity and the arts. The Journal of Creative Behavior,51, 57–69. https://doi.org/10.1002/jocb.86.
Moher, D., Liberati, A., Tetzlaff, J., Altman, D., & The PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med,6, e1000097.
Cornell, D. G., Lovegrove, P. J., & Baly, M. W. (2014). Invalid survey response patterns among middle school students. Psychological Assessment,26, 277–287. https://doi.org/10.1037/a0034808.
Cornell, D., Klein, J., Konold, T., & Huang, F. (2012). Effects of validity screening items on adolescent survey data. Psychological Assessment,24, 21–35. https://doi.org/10.1037/a0024824.
Furlong, M. J., Fullchange, A., & Dowdy, E. (2017). Effects of mischievous responding on universal mental health screening: I love rum raisin ice cream, really I do! School Psychology Quarterly,32, 320–335. https://doi.org/10.1037/spq0000168.
Jia, Y., Konold, T. R., Cornell, D., & Huang, F. (2018). The impact of validity screening on associations between self-reports of bullying victimization and student outcomes. Educational and Psychological Measurement,78, 80–102. https://doi.org/10.1177/0013164416671767.
Laajasalo, T., Aronen, E. T., Saukkonen, S., Salmi, V., Aaltonen, M., & Kivivuori, J. (2016). To tell or not to tell? Psychopathic traits and response integrity in youth delinquency surveys: Psychopathic traits and response integrity. Criminal Behaviour and Mental Health,26, 81–93. https://doi.org/10.1002/cbm.1940.
Barry, C. T., Lui, J. H. L., & Anderson, A. C. (2017). Adolescent narcissism, aggression, and prosocial behavior: The relevance of socially desirable responding. Journal of Personality Assessment,99, 46–55. https://doi.org/10.1080/00223891.2016.1193812.
Pechorro, P., Ayala-Nunes, L., Oliveira, J. P., Nunes, C., & Gonçalves, R. A. (2016). Psychometric properties of the socially desirable response set-5 among incarcerated male and female juvenile offenders. International Journal of Law and Psychiatry,49, 17–21. https://doi.org/10.1016/j.ijlp.2016.05.003.
Stokes, J., Pogge, D., Wecksell, B., & Zaccario, M. (2011). Parent-child discrepancies in report of psychopathology: The contributions of response bias and parenting stress. Journal of Personality Assessment,93, 527–536. https://doi.org/10.1080/00223891.2011.594131.
Wardell, J. D., Rogers, M. L., Simms, L. J., Jackson, K. M., & Read, J. P. (2014). Point and click, carefully: Investigating inconsistent response styles in middle school and college students involved in web-based longitudinal substance use research. Assessment,21, 427–442. https://doi.org/10.1177/1073191113505681.
Boström, P., Johnels, J. Å., Thorson, M., & Broberg, M. (2016). Subjective mental health, peer relations, family, and school environment in adolescents with intellectual developmental disorder: A first report of a new questionnaire administered on tablet PCs. Journal of Mental Health Research in Intellectual Disabilities,9, 207–231. https://doi.org/10.1080/19315864.2016.1186254.
Hopfenbeck, T. N., & Maul, A. (2011). Examining evidence for the validity of pisa learning strategy scales based on student response processes. International Journal of Testing,11, 95–121. https://doi.org/10.1080/15305058.2010.529977.
Cushing, C. C., Marker, A. M., Bejarano, C. M., Crick, C. J., & Huffhines, L. P. (2017). Latent variable mixture modeling of ecological momentary assessment data: Implications for screening and adolescent mood profiles. Journal of Child and Family Studies,26, 1565–1572. https://doi.org/10.1007/s10826-017-0689-5.
Bevans, K. B., Riley, A. W., & Forrest, C. B. (2010). Development of the healthy pathways child-report scales. Quality of Life Research,19, 1195–1214. https://doi.org/10.1007/s11136-010-9687-4.
Bethell, C. D., Read, D., Stein, R. E. K., Blumberg, S. J., Wells, N., & Newacheck, P. W. (2002). Identifying children with special health care needs: Development and evaluation of a short screening instrument. Ambulatory Pediatrics,2, 38–48.
Bevans, K. B., Riley, A. W., & Forrest, C. B. (2012). Development of the healthy pathways parent-report scales. Quality of Life Research,21, 1755–1770. https://doi.org/10.1007/s11136-012-0111-0.
Huang, J. L., Curran, P. G., Keeney, J., Poposki, E. M., & DeShon, R. P. (2012). Detecting and deterring insufficient effort responding to surveys. Journal of Business and Psychology,27, 99–114. https://doi.org/10.1007/s10869-011-9231-8.
Yentes R, Wilhelm F (2018) Careless: Procedures for computing indices of careless responding. R package version 1.1.3. 2018
Tendeiro, J., Meijer, R., & Niessen, S. (2016). PerFit: An R package for person-fit analysis in IRT. Journal of Statistical Software,74, 1–27. https://doi.org/10.18637/jss.v074.i05.
Rosenberg, J., Beymer, P., Anderson, D., & Schmidt, J. (2018). Tidylpa: An R Package to easily carry out latent profile analysis (lpa) using open-source or commercial software. Journal of Open Source Software,3(30), 978. https://doi.org/10.21105/joss.00978.
Celeux, G., & Soromenho, G. (1996). An entropy criterion for assessing the number of clusters in a mixture model. Journal of Classification,13, 195–212.
Marsh, H. W., Lüdtke, O., Trautwein, U., & Morin, A. J. (2009). Classical latent profile analysis of academic self-concept dimensions: Synergy of person-and variable centered approaches to theoretical models of self-concept. Structural Equation Modeling,16, 191–225.
Tein, J. Y., Coxe, S., & Cham, H. (2013). Statistical power to detect the correct number of classes in latent profile analysis. Structural Equation Modeling,20, 640–657.
Research reported in this publication was funded by the Office Of The Director, National Institutes Of Health (OD) under Award Number 4U24OD023319-02, with co-funding from the Office of Behavioral and Social Sciences Research (OBSSR) and by a grant from the National Institute of Child Health and Human Development (R01HD048850, PI Forrest). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute of Health.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Bevans, K.B., Ahuvia, I.L., Hallock, T.M. et al. Investigating child self-report capacity: a systematic review and utility analysis. Qual Life Res 29, 1147–1158 (2020). https://doi.org/10.1007/s11136-019-02387-3
- Patient-reported outcome measures
- Data quality
- Self-report capacity