Skip to main content

Investigating child self-report capacity: a systematic review and utility analysis



To identify and evaluate methods for assessing pediatric patient-reported outcome (PRO) data quality at the individual level.


We conducted a systematic literature review to identify methods for detecting invalid responses to PRO measures. Eight data quality indicators were applied to child-report data collected from 1780 children ages 8–11 years. We grouped children with similar data quality patterns and tested for between-group differences in factors hypothesized to influence self-report capacity.


We identified 126 articles that described 494 instances in which special measures or statistical techniques were applied to evaluate data quality at the individual level. We identified 22 data quality indicator subtypes: 9 direct methods (require administration of special items) and 13 archival techniques (statistical procedures applied to PRO data post hoc). Application of archival techniques to child-report PRO data revealed 3 distinct patterns (or classes) of the data quality indicators. Compared to class 1 (56%), classes 2 (36%) and 3 (8%) had greater variation in their PRO item responses. Three archival indicators were especially useful for differentiating plausible item response variation (class 2) from statistically unlikely response patterns (class 3). Neurodevelopmental conditions, which are associated with a range of cognitive processing challenges, were more common among children in class 3.


A multi-indicator approach is needed to identify invalid PRO responses. Once identified, assessment environments and measurement tools should be adapted to best support these individuals’ self-report capacity. Individual-level data quality indicators can be used to gauge the effectiveness of these accommodations.

This is a preview of subscription content, access via your institution.

Fig. 1


  1. 1.

    Forrest, C. B., Bevans, K. B., Tucker, C., Riley, A. W., Ravens-Sieberer, U., Gardner, W., et al. (2012). Commentary: The Patient-Reported Outcome Measurement Information System (PROMIS(R)) for children and youth: Application to pediatric psychology. Journal of Pediatric Psychology,37, 614–621.

    Article  PubMed  PubMed Central  Google Scholar 

  2. 2.

    Kramer, J. M., & Schwartz, A. (2017). Reducing barriers to patient-reported outcome measures for people with cognitive impairments. Archives of Physical Medicine and Rehabilitation,98, 1705–1715.

    Article  PubMed  Google Scholar 

  3. 3.

    Brod, M., Tesler, L. E., & Christensen, T. L. (2009). Qualitative research and content validity: Developing best practices based on science and experience. Quality of Life Research,18, 1263–1278.

    Article  PubMed  Google Scholar 

  4. 4.

    Lasch, K. E., Marquis, P., Vigneux, M., Abetz, L., Arnould, B., Bayliss, M., et al. (2010). PRO development: Rigorous qualitative research as the crucial foundation. Quality of Life Research,19, 1087–1096.

    Article  PubMed  PubMed Central  Google Scholar 

  5. 5.

    Curran, P. G. (2016). Methods for the detection of carelessly invalid responses in survey data. Journal of Experimental Social Psychology,66, 4–19.

    Article  Google Scholar 

  6. 6.

    DeSimone, J. A., Harms, P. D., & DeSimone, A. J. (2015). Best practice recommendations for data screening: Data screening. Journal of Organizational Behavior,36, 171–181.

    Article  Google Scholar 

  7. 7.

    Johnson, J. A. (2005). Ascertaining the validity of individual protocols from Web-based personality inventories. Journal of Research in Personality,39, 103–129.

    Article  Google Scholar 

  8. 8.

    Meade, A. W., & Craig, S. B. (2012). Identifying careless responses in survey data. Psychological Methods,17, 437–455.

    Article  PubMed  Google Scholar 

  9. 9.

    Huang, J. L., Bowling, N. A., Liu, M., & Li, Y. (2015). Detecting insufficient effort responding with an infrequency scale: Evaluating validity and participant reactions. Journal of Business and Psychology,30, 299–311.

    Article  Google Scholar 

  10. 10.

    McKibben, W. B., & Silvia, P. J. (2017). Evaluating the distorting effects of inattentive responding and social desirability on self-report scales in creativity and the arts. The Journal of Creative Behavior,51, 57–69.

    Article  Google Scholar 

  11. 11.

    Moher, D., Liberati, A., Tetzlaff, J., Altman, D., & The PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med,6, e1000097.

    Article  Google Scholar 

  12. 12.

    Cornell, D. G., Lovegrove, P. J., & Baly, M. W. (2014). Invalid survey response patterns among middle school students. Psychological Assessment,26, 277–287.

    Article  PubMed  Google Scholar 

  13. 13.

    Cornell, D., Klein, J., Konold, T., & Huang, F. (2012). Effects of validity screening items on adolescent survey data. Psychological Assessment,24, 21–35.

    Article  PubMed  Google Scholar 

  14. 14.

    Furlong, M. J., Fullchange, A., & Dowdy, E. (2017). Effects of mischievous responding on universal mental health screening: I love rum raisin ice cream, really I do! School Psychology Quarterly,32, 320–335.

    Article  PubMed  Google Scholar 

  15. 15.

    Jia, Y., Konold, T. R., Cornell, D., & Huang, F. (2018). The impact of validity screening on associations between self-reports of bullying victimization and student outcomes. Educational and Psychological Measurement,78, 80–102.

    Article  PubMed  Google Scholar 

  16. 16.

    Laajasalo, T., Aronen, E. T., Saukkonen, S., Salmi, V., Aaltonen, M., & Kivivuori, J. (2016). To tell or not to tell? Psychopathic traits and response integrity in youth delinquency surveys: Psychopathic traits and response integrity. Criminal Behaviour and Mental Health,26, 81–93.

    Article  PubMed  Google Scholar 

  17. 17.

    Barry, C. T., Lui, J. H. L., & Anderson, A. C. (2017). Adolescent narcissism, aggression, and prosocial behavior: The relevance of socially desirable responding. Journal of Personality Assessment,99, 46–55.

    Article  PubMed  Google Scholar 

  18. 18.

    Pechorro, P., Ayala-Nunes, L., Oliveira, J. P., Nunes, C., & Gonçalves, R. A. (2016). Psychometric properties of the socially desirable response set-5 among incarcerated male and female juvenile offenders. International Journal of Law and Psychiatry,49, 17–21.

    Article  PubMed  Google Scholar 

  19. 19.

    Stokes, J., Pogge, D., Wecksell, B., & Zaccario, M. (2011). Parent-child discrepancies in report of psychopathology: The contributions of response bias and parenting stress. Journal of Personality Assessment,93, 527–536.

    Article  PubMed  Google Scholar 

  20. 20.

    Wardell, J. D., Rogers, M. L., Simms, L. J., Jackson, K. M., & Read, J. P. (2014). Point and click, carefully: Investigating inconsistent response styles in middle school and college students involved in web-based longitudinal substance use research. Assessment,21, 427–442.

    Article  PubMed  Google Scholar 

  21. 21.

    Boström, P., Johnels, J. Å., Thorson, M., & Broberg, M. (2016). Subjective mental health, peer relations, family, and school environment in adolescents with intellectual developmental disorder: A first report of a new questionnaire administered on tablet PCs. Journal of Mental Health Research in Intellectual Disabilities,9, 207–231.

    Article  Google Scholar 

  22. 22.

    Hopfenbeck, T. N., & Maul, A. (2011). Examining evidence for the validity of pisa learning strategy scales based on student response processes. International Journal of Testing,11, 95–121.

    Article  Google Scholar 

  23. 23.

    Cushing, C. C., Marker, A. M., Bejarano, C. M., Crick, C. J., & Huffhines, L. P. (2017). Latent variable mixture modeling of ecological momentary assessment data: Implications for screening and adolescent mood profiles. Journal of Child and Family Studies,26, 1565–1572.

    Article  Google Scholar 

  24. 24.

    Bevans, K. B., Riley, A. W., & Forrest, C. B. (2010). Development of the healthy pathways child-report scales. Quality of Life Research,19, 1195–1214.

    Article  PubMed  PubMed Central  Google Scholar 

  25. 25.

    Bethell, C. D., Read, D., Stein, R. E. K., Blumberg, S. J., Wells, N., & Newacheck, P. W. (2002). Identifying children with special health care needs: Development and evaluation of a short screening instrument. Ambulatory Pediatrics,2, 38–48.

    Article  Google Scholar 

  26. 26.

    Bevans, K. B., Riley, A. W., & Forrest, C. B. (2012). Development of the healthy pathways parent-report scales. Quality of Life Research,21, 1755–1770.

    Article  PubMed  Google Scholar 

  27. 27.

    Huang, J. L., Curran, P. G., Keeney, J., Poposki, E. M., & DeShon, R. P. (2012). Detecting and deterring insufficient effort responding to surveys. Journal of Business and Psychology,27, 99–114.

    Article  Google Scholar 

  28. 28.

    Yentes R, Wilhelm F (2018) Careless: Procedures for computing indices of careless responding. R package version 1.1.3. 2018

  29. 29.

    Tendeiro, J., Meijer, R., & Niessen, S. (2016). PerFit: An R package for person-fit analysis in IRT. Journal of Statistical Software,74, 1–27.

    Article  Google Scholar 

  30. 30.

    Rosenberg, J., Beymer, P., Anderson, D., & Schmidt, J. (2018). Tidylpa: An R Package to easily carry out latent profile analysis (lpa) using open-source or commercial software. Journal of Open Source Software,3(30), 978.

    Article  Google Scholar 

  31. 31.

    Celeux, G., & Soromenho, G. (1996). An entropy criterion for assessing the number of clusters in a mixture model. Journal of Classification,13, 195–212.

    Article  Google Scholar 

  32. 32.

    Marsh, H. W., Lüdtke, O., Trautwein, U., & Morin, A. J. (2009). Classical latent profile analysis of academic self-concept dimensions: Synergy of person-and variable centered approaches to theoretical models of self-concept. Structural Equation Modeling,16, 191–225.

    Article  Google Scholar 

  33. 33.

    Tein, J. Y., Coxe, S., & Cham, H. (2013). Statistical power to detect the correct number of classes in latent profile analysis. Structural Equation Modeling,20, 640–657.

    Article  Google Scholar 

Download references


Research reported in this publication was funded by the Office Of The Director, National Institutes Of Health (OD) under Award Number 4U24OD023319-02, with co-funding from the Office of Behavioral and Social Sciences Research (OBSSR) and by a grant from the National Institute of Child Health and Human Development (R01HD048850, PI Forrest). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institute of Health.

Author information



Corresponding author

Correspondence to Katherine B. Bevans.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic supplementary material

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Bevans, K.B., Ahuvia, I.L., Hallock, T.M. et al. Investigating child self-report capacity: a systematic review and utility analysis. Qual Life Res 29, 1147–1158 (2020).

Download citation


  • Patient-reported outcome measures
  • Data quality
  • Self-report capacity
  • Pediatric