Reliability, Validity, and Factor Structure of the Current Assessment Practice Evaluation-Revised (CAPER) in a National Sample

  • Aaron R. LyonEmail author
  • Michael D. Pullmann
  • Shannon Dorsey
  • Prerna Martin
  • Alexandra A. Grigore
  • Emily M. Becker
  • Amanda Jensen-Doss


Measurement-based care (MBC) is an increasingly popular, evidence-based practice, but there are no tools with established psychometrics to evaluate clinician use of MBC practices in mental health service delivery. The current study evaluated the reliability, validity, and factor structure of scores generated from a brief, standardized tool to measure MBC practices, the Current Assessment Practice Evaluation-Revised (CAPER). Survey data from a national sample of 479 mental health clinicians were used to conduct exploratory and confirmatory factor analyses, as well as reliability and validity analyses (e.g., relationships between CAPER subscales and clinician MBC attitudes). Analyses revealed competing two- and three-factor models. Regardless of the model used, scores from CAPER subscales demonstrated good reliability and convergent and divergent validity with MBC attitudes in the expected directions. The CAPER appears to be a psychometrically sound tool for assessing clinician MBC practices. Future directions for development and application of the tool are discussed.


Compliance with Ethical Standards

Conflict of Interest

The authors have no conflicts of interest to report.

Supplementary material

11414_2018_9621_MOESM1_ESM.docx (18 kb)
ESM 1 (DOCX 17 kb)


  1. 1.
    Substance Abuse and Mental Health Services Administration. Partners for Change Outcome Management System (PCOMS): International Center for Clinical Excellence.Google Scholar
  2. 2.
    Glasgow RE, Kessler RS, Ory MG, et al.. Conducting rapid, relevant research: lessons learned from the My Own Health Report project. American Journal of Preventive Medicine 2014;47(2):212–219.PubMedGoogle Scholar
  3. 3.
    Scott K, Lewis CC. Using measurement-based care to enhance any treatment. Cognitive Behavioral Practice. 2015;22(1):49–59. PubMedGoogle Scholar
  4. 4.
    Bickman L, Lyon AR, Wolpert M. Achieving precision mental health through effective assessment, monitoring, and feedback processes. Administration & Policy in Mental Health & Mental Health Services Research. 2016;43:271–276. Google Scholar
  5. 5.
    Lewis CC, Scott K, Marti CN, et al. Implementing measurement-based care (iMBC) for depression in community mental health: a dynamic cluster randomized trial study protocol. Implementation Science. 2015;10(1):127.PubMedGoogle Scholar
  6. 6.
    Bickman L, Kelley SD, Breda C, et al.. Effects of routine feedback to clinicians on mental health outcomes of youths: results of a randomized trial. Psychiatric Services. 2011;62(12):1423–1429. PubMedGoogle Scholar
  7. 7.
    Carlier IVE, Meuldijk D, Van Vliet IM, et al.. Routine outcome monitoring and feedback on physical or mental health status: evidence and theory. Journal of Evaluation in Clinical Practice. 2012;18(1):104–110. PubMedGoogle Scholar
  8. 8.
    Lambert MJ, Whipple JL, Hawkins EJ, et al. Is it time for clinicians to routinely track patient outcome? A meta-analysis. Clinical Psychology: Science & Practice. 2003;10(3):288–301. Google Scholar
  9. 9.
    Guthrie D, McIntosh M, Callaly T, et al.. Consumer attitudes towards the use of routine outcome measures in a public mental health service: A consumer-driven study. International Journal of Mental Health Nursing. 2008;17(2):92–97.PubMedGoogle Scholar
  10. 10.
    Douglas SR, Jonghyuk B, Andrade ARV de, et al.. Feedback mechanisms of change: How problem alerts reported by youth clients and their caregivers impact clinician-reported session content. Psychotherapy Research. 2015;25(6):678–693. PubMedGoogle Scholar
  11. 11.
    Hatfield DR, Ogles BM. The use of outcome measures by psychologists in clinical practice. Professional Psychology: Research & Practice. 2004;35(5):485–491.Google Scholar
  12. 12.
    Ionita G, Fitzpatrick M. Bringing science to clinical practice: A Canadian survey of psychological practice and usage of progress monitoring measures. Canadian Psychology. 2014;55(3):187.Google Scholar
  13. 13.
    Palmiter DJ. A survey of the assessment practices of child and adolescent clinicians. American Journal of Orthopsychiatry. 2004;74(2):122.PubMedGoogle Scholar
  14. 14.
    Garland AF, Kruse M, Aarons GA. Clinicians and outcome measurement: What’s the use? Journal of Behavioral Health Services & Research. 2003;30(4):393–405.Google Scholar
  15. 15.
    Jensen-Doss A, Smith AM, Becker-Haimes EM, et al. Individualized progress measures are more acceptable to clinicians than standardized measures: Results of a national survey. under review.Google Scholar
  16. 16.
    Jensen-Doss A, Haimes EMB, Smith AM, et al. Monitoring treatment progress and providing feedback is viewed favorably but rarely used in practice. Administration & Policy in Mental Health & Mental Health Services Research. 2016:1–14.Google Scholar
  17. 17.
    Jensen-Doss A, Hawley KM. Understanding barriers to evidence-based assessment: Clinician attitudes toward standardized assessment tools. Journal of Clinical Child & Adolescent Psychology. 2010;39(6):885–896. Google Scholar
  18. 18.
    Lyon AR, Pullmann MD, Whitaker K, et al. A digital feedback system to support implementation of measurement-based care by school-based mental health clinicians. Journal of Clinical Child & Adolescent Psychology. 2017:1–12.Google Scholar
  19. 19.
    Lyon AR, Dorsey S, Pullmann M, et al. Clinician use of standardized assessments following a common elements psychotherapy training and consultation program. Administration & Policy in Mental Health & Mental Health Services Research. 2015;42(1):47–60.Google Scholar
  20. 20.
    Close-Goedjen JL, Saunders SM. The effect of technical support on clinician attitudes toward an outcome assessment instrument. Journal of Behavioral Health Services & Research. 2002;29(1):99–108.Google Scholar
  21. 21.
    Berwick DM. Broadening the view of evidence-based medicine. BMJ Quality & Safety. 2005;14(5):315–316. Google Scholar
  22. 22.
    Glasgow RE. What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Education & Behavior. 2013;40(3):257–265. Google Scholar
  23. 23.
    Glasgow RE, Riley WT. Pragmatic measures: What they are and why we need them. American Journal of Preventive Medicine. 2013;45(2):237–243. PubMedGoogle Scholar
  24. 24.
    Martinez RG, Lewis CC, Weiner BJ. Instrumentation issues in implementation science. Implementation Science. 2014;9(1):118.PubMedGoogle Scholar
  25. 25.
    Borntrager C, Lyon AR. Client progress monitoring and feedback in school-based mental health. Cognitive & Behavioral Practice. 2015;22(1):74–86. Google Scholar
  26. 26.
    Cone JD. Idiographic, nomothetic, and related perspectives in behavioral assessment. Conceptual Foundations of Behavioral Assessment. 1986:111–128.Google Scholar
  27. 27.
    Lyon AR, Connors E, Jensen-Doss A, et al. Intentional research design in implementation science: implications for the use of nomothetic and idiographic assessment. Translational Behavioral Medicine. 2017:1–14.Google Scholar
  28. 28.
    McLeod BD, Jensen-Doss A, Ollendick TH. Diagnostic and Behavioral Assessment in Children and Adolescents: A Clinical Guide. Guilford Press; 2013. Accessed May 10, 2017.
  29. 29.
    Kroenke K, Spitzer RL. The PHQ-9: A new depression diagnostic and severity measure. Psychiatric Annals. 2002;32(9):509–515. Google Scholar
  30. 30.
    Haynes SN, Mumma GH, Pinson C. Idiographic assessment: Conceptual and psychometric foundations of individualized behavioral assessment. Clinical Psychology Review. 2009;29(2):179–191.PubMedGoogle Scholar
  31. 31.
    Weisz JR, Chorpita BF, Frye A, et al. Youth Top Problems: Using idiographic, consumer-guided assessment to identify treatment needs and to track change during psychotherapy. Journal of Consulting & Clinical Psychology. 2011;79(3):369.Google Scholar
  32. 32.
    Connors EH, Arora P, Curtis L, et al. Evidence-based assessment in school mental health. Cognitive Behavioral Practice. 2015;22(1):60–73. Google Scholar
  33. 33.
    Duong MT, Lyon AR, Ludwig K, et al. Student perceptions of the acceptability and utility of standardized and idiographic assessment in school mental health. International Journal of Mental Health Promotion. 2016;18(1):49–63. Google Scholar
  34. 34.
    Lyon AR, Ludwig K, Wasse JK, et al. Determinants and functions of standardized assessment use among school mental health clinicians: A mixed methods evaluation. Administration & Policy in Mental Health & Mental Health Services Research. 2015;43(1):122–134. Google Scholar
  35. 35.
    Lindhiem O, Bennett CB, Orimoto TE, et al. A meta-analysis of personalized treatment goals in psychotherapy: A preliminary report and call for more studies. Clinical Psychology: Science & Practice. 2016;23(2):165–176.Google Scholar
  36. 36.
    Becker EM, Smith AM, Jensen-Doss A. Who’s using treatment manuals? A national survey of practicing therapists. Behaviour Resesarch & Therapy. 2013;51(10):706–710.Google Scholar
  37. 37.
    Brookman-Frazee L, Haine RA, Baker-Ericzén M, et al. Factors associated with use of evidence-based practice strategies in usual care youth psychotherapy. Administration & Policy in Mental Health & Mental Health Services Research. 2010;37(3):254–269.Google Scholar
  38. 38.
    Hatfield DR, Ogles BM. Why some clinicians use outcome measures and others do not. Administration & Policy in Mental Health & Mental Health Services Research. 2007;34(3):283–291.Google Scholar
  39. 39.
    Hyde PS. Report to Congress on the Nation’s substance abuse and mental health workforce issues. US Department of Health & Human Services Substance Abuse & Mental Health Services. Jan 2013. 2013;10. Accessed May 10, 2017.
  40. 40.
    Hawley KM, Cook JR, Jensen-Doss A. Do noncontingent incentives increase survey response rates among mental health providers? A randomized trial comparison. Administration & Policy in Mental Health & Mental Health Services Research. 2009;36(5):343–348.Google Scholar
  41. 41.
    Glisson C, Landsverk J, Schoenwald S, et al. Assessing the Organizational Social Context (OSC) of mental health services: Implications for research and practice. Administration & Policy in Mental Health & Mental Health Services Research. 2008;35(1–2):98. Google Scholar
  42. 42.
    Dillman DA, Smyth JD, Christian LM. Internet, Mail, and Mixed Mode Survey: The Tailored Design Method. Hoboken. NJ: John Wiley & Sons; 2009.Google Scholar
  43. 43.
    Tabachnick BG, Fidell LS. Using Multivariate Statistics (5th Edition). Needham Heights: Allyn & Bacon, Inc.; 2007.Google Scholar
  44. 44.
    Hooper D, Coughlan J, Mullen M. Structural equation modelling: Guidelines for determining model fit. Electronic Journal of Business Research Methods. 2008;6(1):53–60.Google Scholar
  45. 45.
    Hu L, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling Multidisciplinary Journal. 1999;6(1):1–55.Google Scholar
  46. 46.
    Steiger JH. Understanding the limitations of global fit assessment in structural equation modeling. Personality & Individual Differences. 2007;42(5):893–898. Google Scholar
  47. 47.
    Kaiser HF. A second generation little jiffy. Psychometrika. 1970;35(4):401–415. Google Scholar
  48. 48.
    Kaiser HF. An index of factorial simplicity. Psychometrika. 1974;39(1):31–36. Google Scholar
  49. 49.
    Bradshaw CP, Buckley JA, Ialongo NS. School-based service utilization among urban children with early onset educational and mental health problems: The squeaky wheel phenomenon. School Psychology Quarterly. 2008;23(2):169–186. Google Scholar
  50. 50.
    Kataoka SH, Zhang L, Wells KB. Unmet need for mental health care among US children: Variation by ethnicity and insurance status. American Journal of Psychiatry. 2002;159(9):1548–1555.PubMedGoogle Scholar
  51. 51.
    Cabassa LJ, Baumann AA. A two-way street: Bridging implementation science and cultural adaptations of mental health treatments. Implementation Science. 2013;8(1):90.PubMedGoogle Scholar
  52. 52.
    Hurlburt MS, Garland AF, Nguyen K, et al. Child and Family Therapy Process: Concordance of Therapist and Observational Perspectives. Administration & Policy in Mental Health & Mental Health Services Research. 2009;37(3):230–244. Google Scholar
  53. 53.
    Nakamura BJ, Selbo-Bruns A, Okamura K, et al. Developing a systematic evaluation approach for training programs within a train-the-trainer model for youth cognitive behavior therapy. Behavioral Research & Therapy. 2014;53:10–19. Google Scholar
  54. 54.
    Chapman JE, McCart MR, Letourneau EJ, et al. Comparison of youth, caregiver, therapist, trained, and treatment expert raters of therapist adherence to a substance abuse treatment protocol. Journal of Consulting & Clinical Psychology. 2013;81(4):674.Google Scholar
  55. 55.
    Hogue A, Dauber S, Henderson CE, et al. Reliability of therapist self-report on treatment targets and focus in family-based intervention. Administration & Policy in Mental Health & Mental Health Services Research. 2014;41(5):697–705.Google Scholar

Copyright information

© National Council for Behavioral Health 2018

Authors and Affiliations

  1. 1.Department of Psychiatry and Behavioral Sciences, School of MedicineUniversity of WashingtonSeattleUSA
  2. 2.Department of PsychologyUniversity of WashingtonSeattleUSA
  3. 3.Department of PsychologyUniversity of MiamiCoral GablesUSA

Personalised recommendations