Reliability, Validity, and Factor Structure of the Current Assessment Practice Evaluation-Revised (CAPER) in a National Sample

  • Aaron R. Lyon
  • Michael D. Pullmann
  • Shannon Dorsey
  • Prerna Martin
  • Alexandra A. Grigore
  • Emily M. Becker
  • Amanda Jensen-Doss
Article

Abstract

Measurement-based care (MBC) is an increasingly popular, evidence-based practice, but there are no tools with established psychometrics to evaluate clinician use of MBC practices in mental health service delivery. The current study evaluated the reliability, validity, and factor structure of scores generated from a brief, standardized tool to measure MBC practices, the Current Assessment Practice Evaluation-Revised (CAPER). Survey data from a national sample of 479 mental health clinicians were used to conduct exploratory and confirmatory factor analyses, as well as reliability and validity analyses (e.g., relationships between CAPER subscales and clinician MBC attitudes). Analyses revealed competing two- and three-factor models. Regardless of the model used, scores from CAPER subscales demonstrated good reliability and convergent and divergent validity with MBC attitudes in the expected directions. The CAPER appears to be a psychometrically sound tool for assessing clinician MBC practices. Future directions for development and application of the tool are discussed.

Notes

Compliance with Ethical Standards

Conflict of Interest

The authors have no conflicts of interest to report.

Supplementary material

11414_2018_9621_MOESM1_ESM.docx (18 kb)
ESM 1 (DOCX 17 kb)

References

  1. 1.
    Substance Abuse and Mental Health Services Administration. Partners for Change Outcome Management System (PCOMS): International Center for Clinical Excellence.Google Scholar
  2. 2.
    Glasgow RE, Kessler RS, Ory MG, et al.. Conducting rapid, relevant research: lessons learned from the My Own Health Report project. American Journal of Preventive Medicine 2014;47(2):212–219.CrossRefPubMedPubMedCentralGoogle Scholar
  3. 3.
    Scott K, Lewis CC. Using measurement-based care to enhance any treatment. Cognitive Behavioral Practice. 2015;22(1):49–59.  https://doi.org/10.1016/j.cbpra.2014.01.010 CrossRefPubMedPubMedCentralGoogle Scholar
  4. 4.
    Bickman L, Lyon AR, Wolpert M. Achieving precision mental health through effective assessment, monitoring, and feedback processes. Administration & Policy in Mental Health & Mental Health Services Research. 2016;43:271–276.  https://doi.org/10.1007/s10488-016-0718-5 CrossRefGoogle Scholar
  5. 5.
    Lewis CC, Scott K, Marti CN, et al. Implementing measurement-based care (iMBC) for depression in community mental health: a dynamic cluster randomized trial study protocol. Implementation Science. 2015;10(1):127.CrossRefPubMedPubMedCentralGoogle Scholar
  6. 6.
    Bickman L, Kelley SD, Breda C, et al.. Effects of routine feedback to clinicians on mental health outcomes of youths: results of a randomized trial. Psychiatric Services. 2011;62(12):1423–1429.  https://doi.org/10.1176/appi.ps.002052011 CrossRefPubMedGoogle Scholar
  7. 7.
    Carlier IVE, Meuldijk D, Van Vliet IM, et al.. Routine outcome monitoring and feedback on physical or mental health status: evidence and theory. Journal of Evaluation in Clinical Practice. 2012;18(1):104–110.  https://doi.org/10.1111/j.1365-2753.2010.01543.x CrossRefPubMedGoogle Scholar
  8. 8.
    Lambert MJ, Whipple JL, Hawkins EJ, et al. Is it time for clinicians to routinely track patient outcome? A meta-analysis. Clinical Psychology: Science & Practice. 2003;10(3):288–301.  https://doi.org/10.1093/clipsy.bpg025 Google Scholar
  9. 9.
    Guthrie D, McIntosh M, Callaly T, et al.. Consumer attitudes towards the use of routine outcome measures in a public mental health service: A consumer-driven study. International Journal of Mental Health Nursing. 2008;17(2):92–97.CrossRefPubMedGoogle Scholar
  10. 10.
    Douglas SR, Jonghyuk B, Andrade ARV de, et al.. Feedback mechanisms of change: How problem alerts reported by youth clients and their caregivers impact clinician-reported session content. Psychotherapy Research. 2015;25(6):678–693.  https://doi.org/10.1080/10503307.2015.1059966 CrossRefPubMedPubMedCentralGoogle Scholar
  11. 11.
    Hatfield DR, Ogles BM. The use of outcome measures by psychologists in clinical practice. Professional Psychology: Research & Practice. 2004;35(5):485–491.CrossRefGoogle Scholar
  12. 12.
    Ionita G, Fitzpatrick M. Bringing science to clinical practice: A Canadian survey of psychological practice and usage of progress monitoring measures. Canadian Psychology. 2014;55(3):187.CrossRefGoogle Scholar
  13. 13.
    Palmiter DJ. A survey of the assessment practices of child and adolescent clinicians. American Journal of Orthopsychiatry. 2004;74(2):122.CrossRefPubMedGoogle Scholar
  14. 14.
    Garland AF, Kruse M, Aarons GA. Clinicians and outcome measurement: What’s the use? Journal of Behavioral Health Services & Research. 2003;30(4):393–405.CrossRefGoogle Scholar
  15. 15.
    Jensen-Doss A, Smith AM, Becker-Haimes EM, et al. Individualized progress measures are more acceptable to clinicians than standardized measures: Results of a national survey. under review.Google Scholar
  16. 16.
    Jensen-Doss A, Haimes EMB, Smith AM, et al. Monitoring treatment progress and providing feedback is viewed favorably but rarely used in practice. Administration & Policy in Mental Health & Mental Health Services Research. 2016:1–14.Google Scholar
  17. 17.
    Jensen-Doss A, Hawley KM. Understanding barriers to evidence-based assessment: Clinician attitudes toward standardized assessment tools. Journal of Clinical Child & Adolescent Psychology. 2010;39(6):885–896.  https://doi.org/10.1080/15374416.2010.517169 CrossRefGoogle Scholar
  18. 18.
    Lyon AR, Pullmann MD, Whitaker K, et al. A digital feedback system to support implementation of measurement-based care by school-based mental health clinicians. Journal of Clinical Child & Adolescent Psychology. 2017:1–12.Google Scholar
  19. 19.
    Lyon AR, Dorsey S, Pullmann M, et al. Clinician use of standardized assessments following a common elements psychotherapy training and consultation program. Administration & Policy in Mental Health & Mental Health Services Research. 2015;42(1):47–60.CrossRefGoogle Scholar
  20. 20.
    Close-Goedjen JL, Saunders SM. The effect of technical support on clinician attitudes toward an outcome assessment instrument. Journal of Behavioral Health Services & Research. 2002;29(1):99–108.CrossRefGoogle Scholar
  21. 21.
    Berwick DM. Broadening the view of evidence-based medicine. BMJ Quality & Safety. 2005;14(5):315–316.  https://doi.org/10.1136/qshc.2005.015669 CrossRefGoogle Scholar
  22. 22.
    Glasgow RE. What does it mean to be pragmatic? Pragmatic methods, measures, and models to facilitate research translation. Health Education & Behavior. 2013;40(3):257–265.  https://doi.org/10.1177/1090198113486805 CrossRefGoogle Scholar
  23. 23.
    Glasgow RE, Riley WT. Pragmatic measures: What they are and why we need them. American Journal of Preventive Medicine. 2013;45(2):237–243.  https://doi.org/10.1016/j.amepre.2013.03.010 CrossRefPubMedGoogle Scholar
  24. 24.
    Martinez RG, Lewis CC, Weiner BJ. Instrumentation issues in implementation science. Implementation Science. 2014;9(1):118.CrossRefPubMedPubMedCentralGoogle Scholar
  25. 25.
    Borntrager C, Lyon AR. Client progress monitoring and feedback in school-based mental health. Cognitive & Behavioral Practice. 2015;22(1):74–86.  https://doi.org/10.1016/j.cbpra.2014.03.007 CrossRefGoogle Scholar
  26. 26.
    Cone JD. Idiographic, nomothetic, and related perspectives in behavioral assessment. Conceptual Foundations of Behavioral Assessment. 1986:111–128.Google Scholar
  27. 27.
    Lyon AR, Connors E, Jensen-Doss A, et al. Intentional research design in implementation science: implications for the use of nomothetic and idiographic assessment. Translational Behavioral Medicine. 2017:1–14.Google Scholar
  28. 28.
    McLeod BD, Jensen-Doss A, Ollendick TH. Diagnostic and Behavioral Assessment in Children and Adolescents: A Clinical Guide. Guilford Press; 2013. https://books.google.com/books?hl=en&lr=&id=Ja8YAAAAQBAJ&oi=fnd&pg=PP2&dq=Diagnostic+and+behavioral+assessment+in+children+and+adolescents:+A+clinical+guide&ots=X_Qv66A4Iy&sig=mLeKpn1VzNc-Ga-PYN_ubghbtlI. Accessed May 10, 2017.
  29. 29.
    Kroenke K, Spitzer RL. The PHQ-9: A new depression diagnostic and severity measure. Psychiatric Annals. 2002;32(9):509–515.  https://doi.org/10.3928/0048-5713-20020901-06 CrossRefGoogle Scholar
  30. 30.
    Haynes SN, Mumma GH, Pinson C. Idiographic assessment: Conceptual and psychometric foundations of individualized behavioral assessment. Clinical Psychology Review. 2009;29(2):179–191.CrossRefPubMedGoogle Scholar
  31. 31.
    Weisz JR, Chorpita BF, Frye A, et al. Youth Top Problems: Using idiographic, consumer-guided assessment to identify treatment needs and to track change during psychotherapy. Journal of Consulting & Clinical Psychology. 2011;79(3):369.CrossRefGoogle Scholar
  32. 32.
    Connors EH, Arora P, Curtis L, et al. Evidence-based assessment in school mental health. Cognitive Behavioral Practice. 2015;22(1):60–73.  https://doi.org/10.1016/j.cbpra.2014.03.008 CrossRefGoogle Scholar
  33. 33.
    Duong MT, Lyon AR, Ludwig K, et al. Student perceptions of the acceptability and utility of standardized and idiographic assessment in school mental health. International Journal of Mental Health Promotion. 2016;18(1):49–63.  https://doi.org/10.1080/14623730.2015.1079429 CrossRefGoogle Scholar
  34. 34.
    Lyon AR, Ludwig K, Wasse JK, et al. Determinants and functions of standardized assessment use among school mental health clinicians: A mixed methods evaluation. Administration & Policy in Mental Health & Mental Health Services Research. 2015;43(1):122–134.  https://doi.org/10.1007/s10488-015-0626-0 CrossRefGoogle Scholar
  35. 35.
    Lindhiem O, Bennett CB, Orimoto TE, et al. A meta-analysis of personalized treatment goals in psychotherapy: A preliminary report and call for more studies. Clinical Psychology: Science & Practice. 2016;23(2):165–176.Google Scholar
  36. 36.
    Becker EM, Smith AM, Jensen-Doss A. Who’s using treatment manuals? A national survey of practicing therapists. Behaviour Resesarch & Therapy. 2013;51(10):706–710.CrossRefGoogle Scholar
  37. 37.
    Brookman-Frazee L, Haine RA, Baker-Ericzén M, et al. Factors associated with use of evidence-based practice strategies in usual care youth psychotherapy. Administration & Policy in Mental Health & Mental Health Services Research. 2010;37(3):254–269.CrossRefGoogle Scholar
  38. 38.
    Hatfield DR, Ogles BM. Why some clinicians use outcome measures and others do not. Administration & Policy in Mental Health & Mental Health Services Research. 2007;34(3):283–291.CrossRefGoogle Scholar
  39. 39.
    Hyde PS. Report to Congress on the Nation’s substance abuse and mental health workforce issues. US Department of Health & Human Services Substance Abuse & Mental Health Services. Jan 2013. 2013;10. http://www.cimh.org/sites/main/files/file-attachments/samhsa_bhwork_0.pdf. Accessed May 10, 2017.
  40. 40.
    Hawley KM, Cook JR, Jensen-Doss A. Do noncontingent incentives increase survey response rates among mental health providers? A randomized trial comparison. Administration & Policy in Mental Health & Mental Health Services Research. 2009;36(5):343–348.CrossRefGoogle Scholar
  41. 41.
    Glisson C, Landsverk J, Schoenwald S, et al. Assessing the Organizational Social Context (OSC) of mental health services: Implications for research and practice. Administration & Policy in Mental Health & Mental Health Services Research. 2008;35(1–2):98.  https://doi.org/10.1007/s10488-007-0148-5 CrossRefGoogle Scholar
  42. 42.
    Dillman DA, Smyth JD, Christian LM. Internet, Mail, and Mixed Mode Survey: The Tailored Design Method. Hoboken. NJ: John Wiley & Sons; 2009.Google Scholar
  43. 43.
    Tabachnick BG, Fidell LS. Using Multivariate Statistics (5th Edition). Needham Heights: Allyn & Bacon, Inc.; 2007.Google Scholar
  44. 44.
    Hooper D, Coughlan J, Mullen M. Structural equation modelling: Guidelines for determining model fit. Electronic Journal of Business Research Methods. 2008;6(1):53–60.Google Scholar
  45. 45.
    Hu L, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling Multidisciplinary Journal. 1999;6(1):1–55.CrossRefGoogle Scholar
  46. 46.
    Steiger JH. Understanding the limitations of global fit assessment in structural equation modeling. Personality & Individual Differences. 2007;42(5):893–898.  https://doi.org/10.1016/j.paid.2006.09.017 CrossRefGoogle Scholar
  47. 47.
    Kaiser HF. A second generation little jiffy. Psychometrika. 1970;35(4):401–415.  https://doi.org/10.1007/BF02291817 CrossRefGoogle Scholar
  48. 48.
    Kaiser HF. An index of factorial simplicity. Psychometrika. 1974;39(1):31–36.  https://doi.org/10.1007/BF02291575 CrossRefGoogle Scholar
  49. 49.
    Bradshaw CP, Buckley JA, Ialongo NS. School-based service utilization among urban children with early onset educational and mental health problems: The squeaky wheel phenomenon. School Psychology Quarterly. 2008;23(2):169–186.  https://doi.org/10.1037/1045-3830.23.2.169 CrossRefGoogle Scholar
  50. 50.
    Kataoka SH, Zhang L, Wells KB. Unmet need for mental health care among US children: Variation by ethnicity and insurance status. American Journal of Psychiatry. 2002;159(9):1548–1555.CrossRefPubMedGoogle Scholar
  51. 51.
    Cabassa LJ, Baumann AA. A two-way street: Bridging implementation science and cultural adaptations of mental health treatments. Implementation Science. 2013;8(1):90.CrossRefPubMedPubMedCentralGoogle Scholar
  52. 52.
    Hurlburt MS, Garland AF, Nguyen K, et al. Child and Family Therapy Process: Concordance of Therapist and Observational Perspectives. Administration & Policy in Mental Health & Mental Health Services Research. 2009;37(3):230–244.  https://doi.org/10.1007/s10488-009-0251-x CrossRefGoogle Scholar
  53. 53.
    Nakamura BJ, Selbo-Bruns A, Okamura K, et al. Developing a systematic evaluation approach for training programs within a train-the-trainer model for youth cognitive behavior therapy. Behavioral Research & Therapy. 2014;53:10–19.  https://doi.org/10.1016/j.brat.2013.12.001 CrossRefGoogle Scholar
  54. 54.
    Chapman JE, McCart MR, Letourneau EJ, et al. Comparison of youth, caregiver, therapist, trained, and treatment expert raters of therapist adherence to a substance abuse treatment protocol. Journal of Consulting & Clinical Psychology. 2013;81(4):674.CrossRefGoogle Scholar
  55. 55.
    Hogue A, Dauber S, Henderson CE, et al. Reliability of therapist self-report on treatment targets and focus in family-based intervention. Administration & Policy in Mental Health & Mental Health Services Research. 2014;41(5):697–705.CrossRefGoogle Scholar

Copyright information

© National Council for Behavioral Health 2018

Authors and Affiliations

  1. 1.Department of Psychiatry and Behavioral Sciences, School of MedicineUniversity of WashingtonSeattleUSA
  2. 2.Department of PsychologyUniversity of WashingtonSeattleUSA
  3. 3.Department of PsychologyUniversity of MiamiCoral GablesUSA

Personalised recommendations