Validating the Ohio Scales in a Juvenile Justice Sample of Youth with Behavioral Health Issues
- 317 Downloads
Between 65 and 75 % of juvenile-justice involved (JJI) youth present with at least one behavioral health disorder. Many communities have developed diversion programs that provide behavioral health services to JJI youth, often in lieu of detention. A key component of successful diversion programming is accurate screening and assessment. The Ohio Scales, a validated instrument designed to track service effectiveness in clinical samples of youth, are now being used with juvenile justice populations. The purpose of this study is to validate the Ohio Scales in a JJI youth population (N = 2246). The population (ages 12–18) is derived from Ohio’s Behavioral Health Juvenile Justice Initiative, a diversion program for JJI youth with behavioral health issues. We conducted Confirmatory Factor Analyses (CFA) on all forms of the Ohio Scales (parent, youth and worker) to measure fit for one factor, four factor and four factor second order solutions. We also conducted an Exploratory Factor Analysis (EFA) on the Problem Severity factor in the youth form to determine the number of appropriate sub-factors. The EFA indicated that the Problem Severity factor should be broken into three hypothesized sub-factors: Externalizing, Internalizing and Delinquency. The CFA confirmed this solution. CFA results indicated the four factor second order solution fits superior to the other two solutions. Using the Ohio Scales Problem Severity measure as a three sub-factor measure may increase clinical applicability by allowing a clinician to specifically measure and target externalizing or internalizing issues during treatment.
KeywordsJuvenile justice Behavioral health Screening Ohio Scales Factor analysis
- Achenbach, T. M. (1991). Manual for the Child Behavior Checklist/4-18 and 1991 profile. Burlington: University of Vermont, Department of Psychiatry.Google Scholar
- Baize, H. (2001). Implications of the Ohio scales factor structure for scale utility and scoring. In The fourth annual californian children’s system of care model evaluation conference, San Francisco, CA.Google Scholar
- Brown, T. A. (2006). Confirmatory factor analysis for applied research. New York, NY: Guilford.Google Scholar
- Grisso, T., & Barnum, R. (2006). Massachusetts youth screening instrument-version 2: User’s manual and technical report. Sarasota, FL: Professional Resource Press.Google Scholar
- Grisso, T., Fusco, S., Paiva-Salisbury, M., Perrault, R., Williams, V., & Barnum, R. (2001). The Massachusetts youth screening instrument-version 2 (MAYSI-2): Comprehensive research review. Worcester, MA: University of Massachusetts Medical School.Google Scholar
- Kenny, D. A. (2014). Measuring model fit. Retrieved May 20, 2015, from http://davidakenny.net/cm/fit.htm.
- Kenny, D. A., & McCoach, D. B. (2003). Effect of the number of variables on measures of fit in structural equation modeling. Structural Equation Modeling, 6, 1–55.Google Scholar
- Kline, R. B. (2001). Chapter 7: Estimation. In Principles and practice of structural equation modeling (pp. 154–188). New York, NY: Guilford Press.Google Scholar
- Kretschmar, J., Butcher, F., & Flannery, D. (2013). An evaluation of the behavioral health/juvenile justice initiative. Behavioral Health in Ohio—Current Research Trends, 1, 18–30.Google Scholar
- Kretschmar, J. M., Butcher, F., Flannery, D. J., & Singer, M. I. (2014). Diverting juvenile justice-involved youth with behavioral health issues from detention: Preliminary findings from Ohio’s behavioral health juvenile justice (BHJJ) initiative. Criminal Justice Policy Review, 1–24.Google Scholar
- Marsh, H. W., Hau, K., & Wan, Z. (2004). In search of golden rules: Comment on hypothesis-testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu and Bentler’s (1999) findings. Structural Equation Modeling: A Multidisciplinary Journal, 11, 320–341.CrossRefGoogle Scholar
- Muthen, L. K., & Muthen, B. O. (1998/2014). Mplus user’s guide seventh edition. Los Angeles, CA: Muthen & Muthen.Google Scholar
- Ogles, B. M., Melendez, G., Davis, D. C., & Lunnen, K. M. (2000). The Ohio youth problem, functioning, and satisfaction scales: Technical manual. Athens, OH: Ohio University.Google Scholar
- Schermelleh-Engel, K., Moosbrugger, H., & Muller, H. (2003). Evaluating the fit of structural equation models: Tests of significance and descriptive goodness-of-fit measures. Methods of Psychological Research Online, 8, 23–74.Google Scholar
- Shufelt, J. L., & Cocozza, J. J. (2006). Youth with mental health disorders in the juvenile justice system: Results from a multi-state prevalence study. Delmar, NY: National Center for Mental Health and Juvenile Justice.Google Scholar
- Texas Department of State Health Services. (2005). Child and adolescent Texas recommended assessment guidelines (CA-TRAG). Retrieved May 20, 2015, from http://www.dshs.state.tx.us/mhsa/trr/.
- Whipple, J. L., Lambert, M. J., Vermeersch, D. A., Smart, D. W., Nielsen, S. L., & Hawkins, E. J. (2003). Improving the effects of psychotherapy: The use of early identification of treatment and problem-solving strategies in routine practice. Journal of Counseling Psychology, 50, 59–68.CrossRefGoogle Scholar