Journal of Child and Family Studies

, Volume 25, Issue 4, pp 1218–1228 | Cite as

Validating the Ohio Scales in a Juvenile Justice Sample of Youth with Behavioral Health Issues

  • Krystel Tossone
  • Jeff Kretschmar
  • Fredrick Butcher
  • Leon Harris
Original Paper

Abstract

Between 65 and 75 % of juvenile-justice involved (JJI) youth present with at least one behavioral health disorder. Many communities have developed diversion programs that provide behavioral health services to JJI youth, often in lieu of detention. A key component of successful diversion programming is accurate screening and assessment. The Ohio Scales, a validated instrument designed to track service effectiveness in clinical samples of youth, are now being used with juvenile justice populations. The purpose of this study is to validate the Ohio Scales in a JJI youth population (N = 2246). The population (ages 12–18) is derived from Ohio’s Behavioral Health Juvenile Justice Initiative, a diversion program for JJI youth with behavioral health issues. We conducted Confirmatory Factor Analyses (CFA) on all forms of the Ohio Scales (parent, youth and worker) to measure fit for one factor, four factor and four factor second order solutions. We also conducted an Exploratory Factor Analysis (EFA) on the Problem Severity factor in the youth form to determine the number of appropriate sub-factors. The EFA indicated that the Problem Severity factor should be broken into three hypothesized sub-factors: Externalizing, Internalizing and Delinquency. The CFA confirmed this solution. CFA results indicated the four factor second order solution fits superior to the other two solutions. Using the Ohio Scales Problem Severity measure as a three sub-factor measure may increase clinical applicability by allowing a clinician to specifically measure and target externalizing or internalizing issues during treatment.

Keywords

Juvenile justice Behavioral health Screening Ohio Scales Factor analysis 

References

  1. Achenbach, T. M. (1991). Manual for the Child Behavior Checklist/4-18 and 1991 profile. Burlington: University of Vermont, Department of Psychiatry.Google Scholar
  2. Achenbach, T. M., & Edelbrock, C. S. (1981). Behavioral problems and competencies reported by parents of normal and disturbed children aged four through sixteen. Monographs of the Society for Research in Child Development, 46(1), 1–82.CrossRefPubMedGoogle Scholar
  3. Achenbach, T., & Ruffle, T. (2000). The child behavioral checklist and related forms for assessing behavioral/emotional problems and competencies. Pediatric Review, 21, 265–271.CrossRefGoogle Scholar
  4. Archer, R., Stredney, R., Mason, J., & Arnau, R. (2004). An examination and replication of the psychometric properties of the Massachusetts youth screening instrument-second edition (MAYSI-2) among adolescents in detention settings. Assessment, 11, 290–302.CrossRefPubMedGoogle Scholar
  5. Baize, H. (2001). Implications of the Ohio scales factor structure for scale utility and scoring. In The fourth annual californian children’s system of care model evaluation conference, San Francisco, CA.Google Scholar
  6. Biederman, J., Monuteaux, M., Mick, E., Spencer, T., Wilens, T., Silva, J., et al. (2006). Young adult outcome of attention deficit hyperactivity disorder: A controlled 10 year prospective follow-up study. Psychological Medicine, 36, 167–179.CrossRefPubMedGoogle Scholar
  7. Brown, T. A. (2006). Confirmatory factor analysis for applied research. New York, NY: Guilford.Google Scholar
  8. Carlston, D., & Ogles, B. (2006). The impact of items and anchors on parent–child reports of problem behavior. Child and Adolescent Social Work Journal, 1, 24–37.CrossRefGoogle Scholar
  9. Colins, L., Vermerien, R., Vreughenhil, C., VanDenBrink, W., Doreleijers, T., & Broekaert, E. (2010). Psychiatric disorders in detained male adolescents: A systematic literature review. Canadian Journal of Psychiatry, 55, 255–263.PubMedGoogle Scholar
  10. Colwell, B., Villarreal, S., & Espinosa, E. (2012). Preliminary outcomes of a pre-adjudication diversion initiative for juvenile justice involved youth with mental health needs in Texas. Criminal Justice and Behavior, 39, 444–460.CrossRefGoogle Scholar
  11. Cueller, A. E., McReynolds, L. S., & Wasserman, G. A. (2006). A cure for crime: Can mental health treatment diversion reduce crime among youth? Journal of Policy Analysis and Management, 25, 197–214.CrossRefGoogle Scholar
  12. Dowell, K. A., & Ogles, B. M. (2008). The Ohio scales youth form: Expansion and validation of a self-report outcome measure for young children. Journal of Child and Family Studies, 17, 291–305.CrossRefGoogle Scholar
  13. Grisso, T., & Barnum, R. (2006). Massachusetts youth screening instrument-version 2: User’s manual and technical report. Sarasota, FL: Professional Resource Press.Google Scholar
  14. Grisso, T., Fusco, S., Paiva-Salisbury, M., Perrault, R., Williams, V., & Barnum, R. (2001). The Massachusetts youth screening instrument-version 2 (MAYSI-2): Comprehensive research review. Worcester, MA: University of Massachusetts Medical School.Google Scholar
  15. Hatfield, D., & Ogles, B. M. (2004). The use of outcome measures by psychologists in clinical practice. Professional Psychology: Research & Practice, 35, 485–491.CrossRefGoogle Scholar
  16. Hatfield, D. R., & Ogles, B. M. (2007). Why some clinicians use outcome measures and others do not. Administration and Policy in Mental Health, 34, 283–291.CrossRefPubMedGoogle Scholar
  17. Henggeler, S. W., Halliday-Boykins, C. A., Cunningham, P. B., Randall, J., Shapiro, S. B., & Chapman, J. E. (2006). Juvenile drug court: Enhancing outcomes by integrating evidence-based treatments. Journal of Consulting and Clinical Psychology, 74, 42–54.CrossRefPubMedGoogle Scholar
  18. Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1–55.CrossRefGoogle Scholar
  19. Kenny, D. A. (2014). Measuring model fit. Retrieved May 20, 2015, from http://davidakenny.net/cm/fit.htm.
  20. Kenny, D. A., & McCoach, D. B. (2003). Effect of the number of variables on measures of fit in structural equation modeling. Structural Equation Modeling, 6, 1–55.Google Scholar
  21. Khurana, A., Cooksey, E. C., & Gavazzi, S. M. (2011). Juvenile delinquency and teenage pregnancy: A comparison of ecological risk profiles among Midwestern European American and African American female juvenile offenders. Psychology of Women Quarterly, 35, 282–292.CrossRefGoogle Scholar
  22. Kline, R. B. (2001). Chapter 7: Estimation. In Principles and practice of structural equation modeling (pp. 154–188). New York, NY: Guilford Press.Google Scholar
  23. Ko, S. J., Ford, J. D., Kassam-Adams, N., Berkowitz, S. J., Wilson, C., Wong, M., et al. (2008). Creating trauma-informed systems: Child welfare, education, first responders, healthcare, juvenile justice. Professional Psychology: Research and Practice, 39, 396–404.CrossRefGoogle Scholar
  24. Kretschmar, J., Butcher, F., & Flannery, D. (2013). An evaluation of the behavioral health/juvenile justice initiative. Behavioral Health in Ohio—Current Research Trends, 1, 18–30.Google Scholar
  25. Kretschmar, J. M., Butcher, F., Flannery, D. J., & Singer, M. I. (2014). Diverting juvenile justice-involved youth with behavioral health issues from detention: Preliminary findings from Ohio’s behavioral health juvenile justice (BHJJ) initiative. Criminal Justice Policy Review, 1–24.Google Scholar
  26. Kretschmar, J., & Flannery, D. J. (2011). Displacement and suicide risk for juvenile justice-involved youth with mental health issues. Journal of Clinical Child and Adolescent Psychology, 40, 797–806.CrossRefPubMedGoogle Scholar
  27. Lambert, M. J., Harmon, C., Slade, K., Whipple, J. L., & Hawkins, E. J. (2005). Providing feedback to psychotherapists on their patients’ progress: Clinical results and practice suggestions. Journal of Clinical Psychology, 61, 165–174.CrossRefPubMedGoogle Scholar
  28. Lambert, M. J., Whipple, J. L., Vermeersch, D. A., Smart, D. W., Hawkins, E. J., Nielsen, S. L., & Goates, M. (2002). Enhancing psychotherapy outcomes via providing feedback on client progress: A replication. Clinical Psychology and Psychotherapy, 9, 91–103.CrossRefGoogle Scholar
  29. Leve, L. D., & Chamberlain, P. (2005). Association with delinquent peers: Intervention effects for youth in the juvenile justice system. Journal of Abnormal Child Psychology, 33, 339–347.CrossRefPubMedCentralGoogle Scholar
  30. MacCallum, R. C., Browne, M. W., & Sugawara, H. M. (1996). Power analysis and determination of sample size for covariance structure modeling. Psychological Methods, 1, 130–149.CrossRefGoogle Scholar
  31. Marsh, H. W., Hau, K., & Wan, Z. (2004). In search of golden rules: Comment on hypothesis-testing approaches to setting cutoff values for fit indexes and dangers in overgeneralizing Hu and Bentler’s (1999) findings. Structural Equation Modeling: A Multidisciplinary Journal, 11, 320–341.CrossRefGoogle Scholar
  32. Muthen, L. K., & Muthen, B. O. (1998/2014). Mplus user’s guide seventh edition. Los Angeles, CA: Muthen & Muthen.Google Scholar
  33. Ogles, B. M., Melendez, G., Davis, D. C., & Lunnen, K. M. (2000). The Ohio youth problem, functioning, and satisfaction scales: Technical manual. Athens, OH: Ohio University.Google Scholar
  34. Ogles, B. M., Melendez, G., Davis, D. C., & Lunnen, K. M. (2001). The Ohio scales: Practical outcome assessment. Journal of Child and Family Studies, 10, 199–212.CrossRefGoogle Scholar
  35. Schermelleh-Engel, K., Moosbrugger, H., & Muller, H. (2003). Evaluating the fit of structural equation models: Tests of significance and descriptive goodness-of-fit measures. Methods of Psychological Research Online, 8, 23–74.Google Scholar
  36. Shufelt, J. L., & Cocozza, J. J. (2006). Youth with mental health disorders in the juvenile justice system: Results from a multi-state prevalence study. Delmar, NY: National Center for Mental Health and Juvenile Justice.Google Scholar
  37. Soler, M. (2002). Health issues for adolescents in the justice system. Journal of Adolescent Health, 31, 321–333.CrossRefPubMedGoogle Scholar
  38. Teplin, L. A., Abram, K. M., McClelland, G. M., Dulcan, M. K., & Mericle, A. A. (2002). Psychiatric disorders in youth in juvenile detention. Archives of General Psychiatry, 59, 1133–1143.CrossRefPubMedPubMedCentralGoogle Scholar
  39. Texas Department of State Health Services. (2005). Child and adolescent Texas recommended assessment guidelines (CA-TRAG). Retrieved May 20, 2015, from http://www.dshs.state.tx.us/mhsa/trr/.
  40. Turchik, J. A., Karpenko, V., & Ogles, B. M. (2007). Further evidence of the utility and validity of a measure of outcome for children and adolescents. Journal of Emotional and Behavioral Disorders, 15, 119–128.CrossRefGoogle Scholar
  41. Warnick, E. M., Weersing, V. R., Scahill, L., & Woolston, J. L. (2009). Selecting measures for use in child mental health services: A scorecard approach. Administration and Policy in Mental Health, 26, 112–122.CrossRefGoogle Scholar
  42. Wasserman, G. A., & McReynolds, L. S. (2011). Contributors to traumatic exposure and posttraumatic stress disorder in juvenile justice youths. Journal of Traumatic Stress, 24, 422–429.CrossRefPubMedGoogle Scholar
  43. Whipple, J. L., Lambert, M. J., Vermeersch, D. A., Smart, D. W., Nielsen, S. L., & Hawkins, E. J. (2003). Improving the effects of psychotherapy: The use of early identification of treatment and problem-solving strategies in routine practice. Journal of Counseling Psychology, 50, 59–68.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2015

Authors and Affiliations

  • Krystel Tossone
    • 1
  • Jeff Kretschmar
    • 1
  • Fredrick Butcher
    • 1
  • Leon Harris
    • 1
  1. 1.Begun Center for Violence Prevention Research and EducationCase Western Reserve UniversityClevelandUSA

Personalised recommendations