Advertisement

Journal of Child and Family Studies

, Volume 26, Issue 4, pp 1051–1055 | Cite as

In the Same Ballpark or a Whole New Ball Game? Staff as Raters of Youth Behavior

  • Valerie B. Shapiro
  • Sarah Accomazzo
  • Jennifer L. Robitaille
Original Paper

Abstract

There is a lack of psychometrically sound tools for measuring youth outcomes in out-of-school time (OST) settings. Consequently, behavior ratings completed by OST staff are being scored as though the raters were teachers, even though cross-informant correlations are notoriously low (meta-analysis r = .27). Across 26 schools, 227 students were assessed by both teachers and OST staff using the Devereux Student Strengths Assessment (DESSA) to measure Social Emotional Competence. These 4th and 5th grade students were 50% male; 53% 5th graders; and 51% Latino, 20% Mixed/Other, 11% Black, 11% Asian, and 7% White. In the full sample, OST staff rated children’s behavior more harshly than teachers (p < .001; d = .32), although the scores were associated (r = .31, p < .001). Among the ratings completed within the same week, teacher and staff distributions were not statistically different. Teacher and staff ratings had a “medium” correlation (r = .42; p = .01) and a classification consistency (88%) that exceeded chance by a “moderate” amount (κ = .43). Few, if any, studies have previously compared the ratings of the same children by teachers and OST providers. Cross-informant inter-rater reliability between teachers and OST staff was higher than expected on the DESSA.

Keywords

Social emotional competence Out-of-school time After-school DESSA Cross-informant inter-rater reliability Assessment and outcome measures 

Notes

Acknowledgements

This project was supported by the Kaiser Permanente Northern California Community Benefit Program. The authors wish to acknowledge Jennette Claassen and all of the Playworks staff who participated in this project, and specifically the coaches, program managers, program associates, and program directors from the Silicon Valley and San Francisco offices. The authors also wish to thank Paul LeBuffe (Devereux Center for Resilient Children), Jack Naglieri & Chavaughn Brown (George Mason University), Jessica Adamson and Scott Marshall (Apperson Evo Social & Emotional), and Mechelle Timmons (CASO, Inc.) who were each generous and creative enablers of this work.

Compliance with Ethical Standards

Conflicts of Interest

Two of the authors are employees of the Devereux Center for Resilient Children, the non-profit organization that developed the Devereux Student Strengths Assessment (DESSA). Although one author of this paper had a role in the development of the DESSA, no author receives any direct financial remuneration from the sale of the DESSA or any other tool or resource mentioned within this manuscript.

Ethical Approval

Data analyzed in this paper were collected for evaluation purposes at a non-profit agency and de-identified data were obtained by the university through a data-share agreement, using procedures in accordance with the ethical standards of the Committee for Protection of Human Subjects (CPHS) at the University of California, Berkeley.

Informed Consent

A consent waiver was granted by the Committee for the Protection of Human Subjects (CPHS) at the University of California, Berkeley.

References

  1. Achenbach, T. M., McConaughy, S. H., & Howell, C. T. (1987). Child/adolescent behavioral and emotional problems: Implications of cross-informant correlations for situational specificity. Psychological Bulletin, 101(2), 213–232.CrossRefPubMedGoogle Scholar
  2. Atlas, J. A. (2010). Test review of the Devereux Student Strengths Assessment. In R. A. Spies, J. F. Carlson, K. F. Geisinger (Eds.), The eighteenth mental measurements yearbook (pp. 178–180). Lincoln, NE: Buros Center for Testing.Google Scholar
  3. Chain J., Shapiro, V.B., LeBuffe, P.A., & Bryson, A.M. (2012, August). Indigenous cultural differences in social emotional competence and academic achievement. Poster session presented at the meeting of the Society for the Psychological Study of Ethnic Minority Issues of the American Psychological Association, Orlando, FL.Google Scholar
  4. Cohen, J. (1960). A coefficient of agreement for nominal scales. Educational and Psychological Measurement, 20(1), 37–46.CrossRefGoogle Scholar
  5. Cohen, J. (1988). Statistical power analysis for the behavioral sciences. 2nd edn. Hillsdale, NJ: Erlbaum.Google Scholar
  6. Cowen, E. L., Lorion, R. P., & Wilson, A. B. (1976). Knowing, liking, and judged problem severity in relation to referral and outcome measures in a school-based intervention program. Journal of Consulting and Clinical Psychology, 44, 317–328.CrossRefPubMedGoogle Scholar
  7. Crane, J., Mincic, M. S., & Winsler, A. (2011). Parent-teacher agreement and reliability on the Devereux early childhood assessment (DECA) in English and Spanish for ethnically diverse children living in poverty. Early Education & Development, 22(3), 520–547.CrossRefGoogle Scholar
  8. Denham, S. A., Ji, P., & Hamre, B. (2010). Compendium of preschool through elementary social-emotional learning and associated assessment measures. Chicago, IL: Collaborative for Academic, Social, and Emotional Learning.Google Scholar
  9. Dowdy, E., Furlong, M., Eklund, K., Saeki, E., & Ritchey, K. (2010). Screening for mental health and wellness: Current school-based practices and emerging possibilities. In B. Doll, W. Phofl, J. Yoon (Eds.), Handbook of youth prevention science (pp. 70–95). New York, NY: Routledge.Google Scholar
  10. Feinstein, A. R., & Cicchetti, D. V. (1990). High agreement but low kappa: I. The problems of two paradoxes. Journal of clinical epidemiology, 43(6), 543–549.CrossRefPubMedGoogle Scholar
  11. Friedman, K. A., Leone, P. E., & Friedman, P. (1999). Strengths-based assessment of children with SED: Consistency of reporting by teachers and parents. Journal of Child and Family Studies, 8(2), 169–180.CrossRefGoogle Scholar
  12. Grills, A. E., & Ollendick, T. H. (2003). Multiple informant agreement and the anxiety disorders interview schedule for parents and children. Journal of the American Academy of Child & Adolescent Psychiatry, 42, 30–40.CrossRefGoogle Scholar
  13. Gullotta, T. P. (2015). After-school programming and SEL. In J. A. Durlak, C. E. Domitrovich, R. P. Weissberg, T. P. Gullotta (Eds.), Handbook of social and emotional learning: Research and practice. New York, NY: Guilford.Google Scholar
  14. Haggerty, K., Elgin, J., & Woolley, A. (2011). Social-emotional learning and school climate assessment measures for middle school youth. Retrieved from the Raikes Foundation website: http://www.raikesfoundation.org/
  15. Hemphill, J. F. (2003). Interpreting the magnitudes of correlation coefficients. The American psychologist, 58(1), 78.CrossRefPubMedGoogle Scholar
  16. Huebner, E. S., Gilman, R., & Furlong, M. J. (2009). A conceptual model for research in positive psychology in children and youth. Handbook of positive psychology in schools. In M. J. Furlong, R. Gilman, E. S. Huebner (Eds.), Handbook of positive psychology in schools (pp. 3–8). New York, NY: Routledge.Google Scholar
  17. Johnson, S., Hollis, C., Marlow, N., Simms, V., & Wolke, D. (2014). Screening for childhood mental health disorders using the Strengths and Difficulties Questionnaire: The validity of multi-informant reports. Developmental Medicine & Child Neurology, 56, 453–459. doi: 10.1111/dmcn.12360.CrossRefGoogle Scholar
  18. LeBuffe, P. A., Shapiro, V. B., & Naglieri, J. A. (2014). The Devereux Student Strengths Assessment (DESSA): Assessment, technical manual, and user’s guide. Charlotte, NC: Apperson, Inc.(Original work published 2009).Google Scholar
  19. De Los Reyes, A., & Kazdin, A. E. (2005). Informant discrepancies in the assessment of childhood psychopathology: A critical review, theoretical framework, and recommendations for further study. Psychological Bulletin, 131(4), 483.CrossRefPubMedGoogle Scholar
  20. Malcomb, K. K. (2010). Test review of the Devereux Student Strengths Assessment. In R. A. Spies, J. F. Carlson, K. F. Geisinger (Eds.), The eighteenth mental measurements yearbook (pp. 180–182). Lincoln, NE: Buros Center for Testing.Google Scholar
  21. McKown, C. (2015). Challenges and opportunities in the direct assessment of children’s social and emotional comprehension. In J. A. Durlak, C. E. Domitrovich, R. P. Weissberg, T. P. Gullotta (Eds.), Handbook of social and emotional learning: Research and practice (pp.320–335). New York, NY: Guilford.Google Scholar
  22. Merrell, K. W., & Gueldner, B. A. (2010). Social and emotional learning in the classroom: promoting mental health and academic success. New York, NY: Guilford Press.Google Scholar
  23. Naglieri, J. A., LeBuffe, P. A., & Shapiro, V. B. (2013). Assessment of social-emotional competencies related to resilience. In S. Goldstein, R. Brooks (Eds.). Handbook of resilience in children (pp. 261–272). New York, NY: Kluwer/Academic Press.CrossRefGoogle Scholar
  24. Nickerson, A. B., & Fishman, C. (2009). Convergent and divergent validity of the Devereux student strengths assessment. School Psychology Quarterly, 24(1), 48–59.CrossRefGoogle Scholar
  25. Rosas, S., Chaiken, L., & Case, J. (2008). Cross-setting consistent and setting-specific strength behaviors in preschoolers: Influence on reported concerns. In C. Newman, C. Liberton, K. Kutash, R. Friedman (Eds.). A system of care for children’s mental health: Expanding the research base, 20 (pp. 201–204). Tampa: University of South Florida, Louis de la Parte Florida Mental Health Institute, Research and Training Center for Children’s Mental Health.Google Scholar
  26. Shapiro, V. B., Accomazzo, S., Claassen, J., & Robitaille, J. L. (2015). The choices, challenges, and lessons learned from a multi-method social-emotional/character assessment in an out of school time setting. Journal of Youth Development: Bridging Research and Practice, 10(3), 32–45.Google Scholar
  27. Shapiro, V.B., Kim, B.K.E., Robitaille, J.L., & LeBuffe, P.A. (2016). Protective factor screening for prevention practice: Sensitivity and specificity of the DESSA-Mini. School Psychology Quarterly. Advance online publication. http://dx.doi.org/10.1037/spq0000181.
  28. Shapiro, V. B., & LeBuffe, P. A. (2006). Using protective factors in practice. Annals of the New York Academy of Sciences, 1094, 350–353. doi: 10.1196/annals.1376.048.CrossRefPubMedGoogle Scholar
  29. Simmons, C. A., Shapiro, V. B., Accomazzo, S., & Manthey, T. J. (2016). Strengths-based social work: A meta-theory to guide social work research and practice. In N. Coady, P. Lehmann (Eds.), Theoretical perspectives for direct social work practice. 3rd edn. (pp. 131–154). New York, NY: Springer Publishing Company.Google Scholar
  30. Smith, G. T., Shapiro, V. B., Sperry, R. W., & LeBuffe, P. A. (2014). A strengths-based approach to supervised visitation in child welfare. Child Care in Practice, 20(1), 98–119.CrossRefGoogle Scholar
  31. StataCorp. (2013). Stata statistical software: Release 13. College Station, TX: StataCorp LP.Google Scholar
  32. Tsang, K. L. V., Wong, P. Y. H., & Lo, S. K. (2012). Assessing psychosocial well-being of adolescents: A systematic review of measurement instruments. Child: Care, Health and Development, 38(35), 629–646.Google Scholar
  33. Wei, C., Hoff, A., Villabø, M. A., Peterman, J., Kendall, P. C., & Piacentini, J., et al. (2014). Assessing anxiety in youth with the multidimensional anxiety scale for children. Journal of Clinical Child & Adolescent Psychology, 43(4), 566–578.CrossRefGoogle Scholar
  34. Wolcott, C. S., & Williford, A. P. (2015). Teacher and TA ratings of preschoolers’ externalizing behavior agreement and associations with observed classroom behavior. Topics in Early Childhood Special Education, 34(4), 211–222.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.School of Social WelfareUniversity of CaliforniaBerkeleyUSA
  2. 2.Center for Prevention Research in Social WelfareUniversity of CaliforniaBerkeleyUSA
  3. 3.Devereux Center for Resilient ChildrenVillanovaUSA

Personalised recommendations