Classifying Changes to Preventive Interventions: Applying Adaptation Taxonomies
High-quality implementation is important for preventive intervention effectiveness. Although this implies fidelity to a practice model, some adaptation may be inevitable or even advantageous in routine practice settings. In order to organize the study of adaptation and its effect on intervention outcomes, scholars have proposed various adaptation taxonomies. This paper examines how four published taxonomies retrospectively classify adaptations: the Ecological Validity Framework (EVF; Bernal et al. in J Abnorm Child Psychol 23(1):67–82, 1995), the Hybrid Prevention Program Model (HPPM; Castro et al. in Prev Sci 5(1):41–45, 2004. https://doi.org/10.1023/B:PREV.0000013980.12412.cd), the Moore et al. (J Prim Prev 34(3):147–161, 2013. https://doi.org/10.1007/s10935-013-0303-6) taxonomy, and the Stirman et al. (Implement Sci 8:65, 2013. https://doi.org/10.1186/1748-5908-8-65) taxonomy. We used these taxonomies to classify teacher-reported adaptations made during the implementation of TOOLBOX™, a social emotional learning program implemented in 11 elementary schools during the 2014–2015 academic year. Post-implementation, 271 teachers and staff responded to an online survey that included questions about adaptation, yielding 98 adaptation descriptions provided by 42 respondents. Four raters used each taxonomy to try to classify these descriptions. We assessed the extent to which raters agreed they could classify the descriptions using each taxonomy (coverage), as well as the extent to which raters agreed on the subcategory they assigned (clarity). Results indicated variance among taxonomies, and tensions between the ideals of coverage and clarity emerged. Further studies of adaptation taxonomies as coding instruments may improve their performance, helping scholars more consistently assess adaptations and their effects on preventive intervention outcomes.
KeywordsAdaptation Implementation Measurement Prevention Social and emotional learning
This research was funded by the Stuart Foundation and a Hellman Foundation Graduate Fellow Award. We thank Mark Collin, Dr. Chuck Fisher, Pamela McVeagh-Lally, and our colleagues at the UC Berkeley Center for Prevention Research in Social Welfare (especially Dr. Sarah Accomazzo and Kimberly Knodel) for their contributions to this work. We also thank Dr. Stacey Alexeeff for statistical consultation. Finally, we thank the administrators, teachers, and staff who participated in this research, and Catherine Rodecker and Dr. Kathryn Mapps for their implementation and evaluation leadership. Aspects of this paper were previously presented at the 2016 Society for Prevention Research conference in San Francisco and the 2017 Society for Social Work and Research conference in New Orleans. All research protocols were approved by the Committee for the Protection of Human Subjects (CPHS) at the University of California, Berkeley.
Compliance With Ethical Standards
Conflict of Interest
The authors declare they have no conflicts of interest.
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent was obtained from all individual participants included in the study.
- Baumann, A. A., Powell, B. J., Kohl, P. L., Tabak, R. G., Penalba, V., Proctor, E. K., et al. (2015). Cultural adaptation and implementation of evidence-based parent-training: A systematic review and critique of guiding evidence. Children and Youth Services Review, 53, 113–120. https://doi.org/10.1016/j.childyouth.2015.03.025.CrossRefGoogle Scholar
- Castro, F. G., Barrera, M., Jr., & Martinez, C. R., Jr. (2004). The cultural adaptation of prevention interventions: Resolving tensions between fidelity and fit. Prevention Science, 5(1), 41–45. https://doi.org/10.1023/B:PREV.0000013980.12412.cd.CrossRefGoogle Scholar
- Center for Prevention Implementation Methodology. (n.d.). Retrieved July 13, 2018, from http://cepim.northwestern.edu/.
- Colby, M., Hecht, M. L., Miller-Day, M., Krieger, J. L., Syvertsen, A. K., Graham, J. W., et al. (2013). Adapting school-based substance use prevention curriculum through cultural grounding: A review and exemplar of adaptation processes for rural schools. American Journal of Community Psychology, 51, 190–205. https://doi.org/10.1007/s10464-012-9524-8.CrossRefGoogle Scholar
- Collin, M. A. (2015). TOOLBOX™ Primer. Sebastopol, CA: Dovetail Learning Inc.Google Scholar
- Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science. https://doi.org/10.1023/b:prev.0000013981.28071.52.Google Scholar
- Glasgow, R. E., & Emmons, K. M. (2007). How can we increase translation of research into practice? Types of evidence needed. Annual Review of Public Health, 28(1), 413–433. https://doi.org/10.1146/annurev.publhealth.28.021406.144145.CrossRefGoogle Scholar
- Goncy, E. A., Sutherland, K. S., Farrell, A. D., Sullivan, T. N., & Doyle, S. T. (2015). Measuring teacher implementation in delivery of a bullying prevention program: The impact of instructional and procedural adherence and competence on student responsiveness. Prevention Science, 16(3), 440–450. https://doi.org/10.1007/s11121-014-0508-9.CrossRefGoogle Scholar
- Gould, S. J. (2011). Full house. Cambridge: Harvard University Press.Google Scholar
- Lewis, C. C., Stanick, C. F., Martinez, R. G., Weiner, B. J., Kim, M., Barwick, M., et al. (2015). The society for implementation research collaboration instrument review project: A methodology to promote rigorous evaluation. Implementation Science, 10(1), 2. https://doi.org/10.1186/s13012-014-0193-x.CrossRefGoogle Scholar
- O’Connell, M. E. (2009). Preventing mental, emotional, and behavioral disorders among young people: Progress and possibilities. Washington, DC: National Academies Press.Google Scholar
- Ogden, T., Bjørnebekk, G., Kjøbli, J., Patras, J., Christiansen, T., Taraldsen, K., et al. (2012). Measurement of implementation components ten years after a nationwide introduction of empirically supported programs: A pilot study. Implementation Science, 7(1), 49. https://doi.org/10.1186/1748-5908-7-49.CrossRefGoogle Scholar
- Shapiro, V. B., Kim, B. K. E., Accomazzo, S., & Roscoe, J. N. (2016). Predictors of rater bias in the assessment of social-emotional competence. International Journal of Emotional Education, 8(2), 25.Google Scholar
- Spoth, R., Rohrbach, L. A., Greenberg, M., Leaf, P., Brown, C. H., Fagan, A., et al. (2013). Addressing core challenges for the next generation of type 2 translation research and systems: The translation science to population impact (tsci impact) framework. Prevention Science, 14(4), 319–351. https://doi.org/10.1007/s11121-012-0362-6.CrossRefGoogle Scholar
- SPR MAPS II Task Force. (2008). Type 2 translational research: Overview and definitions. Retrieved June 13, 2017, from http://www.preventionresearch.org/SPR_Type2TranslationResearch_OverviewandDefinition.pdf.
- Wiltsey Stirman, S., Gutner, C. A., Crits-Christoph, P., Edmunds, J., Evans, A. C., & Beidas, R. S. (2015). Relationships between clinician-level attributes and fidelity-consistent and fidelity-inconsistent modifications to an evidence-based psychotherapy. Implementation Science, 10(1), 115. https://doi.org/10.1186/s13012-015-0308-z.CrossRefGoogle Scholar