Abstract
Determining the factors that influence implementation of school-based wellbeing and health programs is essential for achieving desired program effects. Using a convergent mixed-methods, multiple informant design, this study considered factors that influence implementation of health programs for ninth grade students and in what ways implementation is differentially perceived by multiple informants (i.e., participants, instructors, and independent observers). Two types of programs—mindfulness and health education—were implemented with ninth graders (N = 70) in three schools situated in low-resourced urban neighborhoods. Study outcomes were derived from four data sources: (1) focus group participants (N = 45); (2) program instructor fidelity ratings; (3) independent observer fidelity ratings and notes; and (4) instructor open-ended session responses. Using thematic and mixed methods integration analyses, we identified themes related to implementation promoting or challenging factors. Theme names differed when data sources were separately analyzed by informant. Mixed methods integration analysis indicated that four themes were common across all informant groups: (1) competent, attentive, and engaging instructors are essential; (2) programs should involve interactive components (e.g., physical activities, applied learning opportunities); (3) adequate time for program delivery is key for student exposure and engagement; and (4) students’ availability and preferences should guide program scheduling. A fifth theme, unique to instructor and observer perspectives, was that program implementation was negatively impacted by distractions from multiple sources, including instructors, students, and settings. Recommendations from students, instructors, and observers for implementation optimization are discussed.
Similar content being viewed by others
Data Availability
De-identified aggregate data analyzed for this paper are available from the corresponding author upon reasonable request.
References
Aschbrenner, K. A., Kruse, G., Gallo, J. J., & Plano Clark, V. L. (2022). Applying mixed methods to pilot feasibility studies to inform intervention trials. Pilot and Feasibility Studies, 8(1), 1–13. https://doi.org/10.1186/s40814-022-01178-x
Albright, K., Gechter, K., & Kempe, A. (2013). Importance of mixed methods in pragmatic trials and dissemination and implementation research. Academic Pediatrics, 13(5), 400–407. https://doi.org/10.1016/j.acap.2013.06.010
Bauer, M. S., & Kirchner, J. (2020). Implementation science: What is it and why should I care? Psychiatry Research, 283, 112376. https://doi.org/10.1016/j.psychres.2019.04.025
Blase, K., & Fixsen, D. (2013). Core intervention components: Identifying and operationalizing what makes programs work (ASPE Research Brief). OHSP, US Department of Health and Human Services (USDHHS). Retrieved July 13, 2020, from http://aspe.hhs.gov/hsp/13/KeyIssuesforChildrenYouth/CoreIntervention/rbCoreIntervention.cfm
Boyatzis, R. E. (1998). Transforming qualitative information: Thematic analysis and code development. Sage.
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101. https://doi.org/10.1191/1478088706qp063oa
Creswell, J. W., & Plano Clark, V. L. (2018). Designing and conducting mixed methods research (3rd ed.). Sage Publications.
Cutbush, S., Gibbs, D., Krieger, K., Clinton-Sherrod, M., & Miller, S. (2017). Implementers’ perspectives on fidelity of implementation: “Teach every single part” or “Be right with the curriculum”? Health Promotion Practice, 18(2), 275–282.
Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control. Clinical Psychology Review, 18, 23–45. https://doi.org/10.1016/S0272-7358(97)00043-3
Dariotis, J. K., Cluxton-Keller, F., Mirabal-Beltran, R., Gould, L. F., Greenberg, M. T., & Mendelson, T. (2016). “The program affects Me ’Cause it gives away stress”: Urban students’ qualitative perspectives on stress and a school-based mindful yoga intervention. Explore, 12(6), 443–450. https://doi.org/10.1016/j.explore.2016.08.002
Dariotis, J.K., Mabisi, K., Jackson-Gordon, R., Rose, E., Fishbein, D.H., & Mendelson, T. (under review). Perceived Benefits of Mindfulness and Health Education Programs for Minoritized Adolescents: A Qualitative Analysis. Mindfulness.
Dariotis, J. K., Mirabal-Beltran, R., Cluxton-Keller, F., Feagans Gould, L., Greenberg, M. T., & Mendelson, T. (2017). A qualitative exploration of implementation factors in a school-based mindfulness and yoga program: Lessons learned from students and teachers. Psychology in the Schools, 54(1), 53–69. https://doi.org/10.1002/pits.21979
Davidov, D. M., Hill, K., Bush, H. M., & Coker, A. L. (2020). The green light for Green Dot: A qualitative study of factors influencing adoption of an efficacious violence prevention program in high school settings. Violence against Women, 26(12–13), 1701–1726.
Deatrick, J. A., Kazak, A. E., Madden, R. E., McDonnell, G. A., Okonak, K., Scialla, M. A., & Barakat, L. P. (2021). Using qualitative and participatory methods to refine implementation strategies: Universal family psychosocial screening in pediatric cancer. Implementation Science Communications, 2(62), 1–10. https://doi.org/10.1177/1077801219886377
Dobbie, F., Purves, R., McKell, J., Dougall, N., Campbell, R., White, J., Amos, A., Moore, L., & Bauld, L. (2019). Implementation of a peer-led school based smoking prevention programme: A mixed methods process evaluation. BMC Public Health, 19(1), 1–9. https://doi.org/10.1186/s12889-019-7112-7
Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327–350. https://doi.org/10.1007/s10464-008-9165-0
Durlak, J. A., Weissberg, R. P., Dymnicki, A. B., Taylor, R. D., & Schellinger, K. B. (2011). The impact of enhancing students’ social and emotional learning: A meta-analysis of school-based universal interventions. Child Development, 82, 405–432. https://doi.org/10.1111/j.1467-8624.2010.01564.x
Ferber, T., Sileo, A., & Wiggins, M. E. (2019). Advancing the use of core components of effective programs. Forum for Youth Investment. Retrieved July 13, 2020, from https://forumfyi.org/knowledge-center/advancing-core-components/
Fixsen, D., Blase, K., Metz, A., & Van Dyke, M. (2015). Implementation Science. In J.D. Wright (Ed.), International encyclopedia of the social and behavioral sciences (2nd ed.). Elsevier.
Gottfredson, D. C., Cook, T. D., Gardner, F. E. M., Gorman-Smith, D., Howe, G. W., Sandler, I. N., & Zafft, K. M. (2015). Standards of evidence for efficacy, effectiveness, and scale-up research in prevention science: Next generation. Prevention Science, 16(7), 893–926. https://doi.org/10.1007/s11121-015-0555-x
Gottfredson, D. C., & Gottfredson, G. D. (2002). Quality of school-based prevention programs: Results from a national survey. Journal of Research in Crime and Delinquency, 39(1), 3–35. https://doi.org/10.1177/002242780203900101
Gould, L. F., Dariotis, J. K., Greenberg, M. T., & Mendelson, T. (2016). Assessing fidelity of implementation (FOI) for school-based mindfulness and yoga interventions: A systematic review. Mindfulness, 7(1), 5–33. https://doi.org/10.1007/s12671-015-0395-6
Gould, L. F., Mendelson, T., Dariotis, J. K., Ancona, M., Smith, A. S. R., Gonzalez, A. A., Smith, A. A., & Greenberg, M. T. (2014). Assessing fidelity of core components in a mindfulness and yoga intervention for urban youth: Applying the CORE process. New Directions for Youth Development, 2014(142), 59–81. https://doi.org/10.1002/yd.20097
Hulleman, C. S., & Cordray, D. S. (2009). Moving from the lab to the field: The role of fidelity and achieved relative intervention strength. Journal of Research on Educational Effectiveness, 2(1), 88–110. https://doi.org/10.1080/19345740802539325
Koffel, E. E., & Hagedorn, H. H. (2020). Provider perspectives of implementation of an evidence-based insomnia treatment in veterans affairs (VA) primary care: Barriers, existing strategies, and future directions. Implementation Science Communications, 1(1), 1–10. https://doi.org/10.1186/s43058-020-00096-4
Kozica, S. L., Lombard, C. B., Harrison, C. L., & Teede, H. J. (2016). Evaluation of a large healthy lifestyle program: Informing program implementation and scale-up in the prevention of obesity. Implementation Science, 11(1), 1–13. https://doi.org/10.1186/s13012-016-0521-4
Lewis, C. C., Fischer, S., Weiner, B. J., Stanick, C., Kim, M., & Martinez, R. G. (2015). Outcomes for implementation science: An enhanced systematic review of instruments using evidence-based rating criteria. Implementation Science, 10(1), 155–172. https://doi.org/10.1186/s13012-015-0342-x
Mendelson, T., Dariotis, J. K., Gould, L. F., Smith, A. S., Smith, A. A., Gonzalez, A. A., & Greenberg, M. T. (2013). Implementing mindfulness and yoga in urban schools: A community-academic partnership. Journal of Children’s Services, 8, 276–291. https://doi.org/10.1108/JCS-07-2013-0024
Meyers, D. C., Durlak, J. A., & Wandersman, A. (2012). The quality implementation framework: A synthesis of critical steps in the implementation process. American Journal of Community Psychology, 50(3), 462–480. https://doi.org/10.1007/s10464-012-9522-x
Morgan, D. L. (2019). Commentary—After Triangulation, What Next? Journal of Mixed Methods Research, 13(1), 6–11. https://doi.org/10.1177/1558689818780596
Onwuegbuzie, A. J., & Collins, K. M. (2007). A typology of mixed methods sampling designs in social science research. Qualitative Report, 12(2), 281–316. https://doi.org/10.46743/2160-3715/2007.1638
Palinkas, L. A., Aarons, G. A., Horwitz, S., Chamberlain, P., Hurlburt, M., & Landsverk, J. (2011). Mixed method designs in implementation research. Administration and Policy in Mental Health and Mental Health Services Research, 38(1), 44–53. https://doi.org/10.1007/s10488-010-0314-z
Palinkas, L. A., Mendon, S. J., & Hamilton, A. B. (2019). Innovations in mixed methods evaluations. Annual Review of Public Health, 40(1), 423–442. https://doi.org/10.1146/annurev-publhealth-040218-044215
Proctor, E., Silmere, H., Silmere, H., Raghavan, R., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A., Griffey, R., & Hensley, M. (2011). Outcomes for implementation research: Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and Mental Health Services Research, 38(2), 65–76. https://doi.org/10.1007/s10488-010-0319-7
VERBI Software. (2021). MAXQDA 2022 [computer software]. Berlin, Germany: VERBI Software. Available from maxqda.com.
Acknowledgements
We are very grateful for the support of participating schools and program implementers and students’ time and willingness to participate in this study. We appreciate the high level of support from research associates and assistants: Jessica Stavig, Yuanfang Liu, Jeffrey Krick, Marcus Nole, Rachel Dows, Violet Odom, Alex Welna, Steven Sheridan, and Drs. Laura Clary and Qing Duan.
Funding
This work was supported by the National Center for Complementary and Integrative Health (grant number: R61AT009856; MPIs: Fishbein, Mendelson).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of Interest
The authors declare no conflicts of interest.
Ethics Approval
This study was performed in accordance with the ethical standards as laid down in the 1964 Declaration of Helsinki and its later amendments or comparable ethical standards. Consenting and procedures were approved by the institutional review board at the Johns Hopkins Bloomberg School of Public Health, and reliance agreements were established with the Pennsylvania State University, the University of Cincinnati, and the University of Illinois.
Consent to Participate
Parental written consent and student written assent was obtained.
Competing Interests
The authors declare no competing interests.
Disclaimer
The opinions and conclusions expressed are solely the authors’ and should not be construed as representing the opinions of NIH or any agency of the Federal Government. NIH did not have a role in study design; data collection, analysis, and interpretation; and writing or submission for publication.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
About this article
Cite this article
Dariotis, J.K., Mabisi, K., Jackson-Gordon, R. et al. Implementing Adolescent Wellbeing and Health Programs in Schools: Insights from a Mixed Methods and Multiple Informant Study. Prev Sci 24, 663–675 (2023). https://doi.org/10.1007/s11121-022-01481-2
Accepted:
Published:
Version of record:
Issue date:
DOI: https://doi.org/10.1007/s11121-022-01481-2


