The Journal of Primary Prevention

, Volume 40, Issue 1, pp 69–87 | Cite as

Qualitative Comparative Analysis: A Mixed-Method Tool for Complex Implementation Questions

  • Laura G. HillEmail author
  • Brittany Rhoades Cooper
  • Louise A. Parker
Original Paper


The translation and scale-up of evidence-based programs require new methods to guide implementation decisions across varying contexts. As programs are translated to real-world settings, variability is introduced. Some program components may have minor roles to play in producing positive outcomes, and some may have major roles, but only if adapted to meet different contextual demands. While some sources of variability are likely to improve program outcomes, we currently lack methods that allow us to determine the critical components or combinations of components that serve as causal pathways to a desired outcome and then to advise practitioners accordingly. In this paper, we introduce a promising tool for this purpose and illustrate its use in a translational research context. Qualitative Comparative Analysis (QCA) is often used to examine causality in situations that have complex, multiply-determined outcomes. The basic premise of QCA is that different sets of causal conditions, or causal pathways, may lead to a single outcome (the principle of equifinality). We applied QCA to a selection of the highest- and lowest-performing programs from a multi-year two-state dissemination of The Strengthening Families Program for Parents and Adolescents 10–14 to determine which components or combinations of components at the implementation, program delivery, and participant levels produced desired participant outcomes. In particular, we examined which components were necessary (i.e., in the absence of these factors, the outcome didnot occur), and which were sufficient (i.e., in the presence of these factors, the outcome always occurred). Results demonstrated that certain conditions were necessary for program success. In addition, given those necessary conditions, there were two sets of conditions sufficient to produce success, regardless of the presence or absence of any of the others. QCA, not previously used in prevention science research, helps to illuminate causal pathways, leading to concrete, evidence-based implementation decisions that facilitate generalization and scale-up.


Implementation and dissemination Prevention Translation Scaling up Strengthening Families Program 10–14 


Compliance With Ethical Standards

Conflict of Interest

All authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. The protocol for this study, which used archival data, was examined by the Institutional Review Board of [omitted for blind review] University and deemed exempt from formal review.


  1. Baumgartner, M., & Thiem, A. (2017). Often trusted but never (properly) tested: Evaluating Qualitative Comparative Analysis. Sociological Methods & Research. Scholar
  2. Berg-Schlosser, D., Meur, G., Rihoux, B., & Ragin, C. (2008). Qualitative Comparative Analysis (QCA) As an Approach. pp. 1–18 in Configurational Comparative Methods: Qualitative Comparative Analysis (QCA) and Related Techniques, edited by Benoīt Rihoux and Charles C. Ragin. Thousand Oaks/London: Sage.Google Scholar
  3. Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2011). Putting the pieces together: An integrated model of program implementation. Prevention Science, 12(1), 23–33. Scholar
  4. Berkel, C., Murry, V. M., Roulston, K. J., & Brody, G. H. (2013). Understanding the art and science of implementation in the SAAF efficacy trial. Health Education, 113(4), 297–323. Scholar
  5. Cantu, A., Hill, L., & Becker, L. (2010). Implementation quality of a family-focused preventive intervention in a community-based dissemination. Journal of Children’s Services, 5(4), 18–30.Google Scholar
  6. Cooper, B. R., Hill, L. G., & Parker, L. A. (2018). Development of a collaborative evaluation model for a large-scale community-driven dissemination of evidence-based programs. Washington: Washington State University. (Unpublished manuscript).Google Scholar
  7. Cooper, B. R., Shrestha, G., Hyman, L., & Hill, L. G. (2016). Adaptations in a community-based family intervention: Replication of two coding schemes. Journal of Primary Prevention, 37, 33–52. Scholar
  8. Cragun, D., Pal, T., Vadaparampil, S. T., Baldwin, J., Hampel, H., & DeBate, R. D. (2016). Qualitative comparative analysis: A hybrid method for identifying factors associated with program effectiveness. Journal of Mixed Methods Research, 10(3), 251–272.Google Scholar
  9. Crowley, D. M., Coffman, D. L., Feinberg, M. E., Greenberg, M. T., & Spoth, R. L. (2014). Evaluating the impact of implementation factors on family-based prevention programming: Methods for strengthening causal inference. Prevention Science, 15(2), 246–255.Google Scholar
  10. Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation of health services research findings into practice: A consolidated framework for advancing implementation science. Implementation Science, 4(1), 50.Google Scholar
  11. Devers, K. J., Lallemand, N. C., Burton, R. A., Kahwati, L., McCall, N., & Zuckerman, S. (2013). Using Qualitative Comparative Analysis (QCA) to study patient-centered medical homes: An introductory guide. Centers for Medicare & Medicaid Services (CMS). Retrieved from December 29, 2018.
  12. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: A review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41(3–4), 327.Google Scholar
  13. Foxcroft, D. R., Ireland, D., Lister-Sharp, D. J., Lowe, G., & Breen, R. (2003). Longer-term primary prevention for alcohol misuse in young people: A systematic review. Addiction, 98(4), 397–411. Scholar
  14. Goncy, E. A., Sutherland, K. S., Farrell, A. D., Sullivan, T. N., & Doyle, S. T. (2015). Measuring teacher implementation in delivery of a bullying prevention program: The impact of instructional and procedural adherence and competence on student responsiveness. Prevention Science, 16(3), 440–450. Scholar
  15. Green, B. B., & Estabrooks, P. A. (2011). Assessing the scale-up of a weight loss program: Narrowing the gap between research and practice. American Journal of Preventive Medicine, 41(5), 548–549.Google Scholar
  16. Hamre, B. K., Justice, L. M., Pianta, R. C., Kilday, C., Sweeney, B., Downer, J. T., et al. (2010). Implementation fidelity of MyTeachingPartner literacy and language activities: Association with preschoolers’ language and literacy growth. Early Childhood Research Quarterly, 25, 329–347.Google Scholar
  17. Hansen, W. B., Pankratz, M. M., Dusenbury, L., Giles, S. M., Bishop, D. C., Albritton, J., et al. (2013). Styles of adaptation: The impact of frequency and valence of adaptation on preventing substance use. Health Education, 113(4), 345–363. Scholar
  18. Hill, L. G., Maucione, K., & Hood, B. K. (2007). A focused approach to assessing program fidelity. Prevention Science, 8(1), 25–34.Google Scholar
  19. Hill, L. G., & Owens, R. W. (2013). Component analysis of adherence in a family intervention. Health Education, 113(4), 264–280.Google Scholar
  20. Hill, L. G., Rosenman, R., Tennekoon, V., & Mandal, B. (2013). Selection effects and prevention program outcomes. Prevention Science, 14(6), 557–569.Google Scholar
  21. Hockaday, C. (2018). Strengthening families program: For parents and youth 10–14—Facilitator manual. Ames, Iowa: Iowa State University Extension and Outreach. Retrieved from December 29, 2018.
  22. Kahwati, L. C., Lewis, M. A., Kane, H., Williams, P. A., Nerz, P., Jones, K. R., et al. (2011). Best practices in the Veterans Health Administration’s MOVE! Weight management program. American Journal of Preventive Medicine, 41(5), 457–464.Google Scholar
  23. Kane, H., Lewis, M. A., Williams, P. A., & Kahwati, L. C. (2014). Using qualitative comparative analysis to understand and quantify translation and implementation. Translational Behavioral Medicine, 4(2), 201–208.Google Scholar
  24. Leviton, L. C., & Trujillo, M. D. (2017). Interaction of theory and practice to assess external validity. Evaluation Review, 41(5), 436–471.Google Scholar
  25. Lillehoj, C. J., Griffin, K. W., & Spoth, R. (2004). Program provider and observer ratings of school-based preventive intervention implementation: Agreement and relation to youth outcomes. Health Education & Behavior, 31(2), 242–257.Google Scholar
  26. Little, M. A., Riggs, N. R., Shin, H. S., Tate, E. B., & Pentz, M. A. (2015). The effects of teacher fidelity of implementation of pathways to health on student outcomes. Evaluation and the Health Professions, 38(1), 21–41.Google Scholar
  27. Moore, J. E., Bumbarger, B. K., & Cooper, B. R. (2013). Examining adaptations of evidence-based programs in natural contexts. Journal of Primary Prevention, 34(3), 147–161.Google Scholar
  28. Nilsen, P. (2015). Making sense of implementation theories, models and frameworks. Implementation Science, 10(1), 53. Scholar
  29. Ortega, E., Giannotta, F., Latina, D., & Ciairano, S. (2012). Cultural adaptation of the Strengthening Families Program 10–14 to Italian families. Child & Youth Care Forum, 41(2), 197–212. Scholar
  30. Pettigrew, J., Miller-Day, M., Shin, Y., Hecht, M. L., Krieger, J. L., & Graham, J. W. (2013). Describing teacher–student interactions: A qualitative assessment of teacher implementation of the 7th grade Keepin’ it REAL substance use intervention. American Journal of Community Psychology, 51(1–2), 43–56. Scholar
  31. Ragin, C., & Davey, S. (2014). fs/QCA [Computer Programme], Version [2.5/3.0]. Irvine, CA: University of California.Google Scholar
  32. Redmond, C., Spoth, R., Shin, C., & Lepper, H. S. (1999). Modeling long-term parent outcomes of two universal family-focused preventive interventions: One-year follow-up results. Journal of Consulting and Clinical Psychology, 67(6), 975–984.Google Scholar
  33. Rihoux, B. (2006). Qualitative comparative analysis (QCA) and related systematic comparative methods: Recent advances and remaining challenges for social science research. International Sociology, 21(5), 679–706.Google Scholar
  34. Rihoux, B., & Ragin, C. C. (2009). Configurational comparative methods: Qualitative comparative analysis (QCA) and related techniques (Vol. 51). Thousand Oaks: Sage Publications.Google Scholar
  35. Roulette, J. W., Hill, L. G., Diversi, M., & Overath, R. (2017). Cultural adaptations of the Strengthening Families Programme 10-14 in the US Pacific Northwest: A qualitative evaluation. Health Education Journal, 76(2), 169–181.Google Scholar
  36. Schneider, C. Q., & Wagemann, C. (2006). Reducing complexity in Qualitative Comparative Analysis (QCA): Remote and proximate factors and the consolidation of democracy. European Journal of Political Research, 45(5), 751–786.Google Scholar
  37. Segrott, J., Gillespie, D., Holliday, J., Humphreys, I., Murphy, S., Phillips, C., et al. (2014a). Preventing substance misuse: Study protocol for a randomised controlled trial of the Strengthening Families Programme 10–14 UK. BMC Public Health, 14(1), 14–49.Google Scholar
  38. Segrott, J., Rothwell, H., Murphy, S., Morgan-Trimmer, S., Scourfield, J., Holliday, J., et al. (2014b). Fidelity of implementation of the Strengthening Families Programme 10–14 UK in Wales UK: A mixed-method process evaluation within a randomised controlled trial: Laurence Moore. European Journal of Public Health, 24(suppl_2), 184.Google Scholar
  39. Skärstrand, E., Sundell, K., & Andréasson, S. (2013). Evaluation of a Swedish version of the Strengthening Families Programme. The European Journal of Public Health, 24(4), 578–584.Google Scholar
  40. Spoth, R. L., Redmond, C., & Shin, C. (2001). Randomized trial of brief family interventions for general populations: Adolescent substance use outcomes 4 years following baseline. Journal of Consulting and Clinical Psychology, 69(4), 627.Google Scholar
  41. Spoth, R., Trudeau, L., Guyll, M., Shin, C., & Redmond, C. (2009). Universal intervention effects on substance use among young adults mediated by delayed adolescent substance initiation. Journal of Consulting and Clinical Psychology, 77(4), 620–632.Google Scholar
  42. Tabak, R. G., Khoong, E. C., Chambers, D. A., & Brownson, R. C. (2012). Bridging research and practice: Models for dissemination and implementation research. American Journal of Preventive Medicine, 43(3), 337–350.Google Scholar
  43. Trudeau, L., Spoth, R., Mason, W. A., Randall, G. K., Redmond, C., & Schainker, L. (2016). Effects of adolescent universal substance misuse preventive interventions on young adult depression symptoms: Mediational modeling. Journal of Abnormal Child Psychology, 44(2), 257–268.Google Scholar
  44. Vaisey, S. (2014). Comment: QCA works—When used with care. Sociological Methodology, 44(1), 108–112.Google Scholar
  45. Warren, J., Wistow, J., & Bambra, C. (2013). Applying Qualitative Comparative Analysis (QCA) in public health: A case study of a health improvement service for long-term incapacity benefit recipients. Journal of Public Health, 36, 126–133.Google Scholar
  46. Zvoch, K. (2012). How does fidelity of implementation matter? Using multilevel models to detect relationships between participant outcomes and the delivery and receipt of treatment. American Journal of Evaluation, 33(4), 547–565.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Washington State UniversityPullmanUSA

Personalised recommendations