Skip to main content
Log in

The Prevention Delivery System: Organizational Context and Use of Comprehensive Programming Frameworks

  • Original Paper
  • Published:
American Journal of Community Psychology

Abstract

The purpose of this exploratory study is to investigate organizational-level mechanisms in the Prevention Delivery System (PDS) and their influence on implementing comprehensive programming frameworks (e.g., Communities that Care-CtC) as the innovation. The PDS is part of the Interactive Systems Framework for Dissemination and Implementation (ISF) and describes key characteristics of innovation implementation and dissemination. The study addresses two research questions: (1) What types of organizational characteristics are related to successful use of each of the programming processes (i.e., planning, implementation, evaluation, and sustainability) that are part of comprehensive programming frameworks?; and (2) What are the similarities and differences in the organizational patterns correlated with use of each of the programming processes? Surveys, interview data, and other documents designed to assess organizational characteristics and extent of use of a comprehensive programming framework over time, were collected from 8 Community boards and 23 provider agencies. These organizations were responsible for planning and delivering substance abuse prevention services as part of a statewide initiative in Ohio. Data were analyzed using Spearman rho (and rank-biserial) correlations, with an emphasis on effect sizes. Results indicated that leadership, shared vision, process advocates, and technical assistance were common correlates of use across programming processes. However, the role played by these organizational variables differed for each programming process, pointing to complex interactions of the organizational infrastructure with other variables (i.e., characteristics of the innovation itself and external macro-level factors). This exploratory study provides preliminary data on the organizational-level mechanisms of the PDS and the complexity of their relationships with the other Systems in the Interactive Systems Framework.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Notes

  1. Programming is defined as the process by which programs are planned, implemented, evaluated, and sustained.

  2. Use and implementation are used interchangeably throughout this paper.

  3. While the lead author tested the model in its entirety in her original study (Livet 2006), only results relevant to the PDS are presented in this article.

  4. Ordinal regression analyses for the implementation, evaluation, and sustainability models were also conducted, but are beyond the scope of this article.

  5. Because there was no turnover in the boards that first year, this variable was not included in further planning-related analyses. Due to the homogeneity of responses on the “sufficiency of financial resources” item, it was also dropped.

  6. Both framework type (CtC or PfS) (for the boards) and cohort membership (for the provider agencies) were examined as potential confounding variables. Results indicated that the boards that selected PfS (N = 3) were more likely to have higher use scores than the boards that chose CtC (N = 5) (rank-biserial r = .60). However, based on the results of the Mann Whitney U Tests (interval-level organizational variables) and Fisher Exact Test (dichotomous organizational variables), there were no significant differences between the PfS boards and CtC boards on any of the organizational characteristics. Based on rank-biserial correlational data, cohort membership was not significantly correlated with implementation, evaluation, and sustainability process use by the provider agencies.

    Note: Framework type was not investigated as a potential confounder for the implementation, evaluation and sustainability use scores since implementation, evaluation, and sustainability guidelines were standardized across provider agencies by the external evaluation team. Cohort membership was not evaluated for the planning use score since only one cohort of boards planning data were used in this study.

References

  • Amodeo, M., & Gal, C. (1997). Strategies for ensuring use of needs assessment findings: Experiences of a community substance abuse prevention program. Journal of Primary Prevention, 18, 227–242.

    Article  Google Scholar 

  • Barnette, J. E., & Clendenen, F. (1996). The quality journey in a comprehensive mental health center: A case study. The Joint Commission Journal on Quality Improvement, 22, 8–17.

    PubMed  Google Scholar 

  • Bass, B. M. (1985). Leadership and performance beyond expectation. New York: Free Press.

    Google Scholar 

  • Blake, R. R., & Mouton, J. S. (1978). The new managerial grid. Houston, TX: Gulf.

    Google Scholar 

  • Bronfenbrenner, U. (1979). The ecology of human development. Cambridge, MA: Harvard University Press.

    Google Scholar 

  • Caplan, R. D. (1971). Organizational stress and individual strain: A social psychological study of risk factors in coronary heart disease among administrators, engineers, and scientists. Institute for Social Research, University of Michigan, University Microfilms No. 72–14822, Ann Arbor, Michigan.

  • Caplan, R. D., & Jones, K. W. (1975). Effects of work load, role ambiguity, and type A personality on anxiety, depression, and heart rate. Journal of Applied Psychology, 60, 713–719.

    Google Scholar 

  • Chinman, M., Hunter, S. B., Ebener, P., Paddock, S. M., Stillman, L., Imm, P., et al. (2008). The Getting To Outcomes Demonstration and Evaluation: An illustration of the Prevention Support System. American Journal of Community Psychology, in press.

  • DiFransceisco, W., Kelly, J. A., Otto, S. L., McAuliffe, T. L., Somlai, A. M., Hackl, K., Heckman, T. G., & Holtgrave, D. R. (1999). Factors influencing attitudes within AIDS service organizations toward the use of research-based HIV prevention interventions. AIDS Education and Prevention, 11, 72–86.

    Google Scholar 

  • Fiedler, F. E., & Garcia, J. E. (1987). New approaches to effective leadership. New York: John Wiley.

    Google Scholar 

  • Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: A synthesis of the literature. Tampa, FL: University of South Florida, Louis de la Parte Mental Health Institute, The National Implementation Research Network.

    Google Scholar 

  • Fredericksen, P., & London, R. (2000). Disconnect in the hollow state: The pivotal role of organizational capacity in community-based development organizations. Public Administration Review, 60, 230–239.

    Article  Google Scholar 

  • Gardner, J. (1989). On leadership. New York: Free Press.

    Google Scholar 

  • Goodman, R. M., Becker, A. B., Wright, B., Shada, R., & Lempa, M. (2004). Capacities of Community-based initiatives that most influence desired outcomes: A cross-case research study. Manuscript submitted for publication.

  • Goodman, R. M., & Wandersman, A. (1994). FORECAST: A formative approach to evaluating community coalitions and community-based initiatives. Journal of Community Psychology, CSAP Special Issue, 22, 6–25.

    Google Scholar 

  • Green, L. (2001). From research to “best practices” in other settings and populations. American Journal of Health Behavior, 25, 165–178.

    PubMed  Google Scholar 

  • Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: Systematic review and recommendations. Millbank Quarterly, 82, 581–629.

    Article  Google Scholar 

  • Hall, G. E., & Hord, S. M. (1987). Changes in schools facilitating the process. Albany, NY: State University of New York Press.

    Google Scholar 

  • Hall, G. E., & Hord, S. M. (2001). Implementing change: Patterns, principles, and potholes. Boston, MA: Allyn & Bacon.

    Google Scholar 

  • Hanlon, P., Murie, J., Gregan, J., McEwen, J., Moir, D., & Russell, E. (1998). A study to determine how needs assessment is being used to improve health. Public Health, 112, 343–346.

    Article  PubMed  Google Scholar 

  • Hawkins, J. D., & Catalano, R. F. (1992). Communities that care. San Francisco: Jossey-Bass.

    Google Scholar 

  • Johnson, K. W. (1989). Knowledge utilization and planned change: An empirical assessment of the A VICTORY model. Knowledge in Society: The International Journal of Knowledge Transfer, 2, 57–79.

    Google Scholar 

  • Johnson, K. W. (1999). Structural equation modeling in practice: Testing a theory for research use. Journal of Social Science Research, 24, 131–171.

    Google Scholar 

  • Johnson, K., Hays, C., Center, H., & Daley, C. (2004). Building capacity and sustainable prevention innovations: A sustainability planning model. Evaluation and Program Planning, 27, 135–149.

    Article  Google Scholar 

  • Klein, K. J., Sorra, J. S. (1996). The challenge of innovation implementation. Academy of Management Review, 21, 1055–1071.

    Article  Google Scholar 

  • Livet, M., & Wandersman, A. (2005). Organizational functioning: Facilitating effective interventions and increasing the odds of programming success. In D. Fetterman & A. Wandersman (Eds.), Empowerment evaluation principles in practice. Guilford Publications.

  • Livet, M. (2006). Organizational characteristics, use of comprehensive programming frameworks, and programming quality: An exploratory study. Doctoral Dissertation, University of South Carolina.

  • MacDonald, M. A., & Green, L. W. (2001). Reconciling concept and context: The dilemma of implementation in school-based health promotion. Health Education and Behavior, 28, 749–768.

    Article  PubMed  Google Scholar 

  • Miller, R. L., Bedney, B. J., & Guenther-Grey, C. (2003). Assessing organizational capacity to deliver HIV prevention services collaboratively. Tales from the field. Health Education and Behavior, 30, 582–600.

    Google Scholar 

  • Moss, N. (1983). An organization-environment framework for assessing program implementation. Evaluation and Program Planning, 6, 153–164.

    Article  PubMed  Google Scholar 

  • Preskill, H., & Torres, R.T. (1998). Evaluative inquiry for learning in organizations. Thousand Oaks, CA: Sage.

    Google Scholar 

  • Prestby, J. E., & Wandersman, A. (1985). An empirical exploration of a framework of organizational viability: Maintaining block organizations. Journal of Applied Behavioral Science, 21, 287–305.

    Article  Google Scholar 

  • Salem, E., Hooberman, J., Ramirez, D. (2005). MAPP in Chicago: A model for public health systems development and community building. Journal of Public Health Management Practice, 11, 393–400.

    Google Scholar 

  • Schminke, M., Cropanzano, R., & Rupp, D. E. (2002). Organization structure and fairness perceptions: The moderating effect of organizational level. Organizational Behavior and Human Decision Processes, 89, 881–905.

    Article  Google Scholar 

  • Shediac-Rizkallah, M. C., & Bone, L. R. (1998). Planning for the sustainability of community-based health programs: Conceptual frameworks and future directions for research, practice, and policy. Health Education Research, 13, 87–108.

    Article  PubMed  Google Scholar 

  • Simpson, D. D. (2002). A conceptual framework for transferring research to practice. Journal of Substance Abuse Treatment, 22, 171–182.

    Article  PubMed  Google Scholar 

  • Wandersman, A., Duffy, J., Flaspohler, P., Noonan, R., Lubell, K., Stillman, L., Blachman, M., Dunville, R., & Saul, J. (2008). Bridging the gap between prevention science and practice: The Interactive Systems Framework for Dissemination and Implementation. American Journal of Community Psychology, in press.

  • Wandersman A., & Florin, P. (2003). Community interventions and effective prevention. American Psychologist, 58, 441–448.

    Article  PubMed  Google Scholar 

  • Wandersman, A., Imm, P., Chinman, M., & Kaftarian, S. (2000). Getting to outcomes: A results-based approach to accountability. Evaluation and Program Planning, 23, 389–395.

    Article  Google Scholar 

  • Wandersman, A., Goodman, R. M., & Butterfoss, F. D. (1997). Understanding coalitions and how they operate: An “open systems’ organizational framework. In M. Minkler (Ed.), Community organizing and community building for health. New Brunswick, NJ: Rutgers U. Press.

    Google Scholar 

  • Weiss, E., Miller Anderson, R., & Lasker, R. (2002). Making the most of collaboration: Exploring the relationship between partnership synergy and partnership functioning. Health Education and Behavior, 29, 683–698.

    Article  PubMed  Google Scholar 

Download references

Acknowledgments

Results included in this article were part of the lead author’s dissertation work. As such, Dr. Livet would like to thank the individuals who contributed to the successful completion of her dissertation though their continuous support and encouragement including: dissertation committee members, Abraham Wandersman, Keith Davis, Fred Medway, and Arlene Bowers Andrews; the South Carolina Center for Public Health Preparedness staff, and in particular Jane Richter, Center Director, and Karen Pendleton, Stephanie Thompson, and Kathleen Leopard; friends and family, including her husband, Dorian Garcia, Ph.D.; and Duncan Meyers, University of South Carolina clinical-community psychology graduate student. Dr. Livet would like to also extend a special thanks to the two PIRE (Pacific Institute for Research and Evaluation) staff members who were responsible for the evaluation of the Ohio SIG (State Incentive Grant) project: Matt Courser and David Collins facilitated access to the study participants, assisted with data collection, and provided invaluable input and feedback on all aspects of the dissertation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Melanie Livet.

Appendix

Appendix

Organizational surveys

Organizational concept

Measure

Score ranges (for article data)

Psychometric properties

Overall organizational functioning survey (one version for provider agencies and one for boards)

Staffing

Expertise (related to program development)

PIRE items (9 items)

4-point Likert Scale, ranging from “poor” to “excellent”

In article study:

Alpha for SIG provider agencies: .88

Alpha for SIG boards: .73

Internal relationships

Leadership

10 items from the Partnership Self-Assessment Tool (Weiss et al. 2002)

4-point Likert Scale, ranging from “poor” to “excellent”

Factor loadings ranging from .85 to .93 (Weiss et al. 2002)

In article study:

Alpha for SIG provider agencies: .88

Alpha for SIG boards: .73

Intraagency collaboration/problem solving

12 item Collaboration and problem solving subscale of the readiness for organizational learning and evaluation Instrument (ROLE) (Preskill and Torres 1998)

5-point Likert Scale, ranging from “strongly disagree” to “strongly agree”

Alpha: .88 (Preskill and Torres 1998)

In article study:

Alpha for SIG provider agencies: .91

Alpha for SIG boards: .87

Intraagency communication and decision making

10 item collaboration and participatory decision making subscale of the Readiness for Organizational Learning and Evaluation Instrument (ROLE) (Preskill and Torres 1998)

5-point Likert Scale, ranging from “strongly disagree” to “strongly agree”

Alpha: .89 (Preskill and Torres 1998)

In article study:

Alpha for SIG provider agencies: .95

Alpha for SIG boards: .79

Role clarity (i.e., firm role boundaries)

Adapted version of the 4-item role ambiguity measure (Caplan 1971)

5-point Likert Scale, ranging from “strongly disagree” to “strongly agree” (the higher the score, the less ambiguous or blurred the role boundaries)

Average inter-item correlations of .46, and estimate reliability coefficient of .82 (Caplan 1971; Caplan and Jones 1975)

In article study:

Alpha for SIG provider agencies: .96

Alpha for SIG boards: .77

Shared vision

4-items from the “Coalition Self-Assessment Survey II” (Allies against asthma)

5-point Likert scale, ranging from “strongly disagree” to “strongly agree”

In article study:

Alpha for SIG provider agencies: .93

Alpha for SIG boards: .94

Turnover

Based on following formula: (# of staff exiting jobs/average actual # of employees during the time period specified) x (12/# months in time period specified)

N/A

N/A

Internal structure

Formalization (i.e., degree to which organization keeps records and has written procedures/policies)

Adapted 5 items from Schminke et al. (2002) organizational assessment survey

5-point Likert Scale, ranging from “strongly disagree” to “strongly agree”

Alpha: .73 (Schminke et al. 2002)

In article study:

Alpha for SIG provider agencies: .94

Alpha for SIG boards: .75

Flexibility/Risk taking

5 item Risk Taking Subscale of the Readiness for Organizational Learning and Evaluation Instrument (ROLE) (Preskill and Torres 1998)

5-point Likert Scale, ranging from “strongly disagree” to “strongly agree”

Alpha: .85 (Preskill and Torres 1998)

In article study:

Alpha for SIG provider agencies: .86

Alpha for SIG boards: .82

Centralization (i.e., degree to which power structure within an organization is hierarchical)

Adapted 5 items from Schminke et al.’s (2002) organizational assessment survey

5-point Likert Scale, ranging from “strongly disagree” to “strongly agree”

Alpha: .82 (Schminke et al. 2002)

In article study:

Alpha for SIG provider agencies: .90

Alpha for SIG boards: .93

External relationships

Interagency relationships (i.e., extent to which organization demonstrates positive collaboration and communication with others)

PIRE items (6 items)

4-point scale, ranging from “never” to “many times”

In article study:

Alpha for SIG provider agencies: .88

Alpha for SIG boards: .72

Process-specific organizational survey (one version for provider agencies and one for boards)

Human capacity

Perceived process knowledge

1 item for boards (related to planning using CtC or PfS) and 4 items for provider agencies (related to implementation, program monitoring, use of evaluation results, and sustainability using appropriate framework)

4-point scale, ranging from “poor” to “excellent”

N/A

Benefits and drawbacks of process/framework

1 item from the Partnership Self-Assessment Tool (Weiss et al. 2002) (related to planning process for boards, and implementation, program monitoring, use of evaluation results, and sustainability for provider agencies)

6-point scale, ranging from “benefits greatly exceeded drawbacks” to “drawbacks greatly exceeded benefits” (the higher the score, the higher the drawbacks)

N/A

Presence of process champions/advocates

2 items, asking whether there was someone who advocated for the use of a systematic approach to [planning (for the boards), or implementing, monitoring, improving, sustaining (for the provider agencies)] either within or outside the organization

Yes/No

N/A

Technical capacity

Availability of information on process/framework

2 items from ROLE Instrument (Preskill and Torres 1998) (related to planning process for boards, and implementation, program monitoring, use of evaluation results, and sustainability for provider agencies)

5-point Likert Scale, ranging from “strongly disagree” to “strongly agree”

Alpha: .79 (Preskill and Torres 1998)

Process training

Number of training hours organization members participated in on the diverse process steps (planning steps for boards, and implementation, monitoring, improvement, and sustainability steps for provider agencies)

N/A

N/A

Process Technical Assistance (TA)

Number of TA hours organization members received on the diverse process steps (planning steps for boards, and implementation, monitoring, improvement, and sustainability steps for provider agencies)

N/A

N/A

Fiscal capacity

Funding for process

1 item, asking whether or not the organization had the funding necessary to engaging in programming activities (related to planning process for boards, and implementation, program monitoring, use of evaluation results, and sustainability for provider agencies)

Yes/No

N/A

Rights and permissions

Reprints and permissions

About this article

Cite this article

Livet, M., Courser, M. & Wandersman, A. The Prevention Delivery System: Organizational Context and Use of Comprehensive Programming Frameworks. Am J Community Psychol 41, 361–378 (2008). https://doi.org/10.1007/s10464-008-9164-1

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10464-008-9164-1

Keywords

Navigation