Annals of Behavioral Medicine

, Volume 37, Issue 2, pp 218–227

Costing Behavioral Interventions: A Practical Guide to Enhance Translation

Authors

    • Institute for Health ResearchKaiser Permanente Colorado
  • Anna Sukhanova
    • Institute for Health ResearchKaiser Permanente Colorado
  • Bridget Gaglio
    • Institute for Health ResearchKaiser Permanente Colorado
  • Russell E. Glasgow
    • Institute for Health ResearchKaiser Permanente Colorado
Original Article

DOI: 10.1007/s12160-009-9088-5

Cite this article as:
Ritzwoller, D.P., Sukhanova, A., Gaglio, B. et al. ann. behav. med. (2009) 37: 218. doi:10.1007/s12160-009-9088-5

Abstract

Background/Purpose

Cost and cost effectiveness of behavioral interventions are critical parts of dissemination and implementation into non-academic settings. Due to the lack of indicative data and policy makers’ increasing demands for both program effectiveness and efficiency, cost analyses can serve as valuable tools in the evaluation process.

Methods

To stimulate and promote broader use of practical techniques that can be used to efficiently estimate the implementation costs of behavioral interventions, we propose a set of analytic steps that can be employed across a broad range of interventions.

Results/Conclusions

Intervention costs must be distinguished from research, development, and recruitment costs. The inclusion of sensitivity analyses is recommended to understand the implications of implementation of the intervention into different settings using different intervention resources. To illustrate these procedures, we use data from a smoking reduction practical clinical trial to describe the techniques and methods used to estimate and evaluate the costs associated with the intervention. Estimated intervention costs per participant were $419, with a range of $276 to $703, depending on the number of participants.

Keywords

Cost effectivenessBehavioral interventionsIntervention costs

Introduction

Multiple reasons have been given for the failure of many behavioral medicine research findings to translate into practice. One of the candidates is the lack of information on outcomes important to policy makers, including estimates of intervention costs and cost effectiveness [1]. The extent to which cost-effectiveness analyses (CEA) have influenced health care policy and resource allocation decisions have been debated in a number of reviews [25]. In this era of rapidly rising health care costs, behavioral medicine programs must compete for limited dollars with many different types of health care services. The willingness of decision makers to adopt effective interventions may be strongly related to the estimated costs associated with implementing behavioral medicine programs [6, 7].

To insure future dissemination of the research findings, it is often necessary not only to demonstrate clinical success but also to show feasibility of the proposed intervention. However, assessment of feasibility can often be a challenge in itself. After reviewing numerous publications on cost and cost effectiveness of behavioral interventions, we identified several challenges that can be prohibitive to the wide acceptance and implementation of the cost assessments [811].

First, the recommendations put forth by the Panel on Cost Effectiveness in Health and Medicine (PCHM) commissioned by the US Department of Health and Human Services [12] suggested that cost-effectiveness analyses should be conducted using a reference case, viewed from a societal perspective, and incorporate methods that capture benefits, harms, and costs to all. However, often, the scope, timeline, and budget constraints associated with many complex behavioral intervention projects preclude the inclusion of a comprehensive CEA that includes the complete capture of all long-term-related social costs and benefits. Even if one determines that it is optimal to perform a full CEA, the identification of intervention costs associated with behavioral and public health interventions are a central and required component of these analyses [2, 12].

Second, many of the published CEAs rely on simulations that employ techniques such as Markov modeling coupled with Monte Carlo analyses that focus on estimating long-term cost effectiveness rather than focusing on the costs or resources needed to implement the intervention [13, 14]. Although long-term benefits should not be overlooked, more immediate information associated with the cost of the intervention is often more critical to decision makers having to decide about resource allocation. In addition, literature-derived data, however accurate, can be either outdated or not specific to the project and is rarely used as a sole source for the cost evaluation.

Third, while several studies have estimated the actual intervention costs, few have performed sensitivity analyses associated with variations in the implementation setting, methods of recruitment, and delivery conditions and their associated labor inputs [1517]. Such sensitivity analyses can be paramount in indicating possible cost variations that depend on the factors such as intervention setting, target population, and technology adjustments.

Fourth, there are no clear standards for cost capture analyses procedures. Discrepancies in the methodology of cost analyses results in fragmented and often incomparable cost information [8]. Although frequently called for in literature reviews and lamented because of their general absence from otherwise comprehensive outcome evaluations [1820], there are still few good practical and broadly applicable examples of costing behavioral medicine interventions. There are several examples of behavioral intervention costs captured retrospectively [21]. Examples include the Diabetes Prevention Program Outcomes Study (DPPOS) [21, 22], an analysis of the Mediterranean Lifestyle Program performed by this research group, and an analysis of hospital-based smoking intervention by Meenan et al. [23, 24]. The DPPOS cost analyses used a questionnaire that asked the staff about time spent on direct clinical care or clinical case management including all communication with participants and DPPOS staff, and it covered a 12-month period which was selected to represent the period of the intervention occurring after recruitment but before close-out. Ritzwoller’s and Meenan’s analyses were based on data collected respectively from project survey, expense reports, and estimates of labor inputs. While retrospective cost capture may be a practical and low-burden method to capture intervention costs, this method may suffer from issues related to recall bias; and the lack of detail concerning types of costs may limit the scope of any associated sensitivity analyses.

Recall bias may arise when the quality of the data pertaining to the time allocation is significantly distorted by the loss of detail or inaccurate recollection of the events [25, 26]. Research suggests that accuracy of recall depends on the time interval between the event and the time of its assessments, including estimates which suggest that up to 20% of critical details are forgotten 1 year from its occurrence [27]. These findings suggest that considerable inaccuracies in the time collection reports can be potentially induced or exacerbated by employing staff surveys like the one used in the DPPOS cost analyses [21, 22] that had a lag time of up to 12 months between the intervention events and data collection. Prospective or more real time cost capture methods may reduce the time lag between the intervention and resource tracking.

In addition, strict reliance on budgeted records for estimating intervention costs can be misleading due to the nature of the research settings and can create significant discrepancies between budgeted and actual time allocations. Also, the original project plans are not always consistent with actual transpired events. Exogenous factors may affect timeline and resources associated with many aspects of the intervention. Since our main focus is to assess actual resource allocation, the prospective method may be a better approach to provide researchers with more accurate insight on the intervention.

By reviewing various behavioral interventions as well as by conducting numerous cost analyses, our team identified cost analyses techniques that proved to be informative, provided a high level of precision, and were not significantly burdensome on project resources or timeline. In this paper, we use data from a smoking reduction project to demonstrate feasible, practical methods of cost capture and basic cost analyses that are applicable to many areas of behavioral medicine.

This paper is organized as follows: first, we introduce the Smoking Less, Living More study that was used as a prototype for our cost assessment methods. Second, we present a five-step cost assessment guide designed for evaluation of the behavioral interventions. Third, we report our results pertaining to the Smoking Less, Living More study and related sensitivity analyses.

Smoking Less, Living More Intervention

Our research group has recently developed a smoking reduction intervention that was designed to be broadly applicable and integrated into a large managed care organization [28]. The focus of the program is on behavioral approaches to reduce the number of cigarettes smoked and not on the use of alternative tobacco products [29]. A social-ecological, theoretical approach explained in detail elsewhere [28], including risk perceptions, self-efficacy, and environmental support, was used for intervention development. This project was designed as a “practical clinical trial” [30, 31] and used the RE-AIM framework [1, 32] to evaluate its potential for broader dissemination.

Study Description

The Smoking Less, Living More study was conducted at Kaiser Permanente Colorado (KPCO), targeting members who were identified as being 18 years of age or older, smoked ten or more cigarettes per day, were scheduled for an outpatient surgery or diagnostic procedure (i.e., mammography, sigmoidoscopy), and could read and write in English [28]. This study received KPCO Institutional Review Board (IRB) approval.

Potential participants were first identified using the KPCO electronic medical records system. Those who met the eligibility criteria were notified about the program by a personalized introductory letter, a descriptive flyer that provided additional details about the study, an informed consent form, a Health Insurance Portability and Accountability Act (HIPAA) statement, and an “opt-out” postcard. One to 2 weeks after receiving the letter, participants who did not decline via postcard were contacted by trained telephone interviewers on behalf of the KPCO Preventive Medicine Department. After smoking status, interest, and other eligibility criteria were confirmed, potential participants received a detailed description of the study [28]. A total of 320 subjects were enrolled: 164 were randomized to the intervention arm and 156 to the Usual Care (UC) arm.

The intervention was delivered across a 6-month time period and consisted of four telephone counseling sessions, four tailored newsletters, and one targeted newsletter [28]. The intervention components were sequenced and gradually faded over time with the majority of the components occurring in the first 3 months. Participants in the UC arm received generic health education mailings. Participants in both arms completed an assessment at baseline, 3 months, and 12 months. At the same time intervals, study participants were also asked to participate in biochemical assessments consisting of expired carbon monoxide readings and saliva cotinine assays, which were conducted in person with study staff.

We used telephone interviewers from AMC Cancer Research Center’s computer-assisted telephone interviewing unit (CATI). Interviewers were trained to recruit study participants, obtain verbal consents, complete the baseline assessment, and the follow-up counseling call components of the intervention. Each counseling call was thoroughly scripted and programmed in to the CATI system to anticipate any scenario that might occur with a study participant during the course of the call. The data collected by the CATI interviewers from the baseline assessment and counseling calls were used to tailor the newsletters received by participants over the course of the intervention.

Cost Analysis (Step-by-Step Guide)

Cost Analyses Objective

Cost analysis can provide researchers and policy makers with valuable information regarding the feasibility of a proposed intervention. Considering what information was most essential, as well as the challenges associated with cost assessment, our goal was to develop cost tracking and assessing techniques that would not be prohibitive given the scope of the Smoking Less, Living More study. As a result, we developed the five-step process below that summarizes practical cost assessment techniques that should be applicable across a variety of behavioral interventions.

Step #1: Perspective of the Analyses

Before data collection and subsequent cost analyses, a key starting point is determining the perspective of the analyses. Who are the decision makers that will be using this information? With respect to adoption of many behavioral medicine interventions, the decision makers may include Medicare/Medicaid, a health plan, a community, or public health entity. Thus, the specific perspective may determine the types of cost that will be tracked throughout the cost assessment. In the context of future dissemination, intervention costs may include (but are not limited to) capital equipment, clinician or health educator time, prescription drugs, and the technology used to deliver the intervention (web-based, CD-Rom, interactive voice response (IVR), etc.).

Step #2: Identifying Intervention Components

Once the perspective is identified, the next task is to identify essential intervention components required for intervention replication. In determining what resources would be needed to implement or replicate the intervention, one can separate an intervention project into four categories: research, development, recruitment, and implementation or intervention. Care must be taken to accurately separate research and development-based costs from clinical replication or implementation costs because only the latter would be required by others who might adopt the program [12, 23]. Research costs include labor and other inputs associated with grant administration, acquisition of Institutional Review Board approval, time administering and analyzing assessments, manuscript preparation, and for physiologic testing that is not part of the intervention, etc. Development costs include resources directed toward the overall development of protocols, assessments, website design, telephone script production, and other materials such as newsletters or handouts. We differentiated development categories that do not have to be recreated in future settings from components that require future adjustments. Individually tailored newsletters and message libraries used in this study serve as perfect examples of variable development factors that have to be adjusted and their cost estimated with each new intervention implementation.

Specific implementation or intervention costs may also be separated into two categories: recruitment costs versus all other. We focus on issues related to the capture and analysis of recruitment costs because recruitment costs are often one of the largest, most variable types of cost, and the category most affected by research-related issues such as subject identification, obtainment of consent, etc. Unfortunately, reports of intervention costs have often ignored costs of recruitment [22]. The complexity is to include recruitment costs that would be required in replication (e.g., promotions) but to exclude costs specific to the research program—e.g., informed consent—that would not be required except in research applications. The components of the implementation cost capture include labor or personnel resources associated with staff recruitment and training and participant recruitment, counseling, and monitoring. We also include all supply, printing, and mailing costs, along with costs of collecting and manipulating participant data needed for the tailoring component.

Step #3: Practical Methods for Capturing Intervention Costs

Data used for estimating intervention-related costs are usually captured either retrospectively or prospectively. As previously mentioned, the retrospective approach may suffer from lack of detail and inaccurate recollection. A more real-time, prospective method of capturing intervention-related resources may improve the accuracy and precision of cost assessments and allow for a broader array of sensitivity analyses. In the study described here and in six other ongoing studies, we collect data prospectively by providing EXCEL-based templates (see Table 1) that are collected on a monthly basis. These templates provide the framework for capturing the costs associated with the various intervention components (project staff, materials, IT, etc.) without adding significant strain on project resources or burden to individuals staffing the project. Individual cost capture templates are designed to provide an avenue for recording labor and material resource use and ultimately cost associated with research, development, recruitment, and implementation phases of the project.
Table 1

Example of a cost capture template for the Smoking Less, Living More study project manager

Project manager related activities

Sep 05

Oct 05

Nov 05

Dec 05

h/month

Meetings

Entire team

    

Recruitment

    

Protocol related

5

   

Conference calls

3

   

Meeting preparation

 

6

8

9

Personnel

Human resources—hiring staff

12

   

Orientation and training for staff

 

12

8

12

Grant Administration:

Budget

12

15

15

15

IRB preparation

40

   

Project related

Recruitment scripts and materials

    

Baseline assessments

 

6

  

Travel to patient sample collections

 

4

3

5

Mailing and/or administering follow-up assessments

 

2

15

14

Development of procedures for study tasks

 

21

10

12

Monitoring

    

Sample collection (prep and administration)

 

2

  

Miscellaneous

E-mail correspondence

    

Data management

Data collection

 

5

3

5

Database development

4

   

Data entry

    

Data pulls

    

Other

Travel

    

Off-site commute

 

9

12

7

Scientific conferences

    

Mailing: stuffing envelopes

    

Total

64

82

74

79

One of the major cost components across every phase of most projects is labor input. To determine the labor resources used, we evaluate different job descriptions for the staff members, and then customize individual templates with appropriate categories reflective of job responsibilities. We instruct the staff to edit or alter these templates to best capture the tasks with which each individual is involved. As the project proceeds, the tasks that an individual is involved with may change (e.g., moving from recruitment to intervention), and the categories noted in the template for that individual will change as well. For consistency in results and interpretation, we ask that tasks be captured in hours or full-time-equivalent spent by each individual contributor. Table 1 describes the basic template employed by the project manager of the Smoking Less, Living More study. For example, it was noted that 40 h was spent preparing IRB documents in the first month of the study. Many of the rows or categories were actually created by the project manager, after the study began, and reflect the day-to-day tasks associated with that particular position.

In addition to the labor component, templates can also capture project expenses such as supplies, printed materials, project equipment, etc. In order to minimize recall bias and issues related to missing data, we send out reminder emails asking staff to record and send in-time captures on a monthly basis. Optimally, the time capture begins in the earliest stage of the project and ends at the completion of the intervention. Data collected from the project described here suggest that staff spent approximately 5–10 min per month recording labor resources and materials used.

Step #4: Data Analyses

We do not place the burden on the staff to explicitly note if the tasks they are involved with are “research” versus “development” versus “intervention.” Instead, we aggregate the data within and across the individual cost capture templates based on the tasks and the timeline of the project. Knowing the details of the project allows us, in collaboration with the project leaders, to allocate the resources and costs into the various components of the project. Prospective data collection as well as detailed categories listed by the project staff enabled us to differentiate intervention cost components from research, development, and recruitment type of costs. The key question we used in determining whether or not a resource was considered part of the intervention cost component was “would this resource be needed to deliver the invention in practice?” For instance, the project manager’s time capture report for Smoking Less, Living More study included following categories: travel to patient sample collections, mailing and/or administering follow-up assessments, and sample collection (prep and administration). Since all of these tasks were research related and would not be administered during the future study adaptation, we did not include these categories in overall intervention cost of the project. On the other hand, categories such as orientation and training of new intervention (not research) personnel, mailing of project materials to participants, resolving staffing problems were added to the overall intervention cost.

After intervention-specific resources are differentiated from research and development, total intervention costs and cost per intervention participant can be calculated. Industry or occupation specific wages and benefits data may provide solid estimates of national average labor costs. Also, projects that may be multi-site in nature may need to employ national average wage rates for each job classification in order to minimize the impact that geographical wage variations may have on site-specific intervention cost estimates. These wage rates may be obtained from the Bureau of Labor Statistics [33]. In this analysis, we used actual salaries and benefit data associated with the intervention staff because we felt it was important to understand the actual cost of the intervention as it was deployed at KPCO. Also, these data would be one of the most important metrics that health plan decision makers would use to determine whether or not to disseminate the intervention to the full KPCO membership. For the sensitivity analysis described below, we employed national average wage rates.

Prospective cost capture methods described above also allow us to differentiate the various types of cost that can be associated with behavioral interventions. In this example, we separated the recruitment and intervention costs into fixed and variable costs. Fixed costs are those costs that are constant regardless of the “level of production”. In the Smoking Less, Living More project described here, fixed or overhead costs included resources not directly associated with the number of participants involved or direct interaction with participants. For example, in this project, as long as additional counselors were not hired, the training costs for telephone counselors did not vary by the number of calls that each counselor provided. Also, costs associated with project meetings, email communication, etc., were required for the success of the intervention but did not vary by the number of participants. Variable costs are the costs incurred associated with the intervention—but vary in magnitude by the level of output or number of participants. In this study, costs for phone calls, newsletters, and mailings increased as the number of participants increased.

Step #5: Sensitivity Analyses

To address the existence of uncertainty in the adoption or dissemination of effective behavioral interventions, sensitivity analyses can be used to estimate the range of intervention costs in a variety of settings and circumstances. Sensitivity analyses can include the re-estimation of intervention costs by varying the labor inputs (physician versus nurse or psychologist versus health educator), market wages and benefits (New York City verses Denver), setting of the intervention (community provider group versus health maintenance organization), purchase versus production of materials (e.g., manuals, CDs, websites), the size and characteristics of the target population (adults versus adolescents), and the effect size or outcome (quit rate, reduction rate). One can perform simulations based on extrapolations from other published health-care-specific and nonhealth-care-specific resource costs [24].

In the Smoking Less, Living More project study, our sensitivity analyses included five alternatives that capture variations in the intervention input in terms of resources, along with variation in the size of the target population. Our primary focus was on the assessment of the costs associated with training and supervision of the phone counselors, phone counseling procedures, tailoring of the mailing materials, and program’s upkeep operations. These parameters were chosen for sensitivity testing due to their potential variability. Each alternative scenario comprised fixed and variable costs, which were tested for cost estimates related to capacity and constraints of the project. Specifically, could the intervention be provided to a larger number of participants without increasing (and possibly decreasing) the cost per intervention participant and the marginal or incremental cost per incremental improvement in outcome? In economic terms, given the resources used for the intervention, did we achieve lowest marginal costs?

Smoking Reduction Results

Estimated Costs from the Smoking Reduction Project

Using cost capture templates consistent with Table 1, we estimated the various components of the intervention costs associated with the first 3 months of the smoking reduction intervention. Employing the perspective of an organization such as a health plan that may consider implementing a smoking reduction program, we estimated total intervention costs relative to the usual care condition and the incremental invention costs per patient. Resources and costs associated with the research and development portions of the project were identified and separated from the overall intervention costs. While not reported here, research and development related information may be useful in answering specific questions that are outside the scope of this paper. For example, the resources and costs captured during the course of the project could be used to estimate the costs of translating the program into Spanish.

Table 2 describes the distribution of costs across the intervention components. Total variable and fixed costs for the first three intensive months of the intervention were estimated at $112,848. Excluding the recruitment costs, total intervention costs were estimated at $68,642. Fixed costs (which did not vary by the number of participants) for both recruitment and the intervention were estimated to be almost twice as high as the variable costs. The largest recruitment cost component was attributed to the category of overhead, which included activities associated with day-to-day operation of the project (i.e., unscheduled meetings, staff communication, etc.). The largest intervention cost component was associated with interviewer training and supervision. UC costs were estimated at $3,900 over the initial 3-month period. This translates to a cost of $688 ($419 excluding recruitment cost) per intervention participant (see Fig. 1).
https://static-content.springer.com/image/art%3A10.1007%2Fs12160-009-9088-5/MediaObjects/12160_2009_9088_Fig1_HTML.gif
Fig. 1

Sensitivity analyses using alternative intervention resources

Table 2

Intervention and recruitment cost components

Cost Element

Variable ($)

Fixed ($)

Total ($)

Recruitment

 Project staff

   

 Mailings

1,908

 

1,908

 E-mail

3,990

 

3,990

 Overheada

 

24,912

24,912

 Subject identification

 

1,470

1,470

 Telephone interviewers

   

 Training

 

3,046

3,046

 Enrollment/eligibility calls

8,104

 

8,104

 Supplies

776

 

776

Total recruitment

14,778

29,428

44,206

Intervention components

 Tailored news letters

10,102

 

10,102

 Interviewers training and supervision

 

23,865

23,865

 Phone counseling/data management

 

11,872

11,872

 Project meetings and email

 

5,667

5,667

 Equipment and materials

 

2,890

2,890

 Personnel management

 

9,643

9,643

 Overheada

 

4,603

4,603

 3-Month intervention

21,974

46,668

68,642

Total recruitment plus 3-month intervention

  

112,848

aOverhead includes office tasks as printing, copy making, unscheduled staff meetings, phone conversations, intervention preparation time, commute to the intervention site where calls are made and newsletters are produced, etc.

Sensitivity Analyses

Using published data as well as market price listings, we estimated the potential intervention or implementation costs under alternative scenarios where we assumed hypothetical scenarios with and variation in intervention resources. The goal of this simulation was to evaluate potential cost offset by reduction in the training costs associated with phone interviewer coaching. Specifically, we substituted trained health educators in the place of untrained telephone interviewers, we used published cost data associated with multi-state Quitlines [31], and cost estimates associated with the substitution of phone counselors by interactive voice response. Due to the wide price range in Quitline counseling calls across different states, varying from $6 to $342 per phone call, as reported by Keller et al. [34], we used the median cost of $92.50 per call. IVR information was obtained through online vendor channels, and the average market price reported across 11 different vendors was estimated at $18,750, which included installation and operation costs. We also estimated costs associated with phone counseling sessions conducted by professional health educators using market wage rates. Because of the variation that exists in labor markets across the country, we also varied the estimated costs by 10% in both directions in order to demonstrate a potential range of total intervention costs for each alternative intervention scenario.

During the recruitment and enrollment phase of the project, significant attention was placed on IRB- and HIPAA-related information and consent requirements. These requirements may have had a limiting effect on the number of subjects who were recruited and enrolled into the study. Also, reports from the intervention staff suggested that the study was not operating at capacity given the availability of the telephone interviewers. To address issues related to economies of scale, we re-estimated incremental per-participant intervention costs. In this simulation, we exploit potential economies of scale and assumed that variable cost is evenly distributed across the participants and that variable costs increase and decrease in a linear fashion as a function of the number of intervention participants. We also assumed that fixed cost would remain constant up to the point of doubling the number of intervention participants.

Figure 1 presents the aggregate cost summaries of different interventions that comprise hypothetical input variations. Total intervention costs were estimated to be highest for the Quitline alternative at $80,786 where we employed the median reported cost. However, given the range of cost estimates reported by Keller et al. [34], we estimated that this cost could vary from $4,723 ($6 per call) to $280,112 ($342 per call). The lowest estimate was derived from the IVR scenario, where all telephone-based interviewers are replaced by a preprogrammed IVR system. The analyses in which we substituted CATI-assisted, untrained interviewers with trained health educators resulted in an estimated total intervention cost of $56,208. In this scenario, it is assumed that health educators would require minimal training and supervision.

Figure 2 presents the results associated with the sensitivity analyses in which we either reduced by half or doubled the number of participants. We estimated that variable cost per participant would stay constant at $134 ($224 for intervention plus recruitment). Fixed cost per participant would vary inversely with the number of subjects participating in the intervention. If the number was reduced by 50%, the total intervention cost per participant (variable + fixed) would increase to $703. If the number was doubled, the cost would decline to $276. Similar changes were noted when we used intervention cost estimates derived from the sensitivity analyses where health educators were substituted for trained interviewers. If the number of participants was doubled and health educators were employed—requiring less supervision and training—the cost per participant could hypothetically be reduced to $255.
https://static-content.springer.com/image/art%3A10.1007%2Fs12160-009-9088-5/MediaObjects/12160_2009_9088_Fig2_HTML.gif
Fig. 2

Scale analyses employing varying numbers of participants

Discussion

For many behavioral interventions, including the smoking reduction study described here, the limited duration of post-intervention follow-up often precludes the observation of cost offsets associated with reductions in health care resource use (and costs) attributable to the intervention. For tobacco control interventions, medical care cost savings attributable to cessation or reduction are not estimated to occur for upwards of 5 years [35, 36]. Similar issues may hold for changes in biological or physiological measures such as glycosylated hemoglobin (HbA1c), body mass index (BMI), or blood pressure. In addition, care must be taken when estimating medical care cost changes attributable to the intervention. Will the improvement in biological or physiological outcome reduce all utilization and costs or will it be condition specific? Care must also be taken to capture the potential “spillover effects” into other diagnoses or conditions. Is there a substitution from high-priced hospital or specialty care to novel, and perhaps costly, pharmaceutical therapies? These types of analyses are not trivial and usually require relatively complex econometric modeling to adjust for seasonality, variation in comorbidities and health status, along with issues related to nonnormal distributions of the dependent variable. Because of these issues, we have excluded them from this paper—which focuses simply on the cost of the intervention and estimates of costs under different implementation scenarios.

While the intervention cost analysis described here was developed as part of the original project plan, the resources required and the level of personnel time employed to collect the cost data were minimal. This type of intervention cost analysis may also be feasible for most behavioral interventions. While use of the prospective cost capture templates provided more detail, which enhanced the flexibility of the sensitivity analyses, retrospective cost analyses also provide most of the key cost estimates necessary for decision makers.

Recruitment costs are an important area and an area where cost differentiation is needed. For any intervention that is implemented or disseminated into a “real-world” setting, resources may be required for participant identification, marketing and outreach, and/or targeted recruitment. In this study, we made a significant effort to separate out research costs associated with informed consent, etc. However, we suspect that the IRB and HIPAA requirements associated with this project may have decreased the number of subjects who agreed to participate and therefore increased the estimated recruitment costs.

Low mailing costs and the relatively inexpensive technique of participant identification stood out as the most cost-efficient intervention methods. The sensitivity analyses highlighted some areas for improvement as well as several potential cost-efficient intervention techniques. For delivering phone calls, partial or complete substitution of untrained interviewers with health educators may have significantly reduced the training and supervision costs. While the use of IVR was estimated to be a cost-saving technique, little is known how this mode of intervention delivery would affect the participation rates. The scale analyses strongly suggested that the project was not operating at capacity. This finding is important with respect to dissemination and implementation. If a larger population was recruited, for example all smokers, not just those undergoing an outpatient procedure, the cost per participant may significantly decline. While little published research exists regarding the use of nicotine replacement therapy (NRT) for smoking reduction, potential intervention cost savings may also be obtained through the use of pharmacotherapies including NRT. The primary potential limitation of the methods described is that some of the more complex sensitivity analyses may require resources not available to some projects.

This study was limited to issues related to intervention cost analyses of associated behavioral interventions like the smoking reduction study described here rather than a full cost-effectiveness analysis that includes changes in outcome measures attributable to the intervention. In deciding whether to implement a behavioral intervention, a health care organization may need to consider not only the costs of the program but also the relative health benefits. The cost per quality adjusted life year saved (QALYS) is the standard outcome metric recommended by the PCHM [12]. QALYS combines an estimate of both the change in life years and quality of life gained attributable to the intervention, provides a framework for valuing health gains associated with interventions, can be used for priority setting, allows comparisons of the effectiveness of one intervention for a problem with the effectiveness of another intervention for the same problem, and allows for comparisons across disease areas to help show which programs provide the greatest allocative efficiency. However, with respect to prevention and behavioral medicine interventions where the impact on health outcomes may not occur for many years, QALYS are often difficult to estimate with precision. Also, for the decision maker such as a health plan manager or community health center administrator, QALYS may lack intuitive interpretation and may not translate to the direct dollars that it will cost to implement a particular intervention that has been demonstrated to provide a certain outcome or change in behavior (e.g., smoking cessation, weight loss, etc).

Future research on economic analyses of behavioral interventions should be directed towards understanding the resources needed to implement an intervention, in a variety of settings, given the expected outcome. Examples include estimates and sensitivity analyses associated with intermediate outcomes such as cost per person who reduced or quit smoking or cost per unit change in BMI, blood pressure, lipids, HbA1c, etc. Given the widespread use by employer benefit managers, health plans, and the Centers for Medicare and Medicaid of Healthcare Effectiveness Data and Information Set measures and pay-for-performance metrics [37], physicians’ groups and other health care providers may be able to use and leverage estimates of “incremental cost per incremental improvement in intermediate outcome” to justify deployment of these types of behavioral interventions.

Conclusions

This paper describes and illustrates the use of intervention costing methods for behavioral medicine using a recently reported smoking reduction intervention as an example. Most of the methods presented should be broadly applicable across a wide range of behavioral medicine interventions and provide important information not often available to decision makers. Key issues discussed include cost analyses perspective, methodology of the cost assessment, differentiation between research and development costs from recruitment and implementation costs, and sensitivity analyses. Behavioral medicine researchers are encouraged to make greater use of techniques such as those described in this paper to provide information needed by both policy makers and those considering adoption of interventions in different settings.

Acknowledgment

Funding was provided by the National Cancer Institute, grant #RO1 CA 90974-01.

Copyright information

© The Society of Behavioral Medicine 2009