Background

Shared decision-making (SDM) is a process in which providers and patients exchange information, deliberate about available options together, identify the patient’s preferences, and incorporate those preferences in choosing the option or treatment plan [1]. In contraceptive care, SDM has the potential to improve the quality of patient-provider communication, promote patient autonomy, and enhance health and well-being by supporting the patient to choose the contraception option that matches their informed preferences.

SDM is also a model for operationalizing World Health Organization recommendations to offer “evidence-based, comprehensive information, education and counselling to ensure informed choice,” so that “every individual is ensured an opportunity for their own use of modern contraception…without discrimination” [2].

Patients who reported that “the provider and me together” decided what contraceptive method they would use were more satisfied with the decision-making process than those who reported other roles in decision-making [3]. Other research has found that SDM in contraceptive care is suited to the intimacy and complexity of this particular decision [4]. However, despite the suggested benefits of SDM interventions, and increasing calls from policy makers to use SDM as a strategy to promote patient-centered care [5,6,7], SDM has not been widely implemented in contraceptive care nor in other settings.

Implementation researchers have attempted to identify systematically the behaviors that may influence SDM adoption at the individual, organizational, and policy level [7,8,9,10,11,12,13,14]. For instance, in an evaluation of implementation of cancer screening SDM interventions in 12 California primary care practices, the clinician’s role was observed to be the most important factor for implementation, in combination with supportive infrastructure and the practice’s dedication to the goal of SDM [15]. However in the absence of supportive infrastructure (such as an electronic system for automatically mailing decision aids to eligible patients), implementation may be more challenging and require behavioral interventions that encourage adoption at the individual or team level [16,17,18].

The aim of this qualitative study was to explore the feasibility and acceptability of two interventions for facilitating SDM about contraceptive methods with a particular focus on factors that influenced their implementation by clinical and administrative staff. The study was embedded within a 2 × 2 factorial cluster randomized controlled trial (RCT), the Right For Me study, and conducted in 16 primary care and reproductive healthcare clinics in the Northeast United States [19].

Methods

Design

This qualitative study involved semi-structured, one-on-one telephone interviews with staff at clinics involved in the Right For Me trial (ClinicalTrials.gov Identifier: NCT02759939). Methods for the trial are published elsewhere [19]. This study was approved by the Dartmouth College Committee for the Protection of Human Subjects (STUDY00029945).

Theoretical framework

This qualitative study was informed by the Theoretical Domains Framework (TDF), an integrative framework based on psychological theories and key theoretical constructs related to behavior change [20, 21]. In operationalizing the TDF for this research, the study team used the framework in the design, data collection, and analysis of the qualitative investigation of the implementation, acceptability, feasibility, and sustainability of the Right For Me interventions.

The TDF consists of 14 domains describing processes underlying successful behavior change. These domains map onto the capability, opportunity, motivation-behavior (COM-B) implementation model developed by Michie and colleagues [22]. In this “behavior system,” capability, opportunity, and motivation interact to generate behavior, which in turn feeds back and influences those system components: “the skills necessary to perform the behaviour, a strong intention to perform the behaviour, and no environmental constraints that make it impossible to perform the behavior” [22].

Interventions

The two interventions comprised a patient-targeted intervention (video + prompt card) and a provider-targeted intervention (decision aids + training) in English and Spanish. These interventions are described in detail elsewhere [19] and summarized in Table 1.

Table 1 Right For Me shared decision-making (SDM) interventions

Implementation context and strategy

The 16 participating clinics were located in the northeastern United States, provided contraceptive counseling, and included Planned Parenthood affiliated clinics. Four were randomly allocated to the control arm (usual care), four to the patient-targeted intervention, four to the provider-targeted intervention, and four to both interventions. The trial was conducted during a 9-month period, with interventions implemented during the final 6 months.

Each clinic had an identified contact whose role as a senior staff member was to liaise with the research team and facilitate implementation of the interventions in their clinic. At the outset of the trial, a member of the research team (KD) visited each clinic and provided a group orientation on the trial objectives, scope, and data collection procedures. After clinics were randomized to trial arms, those clinics assigned to deliver one or more interventions received a similar, follow-up group orientation about the intervention(s). This orientation was facilitated by a presentation slide deck (see Additional file 1) and included instruction on intervention objectives, target audience, supplies provided, parties responsible for implementation-related activities, and intervention maintenance. Rather than providing strong direction on how to integrate interventions into the clinic workflow, this orientation provided examples of possible approaches and encouraged each clinic to collaboratively develop their own implementation strategy, considering the clinic workflow and other routinely used patient or counseling materials. The slide deck remained accessible to clinics via the trial website.

Interview participants

Participants included clinical and administrative (e.g., front desk) staff aged 18 years or older who worked in one of the 12 intervention arm clinics, had email access, and consented to being audio-recorded. Depending on the clinic, physicians, physician assistants, nurse practitioners, and/or healthcare associates provided contraceptive counseling. A sampling frame was developed by KD identifying each study clinic, the names, demographics, and roles of staff in each clinic, and the project contact. The 12 clinics assigned to an intervention arm had approximately 70 staff potentially involved in implementation.

Each project contact contacted their colleagues directly and via posters in the clinic spaces and asked permission to share their email addresses with the research staff. Interested staff were emailed the study information and invited to participate in one telephone interview. Recruitment and interviews were conducted by a qualitative researcher (SM), who sampled participants first based on their role in the clinic (clinical or administrative) by referencing the sampling frame, and then purposively to seek variation in age, gender, race, profession, and years of experience. We also sought extreme case examples of clinic staff that reported having no memory of seeing or using the interventions despite having a clinic role where they would have been expected to have used them as part of their work. We did so by returning to the sampling frame and identifying participants who had either left their position or joined the clinic during the study period and were likely to be less familiar with the interventions.

Study materials

The interview guide was based on the TDF domains, constructs, and definitions provided by Michie et al. [23] and Cane et al. [20]. Development consisted of three stages: (1) drafting a preliminary list of open-ended questions that explored each of the 14 TDF domains and a 7-item demographic questionnaire, (2) adding probes that explored domain constructs, and (3) pilot testing with three health professionals from the Right For Me research team for comprehensibility and relevance. At each development stage, additional feedback was sought from the research team.

Data collection

Telephone interviews were conducted by SM from January to April 2017, immediately after the intervention implementation period. SM was not involved in the design or data collection of the trial or in the development of the interventions undergoing evaluation and had no relationships with the staff invited to participate. Participants were compensated with a $30 Amazon gift card. Data collection continued until (a) all participants in the sampling frame had an opportunity to respond to the request for an interview, (b) the sample demonstrated sufficient variation, and (c) data saturation was achieved (the interviews did not generate new insights regarding implementation factors). Interviews were audio-recorded and transcribed by a professional transcription service. Each was assigned a numeric identifier (e.g., “001”) and minor edits were made to remove potentially identifying information about staff and their clinics.

Data analysis

Thematic analysis principles [24] were used to guide data analysis, which sought to identify the domains most relevant to the implementation, feasibility, acceptability, and sustainability of the interventions in practice. SM led the analysis with support from two public health researchers (KD, RM) and a nurse-researcher (DA). Each researcher first reviewed one quarter of the transcripts while listening to the audio recordings to become immersed in the data, check the transcription for accuracy, and remove names and other potential identifiers. Verification strategies were pursued throughout to ensure validity and avoid subjective bias, including having multiple researchers involved in codebook development and analysis, and constant comparison. We kept memos throughout to facilitate concurrent data collection and analysis, to maintain a data trail, and document our interpretive choices.

Codebook development

Codebook development at the TDF domain level

At the outset of the study, concurrent with the development of the interview guide, we developed a codebook consisting of the a priori TDF domains and theoretical constructs [20]. We first coded a random sample of transcripts with the codebook and met to discuss our interpretations. Through discussion, we distilled the list of 84 constructs to 39 that were most relevant in influencing clinic staff behavior. A construct was deemed relevant if it was (a) frequent, (b) participants demonstrated conflicting attitudes and beliefs about the construct, and/or (c) the construct was associated with strong attitudes and beliefs [21].

Operationalizing context-specific descriptions

To provide consistency in our deductive coding and interpretation, we operationalized the TDF constructs into context-specific descriptions [21]. These descriptions resulted from inductive analysis of the transcripts and were iteratively refined until reliability was achieved between the coders. Our final codebook consisted of both the TDF domains (themes) and context-specific descriptions (sub-themes).

Coding the transcripts

The interviewer (SM) coded all transcripts using the finalized codebook, facilitated by Atlas.ti qualitative analysis software (version 1.6.0 for Mac). RM, KD, and DA then received one-third of the coded transcripts, which they independently coded in duplicate to determine if the TDF domains and constructs were interpreted consistently, and to suggest additional codes. We compared coding and resolved disagreements through iterative discussion. Discrepancies in coding were generally due to different interpretations of constructs or lapses in attention. We met as a team through regular teleconferences to discuss, compare, synthesize, and map relationships between findings; compare our findings to our theoretical framework [20, 23]; and generate interpretive insights about the data. We discussed the data collection and analysis and sought feedback on results in progress with the larger research team via our recurring monthly teleconference, one-on-one phone calls, and a face-to-face workshop with patient and stakeholder partners.

Results

Overview

Interviews were conducted with 29 clinic staff from 11 of the 12 intervention clinics (see Table 2). All clinic staff identified as female and were predominantly White, reflecting the demographics of the region, and majority gender of professionals in contraceptive care. The majority had a clinical (n = 16, 55%) or both clinical and administrative role (n = 4, 14%) that involved providing contraceptive counseling to patients. Interviews were 20 to 40 min in length. Staff invited to participate from Clinic 4 (decision aids + training) either did not respond to invitations or declined to participate. Reasons for declining included being too busy, feeling they had nothing meaningful to contribute, or being a recent hire to the clinic with no interaction with the interventions.

Table 2 Characteristics of the clinic staff participant sample (n = 29)

Each clinic chose to implement the interventions using a different approach (see Table 3). Some clinics chose to keep the decision aids in an easily accessible space, such as on a desk or in a wall display holder in exam rooms (Clinic 1, 9–12). In most clinics, the video and prompt cards were handed to patients in the waiting room (Clinics 6, 11, 12) or private exam room (Clinic 7–10) prior to the visit.

Table 3 Characteristics of the clinic settings (n = 11)

When analyzed through the lens of the Theoretical Domains Framework, we observed that each implementation approach was the result of dynamic interaction between multiple domains. It was necessary first for clinic staff to have the capability to implement the interventions, primarily knowledge, skills, memory, and behavioral regulation to use them routinely. Their behavior was also motivated by their social/professional role and identity, beliefs about consequences, and goals. Implementation was also modified by factors that influenced the opportunity to engage in the implementation behavior: social influences (of patients) and their environmental context and resources (see Table 4).

Table 4 Mapping of the Right For Me findings to the Theoretical Domains Framework (TDF)

Capability

A necessary antecedent to implementation was first being aware of the interventions (“Knowledge”) and how to use them (“Procedural Knowledge”) (see Table 5 for representative quotations). While most staff were aware of the decision aids and patient video, some reported that they never watched the video or viewed the prompt card, were absent during the study team orientation, or did not notice the video and prompt card until partway through the implementation period. This led some to be unclear about the purpose of these interventions. In contrast, all clinical staff knew about the purpose of the decision aids and reported tending to use them with fidelity, following the steps provided in the online training (“Explain it, Give it, Use it”). In the context of Clinic 2, however, staff reported using the decision aid after the patient had made a choice, as a way to discuss the benefits and harms of the contraception method:

So they choose the method. We review the medical history with them. Then I’ll say, ‘Oh, we’re going to bring you upstairs to see the practitioner, or we’re going to bring you upstairs to see the nurse and I’m also going to give you our handout to go home with as well, which says a little bit more about the rare side effects and complications of the methods.’ (015, Clinic 2, Clinical role, Decision aids + Training)

Clinical staff perceived that the group orientation and decision aid training reinforced their proficiency in patient communication (“Skills”). There was vast heterogeneity in how each staff member engaged with the training; some clinics had all staff, both clinical and administrative, complete the training as a team at the outset of implementation, while others asked them to complete it during their own time, and a minority was unaware that they could have accessed the training themselves.

Table 5 Factors related to “Capability” and representative quotations

Clinical staff across all 12 clinics had a strong perception that SDM was already a part of their contraceptive counseling approach (“Professional Role”), was appropriate for their clinical context (“Organizational Culture/Climate”), and was facilitated by their use of existing contraceptive counseling resources (“Resources/Material Resources”):

We didn't need a whole lot of training, because this is what I do all the time. So, I have different decision tools, I found them useful, and I didn't – I've been doing it for 20 years ... I didn't need a lot of training to know how to use these with women. Because I also have been trained in shared decision making, and motivational interviewing, and all of that. (002, Clinic 1, Clinical role, Decision aids + Training)

Descriptions of their contraceptive counseling approaches suggested that they successfully used the decision aids to engage in components of SDM. For instance, participants from reproductive health care clinics explained that they typically placed the decision aid providing an overview of all contraceptive methods on a desk between themselves and the patient and used it as a visual aid while asking open-ended questions to elicit the patient’s preferences and identify the features of a birth control method that would suit the patient’s needs. Despite this, health care providers did not always report taking the next step of asking patients to indicate their preferences in writing (“Procedural Knowledge”). For some, this was due to perceived patient disinterest or discomfort with this step (“Social Influences”) and, in these contexts, patient influences were a factor in staff use of the decision aids: “It makes me think that perhaps the staff stopped even utilizing that aspect of that suggestion over time because people weren’t taking them up on it.” (015, Clinic 2, Clinical, Decision aids + Training).

Clinical staff perceived that getting into the habit of using the decision aids (“Behavioral Regulation”) was relatively easy because of their similarity to existing contraceptive counseling resources. However, participants explained that providing a tablet computer to patients and asking them to watch a video before their clinical visit required a change to existing routines for administrative staff. Three clinics (7, 11, and 12) got into the habit of implementing the prompt cards routinely by handing them out to each patient prior to the visit, while others clinics (5, 6, and 10) chose to place a stack of cards in the waiting room or exam rooms and did not interact with the cards thereafter. Notably, staff from clinics that did not have a plan for using the prompt cards and did not make them part of staff routines were the same ones unaware that the prompt cards existed (see Table 3).

Participants that reported gaining proficiency in how to use the interventions typically perceived clear leadership from the project contact, who acted as a liaison with the research team. For instance, after experiencing significant staff turnover, the project contact in Clinic 9 began providing one-on-one, in-person training to staff in how to use the decision aids interactively, using a role-playing technique. Other project contacts had staff review all of the interventions to create common understanding (“Knowledge”), while others worked with staff to develop a plan for how to use them (“Action Planning”).

Clinics implemented the interventions in a way aligned with their work style, contextual needs, and team dynamics (see Table 3). Most clinics developed a plan at the outset of the study for how, where, and when they would integrate the interventions into clinic workflow, mental reminders, and talking points (“Action Planning”), as one participant described:

The biggest thing was I think creating the space for the [interventions], so like physically and mentally. So rearranging the rooms in a way so that we have those sort of stackable file holders, and putting all of the tear-off sheets in those in an order and in a way that made sense. (016, Clinic 9, Administrative role, Both interventions)

Handing the interventions directly to patients and explaining their purpose was more acceptable and feasible than relying on the patient to use them if interested. In contrast, in the context of Clinic 5, participants used a passive strategy for the video and prompt card because staff perceived it was the patient’s responsibility to engage with them:

We just kind of allowed them to watch the video without any intervention on our end to say, ‘Feel free to watch this or we’d like you to watch this beforehand,’ or ‘You have to watch this on your visit.’ It was kind of freeform. (003, Clinic 5, Administrative role, Video + prompt card)

Motivation

Clinic staff were motivated to implement the interventions if they felt that they were aligned with their existing roles and responsibilities (“Professional Role”) and added value to their work (“Reinforcement”) (see Table 6 for representative quotations). The content of the decision aids reflected what clinical staff would typically discuss with patients, and provided a textual cue to reinforce their “talking points.” One key contextual factor in Clinic 10 was that the project contact changed her implementation approach and encouraged nurses to use the decision aids as part of their responsibilities after observing that they had not adopted them into their professional role.

Table 6 Factors related to “Motivation” and representative quotations

While decision aids were perceived to help clinical staff exercise their existing responsibilities for contraceptive counseling, handing out the videos and prompt cards did not help front desk staff complete their administrative tasks, and this misalignment was a barrier to implementation (“Reinforcement”).

Administrative staff did not know what took place between patients and clinical staff after the patient had watched the video, while clinical staff reflected that they did not know which patients had watched the video in the waiting room: “I’d be curious to know what the study shows as far as patients who took the survey who also said they watch the video. It’s just a tool that was much more hands off for me” (013, Clinic 11, Clinical role, Both interventions). Staff who watched the video had some criticisms of the content, namely that the featured patient was not representative of their patients and sounded “rehearsed,” negatively impacting their motivation to implement it. In contrast, clinical staff involved in contraceptive counseling illustrated the motivating value of directly observing or experiencing a positive effect as a result of using the decision aids (“Reinforcement”). For instance, some staff shared that, after using a decision aid, patients chose a method of contraception that seemed best aligned with their preferences.

Opportunity

The physical context of implementation (“Environmental Context and Resources”) influenced staff members’ motivation and plans to use the interventions. The implementation process was perceived to be easiest in clinics with a self-described “small team” and low caseload, and/or where staff felt they had flexible procedures and infrastructure to adapt the interventions to fit existing routines and their clinic environment (see Table 7 for representative quotations). Implementation was also facilitated in contexts where reproductive health clinics perceived that their organizational routines and priorities were aligned with the goals for the study (“Organizational Culture”). One participant clarified that because their organization follows the “same guidelines and expectations” for informed contraceptive choices “it was just easy and natural. It flow[ed] very naturally for us” (011, Clinic 3, Clinical role, Decision aids + Training).

Table 7 Factors related to “Opportunity” and representative quotations

Clinical staff typically felt that the decision aids fit easily into the clinic or counseling room space, and helped to make appointments shorter by creating a more focused conversation. A minority of clinical staff perceived that decision aids “trigger[ed] more questions, and therefore a 15-minute visit would tend to be 20 or 25 minutes” (017, Clinic 10, Clinical role, Both interventions). However, these staff typically found the longer visits easy to adjust to (“Environmental Stressors”). Staff that received both interventions did not perceive that having multiple components added to their implementation “load,” in part because of the division of labor between clinical (decision aids + training) and administrative (video + prompt card) staff. Rather, comments about time pressure and workload were more common among staff who were already experiencing environmental stressors, such as participating in other research studies or having unexpected staff turnover.

Staff from clinics with existing approved counseling materials used these and the decision aids together, so that one supplemented the other (“Resources/Material Resources”). Few staff felt that using multiple resources was cumbersome in counseling, because the resources were so similar. As described above, a minority of staff at reproductive health care clinics perceived that they already do SDM (“Competence”) and this attitude led them to believe that the decision aids and accompanying training were redundant (“Motivation”). No staff suggested that there were any existing materials that were replaced by or supplemented the video or prompt card.

Finally, one of the core domains that influenced implementation behavior was the “Social Influence” of patients on staff members’ routine use of the interventions. While staff perceived that patients found the decision aids acceptable, they felt that patients had minimal interest in the video and prompt cards, potentially because using them was inconsistent with typical waiting room behavior. Nonetheless, staff felt that both interventions were appropriate for their patient population, in particular for those with lower education and literacy, and those making their own health care decisions for the first time.

Sustainability

When asked if they would want to continue using the interventions now that the trial was complete, staff reported that they would like to continue using the decision aids but had mixed feelings about the video and prompt card. Without the tablets provided by the Right For Me study, most felt that their clinic would be unlikely to continue use of the video in its current format. While staff were keen to continue using the decision aids, those affiliated with a larger organization or network also felt that future implementation decisions would have to be made “high up,” for consistency of content and branding across the organization.

Discussion

Our findings suggest that the decision aids were more acceptable, feasible, and sustainable than the video and prompt cards. Awareness of these interventions, knowing how to use them correctly and competently, integrating the interventions into regular workflow, and having a professional role and organizational culture that supported using the interventions appeared to facilitate intervention implementation. Clinic environments, workflow, and physical space supported implementation of the decision aids, but did not facilitate use of the video and prompt card. While some facilitators are context-specific, our findings suggest that introducing interventions will not be successful without the resources required to modify existing routines and to monitor and sustain behavior change.

In clinics where implementation was explained as relatively weak, it appears that the interventions were not considered an essential professional responsibility. While integrating a video via tablet into a busy waiting room may not be a feasible strategy for facilitating SDM, integrating paper-based decision aids into clinical routines may prove more successful. However, it requires negotiating and planning as to what the task is, who does it, how it gets done, and whether it adds any real value.

Elwyn and colleagues conducted a thematic analysis of qualitative interviews embedded in the intervention phase of a trial of similar clinical encounter decision aids for treatment of knee osteoarthritis [25]. Before using the decision aid, clinicians expressed concern about time pressures, patient resistance, and patient information overload [25]. After minimal training, the same clinicians perceived that the decision aid was acceptable and helpful, and it had changed their usual way of communicating. In the USA, case studies of implementation of clinical encounter decision aids in routine care suggested that physicians may not perceive the decision aids have utility, particularly for patients with low literacy [26]. Lack of suitability for patients was not a factor that emerged from analysis of our clinic staff interviews; rather, participants suggested that patients with low literacy or limited education would benefit most from interventions that facilitate SDM.

Our findings suggest that participants in clinics that implemented both interventions did not experience implementation overload in comparison with those that were exposed to one only. Similar to a study investigating implementation of a clinical encounter decision aid for circumcision, we observed that gaining the skills to use the decision aid through practice (a learning curve) was necessary [27].

The research team overestimated some clinics’ capacity to self-organize in designing and preparing for implementation even when the interventions were conceptually aligned. However, the solution for bridging this capacity gap is unclear. The MAGIC (making good decisions in collaboration) program, which sought to implement SDM into routine primary and secondary care, similarly observed that clinical teams feel they already involve patients in decisions about their care [13]. In that program, hands-on role-playing that promoted practical skills and exercises to change embedded attitudes helped to show clinicians how SDM differed from their current practice. Changing individual SDM behavior in contraceptive care may thus require more interactive training, such as role-playing, that emphasizes both skills (Capability) and the value of the skill to the individual and their organization (Motivation).

All clinics had a strong perception that SDM was already a part of their organizational culture, and was facilitated in some clinics by use of existing educational resources. The high acceptability of the decision aids may stem from our extensive provider consultation about their content [28]. Implementing the decision aids was perceived to be a simple step at implementation (e.g., swapping their existing resources for the Right For Me interventions) but sometimes difficult to remember on a day-to-day basis. Not all staff randomized to the decision aid and training intervention reported completing the online modules or using the decision aids as intended (e.g., used them during the counseling encounter; wrote on them). A meta-analysis of six randomized controlled trials conducted in US practice settings [29] similarly observed that few clinicians used clinical encounter decision aids with fidelity. The authors of the meta-analysis observed that, after implementing the interventions, clinicians used them as intended only partially and inconsistently, and that higher fidelity was associated with increased patient knowledge and patient involvement in decision-making [29]. Such findings have led Montori and colleagues to suggest that “the answer is not in” regarding the effect of decision aids on SDM [30].

Clinics exposed to the video and prompt cards had limited awareness of the cards, and perceived that the videos were difficult to integrate into routine workflow and were of limited interest to patients. The implementation process was seen to be easiest in smaller clinics, or where staff felt they had flexible procedures and infrastructure to adapt the interventions to fit routine practice. The success of implementing these new routines was also dependent on the actions of clinic patients, who may either accept or decline to use the Right For Me interventions. Survey responses from patient participants in the trial will provide further insight into the number of patients who reported using the Right For Me interventions, and what proportion would recommend them to a friend (e.g., acceptability). Participants felt that both interventions were most appropriate for their low health literacy patients. However, these attitudes may not be supported by emerging literature. Recent investigations from Australia suggest that generic question sets alone, like those used in our video and prompt card, are not sufficient to support shared decision-making among adults with low literacy [31] and additional strategies may be required to improve understanding of SDM terms and probability concepts [32].

The video and prompt card used in this study were adapted from the “Ask, Share, Know” program previously tested and implemented in an Australian primary care setting [33]. A systematic review of the use of question prompt lists in routine practice highlighted the importance of “endorsement,” that is, when the list is not given to or mentioned by the clinician, studies demonstrate inconsistent findings with respect to patients’ question asking [34]. In our study, this construct was reflected in clinic staff “motivation.” Clinical staff were largely unaware of which patients were exposed to the video and prompt card, and wished to know so they could respond to the patients’ questions and observe whether or not they were useful to contraceptive counseling. Implementation of the video and prompt card may thus require an organization-wide or team-based training that increases clinician awareness of and motivation to engage with them.

Our study findings also suggest that there may be differences in implementation practices for patient-targeted interventions implemented by administrative staff (video + prompt card) versus those that are intended for the provider to use with the patient (decision aids). The organizational or institutional context may also play an important role. Sexual and reproductive health clinics and organizations have well-established norms, such as organization-wide counseling protocols and branding. These norms may represent a double-edged sword—they provide the capability, opportunity, and motivation for staff to engage in SDM, but may be inflexible to change.

Limitations of this study include lack of accompanying observational strategies for assessing implementation success. This meant we were unable to investigate the relationships between staff perceptions and actual implementation. We also took measures to minimize social desirability bias, but some participants may have over-reported positive and under-reported negative perceptions, attitudes, or experiences. In spite of our partnerships with and recruitment support from clinic staff at each study clinic, we received limited interest in participation from some clinicians, leading to no data for Clinic 4 and only one participating clinic staff each for Clinics 8 and 11. Findings from those clinics should be interpreted in relationship to the other settings, not individually.

The strengths of this study include our systematic application of a theoretical framework for behavior change [22] to develop SDM interventions, comprehensive evaluation of their use in routine contraceptive care, and identification of factors that influence their use. Having an independent researcher conduct and analyze the interviews mitigated potential interviewer and reporting bias. Finally, by including administrative staff in the study sample, we gathered data on the implementation experience of stakeholders across clinic organizations. Interviews with administrative staff provided critical data on the feasibility of the video and prompt card interventions, which would not have been collected through interviews with clinical staff alone.

Conclusion

Our results suggest that clinical and administrative staff perceived the clinical encounter decision aids to be more acceptable and feasible to implement than the patient video and prompt card questions. Implementation of interventions that align with existing roles, tasks, and workflow may have greater acceptability, feasibility, and sustainability than those that require new procedures and infrastructure. We demonstrated how use of the Theoretical Domains Framework can be used to understand the factors that influence implementation of SDM, and to create interventions that are theoretically and behaviorally informed. Future studies could build on our findings of the factors that influence implementation of SDM and use the Behavior Change Wheel and COM-B frameworks to characterize and design strategies for implementing our study interventions in different settings [22].