The following practical guidance for using the online modified-Delphi approach covers three stages of stakeholder engagement—preparation, implementation, and evaluation and dissemination—and includes examples from our recent study (Fig. 2).
Preparing for Research
Co-Develop an Engagement Approach with Relevant Patient Representatives
Guideline developers should determine who should be engaged in the CPG process and work with patients, caregivers, and their representatives to design all engagement activities and data collection protocols. At this stage, developers should also consider whether patients may have substantively different perspectives than caregivers and, therefore, whether patients should be engaged independently from, or together with, caregivers in the CPG development process. Forming an advisory board (AB) could also be useful. Research suggests it is important to engage relevant stakeholders early on and ask for their input often [27]. Working with a patient advocacy organization can help locate patients, caregivers, and others with relevant perspectives who can provide input on patient needs, the feasibility of proposed engagement activities, appropriate participation burden, and acceptable remuneration for participation. Patient representatives can be instrumental in helping operationalize the engagement tasks, define key concepts, translate scientific information, and finalize research protocols [28, 29]. All research-related activities should be reviewed and approved by the institutional review board.
Examples We worked with the Duchenne Registry to identify key patient and caregiver partners and assembled a multi-stakeholder AB that included one adult with DMD, two caregivers, two clinicians, two genetic counselors, three researchers, and two guideline developers.Footnote 1 The AB was co-led by a caregiver and a Delphi expert who made sure that all decisions were made jointly and that the patient/caregiver voices were heard, valued, and given more weight (than those of the other AB members) in discussions related to decisions that may have affected what the panelists were asked to do and how the panel results were interpreted. We found patient and caregiver input particularly useful for helping us define, measure, and operationalize patient-centeredness in the guideline context (see Box). Caregivers and patients on the AB also helped us identify the recommendations that may be of interest to patients and caregivers. To ensure participants understood the complex medical information, we developed plain language explanations of each recommendation. Here, patients/caregivers worked with clinicians to finalize these descriptions. Using AB input, we also included the clinical rationale for each care consideration, a description of the process for following the guidance, and other relevant information, such as treatment burden.
Mirror Methods Used for Expert and Stakeholder Engagement
One way to increase the scientific rigor and legitimacy of patient engagement in CPG development is to adapt the methods that clinical experts use to develop guidelines. Because CPG development is labor intensive and time consuming, it is crucial to ensure that participants do not feel overburdened [30]. Finding a balance between rigor and ease of participation is key.
Examples To mirror the methods clinicians used for the 2018 DMD care considerations [26], we began Round 1 by providing study participants with data we collected in Round 0 on the reasons for, and the barriers and facilitators associated with, seeking care. We then asked participants to rate the patient-centeredness of guideline recommendations (Fig. 3). This corresponded to the step of providing clinical experts with a literature review before asking them to rate the appropriateness of different treatments. We also adopted a three-round modified-Delphi format and used a nine-point rating scale, which mirrored the appropriateness and necessity scales that clinicians used to develop the 2018 DMD care considerations. Finally, we adopted the RAM approach to determine consensus [25].
Pilot Test the Engagement Approach
It is best practice to pilot test any data collection with a small sample of qualified participants [31]. A pilot is particularly important for online modified-Delphi approaches [32] because the task is novel for a typical patient and there are nuances to using online platforms. It is also important for ensuring participants can actually use the online tool, especially if they have disabilities. Guideline developers and panel participants are not in the same room and cannot provide assistance in real time. It is important to ensure pilot testers are not counted as study participants.
Examples Based on our experiences [33], we recommend testing the clarity of participation instructions, recommendation wording, and rating criteria. A pilot allowed us to estimate the time that participation in each round was likely to take, which helps determine the amount of renumeration, if any. Asking testers for feedback at the end of the pilot via a survey or brief telephone interview can help identify how the wording of recommendations should be changed, what information to add or delete, or how to improve the engagement process. Based on feedback we received during the pilot, we reduced the number of recommendations that participants had to rate.
Recruit Participants with Diverse Perspectives
Expert panels are often criticized for not including diverse perspectives. A panel about the clinical appropriateness of carotid endarterectomy that includes only surgeons will arrive at different recommendations than a panel of surgeons, neurologists, primary care physicians, and radiologists [34]. The same can be true of patient panels. It is important to ensure that patient representatives have relevant experiences and to help them think about the experiences of a typical patient, especially if patient-only panels use a methodology that clinical panels adhere to.
Examples We found that using an established and curated patient registry was helpful for recruiting a panel with diverse views. While it may be difficult to know what types of patients may have different views on a given issue, we were able to reach the diversity goal by using previous research on patient preferences, recruiting demographically and geographically diverse panelists, and recruiting those in different stages of disease progression. If recruitment via registries is not possible, then screening should be used to confirm a participant’s expertise with a condition.
Assemble a Panel of Adequate Size and Composition
Assembling panels of adequate size and composition helps ensure effective and productive online discussion and account for attrition in online modified-Delphi panels. Research suggests empaneling approximately 40 participants; larger panels may increase participation burden during the discussion round, and smaller panels may become too small due to attrition [35]. Attrition is typical for all Delphi panels because they rely on iterative data collection [36]. It is not uncommon for online Delphi panels with only two rating rounds to have 50% participation rates, calculated by dividing the number of those completing all rounds by the number of those invited to participate [37].
Examples To account for attrition, we included 61 participants in each panel. To reduce attrition, we asked participants during recruitment to confirm their interest and intention to participate. We made sure both panels consisted of patients and caregivers to ensure diversity of perspectives. Because DMD is a rare pediatric disorder, most participants were parents of, or caregivers to, individuals with DMD, but we also included adults with DMD.
Implementation and Continuous Participant Engagement
Build Participant Research and Engagement Capacity
CPG groups require patients and their representatives to undergo extensive training on the CPG development process, which can make patients unwilling to engage [22]. Although an online platform can help reduce perceived participation burden, it is important to ensure that participation instructions and task descriptions are self-explanatory. Because some participants are more comfortable with online technologies and sharing disease experiences, CPG developers should try to put all participants on a level playing field.
Examples To build their capacity, we provided participants with instructions on how to participate in the online process and use the online platform. The instructions were modified based on the pilot results. We included instructional videos on how to log into ExpertLens and participate in each round. Because Round 2 used charts showing the distribution of participants’ responses, we provided explanations of what each chart showed, included tooltips that explained statistical terms, and color-coded group responses/decisions (i.e., green text identified recommendations that participants agreed were important or acceptable) (Fig. 4). In case participants had questions or technical issues, they received contact information for study staff, including the principal investigator, caregiver representative, clinician, and technical support personnel.
Build Two-Way Interaction
Although face-to-face interaction may be more engaging than online discussion boards, threaded discussion boards allow participants to engage in more thoughtful conversations and explore other participants’ ideas [38]. That is why encouraging two-way information exchange and lively discussions is particularly important for online modified-Delphi panels. Make sure discussion boards have a clear structure and allow participants to keep track of comments made by other participants. As with in-person expert panels, an experienced discussion facilitator is crucial. The facilitator’s role is to encourage discussion, solicit comments from all participants, and ensure that no single participant dominates the conversation [25, 39].
Examples In our experiences, providing the distribution of Round 1 responses and a summary of participants’ rationales in Round 2 helps promote discussion because participants see how their responses compare with those of other participants. A threaded discussion board structure makes it easier for participants to find the right place to share their opinions (Fig. 4). Using participant IDs helps ensure that all comments made by a given participant can be attributed to him or her, and the anonymity facilitates an open exchange of information. We found it useful for the user ID to show whether a participant was a caregiver or a patient to help participants contextualize their comments [49].
To ensure active discussion engagement, three trained discussion moderators (a caregiver, a genetic counselor, and a modified-Delphi expert) facilitated the discussions by reviewing and posting comments at least once a day. Moderators followed a guide (see Appendix A) and were instructed to focus on group dynamics, ask non-leading clarifying questions, promote direct engagement among participants, and answer factual questions about the study. They also provided access to additional informational resources as needed.
Ensure Continuous Engagement and Retention of Participants
Because participant attrition is common in Delphi panels [32, 36], it is important to keep panelists engaged throughout all study rounds. The Delphi method is less common than surveys and relies on iterative data collection. Panelists can participate at any time while each round is open but are expected to contribute to each round. Because of the time gap between rounds, reminding them about their participation is critical.
Examples To encourage continuous engagement, we informed participants about expected time commitments and paid them $US50 for completing each round. We sent personalized email invitations when each round opened and emailed up to three reminders to lagging participants during each round. We extended the round deadlines as needed. If requested, we allowed participants to perform Round 1 after Round 2 opened but before they saw other participants’ responses and comments. Such flexibility may be required when the condition of interest causes significant impairment or treatment burden. During Round 2, participants also received daily discussion digests informing them of when others posted new comments or responded to the participant’s own comments.
Conduct Scientifically Rigorous Data Analysis
Research shows that the methods used to measure consensus can have a significant impact on study findings [40] and calls for specifying how Delphi data will be analyzed before they are collected [41]. The RAM manual offers a validated and frequently used measure of consensus for nine-point Likert scales [25]. Moreover, Delphi panels have been criticized for low replicability of its findings [42]. Therefore, it is prudent to conduct more than one panel using the same protocol, balance panel composition on key variables that might affect outcomes, and include data from all panels in the a priori determination of group consensus [43]. Because the Delphi technique is based on a mixed-methods approach to data collection, thematic analysis of qualitative comments can help explain why consensus was or was not reached [44].
Examples To ensure rigor of our panel findings, we published our research protocol at the beginning of the project [18] and used the RAM to measure consensus [25]. We also ran two concurrent panels using the same protocol to ensure replicability of panel findings. We randomly assigned selected participants to one of two panels and balanced panels in terms of caregiver educational attainment, ambulatory status of the individual with DMD, and the distance to the closest PPMD Certified Duchenne Care Center [45], which we considered key variables that might affect determinations of patient-centeredness [46]. Our a priori criteria for patient-centeredness was that both panels had to agree that a recommendation was important and acceptable. Finally, we qualitatively analyzed all comments made by participants throughout the panel to determine points of agreement and disagreement and any differences in perspectives between patients and caregivers.
Evaluation and Dissemination
Evaluate Engagement Activities
Participant experiences with the Delphi processes are not typically evaluated as part of every panel. Understanding what works and what does not is important for measuring the quality of panel findings and the engagement process as well as for retaining participants during iterative data collection [47].
Examples All panels conducted using the ExpertLens system include questions that measure participant experiences and satisfaction with the platform [48]. For our study, we slightly modified these questions and asked them after Rounds 1 and 3. We also interviewed a diverse sample of individuals with DMD and their caregivers after the modified-Delphi process was completed [49].
Disseminate Results
Sharing results with participants [50] is a key principle of participant-centered research [51], and sharing individual results and overall study findings can help enroll and retain participants in longitudinal projects [52, 53]. Disseminating study findings to wider audiences, including patients, caregivers, clinicians, and guideline developers, is important not only for the conduct of rigorous and transparent research but also for improving care quality and helping develop future guidelines [2, 54].
Examples Feedback on Round 1 results provided to participants can serve as an important incentive to participate and engage in Delphi panels. In Round 2 of our study, we not only provided statistical summaries of Round 1 ratings, but also thematically analyzed the reasons behind participant ratings. We also emailed copies of Round 2 discussion comments to participants who requested them after the panels were completed. We presented preliminary study findings to our panelists using a webinar format that has been posted on the PPMD’s YouTube channel (https://www.youtube.com/watch?v=aps_E08C4fg). To reach a wider audience, we presented our results at the annual PPMD and G-I-N conferences, as well as at the Centers for Disease Control and Prevention, which was responsible for developing the 2018 DMD care considerations. In addition, we gave a G-I-N webinar, which was recorded and posted on the G-I-N North America’s website (https://g-i-n.net/library/webinars/g-i-n-n-a-webinars/a-new-online-approach-to-engaging-patients-andcaregivers-in-guideline-development/?searchterm=khodyakov). Finally, we published the results in peer-reviewed journals [19, 46, 49].