Background

Clinical trial recruitment is an active area of study due to its importance in contributing to the success of clinical trials as well as its many practical challenges [1]. Clinical trials with ineffective recruitment efforts can lead to underpowered or failed studies [2] and can have significant financial and ethical implications [3]. Clinical trials often have difficulty recruiting underrepresented patient groups, resulting in study populations that do not reflect the targeted populations [4,5,6]. Chronic pain clinical trials, in particular, often have difficulty recruiting sufficient sample sizes and recruiting underrepresented patient groups, yet very few studies have investigated the success of different recruitment methods for chronic pain clinical trials [7].

In recent years, digital approaches to clinical trial recruitment (e.g., email, text messages, websites, and social media) have been compared to more traditional approaches (e.g., mailing, phone calls, newspaper advertisements, and media campaigns). The results in terms of response rates, costs, time to recruit participants, and access have been mixed, depending on the specific details of how the digital recruitment tools were implemented [8, 9]. However, several studies have shown that combining digital and traditional recruitment tools may have the potential to improve recruitment outcomes [10, 11].

The Learning to Apply Mindfulness to Pain (LAMP) study is a three-site clinical trial to test the effectiveness of mindfulness-based interventions (MBIs) for chronic pain. Patients with moderate to severe chronic pain were recruited from three U.S. Veterans Affairs facilities and were randomly assigned to two intervention groups (Group MBI and Self-paced MBI), which were compared against usual care. The primary outcome was change in the Brief Pain Inventory (BPI) interference score at 10 weeks, 6 months, and 12 months. Secondary outcomes include changes in pain intensity, global improvement in pain, anxiety, depression, fatigue, post-traumatic stress disorder, physical function, sleep disturbance, and participation in social roles and activities. Additional details can be found in the study protocol paper [12].

Partway through the LAMP study, we switched from traditional postal recruitment to email recruitment, which allowed us to compare the two recruitment modalities in terms of equity, efficiency, and effectiveness in a United States Department of Veterans Affairs (VA) population.

Methods

Patients were recruited from the Minneapolis VA Health Care System (MVAHCS), Durham VA Health Care System (DVAHCS), and VA Greater Los Angeles Healthcare System (VAGLAHS) if their electronic health record (EHR) showed qualifying pain diagnoses on at least two occasions within the same pain category, at least 90 days part, during the previous two years [12]. The qualifying pain categories were defined using the International Classification of Diseases, Tenth Revision, Clinical Modification (ICD-10-CM) diagnostic codes for common pain conditions [13]. To ensure generalizability of the pragmatic clinical trial, minimal exclusion criteria were used. This study is part of the LAMP trial, which was approved by the VA Central Institutional Review Board.

Recruitment materials were sent to six separate waves of patients, recruited at different times between 2020 and 2022. Each wave was a random sample of potentially eligible men and women from the included recruitment sites at that time. Part of the recruitment occurred during the COVID-19 pandemic. The recruitment method started as postal recruitment only for the first two waves, but we decided to try email recruitment for the remaining waves due to an increase in printing and mailing costs. We also felt that email recruitment might integrate better with the technology focus of the study. Women were oversampled from the identified population to try to get approximately even numbers of men and women randomized into the trial. Patients were either sent recruitment materials by postal mail or email, depending on their wave of recruitment. Their postal and email addresses were obtained from the VA EHR. The number of recruitment packets sent for each wave was based on an estimated response rate that would efficiently fill the intervention session times. We also made sure that the group sizes for each wave would not exceed the capacity of the Group MBI intervention facilitators.

Patients in the postal recruitment group, waves 1 and 2, were mailed a packet of documents that included information about the study and instructions for accessing the study website. The mailed packet included an optional quick response (QR) code that could be used to simplify the process of accessing the study website. They were also given contact information and a prefilled postcard to opt out of the study if they wanted. Patients who logged into the study website using a study-specific identifier were prompted to complete the study screener. The patients in the postal recruitment group were from the MVAHCS site and not the other sites.

Patients in the email recruitment group, waves 3–6, were first mailed an introductory postcard, a requirement of our Institutional Review Board (IRB). The postcard notified patients that they would receive an email about participation in the study. They were also given contact information and a website link to opt out of the study if they wanted. Approximately a week after they were sent a postcard, they were sent an email that contained the same information as the packet of documents sent to the postal recruitment group. No one who was sent email recruitment materials requested paper documents. Waves 3 and 4 were patients from the MVAHCS and DVAHCS sites. Waves 5 and 6 were mainly patients from the DVAHCS and VAGLAHS sites.

Reminder postcards were mailed to non-responders in waves 1–3, and reminder emails were sent to non-responders in waves 3, 5, and 6. Reminder emails were not sent to patients in wave 4 because the maximum number of participants who could be included for that wave had already been reached. In line with the pragmatic nature of the study, the reminder methods changed over the different waves as we tried to improve recruitment strategies.

Effectiveness was measured by the response rate of patients to the recruitment materials, where a response was defined as a patient logging into the study website using their study-specific identifier. No more than a single response per patient was recorded in the dataset. Logging into the study website was chosen as a response, as opposed to completing the study screener, because patients may exit the screener early for reasons that do not reflect their engagement with the recruitment method, such as inclusion and exclusion criteria. We then calculated a ratio of email to postal response rates by dividing the email response rate by the postal response rate.

Efficiency was measured by the response time for patients to respond to the recruitment materials as well as the difference in estimated costs. Response time was calculated as the number of days from mailing the packet of study documents to logging into the study website for the postal-recruited group and as the number of days from sending the email for the email-recruited group to logging into the study website. The postcard mailing date was not included in this calculation because patients were unable to log into the study website until they received either the mailed packet or email. To compare response times between email and postal recruitment, we generated box and whisker plots by recruitment strategy. The cost for postal recruitment materials included printing and mailing ten different items in the mailed recruitment packet. The cost for email recruitment materials was primarily the cost of the postcards.

Equity was based on an analysis of response rates by recruitment method across key demographic groups. We coded age, gender, race, ethnicity, and rurality based on the patient’s entry in the VA EHR. VA rurality data is based on the Rural–Urban Commuting Areas (RUCA) system, which classifies United States census tracts using measures of urbanization, population density, and daily commuting. We also conducted a multivariable analysis of response rates controlling for age, gender, race, ethnicity, rurality, and site.

Results

We identified 121,441 potential participants from the VA EHR and sent postal mail recruitment materials to 7986 patients and email recruitment materials to 19,333 patients (Fig. 1). Table 1 shows the demographic information for the patients sent recruitment materials. Due to the demographics of the Veterans at the different recruitment sites used for the different waves, the patients sent email materials were younger, more female, more ethnically and racially diverse, and less rural.

Fig. 1
figure 1

Recruitment flow diagram

Table 1 Baseline characteristics of patients sent recruitment materials

Effectiveness

Unadjusted response rates were higher for email recruitment (18.9%) compared to postal mail recruitment (6.3%). Additionally, in a multivariable analysis controlling for age, gender, race, ethnicity, rurality, and site, the adjusted response rates were over three times greater for email recruitment (RR = 3.5, 95% CI 3.1–3.8).

Most non-responders did not contact the study team. However, 1240 (15.5%) patients in the postal group actively refused the screener, mostly using opt-out postcards, compared with 289 (1.2%) of patients in the email group. Recruitment materials were returned to the study team due to bad address for 314 (3.9%) patients in the postal group and 521 (2.7%) patients in the email group. A small number of patients in both groups or their family members contacted the study team to inform us that the patient was ineligible or deceased. Additionally, a few patients were determined to be ineligible or to have deceased based on a chart review performed by the study team approximately six weeks after the recruitment materials were sent.

Following initial recruitment, 1524 of the 3662 (42%) responders in the email group and 213 of the 506 (42%) responders in the postal group completed the baseline survey, showing that recruitment method did not negatively affect engagement in other study activities. We ultimately randomized 667 (18%) responders from the email group and 144 (28%) responders from the postal mail group into the trial. Due to the unexpectedly high effectiveness of email recruiting, the maximum capacity of the intervention sessions was reached in later waves, and some eligible patients were not randomized to an intervention group in that wave but were included in next recruitment wave.

Efficiency

The time to respond to the recruitment materials was much shorter for email than postal mail recruitment (Fig. 2). The median time to respond was 1 day for the email group compared to 8 days for the postal group. Many people in the email group responded the same day that the email recruitment materials were sent. The cost of printing and mailing the recruitment materials to the 19,333 patients in the email group would have cost approximately $2.33 per participant, corresponding to a total of approximately $45,000 saved. Additionally, the personnel time required to prepare and send the recruitment materials was estimated to have taken 130–200 h for the 7986 people in the postal group and 5–20 h for the 19,333 people in the email group.

Fig. 2
figure 2

Response time following email or postal mailed recruitment materials. The median is indicated by the vertical line, the interquartile range by the box, and the 2.5th and 97.5th percentiles by the whiskers

Equity

Email recruitment had a higher unadjusted response rate than postal mail recruitment for every age, gender, race, ethnicity, and rurality category (Table 2). The unadjusted ratio of email to postal response rates was at least 2.2 for each subpopulation, and the highest unadjusted ratio was 7.1 for Black patients.

Table 2 Response rates

Discussion

We found email recruitment to be an effective, efficient, and equitable way to recruit VA patients to the LAMP study. The time to respond was consistently shorter for patients in the email group. The median response time of 1 day was shorter than the minimum estimated time to deliver postal mail. Response rates were higher for email recruitment overall and across individual subpopulations, including for older, non-White, Hispanic, rural, and female Veterans. Additionally, Black and multiracial patients had the largest ratio of email to postal response rates, highlighting the capabilities of using email to recruit populations often underrepresented in research.

The introductory postcards mailed to patients in the email recruitment group may have increased response rates by combining the benefits of digital and traditional recruitment methods, which has also been reported in other non-chronic-pain clinical trials [10, 11]. We heard from members of the study’s Veteran Engagement Panel, a diverse group of Veterans with chronic pain, that the introductory postcards lent credibility to the recruitment email, which would make the email less likely be disregarded, deleted, or marked as spam. Additionally, the recruitment email made it easy for patients to click on a link to access the study website, which would have required less effort than the patients who received the mail documents who had to manually enter the login information or use a QR code to access the study website. Overall, email recruitment combined with introductory postcards improved recruitment outcomes and reduced burden for both study staff and potential participants.

There were limitations to this study. We were not able to conduct a randomized controlled trial, which would have been the gold standard method to evaluate postal versus email recruitment. Due to the nature of the pragmatic clinical trial, different waves had different recruitment methods and were recruited at different times from different sites resulting in groups with different demographics. Response rates may have been affected by the different phases of the pandemic and other external factors at the time that each wave was recruited. Additionally, postal mail recruitment was only tested with patients from the Minneapolis VA site. Nevertheless, the multivariable analysis showed that response rates were greater for email recruitment after controlling for site (as well as age, gender, race, ethnicity, and rurality). Email-only recruitment (i.e., without postcards) was not tried with any of the waves, as this was not permitted by our Institutional Review Board (IRB). This study examined only VA patients, and the recruitment outcomes of email and mail recruitment might be different for non-VA populations. Other demographic factors that could impact recruitment, such as education status, socio-economic status, and household income were not available for analysis. Also, the study required interested participants to sign into a website, which may have been easier for those who received recruitment materials by email. We did not track the time required for email and postal recruitment and instead used an estimate. Finally, recruitment materials were sent during the COVID-19 pandemic, when people spent more time at home and might have been more likely to respond to recruitment materials.

Conclusions

Email recruitment is an effective, efficient, and equitable way to recruit VA patients to large-scale, chronic pain clinical trials. Postal costs and personnel time were also much less for email recruitment. Future studies are needed to further explore how email recruitment affects groups who do not have regular access to email via computer or smartphone. As more VA studies consider using electronic recruitment and data collection, it will be important to ensure that all Veterans have access to resources that enable them to participate in VA research.