Increasing Patient Engagement in Pharmacovigilance Through Online Community Outreach and Mobile Reporting Applications: An Analysis of Adverse Event Reporting for the Essure Device in the US
- First Online:
- Cite this article as:
- Bahk, C.Y., Goshgarian, M., Donahue, K. et al. Pharm Med (2015) 29: 331. doi:10.1007/s40290-015-0106-6
- 2.1k Downloads
Preparing and submitting a voluntary adverse event (AE) report to the US Food and Drug Administration (FDA) for a medical device typically takes 40 min. User-friendly Web and mobile reporting apps may increase efficiency. Further, coupled with strategies for direct patient involvement, patient engagement in AE reporting may be improved. In 2012, the FDA Center for Devices and Radiologic Health (CDRH) launched a free, public mobile AE reporting app, MedWatcher, for patients and clinicians. During the same year, a patient community on Facebook adopted the app to submit reports involving a hysteroscopic sterilization device, brand name Essure®.
Patient community outreach was conducted to administrators of the group “Essure Problems” (approximately 18,000 members as of June 2015) to gather individual case safety reports (ICSRs). After agreeing on key reporting principles, group administrators encouraged members to report via the app. Semi-structured forms in the app mirrored fields of the MedWatch 3500 form. ICSRs were transmitted to CDRH via an electronic gateway, and anonymized versions were posted in the app. Data collected from May 11, 2013 to December 7, 2014 were analyzed. Narrative texts were coded by trained and certified MedDRA coders (version 17). Descriptive statistics and metrics, including VigiGrade completeness scores, were analyzed. Various incentives and motivations to report in the Facebook group were observed.
The average Essure AE report took 11.4 min (±10) to complete. Submissions from 1349 women, average age 34 years, were analyzed. Serious events, including hospitalization, disability, and permanent damage after implantation, were reported by 1047 women (77.6 %). A total of 13,135 product–event pairs were reported, comprising 327 unique preferred terms, most frequently fatigue (n = 491), back pain (468), and pelvic pain (459). Important medical events (IMEs), most frequently mental impairment (142), device dislocation (108), and salpingectomy (62), were reported by 598 women (44.3 %). Other events of interest included loss of libido (n = 115); allergy to metals (109), primarily nickel; and alopecia (252). VigiGrade completeness scores were high, averaging 0.80 (±0.15). Reports received via the mobile app were considered “well documented” 55.9 % of the time, compared with an international average of 13 % for all medical products. On average, there were 15 times more reports submitted per month via the app with patient community support versus traditional pharmacovigilance portals.
Outreach via an online patient community, coupled with an easy-to-use app, allowed for rapid and detailed ICSRs to be submitted, with gains in efficiency. Two-way communication and public posting of narratives led to successful engagement within a Motivation-Incentive-Activation-Behavior framework, a conceptual model for successful crowdsourcing. Reports submitted by patients were considerably more complete than those submitted by physicians in routine spontaneous reports. Further research is needed to understand how biases operate differently from those of traditional pharmacovigilance.
Spontaneous adverse event reporting to the US FDA was encouraged using an easy-to-use Web and mobile app along with engagement of a Facebook patient group, specifically for Essure, a hysteroscopic sterilization device.
A total of 1349 valid reports were received through the app over approximately 19 months, equivalent to 15 times more reports than through traditional channels, with high completeness scores.
The reports were characterized including symptoms and outcomes reported, and the motivations and incentives in this engagement model for pharmacovigilance are discussed.
It is widely acknowledged that once medical products are on the market, adverse events (AEs) are severely underreported [1, 2]. While legally binding reporting from manufacturers to regulatory agencies and formal post-marketing studies are conducted, capturing complete information continues to be a challenge. The challenge is exacerbated for implantable medical devices, midst pharmacoepidemiology practice developed largely for drugs or biologics . For example, in the USA, nearly a third of mandatory reports from device manufacturers are delivered later than the 5-day requirement, creating bottlenecks in which information becomes lost, misinterpreted, or delayed .
Meanwhile, patient-reported outcomes are now accepted for clinical trials with new medical products , and we have seen renewed focus on patient-reported outcomes in comparative effectiveness research [6, 7]. However, only 2 % of post-marketing reports about medical devices received by the US Food and Drug Administration (FDA) come from patients . Part of the reason for low patient participation is likely due to reporting burden; the FDA estimates it takes from 40 to 73 min to complete a report for a medical device using the MedWatch 3500 (voluntary) and MedWatch 3500A (mandatory) forms, respectively . While patient reporting for drugs has been shown to be a valuable addition, with many countries encouraging this practice [10, 11, 12, 13], the same has not been seen for medical devices; whether patient reporting for implantable medical devices can provide high-quality, complete, and novel information remains an open question.
Despite limited patient engagement in device AE reporting, online tools have served to expand participation in public health reporting, generally termed “digital disease detection.” These tools have been applied in monitoring infectious diseases , and “crowdsourcing” or “participatory epidemiology” efforts in this domain have proven especially successful . Crowdsourcing consists of systematic efforts to collect information from a wide audience, particularly through the use of online tools, that are mutually beneficial to the participants and activity sponsors . Keating and Furberg’s conceptual framework for crowdsourcing, known as the Motivation-Incentive-Activation-Behavior (MIAB) model, deconstructs crowdsourcing into components that are required for its successful implementation . While successful crowdsourcing requires open and active communication between participants and sponsors, existing pharmacovigilance practices typically yield limited feedback and communication.
The objective of this study was to assess the potential for participatory epidemiology in post-marketing medical device surveillance, specifically by engaging an online patient community to encourage submission of individual case safety reports (ICSRs) through an online tool. We evaluated the quality of data collected through a Web and mobile app called MedWatcher and applied the MIAB framework to characterize successful patient engagement. The device of interest is Essure® (Bayer HealthCare Pharmaceuticals, Inc., Parsippany, NJ, USA), the first hysteroscopic sterilization device approved in the USA [18, 19]. This class III device contains an inner coil of stainless steel with polyethylene fibers and an outer coil of titanium-nickel . The coil is placed into each fallopian tube during an outpatient visit, and is promoted as “permanent birth control.” Subsequent tissue growth around the coil occludes the fallopian tubes, with a confirmatory test 3 months post-procedure to confirm correct placement and blockage .
2.1 Data Sources
MedWatcher is a Web and mobile app developed by Epidemico (Boston, MA, USA) and launched in September 2012, freely available to the US public for streamlined and user-friendly AE reporting to the FDA. MedWatcher was developed in partnership with the FDA Center for Devices and Radiologic Health (CDRH) to overcome the limitations posed by traditional reporting methods. It is available in English language, on iOS  or Android  devices as well as on a mobile-optimized website.
Using MedWatcher, patients and physicians can submit AEs for medical devices, drugs, vaccines, and biologics. The app’s report form corresponds to the fields of the MedWatch 3500 form and requires an event description and email address. Optional fields include sex, age, event outcome, and image file. Users receive an email confirmation with the content of their report, formatted to allow printing and sharing with their care providers. In parallel, the system automatically prepares the ICSR in an E2B format . ICSRs are processed in a secure cloud computing environment, manually reviewed to remove spam and test submissions, and transmitted electronically to CDRH using a dedicated voluntary reporting gateway, where automatic consistency, formatting, and completeness checks are conducted before each report is entered into the Manufacturer and User Facility Device Experience (MAUDE) database.
Two-way communication through the gateway allows for FDA case report numbers to be provided back to the patient via the app, enabling consolidation of follow-up reports. As stated in the terms of service for the app, the narrative text of each report is de-identified and shared publicly with other app users, fostering a sense of community and creating a source for safety information. For this analysis, we obtained public, redacted narratives involving Essure from the MedWatcher website , starting from the first submitted report (May 11, 2013) through to December 7, 2014.
2.2 Product Selection
We selected Essure as the product of analysis because it was the most frequently reported product via the MedWatcher app. Of the 3290 MedWatcher reports received as of June 2015, 2600 involve Essure. We also had significant and successful patient engagement with Essure users through social media.
Essure was originally approved in the USA in 2002. The “Summary of Safety and Effectiveness” reported AEs during pivotal trials, differentiating between those that happened on the day of implantation and those that occurred during 1 year of follow-up . The top five events on the day were cramping, pain, nausea/vomiting, dizziness/light headedness, and bleeding/spotting. The top five events in the first year of follow-up were (lower) back pain, abdominal pain/cramps, dyspareunia (painful sexual intercourse), pain/discomfort, and dysmenorrhea (menstrual cramps). While allergy to metals was not observed in clinical trials, it was mentioned as a possibility in the “Warnings” section of the label.
A more recent FDA review of the 943 Essure AEs received from November 4, 2002 to October 25, 2013 found: “The most frequently reported adverse events were pain (606), haemorrhage [bleeding] (140), headache (130), menstrual irregularities (95), fatigue (88), and weight fluctuations (77). The most frequent device problems reported were the migration of the device or device component (116), patient device incompatibility (113) (e.g., possible nickel allergy), device operating differently than expected (73), malposition of the device (46), and device breakage (37)” . The FDA received one report from a physician about a death due to necrotizing Streptococcus spp. infection associated with the device, although the manufacturer explains that “the medical opinion of the attending physician was that the cause of death was not directly related to the essure [sic] inserts or procedure” . Academic researchers noted tubal perforation , pain [30, 31], and placement failures  in clinical routine practice.
2.3 Patient Community Outreach
“Essure Problems” is a Facebook group  launched in March 2011 by Angie Firmalino, a patient experiencing severe AEs following Essure implantation. Since launch, patients organically joined the Facebook group, and as of June 2015, there were 17,850 members, managed by 11 volunteer administrators. The group provides an environment where patients can share information and experiences regarding Essure, including an organized directory of files such as doctors offering device removal, a list of symptoms experienced by members, and a collection of publications and articles about the product. Discussions of benefits also occur, but the group was formed largely in the context of harm. In October 2013, a representative from the MedWatcher app development team (co-author CYB) joined the group to provide technical support to patients filing AE reports to the FDA.
Through active engagement in the patient community, factors that contributed to participation in reporting were observed by applying the MIAB model. In this model, “motivation” is the reason for interest, and “incentive” is what leads someone to act. “Activation” is the set of factors that lead to actual participation, and “behavior” is the activity of interest and outcome, in this case, submitting an AE report . Specifically during engagement of the Facebook patient group, factors that acted as motivation and incentive that were distinct from traditional reporting channels were noted.
In preparation for the present study, a series of discussions were initiated with the Facebook group administrators, starting February 2014, to explain the intent for a research publication. Two administrators of the Facebook group were elected to participate in the research process (co-authors MG and KD), and additional discussions clarified the use of a regulatory coding ontology, appropriate interpretation of spontaneous data, and expectations for the peer-review process. Concepts of legitimacy and integrity were clarified on both sides, and there was agreement that the results would be prepared for publication regardless of the content of the data received. The outcomes of discussions were communicated back to the Facebook group administrators or the entire group.
2.4 Coding Adverse Events (AEs)
AE symptoms reported in mobile app reports were tagged by two certified MedDRA coders (co-authors CYB and CEP) using the MedDRA (Medical Dictionary for Regulatory Activities) version 17 at the preferred term (PT) level . All reports were in English. The two coders jointly tagged the first 20 reports to establish coding guidelines then proceeded to code the remaining reports independently, maintaining a living document of codes and coding guidelines, which were iteratively discussed and updated.
2.5 VigiGrade Completeness Scores
VigiGrade completeness scores, developed by the Uppsala Monitoring Centre of the World Health Organization (WHO-UMC), were calculated based on rules outlined by Bergvall et al. . VigiGrade completeness scores have been used routinely in the WHO’s safety report database, VigiBase, since 2010. Dimensions accounted for in the VigiGrade completeness score include time-to-onset, indication, outcome, sex, age, dose, country, primary reporter’s occupation, report type, and the presence of informative free-text information. Possible scores range from 0.07 to 1.0, with each report starting at 1.0 and then subsequently penalized for each dimension lacking or containing limited information.
Since Essure is a device with only one indication (permanent female birth control) and no off-label use was noted, there was limited opportunity to penalize for indication or dose. Because the MedWatcher app is intended for submission to the US FDA, the assumption was made that the patients resided in the USA unless otherwise stated (respondents providing non-US addresses were notified of a failure to submit to the FDA and then guided to report to their national authorities). Free-text information was required on the app’s report form. With these assumptions and requirements, the lowest score possible for Essure reports submitted through MedWatcher was 0.139.
2.6 Data Analysis
Consistency and logic checks were used to identify input and coding errors. Narrative fields were cross-checked for consistency with structured data elements to correct errors where possible by directly emailing the reporter for clarification. Events requiring medical care were coded as those that were life threatening, resulted in a hospital stay or prolongation of one, or resulted in a visit to the emergency department. The PT “nonspecific reaction” was not included in analyses. Important medical events (IMEs) were identified using the European Medicines Agency list for MedDRA 17 ; the IME list contained 7605 PTs. Summary and descriptive statistics were calculated in Stata version 13 (College Station, TX, USA) or visualized in DataGraph 4 beta (Chapel Hill, NC, USA).
3.1 App Usage
3.2 Nature of Reported AEs
Statutorily defined serious events  were reported in 1047 cases (77.6 %). The events were serious enough to require hospitalization and other medical attention in 475 cases (35.2 %), and 382 reports (28.3 %) indicated the patient had experienced disability or permanent damage after implantation of the device.
Some variation in age distribution was observed among the more common AEs and IMEs (Figs. 4, 5). Average age was 33.9 years. PTs occurring in women older than average were device dislocation (35.0 years), arthralgia (joint pain) (35.2 years), uterine perforation (35.2 years), and endometrial ablation (scarring of uterus to stop bleeding) (36.0 years). Younger women more frequently reported post-procedural haemorrhage (bleeding) (31.9 years) and spontaneous abortions (miscarriage) (29.5 years).
3.3 Completeness of Reporting
The average length of the free-text narrative field was 104 words, ranging from 3 to 1557; no restrictions on length were imposed by the app. VigiGrade completeness scores ranged from 0.2 to 1.0, with an average of 0.80 (SD 0.15). Time-to-onset information was provided in 858 reports (63.6 %), with high precision (less than 1-month uncertainty) for 33.0 %. Using the WHO’s threshold for “well-documented reports” (0.80) , 55.9 % (n = 754) of reports were well documented. Completeness scores did not vary by age (data not shown). However, average narrative completeness scores were higher among women who also reported serious AEs using checkboxes (0.83, SD 0.13) compared with those who did not use this feature (0.61, SD 0.13), suggesting the need to balance structured and semi-structured fields.
This analysis presents an approach for encouraging patient AE reporting via a crowdsourcing tool in collaboration with online patient community outreach. The inverse relationship between survey length and response rate is well studied [38, 39], and it is likely that the efficiency of reporting, from 40 min via traditional routes to 11.4 min via the MedWatcher app, contributed to higher volumes of reports being submitted. During the 132 months after marketing authorization, CDRH received 943 reports in MAUDE for Essure (an average of seven per month) . By comparison, there were 1349 reports received via the app during the 19 months (103 per month) of the study period (ratio 14.7:1), acknowledging that some reports may have been submitted via both channels, and that some events may have occurred years before. For drug AEs in general, Hoffman et al. have found increasing reports during the first three quarters after approval and relatively constant counts after that . While it is unclear how this pattern may apply for consumer-oriented medical devices, the sudden increase in reports submitted via the app that occurred a decade after initial marketing is likely related to the outreach conducted by the Facebook group administrators and use of the MedWatcher tool.
Second, new incentives arise from positive communications within the patient community, endorsing the app with reassurance that a group member’s participation will help other women. This is evidenced by the following posts in the Facebook group: “We have an obligation to our daughters to report it, if we don’t, this could be a viable birth control option for them” and “If you have had any problems after being implanted with Essure, please file a report with the FDA […] That is the only way the FDA knows what is going on […]. File here. [URL to MedWatcher app]”. The incentive of receiving external validation and empathy for one’s experience is supported by public encouragement in the patient community. For example, when a group member posted a screen capture of the email confirmation received from the app, others responded with many Facebook “likes,” a mechanism of quick, positive response to a post, or comments such as “Wootwoot! AWESOME!!”
The third difference with traditional pharmacovigilance is the presence of feedback loops based on bidirectional communication. One feedback loop resulting from a positive user experience reinforces the community’s endorsement of the technology. A second feedback loop operates between the end user and the patient community when anonymized and redacted reports are publicly posted in the app. In summary, the collaboration between the mobile app and Facebook group yielded new incentive structures and feedback mechanisms by allowing patients to communicate confidentially with each other and by making anonymized reports public.
Leverage-salience theory  further corroborates that individual participation in survey research is greater when community involvement is present. As Keating and Furberg point out, motivation “is greater if members of an individual’s social network indicate the importance of participating in an event” . The motivational support pathways emerged naturally without interference by app developers or the FDA; deliberate alignment of incentives is an area for consideration in future efforts. Another potential enhancement is providing personalized feedback. Previous research has shown that personalized feedback from healthcare providers is a key factor in submitting another AE in the future . While the publicly posted reports may go some distance in generating feedback content, further steps can be taken to customize this information for the reporter.
The reports received via the app were more complete on average than reports received by regulatory agencies worldwide. The average VigiGrade completeness score is 0.45 for the 7.0 million reports in VigiBase through to January 2012 , while the reports received via the app averaged 0.80. Reports received via the app were considered “well documented” 55.9 % of the time, while the international average is 13 % , in other words, app reports were more than four times as likely to be well documented. Further, reports from patients via the app were more complete than reports completed by physicians worldwide (for all medical products); only 24 % of reports from physicians are considered well documented . Since VigiGrade scores vary by country, it is worth noting that US physicians rank near the bottom of the list globally in completeness of reports . The information provided by patients, in 11.4 min on average via the app, was of much higher quality than anticipated.
While this study highlights the usefulness and value of online tools in patient reporting, its greatest limitations are generalizability and replicability. Spontaneous report data are limited in general by the lack of a patient exposure denominator. Further, it is unclear what biases may exist in reports submitted by a particularly active and motivated group of patients and how their access to technology may play a role in these biases. Traditional spontaneous report data originating from patients versus healthcare providers have typically involved a different breadth of body organ systems, yet provide similar overall nature of drug problems . Similar patterns may be present in data reported via consumer-oriented apps. It will be important in the future to explore what kinds of biases may operate due to social stigma that may lead to differential submissions of certain types of medical events (e.g., those less stigmatized) or by certain types of patients (e.g., with more normative patient identities) at the intersection of social media and apps. Until these studies are undertaken, there will be limited direction for how to incorporate stimulated data into quantitative signal detection methods, especially in regard to signal-to-noise determination. It also remains to be seen whether a co-promotion model with patient communities is a sustainable and scalable enterprise. With the MedWatcher app, similar efforts are already emerging with other Facebook patient groups.
In addition, the nature of the product (narrow indication, no notable off-label use) allowed us to make assumptions that led to higher VigiGrade completeness scores. Due to the lack of a device-specific completeness metric, VigiGrade was used as the best alternative. While VigiGrade scores are intended for multi-national comparisons, global data on medical device reports are not currently available, making relative completeness of the reports somewhat difficult to interpret.
Despite these limitations, the FDA has been responsive to this reporting population. In the June 2014 report, FDA stated that they “reviewed Essure patient reports of problems (including Web-based testimonials) and reports of problems submitted to the FDA from other sources, including doctors, patients, and the manufacturer” . We applaud the Agency’s efforts to extend the conversational space for collecting AEs. The Agency went as far as to meet with representatives of the Facebook group in February 2014, as a venue to communicate and express the group’s concerns. Most recently, on June 24, 2015, the Agency announced that an Advisory Committee Meeting would be held in September 2015 to discuss risks and benefits of Essure, citing that “the majority of reports received since 2013 have been voluntary reports, mostly from women who received Essure implants” .
Creating a mobile app that mimics ICSRs provides a technological solution to the greater medical problem of underreported AEs. A more holistic solution demonstrated in the present study is the engagement of patient groups to responsibly promote use of the app, grounded equally in social theory and medical informatics. At the heart of the user experience in this coupled engagement approach is the sharing of information, a traditionally sensitive subject in pharmacovigilance. Impending technological advances such as automated anonymization can support broader liberation of ICSR narratives, bringing pharmacovigilance closer to the “lively, engaging, dynamic, collaborative, humane enterprise” that it has the potential to be .
The authors thank all the women who shared their experiences via MedWatcher. The views expressed in this paper are solely those of the authors and not of the funding source. The authors acknowledge past and present CDRH collaborators: Fei Wu, John-William Declaris, Doug Wood, Mary Beth Ritchey, Isaac Chang, Benjamin Eloff, and Allison Huffman, as well as FDA scientists Bob Ball and Skip Francis. The authors appreciate reviews of portions of this manuscript by Kristen Bibeau and Roxanne Saucier. The authors thank the mobile development team for improving the app: Katelynn O’Brien, Kyra McKenna, and Lucas Baptista. The authors thank Eric Xu and Carly Winokur for assistance in preparing this manuscript for submission.
Compliance with Ethical Standards
This research was funded in part by the Center for Devices and Radiologic Health of the US Food and Drug Administration (HHSF223201210016C).
Conflict of interest
CYB, CCF, CMM, CEP, HR, and ND are employees, and JSB is a consultant of Epidemico, Inc., a company attempting to commercialize the information technology aspects of this research. MG, KD, and RB have no conflict of interest to report. The manufacturer(s) of Essure® are not clients of Epidemico, nor were they contacted about this research prior to submission. Epidemico is a wholly owned subsidiary of Booz Allen Hamilton.
|Funder Name||Grant Number||Funding Note|
|U.S. Food and Drug Administration|
Open AccessThis article is distributed under the terms of the Creative Commons Attribution-NonCommercial 4.0 International License (http://creativecommons.org/licenses/by-nc/4.0/), which permits any noncommercial use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.