Background

As access to digital technology continues to expand across the globe, evidence is mounting that digital tools can increase knowledge about family planning and reproductive health (FP/RH), facilitate informed decision-making processes and access to FP methods, and enable individuals to self-screen to promote and protect their reproductive health—all important components of self-care [1,2,3,4]. WHO’s Guideline on Self-Care Interventions for Health and Well-Being recognizes that digital health can provide “accurate and tailored information on specific healthcare interventions and technologies” [1]. The number of digital health interventions that provide access to FP/RH information and care to various population groups in low- and middle-income countries has grown tremendously over the past decade. With the addition of artificial intelligence-driven tools such as ChatGPT, it is likely that these interventions will expand at an even faster pace.

Just as access to high quality contraceptive services is a human right, so is access to high-quality information about contraception. However, to our knowledge, there is currently no resource that assesses the content quality of user-facing digital FP tools. Our primary objective was to systematically assess the content of digital FP tools for comprehensiveness and accuracy across 12 content areas reflecting core elements of counseling for informed choice, as well as to provide actionable insights to the strengthen this content. Following this analysis, we recommended a list of vetted digital FP tools for adaptation and/or scale-up and created a checklist that can be applied to assess and improve digital FP content quality.

Methods

Tool landscaping

To source user-facing digital tools for family planning, we implemented a purposive sampling approach. We reviewed existing compendiums and resources, performed web searches, and conducted targeted outreach with family planning collaborators (Table 1). Through this process, we compiled a list of 29 tools. To be included in the analysis, tools were required to meet the following eligibility criteria:

  • Contain detailed FP content with the (stated or implied) aim of increasing individuals’ knowledge about FP

  • Designed for and deployed to individual users in low- and middle- income countries

  • Content available in English or French

  • Include fixed or static content (e.g., content on a website or in an app that does not change as opposed to a social media campaign with short-lived and rotating messages or other dynamic, user-generated, or social-media based content)

  • Is in use at time of search (2020–2021) or use since 2017

  • Delivered via a widely accessible platform such as SMS, chatbot, smartphone application, or website

  • Tool owner was willing to provide access to content in a format that facilitated review, such as (but not limited to) MS Word or MS Excel

Table 1 Digital tool sources

Development of rubric to assess content quality

Based on review of global guidance documents such as the High Impact Practice Briefs, the Family Planning Handbook, and Medical Eligibility Criteria for Contraceptive Use, we decided to evaluate the content of user-facing tools for family planning across 12 key content areas (Table 2) that reflect essential elements of informed choice counseling. For the criteria that tools should cite all modern methods, we adopted the widely used categorization proposed by Hubacher and Trussell, though we acknowledge that these categories do not necessarily align those of contraceptive users [5, 6]. We consulted with our medical advisor—who has clinical FP knowledge and experience contributing to the development of key content resources for FP–to assign each key content area a weighted value based on relative importance for informed choice counseling, with a weighted value of 1.0 representing the highest importance (see Table 2). We incorporated the key content areas, as well as general information about the tool (i.e., level of operation (global, country, etc.), objectives, targeted population, evidence of effectiveness) into an original rubric for content review and analysis. Our medical advisor tested, iterated, and tailored the rubric prior to initiating analysis.

Table 2 Content area assessment criteria and assigned weights

Content review and analysis

Our medical advisor conducted a primary content review of each of the 11 tools, scoring each tool by content area, noting any omissions, inaccuracies, or incomplete descriptions, by FP method where applicable. Some content areas could be scored as “not applicable.” For example, a tool providing general information about FP methods does not verify reproductive intentions by design.

Two analysts with backgrounds in public health and expertise in FP and digital tools in low-and middle-income countries extracted information from the rubric applied by the medical advisor, quantifying the omissions, inaccuracies, and incomplete descriptions by family planning method for each key content area, and summarized the information in a master table (Table 3). Key content areas were assigned a categorical score of “green” (numerical value of + 1), “yellow” (numerical value of 0), or “red” (numerical value of -1) based on the following classification:

  • Green (+ 1): Contains ≤ 1 omission, inaccuracy, or incomplete description

  • Yellow (0): Contains 2–3 omissions, inaccuracies, or incomplete descriptions

  • Red (-1): Contains ≥ 4 omissions, inaccuracies, or incomplete descriptions

Table 3 Omissions and errors in 11 tools by key content area

This process was performed by the two reviewers for each tool. The reviewers discussed any inconsistencies in order to achieve consensus. One reviewer then performed a quality control audit by returning to the tool content to verify that the score matched the content. Using our final classifications, we then calculated numerical scores to answer two questions:

  • How accurate and comprehensive was the content of each of the tools?

  • How accurate and comprehensive was the content in each content area across all tools?

To calculate tool scores, we multiplied each key content area weight (Table 1) by the numerical assessment value, calculated the sum of these products, and divided this value by the total number of key content areas (i.e., 12) (Eq. 1).

$$\mathrm\Sigma\;(\mathrm{key}\;\mathrm{content}\;\mathrm{area}\;\mathrm{weight}\ast\mathrm{assessment}\;\mathrm{value})\;/\;(\#\;\mathrm{content}\;\mathrm{areas})$$
(1)

To score key content areas, we calculated the sum of numerical assessment values and divided this value by the total number of tools that received a score for that content area (i.e., 12) (Eq. 2).

$$\mathrm\Sigma\;(\mathrm{assessment}\;\mathrm{value})\;/\;(\#\;\mathrm{tools})$$
(2)

These numerical scores were then used to identify the most accurate and comprehensive content areas across the tools and to identify the strongest overall user-facing digital tools. Based on expert opinion and the pattern we saw across tools, we set a quality threshold: to be recommended for adaptation and scale up, tools could not have four or more errors in more than six content areas. In practice, this meant that tools scoring below -0.1 were deemed of insufficient quality and not recommended for adaptation and scale up.

One analyst reviewed the summary of omissions and inaccuracies included in the master table to populate insights shared in Table 5 relating to common omissions or inaccuracies. To provide examples of specific content, the analyst referred back to each tool.

Results

We contacted 29 tool owners about participating in the review and received content for 24 (83%) of the user-facing tools (see Fig. 1). The aforementioned team of two analysts reviewed each tool and applied the above-listed eligibility criteria, determining 11 tools were eligible for inclusion.

Fig. 1
figure 1

Landscaping user-facing tools

Of the tools included in the review, 8 were designed with youth in mind, while 3 were oriented to people of reproductive age. Table 3 synthesizes the findings from the analysis of key content areas across the tools.

Of the 12 key content areas we identified, five were included in each of the 11 tools; the tools addressed the remaining six content areas inconsistently or, in the case of side effect management, not at all (see Table 3). Four content areas were the most accurate and comprehensive: complete list of modern methods, duration of protection, dual method use, and return to fertility. The lowest scoring content areas were side effect management (not addressed in any of the tools), non-contraceptive benefits, effectiveness, side effects, and instructions for use.

The FP content of digital tools varied greatly in accuracy and comprehensiveness, with overall tool scores ranging from 0.28 to -0.52. Possible tool scores ranged from 0.82 to -82. Table 4 synthesizes the results from the analysis of the quality of each of the tools’ family planning content by tool.

Table 4 Scoring of FP content by tool

We developed a table that included details of specific omissions and inaccuracies by content area for each tool. This information is synthesized in Table 5, which outlines trends in the inaccuracies and omissions across content areas in the tools we reviewed.

Table 5 Qualitative content area trends

Discussion

Quality assurance is a foundational element of quality care for FP [6]. As FP self-care strategies, including digital, become more common, the FP community must determine how to translate principles of quality of care into these new contexts. Both the Digital Self-care Framework and Quality of Care Framework for Clients and Providers in the Delivery of Self Care lay out standards for client safety, indicating that client communication tools should be medically accurate and aligned with national and international guidelines [7, 8]. While this work focuses on accuracy and comprehensiveness, additional factors in the development of tool content are critical, including tailored design and content for the user’s context, literacy level, and preferences [9]. Because quality of care is an important factor that influences uptake and continued use of family planning methods, [10] the potential implications of inadequate or incorrect information in digital tools are far-reaching.

We assessed 11 digital, user-facing tools for quality of their FP content (by tool and by content area). Significant variation in quality was observed by content area, with most accurate and comprehensive areas being naming all available modern FP methods and duration of protection. The content areas with the lowest scores included side effect management and non-contraceptive benefits. Content areas that required more technical or clinical information, such as side effects, effectiveness, and instructions for use, were more likely to include errors in the tools we assessed. These findings resonate with other assessments that have noted that the FP/RH content of digital applications was not sufficiently complete or accurate [11, 12].

Misinformation or incomplete information about FP can contribute to dire consequences for those who are exposed to it, including unintended pregnancy and sexually transmitted infection. Accurate, comprehensive information is an essential component of informed decision-making about RH [13]. For example, when we consider effectiveness, it is important for potential users to know that effectiveness of some methods depends heavily on their ability to use those methods consistently and correctly. Therefore, tool content that presents perfect use statistics for effectiveness, without also including common use statistics, can be misleading. The tiered presentation of methods as most effective, moderately effective, and least effective can be considered as directive, subjective or inaccurate when presented out of the context of all methods [14].

The absence of some information in these digital tools represents a missed opportunity to educate individuals on important considerations and to address prevalent concerns. For example, three tools lacked comprehensive and accurate guidance related to dual method use. In the case of fertility awareness methods, users should be aware that they may still be at risk of STI/HIV (considering these methods include defined periods of unprotected sex when pregnancy is unlikely). As another example, no tools provided accurate information about non-contraceptive benefits such as improved menstrual regularity, reduced cramping and pain, and protection from certain types of gynecological cancers [15]. Provision of this knowledge is highly pertinent, as many users are drawn to non-contraceptive benefits when choosing a method and widespread misconceptions persist surrounding hormonal methods and cancer risk [15,16,17]. Non-contraceptive benefits should be covered systematically to reduce harmful myths and misconceptions and to help those who are deciding on an FP method to consider all advantages.

Side effects, both perceived and experienced, can lead to discontinuation or can prevent users from adopting a method [18,19,20]. Digital tools offer an opportunity to support informed choice and contraceptive continuation by providing users with accurate and comprehensive information on side effects before and after they adopt a method, so users know what to expect and are prepared if side effects occur. While discussion of rare complications can be an important component of comprehensive digital FP (and perhaps its own content area), presenting these as side effects is incorrect and can unnecessarily scare users. Digital tools are optimally designed to support self-care, particularly when they include after-care instructions and reassurance regarding common side effects (many of which can be managed by simple, over-the-counter medications or through other self-managed approaches).

Despite the emphasis on quality in existing frameworks for digital self-care, to our knowledge, no resources are available to assess whether the FP content of a given digital tool is of high quality. Global guidance documents [21,22,23] provide up-to-date family planning information with a high level of detail but they are not designed to serve as quality assurance tools, or to be easily converted into a user-friendly, digital format. Additionally, FP information is dynamic: new contraceptives are being added to the method mix and new evidence leads to changes in recommendations (e.g., the duration of protection of a method is extended, or drug interaction guidance is updated, or timing of initiation is changed). It can therefore be challenging to source and maintain accurate, sufficient, up-to-date content that is also relevant to the intended audience.

Our exercise resulted in a list of existing (as of April 2022) user-facing digital tools (see Appendix 1) which have high-quality FP content and can therefore be promoted or used by governments, donors and FP implementers. This exercise also led us to develop a simplified checklist from our original rubric (see Appendix 2) that can help these same stakeholders to assess and improve the FP content of user-facing digital tools that were not assessed as part of this exercise. The checklist can also be used as an outline for those developing new digital tools to ensure that key content areas are included; then, global FP resource documents can serve as a source for actual/updated technical content. While none of the tools we reviewed cited their content sources, we recommend doing so, as providing citations can assuage both users and clinicians, thereby making the tool more likely to be used as intended.

Limitations

While we conducted an in-depth search of multiple databases to identify eligible tools to include in our review, we may have missed some existing tools. Also, our criteria limited the tools that were eligible for inclusion. For example, we did not consider tools that are available in languages other than English or French. However, the tools we reviewed are deployed in multiple contexts and 9 languages in total, which extends possible benefits of this analysis to populations speaking those languages. In addition, more complex tools, such as those with dynamic, user-generated, or influencer-generated content were not eligible for inclusion, yet these are an important source of FP information for some users and their content should also be reviewed and assessed. Tool owners had to agree to participate in the assessment and share their content with the study team. The assessment is based on the content that was present at time of review. Tool content may have changed since review via content updates. While a trained medical advisor assigned weights to content areas based on experience and evidence regarding the essential components of informed choice counseling and their relative importance from a biomedical perspective, this could be considered subjective and is likely different from individual user priorities.

Conclusion

There has been rapid growth over the past decade in the number of digital tools that support FP knowledge and access. There has been similarly rapid growth in the enthusiasm for these types of tools—including as part of guidelines and efforts to support self-care—which has led to resources that can help interested governments, donors, and program implementers to understand the current evidence base and to identify existing digital FP tools. However, to our knowledge, there has not been similar attention paid to assessing the quality of the FP content that these digital tools provide. Supervision and quality assurance are routine parts of assessing and strengthening in-person FP. Now, the global FP community needs to incorporate ways to do the same for digital FP tools.