Hysteroscopy in the new media: quality and reliability analysis of hysteroscopy procedures on YouTube™

Background Hysteroscopy plays a crucial role in diagnosing and managing various intrauterine pathologies. However, its execution can be influenced by patients’ perception and understanding, which are often shaped by digital resources such as YouTubeTM. Given its popularity and accessibility, YouTubeTM has the potential to greatly influence patients’ knowledge and expectations about this procedure, highlighting the need for accurate and reliable information. Purpose This study aims to assess the reliability and quality of hysteroscopy information available to patients on YouTubeTM. Understanding the nature of information patients’ access can help address their fears and potential misunderstandings about the procedure, consequently reducing the likelihood of suspension or postponement due to anxiety. Methods A comprehensive analysis of YouTubeTM was conducted, simulating the search process of a patient seeking information about hysteroscopy. The study evaluated the reliability and quality of 90 out of the first 100 hysteroscopy-related videos on YouTubeTM, scored by four gynecologists—two experienced hysteroscopists and two trainees. The videos were assessed for reliability and quality using the mDISCERN and Global Quality Scale (GQS) scores. Results The average mDISCERN and GQS scores for the evaluated videos were below the optimal three points, highlighting the lack of fluency, comprehensiveness, and reliability of the available information. Notably, while videos produced by experts, including doctors and professional channels, had higher scores, they still fell short of the minimum score of 3. These videos also were not considered more suitable for either patients or trainees. Videos that were assessed as reliable (mDISCERN ≥ 3) were observed to be longer and were more frequently produced by doctors. These videos were suggested more to trainees rather than patients. Similarly, videos deemed as fluent and comprehensive (GQS ≥ 3) were longer and were more often recommended to patients. Conclusions While YouTubeTM is a widely used source of medical information, the quality and reliability of hysteroscopy videos on the platform are poor. The strategic use of selected, high-quality hysteroscopy videos can enhance procedure success and alleviate patient fears. However, the unsupervised discovery of information by patients could potentially lead to procedure failure or an elevated level of stress due to misleading or incorrect information.


Introduction
The advent and widespread use of the internet have facilitated easy access to a vast array of online resources, transforming them into a significant wellspring of health information for patients and caregivers [1][2][3][4][5][6][7]. Evidence from the Health Information National Trends Survey (HINTS) underscores this trend, highlighting an exponential increase in individuals seeking health-related information online [8]. Pew Research Center surveys corroborate this, reporting that three-quarters of online health information seekers had treatment choices shaped by the insights gleaned from their online findings [3]. Similarly, in 2020, 55% of EU citizens aged  reported that they had sought online health information relating to injury, disease, nutrition, health improvement, or similar topics [9].
YouTube™ has emerged as a predominant platform among the myriad of online resources. In 2022, YouTube™ counted over 2.56 billion users accessing its video content worldwide [10]. However, YouTube™'s model does not regulate the credibility of content creators, potentially leading to the dissemination of unverified or non-expert content. This is further exacerbated by the lack of a peer-review process for content uploaded on YouTube™, allowing registered users to post content at their discretion.
In light of this scenario, concerns have been raised by healthcare providers and regulatory bodies regarding the accuracy and quality of the accessible information, particularly given the prevalent sharing of anecdotal experiences and personal viewpoints [11]. This becomes particularly significant when considering specific medical procedures, such as hysteroscopy, susceptible to misinformation and misunderstanding.
Initially performed only in the operating room under general anesthesia, hysteroscopy has gradually transitioned to an office setting due to advancements in technology, such as the miniaturization of endoscopes and improvements in optics and surgical techniques [22,23]. This transition has helped mitigate the need for hospital admission, preoperative tests, and general or regional anesthesia, reducing the postsurgical recovery period, overall procedure cost, and complication rate [24]. However, managing patient anxiety remains a critical challenge in completing office hysteroscopy, given that it can intensify pain perception and limit procedural tolerance [25].
Despite the widespread application of hysteroscopy and the common use of YouTube™ as a health information resource, no published studies have analyzed the information available on YouTube™ regarding this procedure.
Therefore, our study aims to examine the reliability and quality of hysteroscopy-related information on YouTube™. We analyzed YouTube™ content, simulating the information-seeking process of a patient preparing to undergo a hysteroscopy. The goal is to understand how patients gather information about the procedure and how to alleviate their fears before undergoing hysteroscopy to avoid the necessity of procedure suspension or postponement. Our analysis can shed light on the influence of YouTube™ as a medical information dissemination platform and contribute valuable insights to enhance patient preparedness for hysteroscopy.

Videos selection
The YouTube™ search was conducted on June 3, 2023, using the term "hysteroscopy".
The YouTube™ setting was "global", and no filters were used to faithfully reproduce the search of a hypothetical patient proposed with the hysteroscopy procedure. We limited our search to the first 100 videos in line with existing literature, suggesting that only 8% of internet users continue their search after bending this number [26]. Our selection process was based on several inclusion and exclusion criteria. To qualify for inclusion, videos had to be in English or without audio and had to primarily concern hysteroscopy. Conversely, exclusion criteria encompassed videos in languages other than English, videos unrelated to hysteroscopy, and duplicates. Of the 100 videos evaluated, 90 met the inclusion criteria and were selected for further analysis. The remaining ten videos were eliminated from our review for the following reasons: 7 videos were in a language other than English, 2 were duplicates, and 1 video was unrelated to hysteroscopy (Fig. 1). The complete playlist of the 90 selected videos is available at the following link: https:// www. youtu be. com/ playl ist? list= PLPGE 3LF2g Qpk-9U2qY bN259_ aLqgl 96Qv.
Ethics committee approval was not required as this study included no human participants, and the videos were publicly accessible.

Data collection
A data extraction process was conducted on the selected 90 videos to gather a wide array of information. The specific parameters extracted from each video included: -Language: the language in which the video was presented was documented. In alignment with our inclusion criteria, all the selected videos were in English. -Production source: videos were classified based on their production source, including professional entities like doctors, professional channels, hospitals and non-professional sources such as internet or television platforms or those created by patients without professional backing. -Purpose of the video: each video was categorized as either 'informative' or 'sponsor'. The latter category pertained to videos with an apparent commercial purpose. -Presenter's gender: the gender of the video presenter was recorded. -Time on YouTube™: the period for which each video has been accessible on YouTube™ was determined, measured in months. -Video duration: the length of each video was documented, measured in seconds. -Viewer engagement: several metrics were evaluated to assess viewer engagement, including the total number of views, number of likes, number of comments, and the number of subscribers of the uploader.

Evaluation of video reliability and content quality
The quality and reliability of the videos were evaluated by a team composed of two experienced gynecologists (SGV and VR), who have each performed more than 100 diagnostic and operative hysteroscopic procedures, and two trainees in Obstetrics and Gynecology (AL and SS). To ensure the scientific reliability of the videos, we used the modified DIS-CERN scale [27]. Initially, this scale was designed by Charnock et al.to evaluate written health information [28]. The modified version (mDISCERN) consists of a five-question questionnaire (Table 1), with each affirmative response garnering one point towards a maximum of five [27]. Videos achieving an mDISCERN total score of three or above were deemed to provide reliable health information.
In addition, we used the Global Quality Scale (GQS) to grade the overall quality of the videos [29]. The GQS, a 5-point scale ( Table 2), was developed to evaluate the fluency and comprehensiveness of information on the web [29]. A higher GQS score denotes content of better quality and informative value. Any video that scored three or more on the GQS was deemed to offer higher quality health information. The four operators independently but concurrently viewed and scored the videos during a five-days section.
The team independently and simultaneously watched and scored the videos over five days. Each team member also responded to two final questions: "Would you recommend this video to patients?" and "Would you recommend this video to resident doctors?"

Statistical analysis
The normal distribution of data was tested with the Shapiro-Wilk test. Data are presented as medians (interquartile range; IQR) for continuous variables and frequencies (proportions) for categorical variables. Data from general features and results of the assessment of the videos are presented as mean, standard deviation (SD), minimum (min) and maximum (max) for each variable. Data are presented stratified according to the production source (experts vs. others), mean mDISCERN, mean GQS and aim of the video (informative vs. sponsor). General features of the videos, reliability of the videos content and global quality were compared among the groups with the Wilcoxon-Mann-Whitney test for continuous variables and the Chi-square test for categorical variables, respectively. Statistical analyses were performed using R studio Inc. (Boston, MA, USA) integrated development environment for R software v. 3.5.3 (2016). All tests were two-sided, and the statistical significance level was determined at p < 0.05.

Results
The general characteristics of the 90 videos selected for the study, including their upload duration, length, views, likes, and the number of subscribers to the uploader, as well as the evaluation results (mDISCERN and GQS scores), are provided in Table 3. The mean score for the two scales (mDIS-CERN and GQS) did not satisfy the minimum of 3 points, attesting to the low fluency, comprehensiveness (GQS) and reliability (mDISCERN) of information presented in the videos for both residents and specialists. Nevertheless, specialists provided higher average scores than trainees on both scales. Specifically, the overall mean mDISCERN score was 2.43 ± 0.79, with means of 2.38 ± 1.01 for residents and 2.47 ± 0.77 for specialists. The mean minimum and maximum scores ranged from 0.25 to 4.00 overall, 0.50 to 4.50 for residents and 0 to 4.00 for specialists.
In terms of GQS scores, the overall mean was 2.64 ± 0.83. Residents gave a mean score of 2.53 ± 0.93, and specialists gave a mean of 2.74 ± 0.90. The mean minimum and maximum scores ranged from 1 to 4.75 overall, 1 to 5.00 for residents, and 1 to 4.50 for specialists.
We divided the video sources into two categories: Experts, which included Doctors (n = 39), Professional Channels (n = 26), Hospitals (n = 18), and Patients (n = 7). The features and results of the assessment were compared across these categories, as presented in Table 4.
For most metrics related to video characteristics (duration of upload, length, number of views, comments, likes, and subscribers), the median values were generally higher for the patient group than the expert group. However, the p-values associated with these comparisons were relatively high (ranging from 0.19 to 0.76), indicating no statistically significant differences between the two groups in these respects.
Videos produced by experts received a statistically significant higher score from both specialists and trainees in the two scales of reliability of information (mDISCERN) and fluency and comprehensiveness (GQS). Despite this, the average scores of GQS and mDISCERN remained below 3 points, even for expert-produced content, signifying low reliability and comprehensiveness. Regarding the suggestion of videos to patients and resident doctors, most videos in both groups (92% for experts and 89% for patients) were not suggested.
The comparison of the general features of the videos according to mDISCERN is presented in Table 5. The videos with a higher mDISCERN score (≥ 3) tended to have a significantly longer median duration (459 s compared to 259 s).
There was no statistically significant difference between the group with an mDISCERN score ≥ 3 and the group with an mDISCERN score < 3 concerning video upload duration on YouTube™, number of views, comments, likes, and Are reliable sources of information used? (i.e., publication cited, speaker is specialist) 3 Is the information presented balanced and unbiased? 4 Are additional sources of information listed for patient reference? 5 Are areas of uncertainty mentioned? Table 2 Global Quality Scale (GQS) criteria used to score videos quality [29] Items Characteristics 1 Low quality, low flow, most information missing, not beneficial for patients 2 Generally low quality and low flow of information, some listed information and many important issues are missing, very limited use for patients 3 Moderate quality, sub optimal flow of information, some important information are sufficiently discussed but some are poorly discussed: only somewhat useful for patient 4 Good quality and generally good information flow. Most of the relevant information is listed, but some topics are not covered, useful for patients 5 Excellent quality and information flow, very useful for patients 1 3 subscribers. The videos with higher mDISCERN scores were significantly more likely to be produced by doctors (66% vs 33%) when compared to those with lower mDIS-CERN scores.
There was not a significant difference between the two groups regarding the video being suggested to patients (p-value = 0.23). However, a higher percentage of videos with mDISCERN ≥ 3 (21%) was suggested for resident doctors compared to those with mDISCERN < 3 (2%). This difference was statistically significant (p-value = 0.006).
Notably, none of the videos with a higher mDISCERN score were produced by patients, while 11% of the videos with a lower score were patient-produced. Table 6 showcases the general features of videos based on their Global Quality Scale (GQS) scores, divided into two categories: score < 3 and score ≥ 3. It appears that the median upload duration on YouTube™ was longer for videos with a GQS score below 3. In contrast, the median video length was longer for those scoring three or higher. However, the two groups had no statistically significant differences regarding these aspects and the number of views, comments, likes, and subscribers, as suggested by the provided p-values.
When considering the production source, there was a larger percentage of doctor-produced videos in the group with GQS scores of 3 or higher (48%), although this difference was not statistically significant (p-value = 0.25). Notably, patient-produced videos only appeared in the group with GQS scores of less than three.
Importantly, all videos with a GQS score less than three were not recommended to patients, while 29% of videos with a GQS score of 3 or more were, representing a significant difference (p-value < 0.001). A similar pattern was observed for recommendations to resident doctors, with 2% of the videos with a GQS score of less than 3 and 19% with a GQS score of 3 or more recommended. This difference was also statistically significant (p-value = 0.01).

Discussion
This study is the first to focus on YouTube™ videos about hysteroscopy. We scrutinized 90 videos, each averaging nearly 96,000 views. Considering the increasing reliance of patients on online resources for health information, it's plausible that a significant portion of these views came from individuals seeking insight into hysteroscopy procedures [1][2][3][4][5][6][7]. Despite 83 of these 90 videos being produced by medical experts (hospitals, doctors, or professional medical channels), our evaluation found them lacking in terms of scientific reliability, clarity, and comprehensiveness, a significant revelation considering the growing role of You-Tube™ in health information dissemination [3,5,7].
This finding aligns with similar analyses conducted on YouTube™ videos regarding other surgical procedures, such as robotic myomectomy [30], uterine leiomyoma surgeries [31] and hysterectomy [32]. These investigations, involving the assessment of 150, 137, and 66 videos respectively, underscored that YouTube™ might not be an optimal platform for disseminating accurate and comprehensible medical information to the public.
Although the focus of this study was restricted to You-Tube™, a broader examination of other online platforms like Google™, Facebook™, LinkedIn, Instagram, and YouTube™ demonstrated a similar tendency [33]. A study exploring treatment options for overactive bladder syndrome revealed a discernible information gap across these platforms, with search results predominantly occupied by homeopathic and alternative medicine. This scenario underscores the need for trustworthy digital health information [33].
Furthermore, our analysis found that only a minimal fraction of the videos (7 out of 90) offered value to trainees, echoing an earlier study on obstetric and gynecological physical examinations, which found only 29 out of 176 videos

3
analyzed useful for self-guided learning among medical students [34]. This implies that YouTube™ might not be an optimal learning tool for medical students and early-career practitioners, particularly in gynecology. A troubling observation from our study was that none of the scrutinized videos achieved the minimum score on mDISCERN and GQS scales. This lack of quality was consistent regardless of whether the videos were produced by experts or not, underscoring that being an expert does not guarantee the delivery of reliable, clear, and comprehensive information. This could lead to misinformation or misunderstanding among patients seeking information about hysteroscopy online, subsequently affecting their decision-making process and causing unwarranted anxiety.
Preoperative anxiety, linked to a lack of information or misinformation, can influence patients' emotional states, potentially exacerbating pain perception during the procedure [35][36][37]. Despite the minimally invasive nature of hysteroscopy, it's essential to acknowledge that anxiety is not always avoidable and could be notably intense in women [38]. This anxiety, occasionally manifesting as increased pain sensitivity, can be triggered by various factors, including cervical dilatation, intrauterine pressure, manipulation, and emotional states [38][39][40]. Technological advancements have ushered in an era of new and smaller surgical devices, significantly contributing to the field of hysteroscopy. In particular, miniaturized mechanical instruments have been developed to optimize precision and efficacy across hysteroscopic procedures [41]. These technological advancements, alongside the diffusion of the vaginoscopic 'no touch' technique that obviates the need for cervical manipulation with a speculum and tenaculum, alleviate potential pain and discomfort [42]. Pharmacological and non-pharmacological measures can be strategically employed in selected cases to manage pain and facilitate the examination process [43][44][45][46]. Pharmacological interventions such as local anesthetics, non-steroidal anti-inflammatory drugs (NSAIDs), and cyclooxygenase-2 inhibitors have been found to be efficacious when discomfort arises [47]. These strategies include warming the distension medium, music diffusion, continuous procedural updates, or enabling the patient to view the procedure on a monitor [43][44][45][46][47]. Furthermore, enhancing doctor-patient communication and comprehensive patient education are practical non-pharmacological interventions to reduce preoperative anxiety and improve patient satisfaction [48,49]. However, the efficacy of these strategies could be undermined if the information available online, as our study indicates, is either insufficient or misleading. Therefore, providing accurate information from experts becomes paramount to prevent patients from seeking potentially incorrect information online and ensure a successful hysteroscopy procedure.

Conclusions
While our study provides valuable insights, we must acknowledge its limitations. It did not evaluate all available videos about hysteroscopy, and YouTube™'s dynamic nature means the pool of videos may have changed since our research was conducted. We only considered videos in English, potentially limiting the generalizability of our findings. Moreover, we did not explore the effect of video content on patient outcomes or satisfaction, which should be the focus of future research. Despite these limitations, the study underscores the need for accurate, reliable, high-quality online medical information. Relying solely on YouTube™ for patient education can potentially lead to misconceptions and unnecessary anxiety. As such, hysteroscopy experts need to harness digital platforms' power and deliver reliable, high-quality information to improve patient understanding and overall procedure success.
Guiding patients to selected, expert-approved videos can enhance their understanding, alleviate anxiety, and increase the success rate of procedures. Conversely, leaving patients uninformed or allowing them to find potentially misleading information can lead to stress and potential procedure failure. Funding Open access funding provided by Università degli Studi di Cagliari within the CRUI-CARE Agreement.

Data availability
The data that support the findings of this study are available on request from the corresponding author [SGV].

Conflict of interest
The authors have no relevant financial or non-financial interests to disclose. Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http:// creat iveco mmons. org/ licen ses/ by/4. 0/.