Abstract
Background
To determine the quality and reliability of DCR YouTube videos as patient education resources and identify any associated factors predictive of video quality.
Methods
A YouTube search was conducted using the terms “Dacryocystorhinostomy, DCR, surgery” on 12th of January 2022, with the first 50 relevant videos selected for inclusion. For each video, the following was collected: video hyperlink, title, total views, months since the video was posted, video length, total likes/dislikes, authorship (i.e. surgeon, patient experience or media companies) and number of comments. The videos were graded independently by a resident, a registrar and an oculoplastic surgeon using three validated scoring systems: the Journal of the American Medical Association (JAMA), DISCERN, and Health on the Net (HON).
Results
The average number of video views was 22,992, with the mean length being 488.12 s and an average of 18 comments per video. The consensus JAMA, DISCERN and HON scores were 2.1 ± 0.6, 29.1 ± 8.8 and 2.7 ± 1.0, respectively. This indicated that the included videos were of a low quality, however, only DISCERN scores had good interobserver similarity. Videos posted by surgeons were superior to non-surgeons when considering mean JAMA and HON scores. No other factors were associated with the quality of educational content.
Conclusion
The quality and reliability of DCR related content for patient education is relatively low. Based on this study’s findings, patients should be encouraged to view videos created by surgeons or specialists in preference to other sources on YouTube.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
Dacryocystorhinostomy (DCR) is a common surgical procedure to relieve symptoms of epiphora (watery eyes) secondary to nasolacrimal duct obstruction (NLDO) [1]. The incidence of NLDO has been reported to be 20.24 per 100,000 [2].
Medical information relating to surgical procedures is often complicated which can cause patients to feel overwhelmed after being provided information by their doctor [3]. A systematic review of literature in relation to patients understanding of their surgeries, found that only 29% (6/21) of studies noted an acceptable understanding of procedure related information by participants [4]. The lack of understanding in the case of DCR is likely due to the complexity of steps involved in the procedure and the intricate anatomy. This may result in patients seeking additional information from multimedia platforms such as YouTube.
Online information searches are a common source of medical information for patients, with YouTube being a popular resource [5,6,7]. At present, there is no regulation or peer review process required to be undertaken prior to uploading of health-related videos on the platform. Therefore, if the reliability and quality of uploaded videos are of a low standard, there exists a risk of patients understanding incorrect or biased information. The educational quality of YouTube videos for patients has previously been explored, with most studies reporting content to be of low quality across a diverse range of medical and surgical procedures [8,9,10,11].
To our knowledge this is the first study assessing the educational quality of DCR related videos on YouTube. We aim to determine the quality and reliability of DCR surgery videos using validated scoring systems. In addition, we aim to determine if any factors are predictive of higher quality educational content for patients.
Methods
A YouTube search was conducted using the words “Dacryocystorhinostomy, DCR, surgery” on 12th of January 2022. The search was conducted in English (United States), with no filters applied to the search criteria. The first 50 relevant videos that were in English language, at least 6 months old and not duplicated were selected for inclusion in this study.
For each video the following was collected: a video hyperlink, title, total views, months since the video was posted, video length in seconds, total likes, total dislikes, authorship (i.e. surgeon, patient experience or media companies) and number of comments.
The videos were graded independently by an ophthalmology resident, registrar and an oculoplastics specialist surgeon. Each assessor scored the 50 videos using three ranking systems: the Journal of the American Medical Association (JAMA), DISCERN, and Health on the Net (HON). A sample size of 50 videos was used to maintain consistency with previously published literature assessing the quality of YouTube videos as a source of patient education [8, 12,13,14,15]. The JAMA criteria is scored on the basis of four categories (i.e., authorship, attribution, currency, and disclosure) and assigned a score out of four [16]. DISCERN is a detailed questionnaire consisting of 16 questions, with a score of 5 being assigned for each section and a mean being calculated in order to arrive at a final score [17]. The HON system consists of eight criteria such as authoritativeness, complementarity, privacy and advertising policy [18]. In this system each category is scored either 0 or 1.
In this study the consensus JAMA, DISCERN and HON were calculated as means ± SD. In order to determine interobserver reliability, interclass correlation of scoring (ICC) was undertaken, with a score of < 0.40 being considered poor, 0.40–0.59 fair, 0.60–0.74 good and > 0.75 excellent. The mean JAMA, HON and DISCERN scores were then used as the consensus scores for all further analyses. Linear regression was employed to determine if there was an association between scores to: view count, months online, video length, positivity (likes/likes + dislikes) and number of comments. Association between authorship (i.e., surgeon versus non-surgeon) and JAMA, DISCERN and HON scores was assessed using independent sample t-tests. In all statistical analysis P < 0.05 were considered statistically significant.
Results
The total number of views was 708,406 across the 50 selected videos, with a mean of 22,992 ± 37,396 views per video (range 305–198,477). The mean number of months the videos had been published online was 64.87 ± 27.22 (range 20.65–137.52). The average length of the included videos was 488.12 ± 472 s (range 15–2035), and all videos had a mean of 18 ± 76 comments (range 0–539). The overall positivity (i.e., likes on the video compared to dislikes) of all selected videos was approximately 95.3%, with 76% of videos being created by surgeons or ophthalmologists (see Fig. 1).
The consensus JAMA, DISCERN and HON scores were 2.1 ± 0.6, 29.1 ± 8.8 and 2.7 ± 1.0, respectively. In this study, only the DISCERN scoring system had an ICC value > 0.75 (0.852). The JAMA and HON systems correlations produced scores of 0.544 and 0.578, respectively.
The highest mean score for the included videos when considering the JAMA criteria was “currency” and the lowest was “attribution” (see Table 1 for mean scores of each JAMA criteria). For the HON criteria, “confidentiality” had the highest mean score and “justifiability” has the lowest mean score (See Table 2). From the 16 DISCERN questions; question 2- “does the video achieve its aims,” was found to have the highest mean score for included videos and question 12- “Does it describe what would happen if no treatment is used” had the lowest mean score (Table 3).
The following variables demonstrated no association with JAMA, DISCERN or HON scores: view count, months online, video length, positivity, and number of comments. The only variable demonstrating an association was video authorship (surgeons versus non-surgeons). Videos uploaded by surgeons were noted to have a higher JAMA and HON scores, with p-values < 0.05 (see Fig. 2). Interestingly, DISCERN scores demonstrated no significant difference when posted by surgeons versus non-surgeons (P = 0.367).
Discussion
In this study, the average quality of DCR surgery educational videos, using three validated assessment tools, was poor. We found 76% of videos were authored by surgeons, and two assessment tools (JAMA and HON) demonstrated these videos were superior to those posted by non-surgeons.
In ophthalmology, several studies have assessed the quality of YouTube educational resources, concluding that the overall quality is low [12, 19,20,21,22]. In cataract surgery, Bae et al.’s [12] findings were similar to our study, with 71% of videos produced by medical professionals, and the overall educational quality of videos was deemed low. Mangan et al. [21] focussed on strabismus educational content and found that only 28.5% of videos were produced by ophthalmologists, with videos published by academic institutions having the highest quality and reliability scores. In the context of oculoplastics, the quality of eyelid blepharoplasty videos has been assessed. Karataş et al. [22] reported 79% of videos were uploaded by physicians, with higher DISCERN scores compared to our study (45.06 ± 12.88) and lower JAMA scores (1.39 ± 1.06). Garip et al. [23] evaluated eyelid ptosis surgery, they also concluded the available videos were not a reliable source of educational information. Similar to our study, videos created by physicians or surgeons were of higher quality and educational value.
Noteworthy, the benefit of certain videos is not considered within the scope of the assessment tools employed. Although some videos may score well on the basis of the scoring criterion, these tools do not assess the specific content within the videos. Therefore, whilst a study may contain all technical aspects (e.g. date, author), it may lack the correct information (e.g. surgical steps involved). The tools also do not consider the relevance to the specific user, i.e. medical professional versus patient. Furthermore, the tools do not account for the emotional impact a video may have on a patient, for example if a reported experience is negative, it may disproportionality increase a patient’s concerns, anxiety or perceived risk of complications related to a surgical procedure. Another consideration is that none of the assessment tools used consider if all the information related to a procedure is present within the specific video. Given this, there may be a need to develop a dedicated assessment tool for surgical education videos which incorporates all the factors above. Additionally, the DISCERN and JAMA scoring systems are specifically designed for assessment of the quality of written materials. Therefore, the validity of these tools when applied to multimedia educational material (i.e. YouTube videos) is not known.
Winker et al. [24] have previously published guidelines for authors on uploading medical and health related information on the internet. Many of the assessment tools’ criteria are included in these guidelines including the date content was posted and the source of funding or ownership. When considering the quality of content, the guidelines recommend some form of external peer review process by experts in the field, this is a particular aspect lacking in the assessment tools employed. In future, these guidelines may be used as the basis for the development of an assessment tool to precisely evaluate the quality of surgical education videos. An adaption of the Ensuring Quality Information for Patients (EQIP) tool used for assessing the quality of written medical information (e.g. pamphlets) could be employed for online educational resources [25]. This would have to be with the caveat of the tool being validated prior to its widespread implementation.
A limitation of our study is that it only assessed the first 50 videos related to DCR surgery. Therefore, our findings regarding the quality and reliability of DCR related videos as patient education content may not necessarily be able to be extrapolated to videos beyond the first 50. It is, however, most likely for a patient to select one of the first 50 videos to view as a source of further information on their DCR procedure. Furthermore, YouTube’s algorithm for displaying the first 50 videos on each search of DCR surgery is not entirely predictable or consistent. As a result, some videos assessed in this study may not appear in a randomly selected patient’s top 50 searches. Moreover, only videos in English were reviewed; hence it is impossible to conclude the quality of educational videos in other languages. This limits the utility of our findings in non-English speaking regions, as searches in these geographic locations may yield a significantly different set of videos.
In this study only the DISCERN ICC was considered excellent, suggesting scores by all three assessors were very comparable. The JAMA and HON scores ICC was considered fair. This suggests there was more inter-assessor variability when scoring according to these systems within this study, thereby limiting their reliability in the context of this study. Previous studies have also demonstrated lower ICC scores for HON and JAMA [8, 9]. A possible explanation may be these criteria have only 2 options (yes or no), with less questions than DISCERN. This may result in the impact of assessors disagreeing becoming more significant. Furthermore, the DISCERN tool provides more context and explanation for each criterion, limiting the impact individual interpretation of questions by assessors on the overall score.
Conclusion
YouTube provides patients with easy access to a range of DCR surgery related videos. The quality and reliability of DCR related content for patient education is relatively low. Given the current trend for patients to seek additional information on the internet, specialists should consider counselling patients on the limited quality of DCR surgery content on YouTube and provide factual information through pamphlet or college instituted videos. Furthermore, based on this study’s findings, patients should be encouraged to view videos created by specialists in preference to other sources of content. Finally, there is a strong need to devise a consensus on a dedicated scoring tool for online educational videos. This assessment should include assessing the appropriateness, accuracy and completeness of the provided information pertaining to the specific treatment being discussed.
References
Yakopson VS, Flanagan JC, Ahn D et al (2011) Dacryocystorhinostomy: history, evolution and future directions Saudi. J Ophthalmol 25(1):37–49. https://doi.org/10.1016/j.sjopt.2010.10.012
Woog JJ (2007) The incidence of symptomatic acquired lacrimal outflow obstruction among residents of Olmsted County, Minnesota, 1976–2000 (an American Ophthalmological Society thesis). Trans Am Ophthalmol Soc 105:649–666
Goldblum K (1992) Knowledge deficit in the ophthalmic surgical patient. Nurs Clin North Am 27(3):715–725
Falagas ME, Korbila IP, Giannopoulou KP et al (2009) Informed consent: how much and what do patients understand? Am J Surg 198(3):420–435. https://doi.org/10.1016/j.amjsurg.2009.02.010
Van Riel N, Auwerx K, Debbaut P et al (2017) The effect of Dr Google on doctor-patient encounters in primary care: a quantitative, observational, cross-sectional study. BJGP Open 1(2):bjgpopen17X100833. https://doi.org/10.3399/bjgpopen17X100833
Murray E, Lo B, Pollack L et al (2003) The impact of health information on the internet on the physician-patient relationship: patient perceptions. Arch Internal Med 163(14):1727–1734. https://doi.org/10.1001/archinte.163.14.1727
Osman W, Mohamed F, Elhassan M et al (2022) Is YouTube a reliable source of health-related information? A systematic review BMC Med Educ 22(1):382. https://doi.org/10.1186/s12909-022-03446-z
Ovenden CD, Brooks FM (2018) Anterior cervical discectomy and fusion youtube videos as a source of patient education. Asian Spine J 12(6):987–991. https://doi.org/10.31616/asj.2018.12.6.987
Gupta AK, Kovoor JG, Ovenden CD et al (2022) Paradigm shift: Beyond the COVID-19 era, is YouTube the future of education for CABG patients? J Card Surg 37(8):2292–2296. https://doi.org/10.1111/jocs.16617
Fischer J, Geurts J, Valderrabano V et al (2013) Educational quality of YouTube videos on knee arthrocentesis. J Clin Rheumatol 19(7):373–376. https://doi.org/10.1097/RHU.0b013e3182a69fb2
Ho M, Stothers L, Lazare D et al (2015) Evaluation of educational content of YouTube videos relating to neurogenic bladder and intermittent catheterization. Can Urol Assoc J 9(9–10):320–354. https://doi.org/10.5489/cuaj.2955
Bae SS, Baxter S (2018) YouTube videos in the English language as a patient education resource for cataract surgery. Int Ophthalmol 38(5):1941–1945. https://doi.org/10.1007/s10792-017-0681-5
Starks C, Akkera M, Shalaby M et al (2021) Evaluation of YouTube videos as a patient education source for novel surgical techniques in thyroid surgery. Gland Surg 10(2):697–705. https://doi.org/10.21037/gs-20-734
Jamleh A, Nassar M, Alissa H et al (2021) Evaluation of YouTube videos for patients’ education on periradicular surgery. PLoS ONE 16(12):e0261309. https://doi.org/10.1371/journal.pone.0261309
Green L, Noll D, Barbaro A et al (2022) YouTube-friend or Foe? A closer look at videos on inguinal hernia surgery as a source for patient education. J Surg Res 280:510–514. https://doi.org/10.1016/j.jss.2022.07.024
Corcelles R, Daigle CR, Talamas HR et al (2015) Assessment of the quality of Internet information on sleeve gastrectomy. Surg Obes Rel Dis 11(3):539–544. https://doi.org/10.1016/j.soard.2014.08.014
Charnock D, Shepperd S, Needham G et al (1999) DISCERN: an instrument for judging the quality of written consumer health information on treatment choices. J Epidemiol Community Health 53(2):105–111. https://doi.org/10.1136/jech.53.2.105
Boyer C, Selby M, Scherrer JR et al (1998) The Health On the Net Code of Conduct for medical and health Websites. Comput Biol Med 28(5):603–610. https://doi.org/10.1016/s0010-4825(98)00037-7
Altunel O, Sirakaya E (2021) Evaluation of YouTube videos as sources of information about multifocal intraocular lens. Semin Ophthalmol 36(5–6):423–428. https://doi.org/10.1080/08820538.2021.1900281
Yildiz MB, Yildiz E, Balci S et al (2021) Evaluation of the Quality, Reliability, and Educational Content of YouTube Videos as an Information Source for Soft Contact Lenses. Eye Contact Lens 47(11):617–621. https://doi.org/10.1097/icl.0000000000000795
Mangan MS, Cakir A, Yurttaser Ocak S et al (2020) Analysis of the quality, reliability, and popularity of information on strabismus on YouTube. Strabismus 28(4):175–180. https://doi.org/10.1080/09273972.2020.1836002
Karataş ME, Karataş G (2022) Evaluating the reliability and quality of the upper eyelid blepharoplasty videos on YouTube. Aesthetic Plast Surg 46(2):754–759. https://doi.org/10.1007/s00266-021-02504-z
Garip R, Sakallioğlu AK (2022) Evaluation of the educational quality and reliability of YouTube videos addressing eyelid ptosis surgery. Orbit 41(5):598–604. https://doi.org/10.1080/01676830.2021.1989467
Winker MA, Flanagin A, Chi-Lum B et al (2000) Guidelines for medical and health information sites on the internet: principles governing AMA web sites. Am Med Assoc JAMA 283(12):1600–1606. https://doi.org/10.1001/jama.283.12.1600
Moult B, Franck LS, Brady H (2004) Ensuring quality information for patients: development and preliminary validation of a new instrument to improve the quality of written health care information. Health Expect 7(2):165–175. https://doi.org/10.1111/j.1369-7625.2004.00273.x
Funding
Open Access funding enabled and organized by CAUL and its Member Institutions. The authors declare that no funds, grants, or other support were received during the preparation of this manuscript.
Author information
Authors and Affiliations
Contributions
Dinesh Selva and Christopher Ovenden contributed to the study conception and design. Material preparation, data collection were performed by Gurfarmaan Singh, Raghav Goel and Yinon Shapira. Data analysis was performed by Joesph Hewitt. The first draft of the manuscript was written by Gurfarmaan Singh and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.
Corresponding author
Ethics declarations
Competing interests
The authors have no relevant financial or non-financial interests to disclose.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Singh, G., Goel, R., Shapira, Y. et al. Dacryocystorhinostomy videos on YouTube as a source of patient education. Int Ophthalmol 44, 192 (2024). https://doi.org/10.1007/s10792-024-03139-0
Received:
Accepted:
Published:
DOI: https://doi.org/10.1007/s10792-024-03139-0