Pathology is considered the “gold standard” of diagnostic medicine. The importance of radiology-pathology correlation is seen in interdepartmental patient conferences such as “tumor boards” and by the tradition of radiology resident immersion in a radiologic-pathology course at the American Institute of Radiologic Pathology. In practice, consistent pathology follow-up can be difficult due to time constraints and cumbersome electronic medical records. We present a radiology-pathology correlation dashboard that presents radiologists with pathology reports matched to their dictations, for both diagnostic imaging and image-guided procedures. In creating our dashboard, we utilized the RadLex ontology and National Center for Biomedical Ontology (NCBO) Annotator to identify anatomic concepts in pathology reports that could subsequently be mapped to relevant radiology reports, providing an automated method to match related radiology and pathology reports. Radiology-pathology matches are presented to the radiologist on a web-based dashboard. We found that our algorithm was highly specific in detecting matches. Our sensitivity was slightly lower than expected and could be attributed to missing anatomy concepts in the RadLex ontology, as well as limitations in our parent term hierarchical mapping and synonym recognition algorithms. By automating radiology-pathology correlation and presenting matches in a user-friendly dashboard format, we hope to encourage pathology follow-up in clinical radiology practice for purposes of self-education and to augment peer review. We also hope to provide a tool to facilitate the production of quality teaching files, lectures, and publications. Diagnostic images have a richer educational value when they are backed up by the gold standard of pathology.
Dashboard Medical records systems Radiology-pathology correlation Radiology teaching files Radiology workflow RadLex
This is a preview of subscription content, log in to check access.
The authors would like to acknowledge and thank Kristen Kalaria for her help with manuscript preparation.
Compliance with Ethical Standards
Our study met the criteria for exemption from approval by the institutional review board at our institution.
Verma N, Hippe DS, Robinson JD: JOURNAL CLUB: assessment of interobserver variability in the peer review process: should we agree to disagree? Am J Roentgenol. 207(6):1215–1222, 2016. doi:10.2214/AJR.16.16121.CrossRefGoogle Scholar
Bender LC, Linnau KF, Meier EN, Anzai Y, Gunn ML: Interrater agreement in the evaluation of discrepant imaging findings with the Radpeer system. Am J Roentgenol. 199(6):1320–1327, 2012. doi:10.2214/AJR.12.8972.CrossRefGoogle Scholar
Forman HP, Larson DB, Kazerooni EA, et al.: Masters of radiology panel discussion: defining a quality dashboard for radiology—what are the right metrics? Am J Roentgenol. 200(4):839–844, 2013. doi:10.2214/AJR.12.10469.CrossRefGoogle Scholar
Kohli MD, Kamer AP: Story of Stickr —design and usage of an automated biopsy follow up tool. In: Story of Stickr—Design and Usage of an Automated Biopsy Follow Up Tool.; 2014. http://archive.rsna.org/2014/14010707.html. Accessed March 12, 2017.
Do BH, Wu A, Biswal S, Kamaya A, Rubin DL: Informatics in radiology: RADTF: a semantic search–enabled, natural language processor–generated radiology teaching file. RadioGraphics. 30(7):2039–2048, 2010. doi:10.1148/rg.307105083.CrossRefPubMedGoogle Scholar