Journal of Digital Imaging

, Volume 31, Issue 1, pp 32–41 | Cite as

Characterizing Diagnostic Search Patterns in Digital Breast Pathology: Scanners and Drillers

  • Ezgi Mercan
  • Linda G. Shapiro
  • Tad T. Brunyé
  • Donald L. Weaver
  • Joann G. Elmore


Following a baseline demographic survey, 87 pathologists interpreted 240 digital whole slide images of breast biopsy specimens representing a range of diagnostic categories from benign to atypia, ductal carcinoma in situ, and invasive cancer. A web-based viewer recorded pathologists’ behaviors while interpreting a subset of 60 randomly selected and randomly ordered slides. To characterize diagnostic search patterns, we used the viewport location, time stamp, and zoom level data to calculate four variables: average zoom level, maximum zoom level, zoom level variance, and scanning percentage. Two distinct search strategies were confirmed: scanning is characterized by panning at a constant zoom level, while drilling involves zooming in and out at various locations. Statistical analysis was applied to examine the associations of different visual interpretive strategies with pathologist characteristics, diagnostic accuracy, and efficiency. We found that females scanned more than males, and age was positively correlated with scanning percentage, while the facility size was negatively correlated. Throughout 60 cases, the scanning percentage and total interpretation time per slide decreased, and these two variables were positively correlated. The scanning percentage was not predictive of diagnostic accuracy. Increasing average zoom level, maximum zoom level, and zoom variance were correlated with over-interpretation.


Digital pathology Diagnostic decision-making Breast cancer Breast histopathology Whole slide imaging Diagnostic interpretation 



Research reported in this publication was supported by the National Cancer Institute awards R01 CA172343, R01 CA140560, and KO5 CA104699. The content is solely the responsibility of the authors and does not necessarily represent the views of the National Cancer Institute or the National Institutes of Health. We thank Ventana Medical Systems, Inc. (Tucson, AZ, USA), a member of the Roche Group, for the use of iScan Coreo Au™ whole slide imaging system, and HD View SL for the source code used to build our digital viewer. For a full description of HD View SL, please see

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflict of interest.

Supplementary material

10278_2017_9990_Fig4_ESM.gif (14 kb)
Figure 1

(GIF 14 kb).

10278_2017_9990_MOESM1_ESM.tif (1.7 mb)
High-resolution image (TIFF 1742 kb).
10278_2017_9990_MOESM2_ESM.docx (13 kb)
Table 1 (DOCX 13 kb).
10278_2017_9990_MOESM3_ESM.docx (18 kb)
Table 2 (DOCX 17 kb).


  1. 1.
    Irshad H, Veillard A, Roux L et al.: Methods for nuclei detection, segmentation, and classification in digital histopathology: a review-current status and future potential. IEEE Rev Biomed Eng 7:97–114, 2014. doi:  10.1109/RBME.2013.2295804 CrossRefPubMedGoogle Scholar
  2. 2.
    Yin F, Han G, Bui MM et al.: Educational value of digital whole slides accompanying published online pathology journal articles: a multi-institutional study. Arch Pathol Lab Med 140(7):694–697, 2016. doi:  10.5858/arpa.2015-0366-OA
  3. 3.
    Saco A, Bombi JA, Garcia A et al.: Current status of whole-slide imaging in education. Pathobiology 83(2-3):79–88, 2016. doi:  10.1159/000442391
  4. 4.
    Kumar RK, Freeman B, Velan GM et al.: Integrating histology and histopathology teaching in practical classes using virtual slides. Anat Rec - Part B New Anat 289(4):128–133, 2006. doi:  10.1002/ar.b.20105
  5. 5.
    Bruch LA, De Young BR, Kreiter CD et al.: Competency assessment of residents in surgical pathology using virtual microscopy. Hum Pathol 40(8):1122–1128, 2009. doi:  10.1016/j.humpath.2009.04.009
  6. 6.
    Gutman D, Cobb J, Somanna D: Cancer Digital Slide Archive: an informatics resource to support integrated in silico analysis of TCGA pathology data. Med Informatics 20(6):1091–1098, 2013. doi:  10.1136/amiajnl-2012-001469
  7. 7.
    Al-Janabi S, Huisman A, Van Diest PJ: Digital pathology: current status and future perspectives. Histopathology. 61(1):1–9, 2012. doi:  10.1111/j.1365-2559.2011.03814.x
  8. 8.
    Pantanowitz L, Valenstein PN, Evans AJ et al.: Review of the current state of whole slide imaging in pathology. J Pathol Inform 2:36, 2011. doi:  10.4103/2153-3539.83746 CrossRefPubMedPubMedCentralGoogle Scholar
  9. 9.
    Brunyé TT, Carney PA, Allison KH et al.: Eye movements as an index of pathologist visual expertise: a pilot study. PLoS One 9(8):e103447, 2014. doi:  10.1371/journal.pone.0103447
  10. 10.
    Bahlmann C, Patel A, Johnson J et al.: Automated detection of diagnostically relevant regions in H&E stained digital pathology slides. Proc. SPIE, Med. Imaging 8315:831504, 2012. doi:  10.1117/12.912484 CrossRefGoogle Scholar
  11. 11.
    Drew T, Vo ML, Olwal A et al.: Scanners and drillers: characterizing expert visual search through volumetric images. J Vis 13(10). pii: 3., 2013. doi:  10.1167/13.10.3
  12. 12.
    Tourassi G, Voisin S, Paquit V et al.: Investigating the link between radiologists’ gaze, diagnostic decision, and image content. J Am Med Inform Assoc 20(6):1067–1075, 2013. doi:  10.1136/amiajnl-2012-001503
  13. 13.
    Krupinski EA, Graham AR, Weinstein RS: Characterizing the development of visual search expertise in pathology residents viewing whole slide images. Hum Pathol 44(3):357–64, 2013. doi:  10.1016/j.humpath.2012.05.024
  14. 14.
    Crowley RS, Naus GJ, Stewart J et al.: Development of visual diagnostic expertise in pathology: an information-processing study. J Am Med Informatics Assoc 10(1): 39–51, 2003. doi:  10.1197/jamia.M1123
  15. 15.
    Krupinski EA, Weinstein RS: Changes in visual search patterns of pathology residents as they gain experience. In: Proceedings of SPIE:79660P, 2011. doi:  10.1117/12.877735
  16. 16.
    Velez N, Jukic D, Ho J: Evaluation of 2 whole-slide imaging applications in dermatopathology. Hum Pathol 39 (9), 1341–1349, 2008. doi:  10.1016/j.humpath.2008.01.006
  17. 17.
    Wen G, Drew T, Wolfe JM et al.: Computational assessment of visual search strategies in volumetric medical images strategies in volumetric medical images. J Med Imaging 3(1):015501, 2016. doi:  10.1117/1.JMI.3.1.015501
  18. 18.
    Elmore JG, Longton GM, Carney PA et al.: Diagnostic concordance among pathologists interpreting breast biopsy specimens. JAMA 313(11):1122–1132., 2015. doi:  10.1001/jama.2015.1405
  19. 19.
    Oster NV, Carney PA, Allison KH et al.: Development of a diagnostic test set to assess agreement in breast pathology: practical application of the Guidelines for Reporting Reliability and Agreement Studies (GRRAS). BMC Womens Health 13:3, 2013. doi:  10.1186/1472-6874-13-3
  20. 20.
    Nagarkar DB, Mercan E, Weaver DL et al.: Region of interest identification and diagnostic agreement in breast pathology. Mod. Pathol. 29(9):1004–1011, 2016. doi:  10.1038/modpathol.2016.85
  21. 21.
    Elmore J, Longton G, Pepe M et al.: A randomized study comparing digital imaging to traditional glass slide microscopy for breast biopsy and cancer diagnosis. J Pathol Inform 8:12, 2017. doi:  10.4103/2153-3539.201920 CrossRefPubMedPubMedCentralGoogle Scholar
  22. 22.
    Elias SM, Smith WL, Barney CE: Age as a moderator of attitude towards technology in the workplace: work motivation and overall job satisfaction. Behav Inf Technol 31(5):453–467, 2012. doi:  10.1080/0144929X.2010.513419
  23. 23.
    Newton T, Slade P, Butler NM et al.: Personality and performance on a simple visual search task. Pers Individ Dif 13(3):381–382, 1992. doi:  10.1016/0191-8869(92)90119-A
  24. 24.
    Wu S, Zhong S, Liu Y. Deep residual learning for image steganalysis. Multimed Tools Appl, Published Online First: 15 February 2017. doi: 10.1007/s11042-017-4440-4
  25. 25.
    Miglioretti DL, Gard CC, Carney PA et al.: When radiologists perform best: the learning curve in screening mammogram interpretation. Radiology 253(3):632–640, 2009. doi:  10.1148/radiol.2533090070
  26. 26.
    Chun MM, Wolfe JM: Just say no: how are visual searches terminated when there is no target present? Cogn Psychol 30(1):39–78, 1996. doi:  10.1006/cogp.1996.0002
  27. 27.
    Miyake A, Friedman NP, Emerson MJ et al.: The unity and diversity of executive functions and their contributions to complex ‘frontal lobe’ tasks: a latent variable analysis. Cogn Psychol 41(1):49–100, 2000. doi:  10.1006/cogp.1999.0734
  28. 28.
    Turner ML, Engle RW: Is working memory capacity task dependent? J Mem Lang 28(2):127–154, 1989. doi:  10.1016/0749-596X(89)90040-5
  29. 29.
    Treanor D, Lim CH, Magee D et al.: Tracking with virtual slides: a tool to study diagnostic error in histopathology. Histopathology 55(1):37–45, 2009. doi:  10.1111/j.1365-2559.2009.03325.x
  30. 30.
    Mello-thoms C, Mello CAB, Medvedeva O et al.: Perceptual analysis of the reading of dermatopathology virtual slides by pathology residents. Arch Pathol Lab Med. 136(5):551–562, 2012. doi:  10.5858/arpa.2010-0697-OA
  31. 31.
    Krupinski EA, Tillack AA, Richter L et al.: Eye-movement study and human performance using telepathology virtual slides. Implications for medical education and differences with experience. Hum Pathol 37(12):1543–1556, 2006. doi:  10.1016/j.humpath.2006.08.024

Copyright information

© Society for Imaging Informatics in Medicine 2017

Authors and Affiliations

  1. 1.Paul G. Allen School of Computer Science and EngineeringUniversity of WashingtonSeattleUSA
  2. 2.Department of PsychologyTufts UniversityMedfordUSA
  3. 3.Department of Pathology and UVM Cancer CenterUniversity of VermontBurlingtonUSA
  4. 4.Department of MedicineUniversity of Washington School of MedicineSeattleUSA

Personalised recommendations