Skip to main content

Advertisement

Log in

Review of learning opportunity rates: correlation with radiologist assignment, patient type and exam priority

  • Original Article
  • Published:
Pediatric Radiology Aims and scope Submit manuscript

Abstract

Background

Common cause analysis of learning opportunities identified in a peer collaborative improvement process can gauge the potential risk to patients and opportunities to improve.

Objective

To study rates of learning opportunities based on radiologist assignment, patient type and exam priority at an academic children’s hospital with 24/7 in-house attending coverage.

Materials and methods

Actively submitted peer collaborative improvement learning opportunities from July 2, 2016, to July 31, 2018, were identified. Learning opportunity rates (number of learning opportunities divided by number of exams in each category) were calculated based on the following variables: radiologist assignment at the time of dictation (daytime weekday, daytime weekend and holiday, evening, and night) patient type (inpatient, outpatient or emergency center) and exam priority (stat, urgent or routine). A statistical analysis of rate differences was performed using a chi-square test. Pairwise comparisons were made with Bonferroni method adjusted P-values.

Results

There were 1,370 learning opportunities submitted on 559,584 studies (overall rate: 0.25%). The differences in rates by assignment were statistically significant (P<0.0001), with the highest rates on exams dictated in the evenings (0.31%) and lowest on those on nights (0.19%). Weekend and holiday daytime (0.26%) and weekday daytime (0.24%) rates fell in between. There were significantly higher rates on inpatients (0.33%) than on outpatients (0.22%, P<0.0001) or emergency center patients (0.16%, P<0.0001). There were no significant differences based on exam priority (stat 0.24%, urgent 0.26% and routine 0.24%, P=0.55).

Conclusion

In this study, the highest learning opportunity rates occurred on the evening rotation and inpatient studies, which could indicate an increased risk for patient harm and potential opportunities for improvement.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Similar content being viewed by others

References

  1. Chassin MR, Loeb JM, Schmaltz SP, Wachter RM (2010) Accountability measures – using measurement to promote quality improvement. N Engl J Med 363:683–688

    Article  CAS  PubMed  Google Scholar 

  2. Rosenkrantz AB, Nicola GN, Allen B Jr (2017) MACRA, alternative payment models, and the physician-focused payment model: implications for radiology. J Am Coll Radiol 14:744–751

    Article  PubMed  Google Scholar 

  3. Burwell SM (2015) Setting value-based payment goals. N Engl J Med 372:897–899

    Article  CAS  PubMed  Google Scholar 

  4. Larson DB, Donnelly LF, Podberesky DJ et al (2017) Peer feedback, learning, and improvement: answering the call of the Institute of Medicine’s report on diagnostic error. Radiology 283:231–241

    Article  PubMed  Google Scholar 

  5. Donnelly LF, Dorfman SR, Jones J, Bisset GS 3rd (2018) Transition from peer review to peer learning: experience in a radiology department. J Am Coll Radiol 15:1143–1149

    Article  PubMed  Google Scholar 

  6. Donnelly LF, Larson DB, Heller RE III, Kruskal JB (2018) Practical suggestions on how to move from peer review to peer learning. AJR Am J Roentgenol 210:578–582

    Article  PubMed  Google Scholar 

  7. Larson PA, Pyatt RS Jr, Grimes CK et al (2011) Getting the most out of RADPEER™. J Am Coll Radiol 8:543–548

    Article  PubMed  Google Scholar 

  8. Borgstede JP, Lewis RS, Bhargavan M, Sunshine JH (2004) RADPEER™ quality assurance program: a multifacility study of interpretive disagreement rates. J Am Coll Radiol 1:59–65

    Article  PubMed  Google Scholar 

  9. Alkasab TK, Harvey HB, Gowda V et al (2014) Consensus-oriented group peer review: a new process to review radiologist work output. J Am Coll Radiol 11:131–138

    Article  PubMed  Google Scholar 

  10. Busby LP, Courtier JL, Glastonbury CM (2018) Bias in radiology: the how and why of misses and misinterpretations. Radiographics 38:236–247

    Article  PubMed  Google Scholar 

  11. Waite S, Kolla S, Jeudy J et al (2017) Tired in the reading room: the influence of fatigue in radiology. J Am Coll Radiol 14:191–197

    Article  PubMed  Google Scholar 

  12. Bruno MA, Walker EA, Adujudeh HH (2015) Understanding and confronting our mistakes: the epidemiology of error in radiology and strategies for error reduction. Radiographics 35:1668–1676

    Article  PubMed  Google Scholar 

  13. Degnan AJ, Ghobadi EH, Hardy P et al (2018) Perceptual and interpretive error in diagnostic radiology—causes and potential solutions. Acad Radiol 26:833–845

    Article  PubMed  Google Scholar 

  14. Larson DB, Nance JJ (2011) Rethinking peer review: what aviation can teach radiology about performance improvement. Radiology 259:626–632

    Article  PubMed  Google Scholar 

  15. Larson DB (2015) Tackling the problem of error in diagnostic radiology. Pediatr Radiol 45:790–792

    Article  PubMed  Google Scholar 

  16. Waite S, Scott JM, Legasto A et al (2017) Systemic error in radiology. AJR Am J Roentgenol 209:629–639

    Article  PubMed  Google Scholar 

  17. Hanna TN, Loehfelm T, Khosa F et al (2016) Overnight shift work: factors contributing to diagnostic discrepancies. Emerg Radiol 23:41–47

    Article  PubMed  Google Scholar 

  18. Hanna TN, Zygmont ME, Peterson R et al (2018) The effects of fatigue from overnight shifts on radiology search patterns and diagnostic performance. J Am Coll Radiol 15:1709–1716

    Article  PubMed  Google Scholar 

  19. Jaimes C, Murcia DJ, Miguel K et al (2018) Identification of quality improvement areas in pediatric MRI from analysis of patient safety reports. Pediatr Radiol 48:66–73

    Article  PubMed  Google Scholar 

  20. Snyder EJ, Zhang W, Jasmin KC et al (2018) Gauging potential risk for patients in pediatric radiology by review of over 2,000 incident reports. Pediatr Radiol 48:1867–1874

    Article  PubMed  Google Scholar 

  21. Mansouri M, Aran S, Shaqdan KW, Adujudeh HH (2016) Rating and classification of incident reporting in radiology in a large academic medical center. Curr Probl Diagn Radiol 45:247–252

    Article  PubMed  Google Scholar 

  22. Schultz SR, Watson RE Jr, Prescott SL et al (2011) Patient safety event reporting in a large radiology department. AJR Am J Roentgenol 197:684–688

    Article  PubMed  Google Scholar 

  23. Krupinski EA, Berbaum KS, Caldwell RT et al (2010) Long radiology workdays reduce detection and accommodation accuracy. J Am Coll Radiol 7:698–704

    Article  PubMed  PubMed Central  Google Scholar 

  24. Krupinski EA, Berbaum KS (2009) Measurement of visual strain in radiologists. Acad Radiol 16:947–950

    Article  PubMed  PubMed Central  Google Scholar 

  25. Krupinski E, Reiner BI (2012) Real-time occupational stress and fatigue measurement in medical imaging practice. J Digit Imaging 25:319–324

    Article  PubMed  Google Scholar 

  26. Hanna TN, Lamoureux C, Krupinski EA et al (2018) Effect of shift, schedule, and volume on interpretive accuracy: a retrospective analysis of 2.9 million radiologic examinations. Radiology 287:205–212

    Article  PubMed  Google Scholar 

  27. Krupinski EA, Berbaum KS, Caldwell RT et al (2012) Do long radiology workdays affect nodule detection in dynamic CT interpretation? J Am Coll Radiol 9:191–198

    Article  PubMed  PubMed Central  Google Scholar 

  28. Rohatgi S, Hanna TN, Sliker CW et al (2015) After-hours radiology: challenges and strategies for the radiologist. AJR Am J Roentgenol 205:956–961

    Article  PubMed  Google Scholar 

  29. McElhatton J, Drew C (1993) 'Hurry-up' syndrome identified as a causal factor in aviation safety incidents. Human Factors & Aviation Medicine 40:1–6

  30. Moriarity AK, Hawkins CM, Geis JR et al (2016) Meaningful peer review in radiology: a review of current practices and potential future directions. J Am Coll Radiol 13:1519–1524

    Article  PubMed  Google Scholar 

  31. Masch WR, Parikh ND, Licari TL et al (2018) Radiologist quality assurance by nonradiologists at tumor board. J Am Coll Radiol 15:1259–1265

    Article  PubMed  Google Scholar 

  32. Harvey HB, Alkasab TK, Prabhakar AM et al (2016) Radiologist peer review by group consensus. J Am Coll Radiol 13:656–662

    Article  PubMed  Google Scholar 

  33. Charkhchi P, Wang B, Caffo B, Yousem DM (2019) Bias in neuroradiology peer review: impact of a "ding" on "dinging" others. AJNR Am J Neuroradiol 40:19–24

    Article  CAS  PubMed  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Marla B. K. Sammer.

Ethics declarations

Conflicts of interest

None

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sammer, M.B.K., Sammer, M.D. & Donnelly, L.F. Review of learning opportunity rates: correlation with radiologist assignment, patient type and exam priority. Pediatr Radiol 49, 1269–1275 (2019). https://doi.org/10.1007/s00247-019-04466-6

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00247-019-04466-6

Keywords

Navigation