Skip to main content

Transforming from Radiologist Peer Review Audits to Peer Learning and Improvement Approaches

  • Chapter
  • First Online:
Quality and Safety in Imaging

Part of the book series: Medical Radiology ((Med Radiol Diagn Imaging))

  • 619 Accesses

Abstract

All radiologists actively practicing in the United States are required to undergo some manner of periodic performance evaluation. This should provide an unbiased, fair, and balanced evaluation of radiologist performance to identify opportunities for additional education, error reduction, and self-improvement. By far the most common method of peer review audit system currently used in radiology is RADPEER, developed almost 15 years ago by the American College of Radiology (ACR), in which originally interpreted images are randomly selected and reviewed by a peer radiologist. However, studies have shown that this time-consuming process has inherent sampling bias, has limited value as an educational tool, and is primarily performed to meet accreditation and hospital credentialing requirements. Moreover, it evaluates the performance of a radiologist in terms of a diagnostic discrepancy score, excluding the myriad of other functions and roles that radiologists play, including teaching, consulting, and communicating abnormal results. Consequently, an increasing number of radiology practices are embracing simple scoring systems that either agree with the prior read or score the interpretation as an “apparent learning case.”

Rather than a scoring-based peer review audits of random cases for evaluating radiologist performance, this chapter recommends the adoption of a system based on “peer learning, which consists of peer feedback, learning, and improvement. The goal is not to identify poor-performing physicians, but to improve performance of all members of the group by analyzing the potential contributors to errors through a self-reflection process, as well as peer discussion in a constructive, nonpunitive quality improvement meeting.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  • Abujudeh H, Pyatt RS Jr, Bruno MA et al (2014) RADPEER peer review: relevance, use, concerns, challenges, and direction forward. J Am Coll Radiol 11(9):899–904

    Article  Google Scholar 

  • Alkasab TK, Harvey HB, Gowda V, Thrall JH, Rosenthal DI, Gazelle GS (2014) Consensus- oriented group peer review: a new process to review radiologist work output. J Am Coll Radiol 11(2):131–138

    Article  Google Scholar 

  • Alport HR, Hillman BJ (2004) Quality and variability in diagnostic radiology. J Am Coll Radiol 1:127–132

    Article  Google Scholar 

  • American Board of Medical Specialties Web site. http://www.abms.org/board-certification/a-trusted-credential/based-on-core-competencies/. Accessed July 5, 2017.

  • American College of Radiology Web site. http://www.acr.org. Accessed May 12, 2016.

  • Balogh EP, Miller BT, Ball JR, eds. Board on Health Care Services, Institute of Medicine. Improving diagnosis in health care. Washington, DC: The National Academy of Sciences, The National Academies Press, 2015a.

    Google Scholar 

  • Balogh EP, Miller BT, Ball JR, eds. Board on Health Care Services, Institute of Medicine. Organizational characteristics, the physical environment, and the diagnostic process: Improving learning, culture, and the work system. In:Improving diagnosis in health care. Washington, DC: The National Academy of Sciences, The National Academies Press, 2015b; 263–306.

    Google Scholar 

  • Balogh EP, Miller BT, Ball JR, Board on Health Care Services, Institute of Medicine. Diagnostic team members and tasks: Improving patient engagement and health care professional education and training diagnoses. In: Improving diagnosis in health care Washington, DC: The National Academy of Sciences, The National Academies Press, 2015c; 145–216.

    Google Scholar 

  • Bender LC, Linnau KF, Meier EN, Anzai Y, Gunn ML (2012) Interrater agreement in the evaluation of discrepant imaging findings with the RADPEER system. AJR Am J Roentgenol 199(6):1320–1327

    Article  Google Scholar 

  • Berwick DM (2016) Era 3 for medicine and health care. JAMA 315:1329–1330

    Article  CAS  Google Scholar 

  • Borgstede JP, Lewis RS, Bhargavan M, Sunshine JH (2004) RADPEER quality assurance program: a multifacility study of interpretive disagreement rates. J Am Coll Radiol 1(1):59–65

    Article  Google Scholar 

  • Brook OR, Romero J, Brook A, Kruskal JB, Yam CS, Levine D (2015) The complementary nature of peer review and quality assistance data collection. Radiology 274:221–229

    Article  Google Scholar 

  • Donnelly LF (2007) Performance-based assessment of radiology practitioners: promoting improvement in accordance with the 2007 joint commission standards. J Am Coll Radiol 4(10):699–703

    Article  Google Scholar 

  • Donnelly LF, Strife JL (2005) Performance-based assessment of radiology faculty: a practical plan to promote improvement and meet JCAHO standards. AJR Am J Roentgenol 184(5):1398–1401

    Article  Google Scholar 

  • Eisenberg RL (2014) Survey of faculty perceptions regarding a peer review system. J Am Coll Radiol 11:397–401

    Article  Google Scholar 

  • Eisenberg RL, Heidinger B (2016) Peer review: a better way. Acad Radiol 23:1071–1072

    Article  Google Scholar 

  • Halsted MJ (2004) Radiology peer review as an opportunity to reduce errors and improve patient care. J Am Coll Radiol 1(12):984–987

    Article  Google Scholar 

  • Jackson VP, Cushing T, Abujudeh HH et al (2009) RADPEER scoring white paper. J Am Coll Radiol 6(1):21–25

    Article  Google Scholar 

  • Joint Commission (2007) Comprehensive accreditation manual for hospitals: The official handbook. Oakbrook Terrace, Ill. Joint Commission

    Google Scholar 

  • Kruskal J, Eisenberg R (2016) Focused professional performance evaluation of a radiologist—a centers for medicare and medicaid services and joint commission requirement. Curr Probl Diagn Radiol 45(2):87–93

    Article  Google Scholar 

  • Kruskal JB, Eisenberg RL, Brook O, Siewert B (2016) Transitioning from peer review to peer learning for abdominal radiologists. Abdom Radiol (NY) 41(3):416–428

    Article  Google Scholar 

  • Larson DB, Nance JJ (2011) Rethinking peer review: what aviation can teach radiology about performance improvement. Radiology 259(3):626–632

    Article  Google Scholar 

  • Larson PA, Pyatt RS Jr, Grimes CK, Abudujeh HH, Chin KW, Roth CJ (2011) Getting the most out of RADPEER. J Am Coll Radiol 8(8):543–548

    Article  Google Scholar 

  • Larson DB, Donnelly LF, Podberesky DJ, Merrow AC, Sharpe RE, Kruskal JB (2016) Peer feedback, learning, and improvement: answering the call, of the Institute of Medicine Report on diagnostic error. Radiology 27:161254. doi:10.1148/radiol.2016161254. [Epub ahead of print]

    Article  Google Scholar 

  • Lucey L. The American College of Radiology Accreditation Overview. https://www2.rsna.org/re/QIBA_Annual_Meeting_2014/Index_files/Presentations/11-LUCEY-ACR.pdf. Published May 21, 2014. Accessed Apr 23, 2016

  • Mahgerefteh S, Kruskal JB, Yam CS, Blachar A, Sosna J (2009) Peer review in diagnostic radiology: current state and a vision for the future. Radiographics 29:1221–1231

    Article  Google Scholar 

  • Steele JR, Hovsepian DM, Schomer DF (2010) The joint commission practice performance evaluation: a primer for radiologists. J Am Coll Radiol 7(6):425–430

    Article  Google Scholar 

  • The Royal College of Radiologists. Quality assurance in radiology reporting: peer feedback. http://www.rcr.ac.uk/quality-assurance-radiology-reporting-peer-feedback. Published 2014a. Accessed Apr 23, 2016

  • The Royal College of Radiologists. Standards for learning from discrepancies meetings. http://www.rcr.ac.uk/publication/standards-learning-discrepancies-meetings. Published 2014b. Accessed Apr 23, 2016

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ronald Eisenberg M.D., J.D. .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this chapter

Cite this chapter

Eisenberg, R., Kruskal, J. (2017). Transforming from Radiologist Peer Review Audits to Peer Learning and Improvement Approaches. In: Donoso-Bach, L., Boland, G. (eds) Quality and Safety in Imaging. Medical Radiology(). Springer, Cham. https://doi.org/10.1007/174_2017_114

Download citation

  • DOI: https://doi.org/10.1007/174_2017_114

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-42576-4

  • Online ISBN: 978-3-319-42578-8

  • eBook Packages: MedicineMedicine (R0)

Publish with us

Policies and ethics