The reliability of workplace-based assessment in postgraduate medical education and training: a national evaluation in general practice in the United Kingdom
- 1.5k Downloads
To investigate the reliability and feasibility of six potential workplace-based assessment methods in general practice training: criterion audit, multi-source feedback from clinical and non-clinical colleagues, patient feedback (the CARE Measure), referral letters, significant event analysis, and video analysis of consultations. Performance of GP registrars (trainees) was evaluated with each tool to assess the reliabilities of the tools and feasibility, given raters and number of assessments needed. Participant experience of process determined by questionnaire. 171 GP registrars and their trainers, drawn from nine deaneries (representing all four countries in the UK), participated. The ability of each tool to differentiate between doctors (reliability) was assessed using generalisability theory. Decision studies were then conducted to determine the number of observations required to achieve an acceptably high reliability for “high-stakes assessment” using each instrument. Finally, descriptive statistics were used to summarise participants’ ratings of their experience using these tools. Multi-source feedback from colleagues and patient feedback on consultations emerged as the two methods most likely to offer a reliable and feasible opinion of workplace performance. Reliability co-efficients of 0.8 were attainable with 41 CARE Measure patient questionnaires and six clinical and/or five non-clinical colleagues per doctor when assessed on two occasions. For the other four methods tested, 10 or more assessors were required per doctor in order to achieve a reliable assessment, making the feasibility of their use in high-stakes assessment extremely low. Participant feedback did not raise any major concerns regarding the acceptability, feasibility, or educational impact of the tools. The combination of patient and colleague views of doctors’ performance, coupled with reliable competence measures, may offer a suitable evidence-base on which to monitor progress and completion of doctors’ training in general practice.
KeywordsWorkplace-based assessment Multi-source feedback Patient satisfaction Medical education Physician assessment
The completion of the pilot was made possible thanks to the help and enthusiasm 171 GP registrars and staff from the Wales, Northern Ireland, Mersey, KSS, East Scotland, North and North East Scotland, South East Scotland and West Midlands Deaneries.
The authors would like to thank Mrs. Angela Inglis (Team Leader and Personal Assistant to Dr. David Bruce, GP Director in the East of Scotland Deanery) and her team, (Lee-Ann Troup, Linda Kirkcaldy, Susan Smith, Carol Ironside and Gill Ward) for their help, support, and contribution to the work contained in this paper.
© CARE SW Mercer, Scottish Executive 2004: The CARE Measure was originally developed by Dr. Stewart Mercer and colleagues as part of a Health Services Research Fellowship funded by the Chief Scientist Office of the Scottish Executive (2000–2003). The intellectual property rights of the measure belong to the Scottish Ministers. The measure is available for use free of charge for staff of the NHS and for research purposes, but cannot be used for commercial purposes. Anyone wishing to use the measure should contact and register with Stewart Mercer (email: email@example.com).
© MSF Tool—NHS Education for Scotland 2005–2006: This two question Multi-Source Feedback (MSF) was developed by Drs. Douglas Murphy, David Bruce, and Kevin Eva on behalf of NHS Education Scotland (2005–2006). The measure is available for use free of charge for staff of the NHS and for research purposes, but cannot be used for commercial purposes. Anyone wishing to use the measure should contact and register with Douglas Murphy firstname.lastname@example.org or David Bruce email@example.com.
Ethical approval: Formal application and submission of the research proposal was made and ethical approval granted for all of the work contained in this paper by NHS Ethics Committee (Glasgow West).
Conflict of interest and source of funding statement
NHS Education Scotland and The Royal College of General Practitioners (RCGP) funded this study. DM was and DB is employed by NHS Education Scotland. DM and SWM are supported by a Primary Care Research Career Award Chief Scientist Office, Scottish Executive Health Department. The RCGP had no role in study design, data analysis, data interpretation, or writing of the report. The corresponding author had full access to all the data and had final responsibility for the decision to submit for publication. Contributors D. Murphy and K. Eva designed the studies. Data collection was done by D. Murphy and D. Bruce. Data were analysed by D. Murphy and K. Eva. Data were interpreted by D. Murphy, D. Bruce, S. Mercer and K. Eva. The manuscript was written by D. Murphy, D. Bruce, S. Mercer and K. Eva. All authors were involved in the decision to submit the manuscript for publication.
- Ackerman, E. W., & Mitchell, G. K. (2006). An audit of structured diabetes care in a rural general practice. Medical Journal of Australia, 185(2), 69–72.Google Scholar
- Campbell, L. M., Howie, J. G. R., & Murray, T. S. (1993). Summative assessment: A pilot project in the west of Scotland. British Journal of General Practice, 43, 430–434.Google Scholar
- Campbell, L. M., Howie, J. G., & Murray, T. S. (1995). Use of videotaped consultations in summative assessment of trainees in general practice. British Journal of General Practice, 45(392), 137–141.Google Scholar
- Grant, A. J., Vermunt, J. D., Kinnersley, P., & Houston, H. (2007). Exploring students’ perceptions on the use of significant event analysis, as part of a portfolio assessment process in general practice, as a tool for learning how to use reflection in learning. BMC Medical Education, 7, 5. doi: 10.1186/1472-6920-7-5.
- Joshi, H. et al. (2007). Developing and maintaining an assessment system – a PMETB guide to good practice. January 2007. http://www.pmetb.org.uk/fileadmin/user/QA/Assessment/Assessment_system_guidance_0107.pdf (accessed 10.01.2008).
- McKay, J., Bowie, P., & Lough, M. (2003). Evaluating significant event analyses: Implementing change is a measure of success. Education for Primary Care, 14, 34–38.Google Scholar
- Mercer, S. W., & Howie, J. G. R. (2006). CQI-2, a new measure of holistic, interpersonal care in primary care consultations. British Journal of General Practice, 56(525), 262–268.Google Scholar
- Modernising Medical Careers (MMC). http://www.mmc.nhs.uk/pages/home (accessed 10.05.2007).
- Multi-Source Feedback: 360° Team Assessment of Behaviour (TAB) West Midlands Deanery, UK. http://www.wmdeanery.org/Downloads/360download.asp (accessed 10.05.2007).
- National Office for Summative Assessment. First level assessor’s instructions and marking schedule. http://www.nosa.org.uk (accessed 17.02.2008)
- National Office for Summative Assessment. http://www.nosa.org.uk/downloads/html/audit/marking.htm (accessed 22.05.2007).
- Norcini, J. J., Blank, L. L., Arnold, G. K., & Kimball, H. R. (1995). The Mini-CEX (Clinical Evaluation Exercise): A preliminary investigation. Annals of Internal Medicine, 123(10), 795–799.Google Scholar
- Ram, P., Grol, R., Rethans, J. J., Schouten, B., van der Vleuten, C., & Kester, A. (1999). Assessment of general practitioners by video observation of communicative and medical performance in daily practice: Issues of validity, reliability, and feasibility. Medical Education, 33(6), 447–454.CrossRefGoogle Scholar
- RCGP: Video assessment of consulting skills in 2008; Workbook and instructions http://www.rcgp.org.uk/the_gp_journey/mrcgp/video_workbook.aspx (accessed 13.01.2008).
- Referral Advice. (2001). A guide to appropriate referral from general to specialist services. London: National Institute for Clinical Evidence (NICE).Google Scholar
- Reznick, R., Smee, S., Rothman, A., Chalmers, A., Swanson, D., Dufresne, L., Lacombe, G., Baumber, J., Poldre, P., Lavasseur, L., et al. (1992). An objective structured clinical examination for the licentiate: Report of the pilot project of the Medical Council of Canada. Academic Medicine, 67, 487–494.CrossRefGoogle Scholar
- Scottish Intercollegiate Guideline Network (SIGN). (1998). Report on a recommended referral document. Edinburgh: SIGN.Google Scholar
- Scottish Revalidation Toolkit, RCGP Scotland. http://www.rcgp.org.uk/pdf/Complete%20Revalidation%20Toolkit%20(Read%20Only)%20PDF.pdf (accessed 10.05.2007).
- Streiner, D. L., & Norman, G. R. (2003). Health measurement scales (3rd ed.). Oxford: Oxford Medical Publications.Google Scholar
- Swanson, D., Norman, G. R., & Linn, R. I. (1995). Performance based assessment: Lessons from the health professions. Educational Researcher, 24, 5–12.Google Scholar
- Verhulst, S. J., Colliver, J. A., Paiva, R. E., & Williams, R. G. (1986). A factor analysis study of first-year residents. Journal of Medical Education, 61, 132–134.Google Scholar
- firstname.lastname@example.org available http://www.dundee.ac.uk/gptraining