mini-PAT (Peer Assessment Tool): A Valid Component of a National Assessment Programme in the UK?
- 624 Downloads
To design, implement and evaluate a multisource feedback instrument to assess Foundation trainees across the UK.
mini-PAT (Peer Assessment Tool) was modified from SPRAT (Sheffield Peer Review Assessment Tool), an established multisource feedback (360°) instrument to assess more senior doctors, as part of a blueprinting exercise of instruments suitable for assessment in Foundation programmes (first 2 years postgraduation). mini-PAT’s content validity was assured by a mapping exercise against the Foundation Curriculum. Trainees’ clinical performance was then assessed using 16 questions rated against a six-point scale on two occasions in the pilot period. Responses were analysed to determine internal structure, potential sources of bias and measurement characteristics.
Six hundred and ninety-three mini-PAT assessments were undertaken for 553 trainees across 12 Deaneries in England, Wales and Northern Ireland. Two hundred and nineteen trainees were F1s or PRHOs and 334 were F2s. Trainees identified 5544 assessors of whom 67% responded. The mean score for F2 trainees was 4.61 (SD = 0.43) and for F1s was 4.44 (SD = 0.56). An independent t test showed that the mean scores of these 2 groups were significantly different (t = −4.59, df 390, p < 0.001). 43 F1s (19.6%) and 19 F2s (5.6%) were assessed as being below expectations for F2 completion. The factor analysis produced 2 main factors, one concerned clinical performance, the other humanistic qualities. Seventy-four percent of F2 trainees could have been assessed by as few as 8 assessors (95% CI ±0.6) as they either scored an overall mean of 4.4 or above or 3.6 and below. Fifty-three percent of F1 trainees could have been assessed by as few as 8 assessors (95% CI ±0.5) as they scored an overall mean of 4.5 or above or 3.5 and below. The hierarchical regression when controlling for the grade of trainee showed that bias related to the length of the working relationship, occupation of the assessor and the working environment explained 7% of the variation in mean scores when controlling for the year of the Foundation Programme (R squared change = 0.06, F change = 8.5, significant F change <0.001).
As part of an assessment programme, mini-PAT appears to provide a valid way of collating colleague opinions to help reliably assess Foundation trainees.
KeywordsFoundation programme multisource feedback reliability validity work based assessment
Unable to display preview. Download preview PDF.
- Good Medical Practice (2001). Good Medical Practice London. General Medical Council: http://www.gmc-uk.orgGoogle Scholar
- Modernising Medical Careers (2002). Modernising Medical Careers London. Department of HealthGoogle Scholar
- Curriculum for the Foundation Years in Postgraduate Education and Training (2005). Curriculum for the Foundation Years in Postgraduate Education and Training. The Foundation Programme Committee of the Academy of Medical Royal Colleges, in co-operation with the Modernising Medical Careers in the Departments of Health, Modernising Medical Careers in the Departments of Health: www.mmc.nhs.uk/pages/foundation/CurriculumGoogle Scholar
- The New Doctor (2005). The New Doctor. London, GMC: http://www.gmc-uk.org/education/foundation/new_doctor.aspGoogle Scholar
- Principles for an assessment system for postgraduate medical training (2005). Principles for an assessment system for postgraduate medical training. London, Postgraduate Medical Education and Training Board, PMETB: www.pmetb.org.uk/pmetb/index.php?id=664Google Scholar
- Archer J.C., Davies H.A. (2004) Clinical management. Where medicine meets management. On reflection. Health Service Journal 114(5903): 26–27Google Scholar
- Borman W.C. (1974). The rating of individuals in organizations: an alternative approach. Organizational Behavior and Human Performance 12: 105–124Google Scholar
- Borman W.C. (1987). Personal constructs, performance schema, and “folk theories” of subordinate effectiveness: explorations in an army officer sample. Organizational Behavior and Human Decision Processes 40: 307–322Google Scholar
- Davies H, Archer J, et al. (2005) Assessment tools for foundation programmes – a practical guide. British Medical Journal Career Focus 330(7500): 195–196Google Scholar
- Norcini J.J., Blank L.L., et al. (2003) The mini-CEX: a method for assessing clinical skills. Annales of Internal Medicine 138(6): 476–481Google Scholar
- Ramsey, P.G.W., Wenrich, M.D., Carline, J.D., Inui, T.S., Larson, E.B. & LoGerfo, J.P. (1993). Use of peer ratings to evaluate physician performance. Journal of American Medical Association 269(13): 1655–1660Google Scholar