Overview: what’s worked and what hasn’t as a guide towards predictive admissions tool development

Article

DOI: 10.1007/s10459-009-9160-8

Cite this article as:
Siu, E. & Reiter, H.I. Adv in Health Sci Educ (2009) 14: 759. doi:10.1007/s10459-009-9160-8

Abstract

Admissions committees and researchers around the globe have used diligence and imagination to develop and implement various screening measures with the ultimate goal of predicting future clinical and professional performance. What works for predicting future job performance in the human resources world and in most of the academic world may not, however, work for the highly competitive world of medical school applicants. For the job of differentiating within the highly range-restricted pool of medical school aspirants, only the most reliable assessment tools need apply. The tools that have generally shown predictive validity in future performance include academic scores like grade point average, aptitude tests like the Medical College Admissions Test, and non-cognitive testing like the multiple mini-interview. The list of assessment tools that have not robustly met that mark is longer, including personal interview, personal statement, letters of reference, personality testing, emotional intelligence and (so far) situational judgment tests. When seen purely from the standpoint of predictive validity, the trends over time towards success or failure of these measures provide insight into future tool development.

Keywords

Predictive validity Admissions Overview Grade point average Medical College Admissions Test Multiple mini-interview 

Copyright information

© Springer Science+Business Media B.V. 2009

Authors and Affiliations

  1. 1.McMaster UniversityHamiltonCanada
  2. 2.Department of OncologyMcMaster UniversityHamiltonCanada

Personalised recommendations