Advances in Health Sciences Education

, Volume 10, Issue 1, pp 15–22 | Cite as

Does Blueprint Publication Affect Students’ Perception of Validity of the Evaluation Process?

  • Kevin McLaughlin
  • Sylvain Coderre
  • Wayne Woloschuk
  • Henry Mandin
Article

Abstract

Context: A major goal of any evaluation is to demonstrate content validity, which considers both curricular content as well as the ability expected of learners. Whether evaluation blueprints should be published and the degree of blueprint transparency is controversial. Objectives: To examine the effect of blueprint publication on students’ perceptions of the validity of the evaluation process. Methods: This study examined students’ attitudes towards the Renal Course evaluation before and after blueprint publication. There was no significant change in the course objectives, blueprint or evaluation between the two time periods. Students’ attitudes were evaluated using a questionnaire containing four items related to evaluation. Also collected were the overall course ratings, minimum performance level (MPL) for evaluations and students’ performance on each exam. Results: There were no significant differences in the MPL or evaluation scores between the two time periods. A significantly greater proportion of students perceived that the Renal Course evaluation was a fair test and was reflective of both important subject matter and the delivered curriculum. The increased satisfaction process did not appear to be a reflection of their overall satisfaction with the course as there was a trend towards reduced overall satisfaction with the course. Conclusions: Publication of the evaluation blueprint appears to improve students’ perceptions of the validity of the evaluation process. Further studies are required to identify the reasons for this attitude change. We propose that blueprint transparency drives both instructors teaching and student learning towards key educational elements.

Keywords

content validity education evaluation evaluation blueprint medical undergraduate 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bordage, G. 1995Content validation of key features on a national examination of clinical decision-making skillsAcademic Medicine70276281PubMedGoogle Scholar
  2. Hopkins, K. 1988Educational and Psychological Measurement and EvaluationAllyn and BaconNeedham Heights, MAGoogle Scholar
  3. Mandin, H. 1995Developing a “clinical presentation” curriculum at the university of CalgaryAcademic Medicine70186193PubMedGoogle Scholar
  4. Marton, F. 1976On qualitative differences in learning: II – Outcome as a function of the learner’s conception of the taskBritish Journal of Educational Psychology46115127Google Scholar
  5. McLaughlin K. (2004). Examination blueprinting – what, why, who and how? Meducator, (in press)Google Scholar
  6. Newble, D.I. 1986Learning styles and approaches: Implications for medical educationMedical Education20162175PubMedGoogle Scholar
  7. Newble, D. 1994Guidelines for assessing clinical competenceTeaching and Learning in Medicine6213220Google Scholar
  8. Tombleson, P. 2000Defining the content for the objective structured clinical examination component of the professional and linguistic assessments board examination: Development of a blueprintMedical Education34566572CrossRefPubMedGoogle Scholar

Copyright information

© Springer 2005

Authors and Affiliations

  • Kevin McLaughlin
    • 2
  • Sylvain Coderre
    • 1
  • Wayne Woloschuk
    • 1
  • Henry Mandin
    • 1
  1. 1.University of CalgaryCalgaryCanada
  2. 2.Division of NephrologyFoothills HospitalCalgaryCanada

Personalised recommendations