Skip to main content
Log in

Computer-Based Administration and Grading of Free-Response Practical Examination Items: a Comparison of Assessment Programs and Case Study

  • Health Science Case Study
  • Published:
Medical Science Educator Aims and scope Submit manuscript

Abstract

This descriptive article compares the functionality of frequently used assessment platforms and details, through a case study, the process of using an iPad application to administer and grade free-response practical examinations. The approach presented in this study resolves reported issues, such as cueing effects inherent to other computer-based assessment formats. In a human gross anatomy course, medical students used their iPads during a traditional practical examination to input answers directly from pure recall. Grading procedures and efficiencies of this system are described along with the benefits and limitations of administering practical examinations in this electronic format.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2

Similar content being viewed by others

References

  1. Inuwa IM, Taranikanti V, Al-Rawahy M, Habbal O. Anatomy practical examinations: how does student performance on computerized evaluation compare with the traditional format? Anat Sci Educ. 2012;5(1):27–32.

    Article  Google Scholar 

  2. Karami M, Heussen N, Schmitz-Rode T, Baumann M. Advantages and disadvantages of electronic assessments in biomedical education. World Congress on Medical Physics and Biomedical Engineering, September 7-12, 2009, Munich, Germany. Heidelberg: Springer; 2009.

    Book  Google Scholar 

  3. Krippendorf BB, Bolender DL, Kolesari GL. Computerized grading of anatomy laboratory practical examinations. Anat Sci Educ. 2008;1(5):220–3.

    Article  Google Scholar 

  4. Meyer AJ, Innes SI, Stomski NJ, Armson AJ. Student performance on practical gross anatomy examinations is not affected by assessment modality. Anat Sci Educ. 2015;9(2):111–20.

    Article  Google Scholar 

  5. Casey M. Computer-assisted grading of gross anatomy practical. Clin Anat. 2001;14(1):69.

    Google Scholar 

  6. Vorstenbosch MA, Bouter ST, van den Hurk MM, Kooloos JG, Bolhuis SM, Laan RF. Exploring the validity of assessment in anatomy: do images influence cognitive processes used in answering extended matching questions? Anat Sci Educ. 2014;7(2):107–16.

    Article  Google Scholar 

  7. Yammine K, Violato C. A meta-analysis of the educational effectiveness of three-dimensional visualization technologies in teaching anatomy. Anat Sci Educ. 2015;8(6):525–8.

    Article  Google Scholar 

  8. Shaibah HS, van der Vleuten CPM. The validity of multiple choice practical examinations as an alternative to traditional free response examination formats in gross anatomy. Anat Sci Educ. 2013;6(3):149–56.

    Article  Google Scholar 

  9. McCloskey DI, Holland RA. A comparison of student performances in answering essay-type and multiple-choice questions. Med Educ. 1976;10(5):382–5.

    Article  Google Scholar 

  10. Forsdyke DR. A comparison of short and multiple choice questions in the evaluation of students of biochemistry. Med Educ. 1978;12(5):351–6.

    Article  Google Scholar 

  11. Rothman AI, Kerenyi N. The assessment of an examination in pathology consisting of multiple-choice, practical and short essay questions. Med Educ. 1980;14(5):341–4.

    Article  Google Scholar 

  12. Norman GR, Smith EK, Powles AC, Rooney PJ, Henry NL, Dodd PE. Factors underlying performance on written tests of knowledge. Med Educ. 1987;21(4):297–304.

    Article  Google Scholar 

  13. Veloski JJ, Rabinowitz HK, Robeson MR. A solution to the cueing effects of multiple choice questions: the un-q format. Med Educ. 1993;27(4):371–5.

    Article  Google Scholar 

  14. Schuwirth LWT, Vleuten CPM, Donkers HH. A closer look at cueing effects in multiple-choice questions. Med Educ. 1996;30(1):44–9.

    Article  Google Scholar 

  15. Schuwirth LW, Van Der Vleuten CPM. Different written assessment methods: what can be said about their strengths and weaknesses? Med Educ. 2004;38(9):974–9.

    Article  Google Scholar 

  16. Case SM, Swanson DB. Extended-matching items: a practical alternative to free-response questions. Teach Learn Med. 1993;5(2):107–15.

    Article  Google Scholar 

  17. Bhakta B, Tennant A, Horton M, Lawton G, Andrich D. Using item response theory to explore the psychometric properties of extended matching questions examination in undergraduate medical education. BMC Med Educ. 2005;5(1):9.

    Article  Google Scholar 

  18. Ikah DS, Finn GM, Swamy M, White PM, McLachlan JC. Clinical vignettes improve performance in anatomy practical assessment. Anat Sci Educ. 2015;8(3):221–9.

    Article  Google Scholar 

  19. Bloom BS, Engelhart MD, Furst FJ, Hill WH, Krathwohl DR. Handbook I: cognitive domain. Taxonomy of educational objectives: the classification of education goals. New York: Longman; 1956.

    Google Scholar 

  20. Zhang G, Fenderson BA, Schmidt RR, Veloski JJ. Equivalence of students’ scores on timed and untimed anatomy practical examinations. Anat Sci Educ. 2013;6(5):281–5.

    Article  Google Scholar 

Download references

Acknowledgments

The authors wish to thank the Rush University Medical Class of 2019 for their willingness to switch to this form of computer-based assessment for practical examinations.

Notes on Contributors

ADAM B. WILSON, Ph.D., is assistant professor of anatomy in the Department of Cell and Molecular at Rush University Medical Center, Chicago, IL. With an interest in measurement and evaluation, his research within anatomy and surgical education is primarily focused on instrument development and validation, programmatic evaluation, and the evaluation of teaching pedagogies.

MARK GRICHANIK, M.A., is director of student assessment in the Office of Medical Student Programs at Rush University Medical Center, Chicago, IL. His research focuses on technology-supported learning and assessment in the service of exceptional patient care.

JAMES M. WILLIAMS, Ph.D., is professor of anatomy in the Department of Cell and Molecular Medicine at Rush University Medical Center, Chicago, IL. His research interests include models of cartilage injury and repair, development of pedagogies, and evaluation of instructional methods.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Adam B. Wilson.

Ethics declarations

Disclosures/Disclaimers

The authors have NO affiliations, involvement, or financial interest in ExamSoft®.

Ethical Approval

This study was exempt from institutional review board approval as it is purely a descriptive study.

Previous Presentations

None.

Funding/Support

None.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wilson, A.B., Grichanik, M. & Williams, J.M. Computer-Based Administration and Grading of Free-Response Practical Examination Items: a Comparison of Assessment Programs and Case Study. Med.Sci.Educ. 27, 847–853 (2017). https://doi.org/10.1007/s40670-017-0458-5

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40670-017-0458-5

Keywords

Navigation