Skip to main content

Using an Instructional Design Model to Teach Medical Procedures


Educators are often tasked with developing courses and curricula that teach learners how to perform medical procedures. This instruction must provide an optimal, uniform learning experience for all learners. If not well designed, this instruction risks being unstructured, informal, variable amongst learners, or incomplete. This article shows how an instructional design model can help craft courses and curricula to optimize instruction in performing medical procedures. Educators can use this as a guide to developing their own course instruction.


When educators design courses to teach learners how to perform medical procedures, they risk providing instruction that may be informal and unstructured [1], taught by supervisors who may lack competence in the procedure themselves [2], and based on instructional methods that are not supported by the medical education literature [3]. This, in turn, can lead to learners attempting procedures with which they are unfamiliar [4] and which they may perform incorrectly [57].

To reduce these pitfalls, educators can use an instructional design model when designing a course or curriculum. An instructional design model helps ensure that the learning objectives are clear, the instruction and learning experiences align with the objectives, the learning activities are similar amongst the learners, and the assessments used to determine competence are appropriate [8]. It serves as a blueprint that specifies the type, amount, and order of learning events that will occur [9].

This article will show educators how an instructional design model—in this case, Gagne’s theory of instructional design—can be used to design courses to teach procedures in medicine, using percutaneous chest tube insertion as an example.

Gagne’s Theory of Instructional Design

In Gagne’s theory of instructional design [10], developers of the lesson plan must first determine the type of outcome that the learners must achieve; then, they construct and tailor the instructional events necessary to achieve this outcome [9]. This model has been used to develop instructional plans to teach a variety of procedural [1113] and cognitive skills [1418]. Gagne’s theory of instructional design posits five learning outcomes and nine events of instruction.

Gagne’s Five Learning Outcomes

Gagne proposed five types of learning outcomes, including attitudes, motor skills, memory or recall, complex or procedural knowledge, and learning strategies [19]. The latter three involve cognitive outcomes, while attitudes and motor skills involve affective and psychomotor outcomes, respectively. Complex or procedural knowledge, in turn, encompasses five subcategories including discriminations, concrete concepts, defined concepts, rules, and higher order rules or problem solving [20].

Using the example of percutaneous chest insertion, the learning outcome would be motor skills (that is, procedural technique) and memory or recall (learning the indications, contraindications, and immediate complications of the procedure).

Gagne’s Nine Events of Instruction

Once educators have identified the learning outcomes, they must construct and organize the instructional events to achieve these learning outcomes. Gagne proposed nine events of instruction including gaining attention, informing the learner of the objectives, stimulating recall of prerequisite learning, presenting the stimulus material, providing learning guidance, eliciting the performance, providing feedback about performance correctness, assessing the performance, and enhancing retention and transfer [20].

Adopting Gagne’s nine events of instruction, we use the following instructional blueprint when teaching our learners (that is, residents in our respirology subspecialty residency program) percutaneous chest tube insertion.

Gaining Attention

Educators first need to gain, and maintain, the learners’ attention so that the latter can focus on the requisite learning.

We use a pre-test (and subsequent post-test near the end of instruction) to grab their attention and promote participation as learners tend to view this approach favorably [21]. Relating their learning to the workplace also helps gain their attention [22], and we emphasize that learning this procedure is a practical skill required to manage patients during their training and clinical practice. This reinforcement also stimulates their intrinsic motivation and enhances their learning autonomy. Our instructors also judiciously use humor, anecdotes, and case-based examples to emphasize their teaching points as these have been shown to capture attention [23, 24].

Informing the Learners of the Objectives

After gaining the learners’ attention, educators must state the learning objectives. Akin to what DeSilets [25] calls a “road map” showing the educational destination, learning objectives specify to the learners and instructors the skills that should be achieved and the outcomes that will be assessed [26]. Stated from the learner’s perspective, objectives should use action verbs [27, 28] that describe observable behaviors that the learners need to demonstrate [2932].

For example, some of our objectives include “List the equipment needed for percutaneous chest tube insertion,” “Insert the introducer needle into the pleural space,” and “Insert the chest tube over the guide wire.” We also ask learners to state their learning objective(s)—that is, what they intend to learn from the session. By focusing activities on their needs, we can engage and empower them during the learning process [33].

Stimulating Recall of Prerequisite Learning

Stimulating recall of prerequisite knowledge serves many functions. It establishes what the learners already know and reveals deficits in pre-existing knowledge that instructors must fill before further learning occurs. It helps the learners organize this pre-existing knowledge into conceptual schemas that can facilitate learning of new material [34] and activates prior knowledge to improve the information processing that will occur in the subsequent learning [35].

To do this, we review the answers to the initial pre-test that we administered while we were gaining the learners’ attention—this act of retrieving information enhances subsequent learning and recall. Our pre-test covers the anatomy of the chest wall, lungs and pleural space, appearance of a pleural effusion on ultrasound, the diseases that can cause a pleural effusion, and the diagnostic tests needed to ascertain the cause. In addition to lower order questions that test recall (for example, draw the anatomy of the chest wall, lungs, and pleural space), we use higher order questions as well. For example, given the relevant anatomy, we ask questions on the complications that can occur during the procedure and the ways to avoid them—these higher order questions can promote deep thinking and learner engagement. We also ask individuals to explain their answers to the group as this mindful use of prior knowledge facilitates the learning of the person who formulates the explanation [36].

Presenting the Stimulus Material

Here, new information is presented to the learner. Instructors must emphasize important learning points. In the case of teaching a procedural skill, instructors should not only emphasize the actions needed to perform the procedure correctly, but also highlight the actions to avoid in order to decrease the risk of adverse events.

When teaching percutaneous chest tube insertion, we use small groups of up to five members as this has been shown to increase learning gains and learner satisfaction compared to larger groups [37]. We review the prerequisites for the procedure—this includes obtaining informed consent and ensuring availability of all the necessary equipment, space, and personnel. Then, using photographs of each step of the procedure, we explain each of these steps such as positioning the patient, localizing the effusion with ultrasound, appropriately positioning the patient for the procedure, donning the appropriate gowns, masks, and gloves, opening the sterile tray and organizing the equipment, and so on. For each step, we emphasize the correct actions to perform and the incorrect actions to avoid. Before the teaching session, we give each learner a handout outlining the procedure so that they can prepare themselves. To promote deeper learning, we encourage the learners to ask questions and we invite group members to share their own procedural tips that they have observed in the past.

Providing Learning Guidance

Providing learning guidance involves modeling or showing the learner the correct performance. In some cases, instruction during this phase might be very similar to that provided during presentation of the stimulus material [9]. In our case of procedural learning, providing learning guidance involves a demonstration of the whole procedure—uninterrupted—from start to finish. This integrates all of the individual procedural steps that were taught during presentation of the stimulus.

We first play a demonstrational video in its entirety without interruption; then, we replay the video—pausing intermittently to give the learners the opportunity to ask questions and/or supply comments during the second viewing.

Eliciting the Performance

To elicit the learners’ performance, they must be given a chance to practice and demonstrate the skill they are required to learn. Before eventually performing the procedure on patients, each learner is given the opportunity to perform the procedure on a manikin that emulates the chest wall anatomy with a pleural effusion. Learners can view practice with a simulator as effective as practice on real patients [38] and simulators help learners achieve a variety of performance skills without compromising patient safety [3946].

Each of our learners takes a turn performing the procedure while the instructor and other learners observe. Organizing the learners into groups of two—with each learner taking a turn performing and critiquing the procedure—can decrease the instructor to learner ratio [47]. We let each learner practice the procedure once, while realizing that more complex procedures benefit from serial deliberate practice and feedback over several sessions [48, 49].

Providing Feedback About Performance Correctness

Practice by itself, without feedback, does not necessarily improve performance as learners may be unable to accurately assess themselves and determine the improvements they need to make [5054]. And, feedback, without the learner’s reflection on how to incorporate that feedback, will not likely lead to improvement either [55]. Thus, coaching and feedback, coupled with the learner’s self-appraisal, is needed to improve performance [5659] and enhance future self-assessment ability [60]. This is especially true when the verbal feedback is given by an expert instructor already proficient with the procedural technique [61] and is tailored to the learners’ needs [62].

When providing feedback, our instructors ensure that their feedback includes components that help the learner improve, rather than providing feedback that is vague or unhelpful. For example, our feedback tries to follow an established pattern where the instructor observes the learner’s performance, provides advice, and compares his assessment with the learner’s own assessment [63]. We allow our learners to reflect on the feedback during the process, and the feedback is provided in a safe, non-judgemental learning environment. Based on the experience of others, we aim to provide feedback that is specific and timely [64, 65], describes task performance [6668], and incorporates the learner’s goals and anticipated outcomes [69]. Other learners in the group are also invited to supply feedback.

Assessing the Performance

After the learners have had a chance to improve their performance with feedback and reflection, they must then demonstrate this skill from start to finish, on the manikin, without interruption. Our goal is not to assess final competence as this relies on more formal standard setting methods. Rather, our purpose is to determine whether it is appropriate to allow the learner to perform the procedure on patients with ongoing supervision. Here, competence is not a one-time achievement; instead, it is a process or what Leach [70] refers to as a “habit” of life-long learning, and the learner needs to continue to demonstrate the procedure in the clinical context with supervision [71, 72].

Learners must demonstrate a checklist of components created by our faculty who have expertise in the procedure. If they satisfy the items in the proper sequence, they have satisfied our curricular standards for the procedure and can go on to perform the procedure, with supervision, in the clinical setting. Learners who fail to demonstrate the skills in the checklist redo the individual components, with feedback and reflection, before trying again.

Enhancing Retention and Transfer

In this instructional event, retention refers to the learner’s ability to repeat the skill in future settings and transfer refers to the learner’s ability to adapt these skills to different situations [9].

Much of the groundwork for this instructional event will have been established by the preceding instructional events. For example, retention will be enhanced when active learning occurs while presenting the stimulus material and providing learning guidance. Retention is also enhanced when the learner’s own self-appraisal is incorporated while providing feedback, and the learner’s educational goals are accommodated while reviewing the learning objectives [55, 73, 74].

At this time, we administer the post-test and compare their performance to their pretest to reinforce what they have learned. Furthermore, we give the learners access to the demonstrational video to review in the future before performing the procedure again. This blended learning allows them to review the procedure in the proper clinical context, enhancing retention [75].

To promote transfer of knowledge and skills, we discuss how to modify the procedural technique in a variety of situations, such as when the patient has limited mobility and cannot maneuver into the proper position. We also try to instill an attitude of what Fraser and Greenlaugh [76] refer to as “capability”—that clinicians must adapt to situations which are novel and with which they may feel some uncertainty [7779].

Program Evaluation

Our program evaluation supports ongoing use of this instructional method. Before structuring our teaching, learners were asked whether the educational material was presented effectively and whether they felt they had acquired adequate procedural skills. Responses ranged from 2 (Disagree) to 5 (Strongly Agree) on a Likert scale from 1 to 5 for these two questions. After implementing our structured teaching, subsequent cohorts of learners consistently reported scores of 4 (Agree) and 5 for both questions. As well, we began to assess the learners’ procedural performance on an objective structured clinical examination (OSCE) station. On global ratings of performance—with possible scores from 1 to 5 and a score of “3” or higher needed to pass the station—all learners have received scores between 3 and 5.


Using an instructional design model to craft components of the curriculum enables educators to structure the teaching so that all learners have a comparable learning experience. This structure, in turn, helps identify the specific program components that are effective and those that require improvement. While this article uses Gagne’s theory of instruction, there are many other educational paradigms that could be used for procedural instruction, such as Mayer’s instruction based on cognitive load theory [80], Peyton’s 4 step approach to procedural instruction [81], and Merrill’s First Principles of Instruction [82], to name a few.

Also, while this article has used one procedural skill as an example, an instructional design model can be used to program instruction for a variety of cognitive and procedural skills. Further study is needed to assess the effect that implementing an instructional design model into a teaching program has on workplace (that is, clinical) performance, such as the effect on procedural complication rates or speed.


  1. 1.

    Mason WT, Strike PW. See one, do one, teach one—is this still how it works? A comparison of the medical and nursing professions in the teaching of practical procedures. Med Teach. 2003;25:664–6.

    Article  Google Scholar 

  2. 2.

    Wickstrom GC, Kelley DK, Keyserling TC, Kolar MM, Dixon JG, Xie SX, et al. Confidence of academic general internists and family physicians to teach ambulatory procedures. J Gen Intern Med. 2000;15:353–60.

    Article  Google Scholar 

  3. 3.

    Levinson AJ. Where is evidence-based instructional design in medical education curriculum development? Med Educ. 2010;44:536–7.

    Article  Google Scholar 

  4. 4.

    Yadla S and Rattigan EM. “See One, Do One, Teach One: Competence versus Confidence in Performing Procedures.” Virtual Mentor. 2003;5:1–4.

  5. 5.

    Davis JS, Garcia GD, Jouria JM, Wyckoff MM, Alsafran S, Graygo JM, et al. Identifying pitfalls in chest tube insertion: improving teaching and performance. J Surg Educ. 2013;70:334–9.

    Article  Google Scholar 

  6. 6.

    Elsayed H, Roberts R, Emadi M, Whittle I, Shackcloth M. Chest drain insertion is not a harmless procedure—are we doing it safely? Interact Cardiovasc Thorac Surg. 2010;11:745–8.

    Article  Google Scholar 

  7. 7.

    Griffiths JR, Roberts N. Do junior doctors know where to insert chest drains safely? Postgrad Med J. 2005;81:456–8.

    Article  Google Scholar 

  8. 8.

    Piskurich GM. “What is this Instructional Design Stuff Anyway?”. In: Piskurich GM, editor. Rapid instructional design: learning ID fast and right. Thirdth ed. Hoboken: Wiley; 2015.

    Google Scholar 

  9. 9.

    Okey JR. “Procedures of Lesson Design,” in Instructional Design: Principles and Application, L. J. Briggs, Ed., 2nd ed: Education Technology Publications, 1991, pp. 192–208.

  10. 10.

    Gagné RM. The conditions of learning. New York: Holt, Rinehart, and Winston; 1965

  11. 11.

    Ng JY. Combining Peyton’s four-step approach and Gagne’s instructional model in teaching slit-lamp examination. Perspect Med Educ. 2014;3:480–5.

    Article  Google Scholar 

  12. 12.

    Khadjooi K, Rostami K, Ishaq S. How to use Gagne’s model of instructional design in teaching psychomotor skills. Gastroenterol Hepatol Bed Bench. 2011;4:116–9.

    Google Scholar 

  13. 13.

    Buscombe C. Using Gagne’s theory to teach procedural skills. Clin Teach. 2013;10:302–7.

    Article  Google Scholar 

  14. 14.

    Condell SL, Elliott N. Gagne’s theory of instruction—its relevance to nurse education. Nurse Educ Today. 1989;9:281–4.

    Article  Google Scholar 

  15. 15.

    Coulter MA. A review of two theories of learning and their application in the practice of nurse education. Nurse Educ Today. 1990;10:333–8.

    Article  Google Scholar 

  16. 16.

    Duan Y. “Selecting and applying taxonomies for learning outcomes: a nursing example.” Int J Nurs Educ Scholarsh. 2006; 3:Article 10.

  17. 17.

    Miner A, Mallow J, Theeke L, Barnes E. “Using Gagne’s 9 Events of Instruction to Enhance Student Performance and Course Evaluations in Undergraduate Nursing Course.” Nurse Educ. 2015;40:152–4

  18. 18.

    Belfield J. Using Gagne’s theory to teach chest X-ray interpretation. Clin Teach. 2010;7:5–8.

    Article  Google Scholar 

  19. 19.

    Gagné RM. The conditions of learning and theory of instruction: New York: Holt, Rinehart and Winston, c1985. 4th ed. 1985.

    Google Scholar 

  20. 20.

    Gagne RM. Mastery learning and instructional design. Perform Improv Q. 1988;1:7–18.

    Article  Google Scholar 

  21. 21.

    Cao L, McInnes MD, Ryan JO. What makes a great radiology review course lecture: the Ottawa radiology resident review course experience. BMC Med Educ. 2014;14:22.

    Article  Google Scholar 

  22. 22.

    Jokinen P, Mikkonen I. Teachers’ experiences of teaching in a blended learning environment. Nurse Educ Pract. 2013;13:524–8.

    Article  Google Scholar 

  23. 23.

    Naftulin DH, Ware JEJ, Donnelly FA. The Doctor Fox Lecture: a paradigm of educational seduction. J Med Educ. 1973;48:630–5.

    Google Scholar 

  24. 24.

    Collins J. Education techniques for lifelong learning. RadioGraphics. 2004;24:1185–92.

    Article  Google Scholar 

  25. 25.

    DeSilets LD. Using objectives as a road map. J Contin Educ Nurs. 2007;38:196–7.

    Article  Google Scholar 

  26. 26.

    Grant J. Principles of curriculum design. In: Swanwick T, editor. Understanding medical education: evidence, theory and practice. 2nd ed. West Sussex, UK: Wiley-Blackwell; 2013. p. 31–46.

    Chapter  Google Scholar 

  27. 27.

    Houlden RL, Frid PJ, Collier CP. Learning outcome objectives. Annals RCPSC. 1998;31:327–32.

    Google Scholar 

  28. 28.

    Mager RF. The qualities of useful objectives. In: Mager RF, editor. Preparing instructional objectives: a critical tool in the development of effective instruction. 3rd ed. Atlanta, GA: The Center for Effective Performance Inc; 1997. p. 43–50.

    Google Scholar 

  29. 29.

    Ballard AL. Getting started. Writing behavioral objectives. J Nurs Staff Dev. 1990;6:40–4.

    Google Scholar 

  30. 30.

    Beitz JM. “Developing behavioral objectives for perioperative staff development.” AORN J. 1996;64:87–8, 92–5

  31. 31.

    Ferguson LM. Writing learning objectives. J Nurs Staff Dev. 1998;14:87–94.

    Google Scholar 

  32. 32.

    Wintergalen B and Skupien MB. “Writing behavioral objectives for continuing education.” Ariz Nurse. 1987;40:6, 15.

  33. 33.

    Beckert L, Wilkinson TJ, Sainsbury R. A needs-based study and examination skills course improves students’ performance. Med Educ. 2003;37:424–8.

    Article  Google Scholar 

  34. 34.

    van Kesteren MT, Rijpkema M, Ruiter DJ, Morris RG, Fernandez G. Building on prior knowledge: schema-dependent encoding processes relate to academic performance. J Cogn Neurosci. 2014;26:2250–61.

    Article  Google Scholar 

  35. 35.

    Verkoeijen PP, Rikers RM, Schmidt HG. The effects of prior knowledge on study-time allocation and free recall: investigating the discrepancy reduction model. J Psychol. 2005;139:67–79.

    Article  Google Scholar 

  36. 36.

    Pressley M, Wood E, Woloshyn VE, Martin V, King A, Menke D. “Encouraging mindful use of prior knowledge: attempting to construct explanatory answers facilitates learning.” Educational Psychologist. 2015/04/15 1992;27:91–109.

  37. 37.

    Kooloos JGM, Klaassen T, Vereijken M, Van Kuppeveld S, Bolhuis S, Vorstenbosch M. “Collaborative group work: effects of group size and assignment structure on learning gain, student satisfaction and perceived participation.” Medical Teacher. 2015/04/21 2011;33:983–988.

  38. 38.

    Bokken L, Rethans JJ, van Heurn L, Duvivier R, Scherpbier A, van der Vleuten C. Students’ views on the use of real patients and simulated patients in undergraduate medical education. Acad Med. 2009;84:958–63.

    Article  Google Scholar 

  39. 39.

    Hammoud MM, Nuthalapaty FS, Goepfert AR, Casey PM, Emmons S, Espey EL, et al. “To the point: medical education review of the role of simulators in surgical training. 2008;199:338–343.

  40. 40.

    Hsu JL, Korndorffer JR Jr, Brown KM. “Design of vessel ligation simulator for deliberate practice.” J Surg Res. 2015.

  41. 41.

    Joyce KM, Byrne D, O’Connor P, Lydon SM, Kerin MJ. An evaluation of the use of deliberate practice and simulation to train interns in requesting blood products. Simul Healthc. 2015;10:92–7.

    Article  Google Scholar 

  42. 42.

    Kalaniti K, Campbell DM. Simulation-based medical education: time for a pedagogical shift. Indian Pediatr. 2015;52:41–5.

    Article  Google Scholar 

  43. 43.

    Lopreiato JO, Sawyer T. Simulation-based medical education in pediatrics. Acad Pediatr. 2015;15:134–42.

    Article  Google Scholar 

  44. 44.

    Michael M, Abboudi H, Ker J, Shamim Khan M, Dasgupta P, Ahmed K. “Performance of technology-driven simulators for medical students—a systematic review. 2014;192:531–3.

  45. 45.

    Thomas GW, Johns BD, Marsh JL, Anderson DD. A review of the role of simulation in developing and assessing orthopaedic surgical skills. Iowa Orthop J. 2014;34:181–9.

    Google Scholar 

  46. 46.

    Udani AD, Macario A, Nandagopal K, Tanaka MA, Tanaka PP. Simulation-based mastery learning with deliberate practice improves clinical performance in spinal anesthesia. Anesthesiol Res Pract. 2014;2014:659160.

    Google Scholar 

  47. 47.

    Cason ML, Gilbert GE, Schmoll HH, Dolinar SM, Anderson J, Nickles BM, et al. Cooperative learning using simulation to achieve mastery of nasogastric tube insertion. J Nurs Educ. 2015;54:S47–51.

    Article  Google Scholar 

  48. 48.

    Bosse HM, Mohr J, Buss B, Krautter M, Weyrich P, Herzog W, et al. The benefit of repetitive skills training and frequency of expert feedback in the early acquisition of procedural skills. BMC Med Educ. 2015;15:22.

    Article  Google Scholar 

  49. 49.

    Dul J, Pieters JM, Dijkstra S, “Instructional feedback in motor skill learning.” Innovations in Education & Training International. 2015/04/18 1987;24:71–76.

  50. 50.

    Burson KA, Larrick RP, Klayman J. Skilled or unskilled, but still unaware of it: how perceptions of difficulty drive miscalibration in relative comparisons. J Pers Soc Psychol. 2006;90:60–77.

    Article  Google Scholar 

  51. 51.

    Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: a systematic review. JAMA. 2006;296:1094–102.

    Article  Google Scholar 

  52. 52.

    Eva KW, Cunnington JP, Reiter HI, Keane DR, Norman GR. How can I know what I don’t know? Poor self assessment in a well-defined domain. Adv Health Sci Educ Theory Pract. 2004;9:211–24.

    Article  Google Scholar 

  53. 53.

    Hodges B, Regehr G, Martin D. Difficulties in recognizing one’s own incompetence: novice physicians who are unskilled and unaware of it. Acad Med. 2001;76:S87–9.

    Article  Google Scholar 

  54. 54.

    Kruger J, Dunning D. Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments. J Pers Soc Psychol. 1999;77:1121–34.

    Article  Google Scholar 

  55. 55.

    Brookhart SM. “Successful students’ formative and summative uses of assessment information.” Assessment in Education: Principles, Policy & Practice. 2015/04/16 2001;8:153–169.

  56. 56.

    Bonrath EM, Dedy NJ, Gordon LE, Grantcharov TP. “Comprehensive surgical coaching enhances surgical skill in the operating room: a randomized controlled trial.” Ann Surg. 2015.

  57. 57.

    Hamid Y, Mahmood S. Understanding constructive feedback: a commitment between teachers and students for academic and professional development. J Pak Med Assoc. 2010;60:224–7.

    Google Scholar 

  58. 58.

    Strandbygaard J, Bjerrum F, Maagaard M, Winkel P, Larsen CR, Ringsted C, et al. Instructor feedback versus no instructor feedback on performance in a laparoscopic virtual reality simulator: a randomized trial. Ann Surg. 2013;257:839–44.

    Article  Google Scholar 

  59. 59.

    Kruglikova I, Grantcharov TP, Drewes AM, Funch-Jensen P. The impact of constructive feedback on training in gastrointestinal endoscopy using high-fidelity virtual-reality simulation: a randomised controlled trial. Gut. 2010;59:181–5.

    Article  Google Scholar 

  60. 60.

    Srinivasan M, Hauer KE, Der-Martirosian C, Wilkes M, Gesundheit N. Does feedback matter? Practice-based learning for medical students after a multi-institutional clinical performance examination. Med Educ. 2007;41:857–65.

    Article  Google Scholar 

  61. 61.

    Porte MC, Xeroulis G, Reznick RK, Dubrowski A. Verbal feedback from an expert is more effective than self-accessed feedback about motion efficiency in learning new surgical skills. Am J Surg. 2007;193:105–10.

    Article  Google Scholar 

  62. 62.

    Paschold M, Huber T, Zeissig SR, Lang H, Kneist W. Tailored instructor feedback leads to more effective virtual-reality laparoscopic training. Surg Endosc. 2014;28:967–73.

    Article  Google Scholar 

  63. 63.

    Alves de Lima AE. “[Constructive feedback. A strategy to enhance learning].” Medicina (B Aires). 2008;68:88–92.

  64. 64.

    Bienstock JL, Katz NT, Cox SM, Hueppchen N, Erickson S, Puscheck EE. To the point: medical education reviews—providing feedback. Am J Obstet Gynecol. 2007;196:508–13.

    Article  Google Scholar 

  65. 65.

    Duffy K. “Providing constructive feedback to students during mentoring.” Nurs Stand. 2013;27:50–6; quiz 58.

  66. 66.

    Hills L. Giving and receiving constructive feedback: a staff training tool. J Med Pract Manage. 2010;25:356–9.

    Google Scholar 

  67. 67.

    James IA. The rightful demise of the sh*t sandwich: providing effective feedback. Behav Cogn Psychother. 2014;43:1–8.

    Article  Google Scholar 

  68. 68.

    Kilminster S, Cottrell D, Grant J, Jolly B. AMEE Guide No. 27: effective educational and clinical supervision. Med Teach. 2007;29:2–19.

    Article  Google Scholar 

  69. 69.

    Nicol DJ, Macfarlane-Dick D. “Formative assessment and self-regulated learning: a model and seven principles of good feedback practice.” Studies in Higher Education. 2015/04/16 2006;31:199–218.

  70. 70.

    Leach DC. “Competence is a habit.” 2002;287:243–244.

  71. 71.

    Epstein RM, Hundert EM. “Defining and assessing professional competence.” 2002;287:226–235.

  72. 72.

    Epstein RM. “Assessment in Medical Education.” New England Journal of Medicine. 2015/04/16 2007;356:387–396.

  73. 73.

    Chang A, Chou CL, Teherani A, Hauer KE. Clinical skills-related learning goals of senior medical students after performance feedback. Med Educ. 2011;45:878–85.

    Article  Google Scholar 

  74. 74.

    Kvam PH. “The Effect of Active Learning Methods on Student Retention in Engineering Statistics.” The American Statistician. 2015/04/16 2000;54:136–140.

  75. 75.

    G Hughes. “Using blended learning to increase learner support and improve retention.” Teaching in Higher Education. 2015/04/16 2007;12:349–363.

  76. 76.

    Fraser SW, Greenhalgh T. Coping with complexity: educating for capability. BMJ. 2001;323:799–803.

    Article  Google Scholar 

  77. 77.

    Rees C, Richards L. Outcomes-based education versus coping with complexity: should we be educating for capability? Med Educ. 2004;38:1203.

    Article  Google Scholar 

  78. 78.

    Rees CE. The problem with outcomes-based curricula in medical education: insights from educational theory. Med Educ. 2004;38:593–8.

    Article  Google Scholar 

  79. 79.

    Plsek PE, Greenhalgh T. The challenge of complexity in health care. BMJ. 2001;323:625–8.

    Article  Google Scholar 

  80. 80.

    Mayer RE. Applying the science of learning to medical education. Med Educ. 2010;44:543–9.

    Article  Google Scholar 

  81. 81.

    Walker M, Peyton JWR. “Teaching in the theatre,” in Teaching and learning in medical practice. Peyton JWR, editor., ed Rickmansworth, UK: Manticore Publishers Europe Limited, 1998, pp. 171–180.

  82. 82.

    Merrill MD. First principles of instruction. Educ Technol Res Dev. 2002;50:43–59.

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Lawrence Cheung.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (, which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Cheung, L. Using an Instructional Design Model to Teach Medical Procedures. Med.Sci.Educ. 26, 175–180 (2016).

Download citation


  • Instructional design model
  • Gagne’s Theory of Instruction
  • Medical procedures
  • Percutaneous chest tube insertion