Advertisement

On the Design of Instruction and Assessment

  • Chwee Beng LeeEmail author
  • Jimmie Leppink
  • José Hanham
Chapter

Abstract

One of the greatest mistakes instructional designers make is to create instruction based on simplistic strategies without giving much thought to a systematic overarching framework. The benefit of an overarching framework for any instruction is that there will be coherency and consistency in the design, planning, implementation, and evaluation. In this chapter, we argue for the importance of problem solving as the centre of instructional design in the identified high-stakes learning contexts and provide a variety of instructional design guidelines. Moreover, assessing learners’ performance is one of the most important – if not the most important – component of instruction. To meaningfully assess performance, instructional designers need to clearly identify the descriptors of the required actions or thoughts and align these with the learning outcomes. These descriptors usually take the form of rubrics, and rubrics can be used to observe learners’ performance and assess their articulation of their thoughts. In this chapter, we discuss the important elements of rubrics and the components of rubrics in relation to the conditions of various high-stakes learning environments.

References

  1. Aamodt, A., & Plaza, E. (1994). Case-based reasoning: Foundational issues, methodological variations, and system approaches. Artificial Intelligence Communications, 7, 39–59.Google Scholar
  2. Alison, L., Van den Heuvel, C., Power, S. W. N., Long, A., O’Hara, T., & Crego, J. (2013). Immersive simulated learning environments for researching critical incidents. Journal of Cognitive Engineering and Decision Making, 7, 255–272.  https://doi.org/10.1177/1555343412468113 CrossRefGoogle Scholar
  3. Berentson, L. (2007). Using rubrics for assessing student projects in FAR part 147 programs. Collegiate Aviation Review, 25, 18–29.Google Scholar
  4. Bjork, R. A., Dunlosky, J., & Kornell, N. (2013). Self-regulated learning: Beliefs, techniques, and illusions. Annual Review of Psychology, 64, 417–444.  https://doi.org/10.1146/annurev-psych-113011-143823 CrossRefGoogle Scholar
  5. Borko, H. (1997). New forms of classroom assessment: Implications for staff development. Theory Into Practice, 36, 231–238.  https://doi.org/10.1080/00405849709543773 CrossRefGoogle Scholar
  6. Boulton, L., & Cole, J. (2016). Adaptive flexibility: Examining the role of adaptive expertise in the decision making of authorized firearms officers during armed confrontation. Journal of Cognitive Engineering and Decision Making, 10, 291–308.  https://doi.org/10.1177/1555343416646684 CrossRefGoogle Scholar
  7. Bransford, J. D., Brown, A. L., & Cocking, R. R. (2000). How people learn: Brain, mind, experience and school (pp. 3–23). Washington, DC: National Academy Press.Google Scholar
  8. Brehmer, B. (1992). Dynamic decision making: Human control of complex systems. Acta Psychologica, 81, 211–241.  https://doi.org/10.1016/0001-6918(92)90019-A CrossRefGoogle Scholar
  9. Brookhart, S. (2003). Developing measurement theory for classroom assessment purposes and uses. Educational Measurement: Issues and Practice, 22, 5–12.  https://doi.org/10.1111/j.1745-3992.2003.tb00139.x CrossRefGoogle Scholar
  10. Busemeyer, J. R. (2002). Dynamic decision making. In N. J. Smelser & P. B. Bates (Eds.), International encyclopedia of the social and behavioral sciences: Methodology, mathematics and computer science (pp. 3903–3908). Oxford, UK: Elsevier.Google Scholar
  11. Cyr, P., Smith, K., Broyles, I., & Holt, C. (2014). Developing, evaluating and validating a scoring rubric for written case reports. International Journal of Medical Education, 5, 18–23.  https://doi.org/10.5116/ijme.52c6.d7ef CrossRefGoogle Scholar
  12. Dougan, A. M. (1996). Student assessment by portfolio: One institution’s journey. They History Teacher, 29, 171–178.  https://doi.org/10.2307/494738 CrossRefGoogle Scholar
  13. Dunbar, N., Brooks, C., & Kubicka-Miller, T. (2006). Oral communication skills in higher education: Using a performance-based evaluation rubric to assess communication skills. Innovative Higher Education, 31, 115–128.  https://doi.org/10.1007/s10755-006-9012-x CrossRefGoogle Scholar
  14. Fiore, S. M., Ross, K., & Jentsch, F. (2012). A team cognitive readiness framework for small unit training. Journal of Cognitive Engineering and Decision Making, 6, 325–349.  https://doi.org/10.1177/1555343412449626 CrossRefGoogle Scholar
  15. Fraser, K., Huffman, J., Ma, I., Sobczak, M., McIlwrick, J., Wright, B., et al. (2014). The emotional and cognitive impact of unexpected simulated patient death: A randomized controlled trial. Chest, 145, 958–963.CrossRefGoogle Scholar
  16. Fraser, K., Ma, I., Teteris, E., Baxter, H., Wright, B., & McLaughlin, K. (2012). Emotion, cognitive load and learning outcomes during simulation training. Medical Education, 46, 1055–1062.CrossRefGoogle Scholar
  17. Glaser, R., & Chi, M. T. H. (1988). Overview. In M. T. H. Chi, R. Glaser, & M. J. Farr (Eds.), The nature of expertise (pp. xv–xxviii). Hillsdale, MI: Erlbaum.Google Scholar
  18. Glasspool, D. W., & Fox, J. (2005). Knowledge, argument, and meta-cognition in routine decision making. In T. Betsch & S. Haberstroh (Eds.), The routines of decision making (pp. 343–358). New York: Psychology Press.Google Scholar
  19. Harenčárová, H. (2017). Managing uncertainty in paramedics’ decision making. Journal of Cognitive Engineering and Decision Making, 11, 42–62.  https://doi.org/10.1177/1555343416674814 CrossRefGoogle Scholar
  20. Hernandez-Serrano, J., & Jonassen, D. H. (2013). The effects of case libraries on problem solving. Journal of Computer Assisted Learning, 19, 103–114.  https://doi.org/10.1046/j.0266-4909.2002.00010.x CrossRefGoogle Scholar
  21. Hoffman, R. R., & Klein, G. L. (2017). Challenges and prospects for the paradigm of naturalistic decision making. Journal of Cognitive Engineering and Decision Making, 11, 97–104.  https://doi.org/10.1177/1555343416689646 CrossRefGoogle Scholar
  22. Jonassen, D. H. (2004). Learning to solve problems: An instructional design guide. San Francisco: Pfeiffer.Google Scholar
  23. Jonassen, D. H. (2007). Learning to solve complex scientific problems. Mahwah, NJ: Erlbaum.Google Scholar
  24. Jonassen, D. H. (2011). Learning to solve problems: A handbook for designing problem-solving learning environments. New York: Routledge.Google Scholar
  25. Jonassen, D. H., Howland, J., Moore, J., & Marra, R. M. (2003). Learning to solve problems with technology: A constructivist perspective. Columbus, OH: Merrill.Google Scholar
  26. Kalyuga, S., & Hanham, J. (2011). Instructing in generalized knowledge structures to develop flexible problem solving skills. Computers in Human Behavior, 27, 63–68.  https://doi.org/10.1016/j.chb.2010.05.024 CrossRefGoogle Scholar
  27. Kalyuga, S., & Singh, A. M. (2015). Rethinking the boundaries of cognitive load theory in complex learning. Educational Psychology Review, 2015.  https://doi.org/10.1007/s10648-015-9352-0 CrossRefGoogle Scholar
  28. Kapur, M. (2008). Productive failure. Cognition and Instruction, 26, 379–424.  https://doi.org/10.1080/07370000802212669 CrossRefGoogle Scholar
  29. Kapur, M. (2011). A further study of productive failure in mathematical problem solving: Unpacking the design components. Instructional Science, 39, 561–579.  https://doi.org/10.1007/s11251-010-9144-3 CrossRefGoogle Scholar
  30. Kapur, M. (2014). Productive failure in learning math. Cognitive Science, 38, 1008–1022.  https://doi.org/10.1111/cogs.12107 CrossRefGoogle Scholar
  31. Kapur, M., & Rummel, N. (2012). Productive failure in learning from generation and invention activities. Instructional Science, 40, 645–650.  https://doi.org/10.1007/s11251-012-9235-4 CrossRefGoogle Scholar
  32. Klein, G. (2008). Naturalistic decision making. Human Factors, 50, 456–460.  https://doi.org/10.1518/001872008X288385 CrossRefGoogle Scholar
  33. Klein, G., Calderwood, R., & Clinton-Cirocco, A. (2010). Rapid decision making on the ground: The original study plus a postscript. Journal of Cognitive Engineering and Decision Making, 4, 186–209.  https://doi.org/10.1518/155534310X12844000801203 CrossRefGoogle Scholar
  34. Klein, G. A., & Hoffman, R. (1993). Seeing the invisible: Perceptual/cognitive aspects of expertise. In M. Rabinowitz (Ed.), Cognitive science foundations of instruction (pp. 203–226). Mahwah, NJ: Erlbaum.Google Scholar
  35. Lafleur, A., Côté, L., & Leppink, J. (2015). Influences of OSCE design on students’ diagnostic reasoning. Medical Education, 49, 203–214.  https://doi.org/10.1111/medu.12635 CrossRefGoogle Scholar
  36. Lajoie, S. P. (2003). Transitions and trajectories for studies of expertise. Educational Researcher, 32, 21–25.  https://doi.org/10.3102/0013189X032008021 CrossRefGoogle Scholar
  37. Leppink, J. (2017). Cognitive load theory: Practical implications and an important challenge. Journal of Taibah University Medical Sciences, 12, 385–391. 10/1016/j.jtumed.2017.05.003 CrossRefGoogle Scholar
  38. Leppink, J., & Duvivier, R. (2016). Twelve tips for medical curriculum design from a cognitive load theory perspective. Medical Teacher, 38, 669–674.  https://doi.org/10.3109/0142159X.2015.1132829 CrossRefGoogle Scholar
  39. Leppink, J., & Van den Heuvel, J. (2015). The evolution of cognitive load theory and its application to medical education. Perspectives on Medical Education, 4, 119–127.  https://doi.org/10.1007/s40037-015-0192-x CrossRefGoogle Scholar
  40. Leppink, J., Van Gog, T., Paas, F., & Sweller, J. (2015). Cognitive load theory: Researching and planning teaching to maximise learning. In J. Cleland & S. J. Durning (Eds.), Researching medical education, Chapter 18 (pp. 207–218). Chichester, UK: Wiley & Blackwell.CrossRefGoogle Scholar
  41. Lipshitz, R., & Shaul, O. B. (1997). Schemata and mental models in recognition-primed decision making. In C. E. Zsambok & G. Klein (Eds.), Expertise: Research and applications. Naturalistic decision making (pp. 293–303). Hillsdale, MI: Erlbaum.Google Scholar
  42. Nitko, A. (2001). Educational assessment of students (3rd ed.). Upper Saddle River, NJ: Prentice Hall.Google Scholar
  43. Orasanu, J., & Connolly, T. (1993). The reinvention of decision making. In G. A. Klein, J. Orasanu, R. Calderwood, & C. E. Zsambok (Eds.), Decision making in action: Models and methods (pp. 3–20). Norwood, NJ: Ablex.Google Scholar
  44. Patterson, R., Pierce, B., Bell, H. H., Andrews, D., & Winterbottom, M. (2009). Training robust decision making in immersive environments. Journal of Cognitive Engineering and Decision Making, 3, 331–361.  https://doi.org/10.1518/155534309X12599553478836 CrossRefGoogle Scholar
  45. Pfaff, M. S., Klein, G. L., Drury, J. L., Moon, S. P., Liu, Y., & Entezari, S. (2013). Supporting complex decision making through option awareness. Journal of Cognitive Engineering and Decision Making, 7, 123–140.  https://doi.org/10.1177/1555343412455799 CrossRefGoogle Scholar
  46. Schmidt, H. G. (1983). Problem-based learning: Rationale and description. Medical Education, 17, 11–16.  https://doi.org/10.1111/j.1365-2923.1983.tb01086.x CrossRefGoogle Scholar
  47. Smith, M. U. (1991). A view from biology. In M. U. Smith (Ed.), Toward a unified theory of problem solving (pp. 1–20). Hillsdale, NJ: Erlbaum.Google Scholar
  48. Tremblay, M. L., Lafleur, A., Leppink, J., & Dolmans, D. H. J. M. (2017). The simulated clinical environment: Cognitive and emotional impact among undergraduates. Medical Teacher, 39, 181–187.  https://doi.org/10.1080/0142159X.2016.1246710 CrossRefGoogle Scholar
  49. Van Merriënboer, J. J. G., & Kirschner, P. A. (2018). Ten steps to complex learning (3rd ed.). New York: Routledge.Google Scholar
  50. Van Merriënboer, J. J. G., & Sweller, J. (2010). Cognitive load theory in health professions education: Design principles and strategies. Medical Education, 44, 85–93.  https://doi.org/10.1111/j.1365-2923.2009.03498.x CrossRefGoogle Scholar
  51. Vygotsky, L. S. (1978). Mind in society: Development of higher psychological processes. Boston: Harvard University Press.Google Scholar
  52. Williams, J. (2002). The engineering portfolio: Communication, reflection, and student learning outcomes assessment. International Journal of Engineering Education, 18, 197–207.Google Scholar
  53. Yudkowsky, R., Otaki, J., Lowenstein, T., Riddle, J., Nishigori, H., & Bordage, G. (2009). A hypothesis-driven physical examination learning and assessment procedure for medical students: Initial validity evidence. Medical Education, 43, 729–740.CrossRefGoogle Scholar
  54. Zimmerman, B., & Schunk, D. (2001). Self-regulated learning and academic achievement: Theoretical perspectives. Mahwah, NJ: Erlbaum.Google Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.Western Sydney UniversityPenrithAustralia
  2. 2.Maastricht UniversityMaastrichtThe Netherlands

Personalised recommendations