Advertisement

Revisiting Assertion-Reason Question Format: Case of Information Security Course

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10473)

Abstract

Technology enhanced learning is shaping the face of teaching and learning in innovative way more than ever before. A number of higher education institutions, especially in sub-Sahara Africa are fast-tracking the adoption of blended learning with renewed focus on web-based learning. The pressure on lecturers/faculties to deliver keeps increasing. In the area of assessment, multiple-choice-questions have hold sway and are de-facto where psychometric and validity is of the essence. Assertion-reason questions types, the higher-order variant of multiple choice questions, have not received the same level of adoption and scrutiny. This paper by revisiting Williams (2006) contributes to discourse on assertion-reason questions types. It contributes to the body of knowledge in the domain of information security training and summative assessment. The paper affirms that assertion-reason questions are indeed challenging, and aligns to learning outcome for an information security course, as well as contribute to aspects of sustainable assessment.

Keywords

ARQ Assertion reason Question type Sustainable assessment Higher order Information security 

References

  1. 1.
    Adesemowo, A.K., et al.: Text-based sustainable assessment: a case of first-year information and communication technology networking students. Stud. Educ. Eval. 55, 1–8 (2017)CrossRefGoogle Scholar
  2. 2.
    Adesemowo, A.K., et al.: The experience of introducing secure e-assessment in a South African university first-year foundational ICT networking course. Africa Educ. Rev. 13(1), 67–86 (2016)CrossRefGoogle Scholar
  3. 3.
    Beller, M.: Technologies in large-scale assessments: new directions, challenges, and opportunities. In: von Davier, M., et al. (eds.) The Role of International Large-Scale Assessments: Perspectives from Technology, Economy, and Educational Research, pp. 25–45. Springer, Netherlands, Dordrecht (2013)CrossRefGoogle Scholar
  4. 4.
    Boud, D., Soler, R.: Sustainable assessment revisited. Assess. Eval. High. Educ. 41(3), 400–413 (2016)CrossRefGoogle Scholar
  5. 5.
    Clarke-Midura, J., Dede, C.: Assessment, technology, and change. J. Res. Technol. Educ. 42(3), 309–328 (2010)CrossRefGoogle Scholar
  6. 6.
    Dermo, J.: e-Assessment and the student learning experience: a survey of student perceptions of e-assessment. Br. J. Educ. Technol. 40(2), 203–214 (2009)CrossRefGoogle Scholar
  7. 7.
    Hassan, S., Wium, W.: Quality lies in the eyes of the beholder: A mismatch between student evaluation and peer observation of teaching. Africa Educ. Rev. 11(4), 491–511 (2014)CrossRefGoogle Scholar
  8. 8.
  9. 9.
    Newhouse, C.P.: Using digital technologies to improve the authenticity of performance assessment for high-stakes purposes. Technol. Pedagog. Educ. 24(1), 17–33 (2013)CrossRefGoogle Scholar
  10. 10.
    Van Niekerk, J., von Solms, R.: Using Bloom’s taxonomy for information security education. In: Dodge, R.C., Futcher, L. (eds.) Information Assurance and Security Education and Training, pp. 280–287. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  11. 11.
    Ogude, A.N., Bradley, J.D.: Ionic conduction and electrical neutrality in operating electrochemical cells: pre-college and college student interpretations. J. Chem. Educ. 71(1), 29 (1994)CrossRefGoogle Scholar
  12. 12.
    Paul, R.W.: Bloom’s taxonomy and critical thinking instruction: recall is not knowledge. In: Wilsen, J., Binker, A.J. (eds.) Critical Thinking: What Every Person Needs To Survive in a Rapidly Changing World, pp. 519–526. Foundation for Critical Thinking, California (2012)Google Scholar
  13. 13.
    Rust, C.: The unscholarly use of numbers in our assessment practices: what will make us change? Int. J. Scholarsh. Teach. Learn. 5(1), 1–6 (2011)Google Scholar
  14. 14.
    Shute, V.J., et al.: Advances in the science of assessment. Educ. Assess. 21(1), 34–59 (2016)CrossRefGoogle Scholar
  15. 15.
    Sim, G.: Evidence based design of heuristics: usability and computer assisted assessment. University of Central Lancashire (2009)Google Scholar
  16. 16.
    Sircar, S.S., Tandon, O.P.: Involving students in question writing: a unique feedback with fringe benefits. Am. J. Physiol. 277(6 Pt 2), S84–S91 (1999)Google Scholar
  17. 17.
    Wiggins, G.P.: Assessing Student Performance: Exploring the Purpose and Limits of Testing. Wiley, New York (1993)Google Scholar
  18. 18.
    Williams, J.B.: Assertion-reason multiple-choice testing as a tool for deep learning: a qualitative analysis. Assess. Eval. High. Educ. 31(3), 287–301 (2006)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.School of ICTNelson Mandela UniversityPort ElizabethSouth Africa

Personalised recommendations