Effectively Using Classroom Response Systems for Improving Student Content Retention

  • Robert CollierEmail author
  • Jalal Kawash
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 865)


Classroom response systems are widely recognized as an effective tool for providing formative feedback and engaging students, but our research supports the hypothesis that these systems also provide opportunities for improving content retention. This is evidenced by an experiment we conducted on two distinct sections of an introductory course in computer science, wherein large collections of classroom response system questions were presented to different sections at different stages. Questions that were offered immediately after the corresponding material had the express purpose of providing an opportunity for formative feedback, while questions that were presented later were expected to improve content retention. The performance of participants on the corresponding questions of the final examination was then reviewed, and statistical analyses indicate that participants performed better on those questions that corresponded to the classroom response system questions provided for content retention.


  1. 1.
    Collier, R.D., Kawash, J.: Improving student content retention using a classroom response system. In: CSEDU 2017 - Proceedings of the 9th International Conference on Computer Supported Education, Porto, Portugal, 21–23 April 2017, vol. 1, pp. 17–24 (2017)Google Scholar
  2. 2.
    Boscardin, C., Penuel, W.: Exploring benefits of audience-response systems on learning: a review of the literature. Acad. Psychiatry 36, 401–407 (2012)CrossRefGoogle Scholar
  3. 3.
    Moss, K., Crowley, M.: Effective learning in science: the use of personal response systems with a wide range of audiences. Comput. Educ. 56, 36–43 (2011)CrossRefGoogle Scholar
  4. 4.
    Kay, R.H., LeSage, A.: Examining the benefits and challenges of using audience response systems: a review of the literature. Comput. Educ. 53, 819–827 (2009)CrossRefGoogle Scholar
  5. 5.
    Bruff, D.: Teaching with Classroom Response Systems: Creating Active Learning Environments. Jossey-Bass, San Francisco (2009)Google Scholar
  6. 6.
    Moredich, C., Moore, E.: Engaging students through the use of classroom response systems. Nurse Educ. 32, 113–116 (2007)CrossRefGoogle Scholar
  7. 7.
    Blasco-Arcas, L., Buil, I., Hernandez-Ortega, B., Sese, F.J.: Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Comput. Educ. 62, 102–110 (2013)CrossRefGoogle Scholar
  8. 8.
    Webb, A., Carnaghan, C.: Investigating the effects of group response systems on student satisfaction, learning and engagement in accounting education. Issues Acc. Educ. 22, 391–409 (2006)Google Scholar
  9. 9.
    Liao, S.N., Zingaro, D., Laurenzano, M.A., Griswold, W.G., Porter, L.: Lightweight, early identification of at-risk CS1 students. In: Proceedings of the 2016 ACM Conference on International Computing Education Research, ICER 2016, pp. 123–131. ACM, New York (2016)Google Scholar
  10. 10.
    Porter, L., Zingaro, D., Lister, R.: Predicting student success using fine grain clicker data. In: Proceedings of the Tenth Annual Conference on International Computing Education Research, ICER 2014, pp. 51–58. ACM, New York (2014)Google Scholar
  11. 11.
    Draper, S.W., Brown, I.M.: Increasing interactivity in lectures using an electronic voting system. J. Comput. Assist. Learn. 20, 81–94 (2004)CrossRefGoogle Scholar
  12. 12.
    Judson, E., Sawada, D.: Learning from past and present: electronic response systems in college lecture halls. J. Comput. Math. Sci. Teach. 21, 167–181 (2002)Google Scholar
  13. 13.
    Brewer, C.A.: Near real-time assessment of student learning and understanding in biology courses. BioScience 54, 1034–1039 (2004)CrossRefGoogle Scholar
  14. 14.
    Caldwell, J.E.: Clickers in the large classroom: current research and best-practice tips. CBE Life Sci. Educ. 6, 9–20 (2007)CrossRefGoogle Scholar
  15. 15.
    Simon, B., Kinnunen, P., Porter, L., Zazkis, D.: Experience report: CS1 for majors with media computation. In: Proceedings of the Fifteenth Annual Conference on Innovation and Technology in Computer Science Education, ITiCSE 2010, pp. 214–218. ACM, New York (2010)Google Scholar
  16. 16.
    Porter, L., Simon, B.: Retaining nearly one-third more majors with a trio of instructional best practices in CS1. In: Proceeding of the 44th ACM Technical Symposium on Computer Science Education, SIGCSE 2013, pp. 165–170. ACM, New York (2013)Google Scholar
  17. 17.
    Tew, A.E., Dorn, B.: The case for validated tools in computer science education research. Computer 46, 60–66 (2013)CrossRefGoogle Scholar
  18. 18.
    Cukierman, D.: Predicting success in university first year computing science courses: the role of student participation in reflective learning activities and in i-clicker activities. In: Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education, ITiCSE 2015, pp. 248–253. ACM, New York (2015)Google Scholar
  19. 19.
    Vinaja, R.: The use of lecture videos, ebooks, and clickers in computer courses. J. Comput. Sci. Coll. 30, 23–32 (2014)Google Scholar
  20. 20.
    Simon, B., Parris, J., Spacco, J.: How we teach impacts student learning: peer instruction vs. lecture in CS0. In: Proceeding of the 44th ACM Technical Symposium on Computer Science Education, SIGCSE 2013, pp. 41–46. ACM, New York (2013)Google Scholar
  21. 21.
    Zingaro, D.: Peer instruction contributes to self-efficacy in CS1. In: Proceedings of the 45th ACM Technical Symposium on Computer Science Education, SIGCSE 2014, pp. 373–378. ACM, New York (2014)Google Scholar
  22. 22.
    Zingaro, D., Porter, L.: Tracking student learning from class to exam using isomorphic questions. In: Proceedings of the 46th ACM Technical Symposium on Computer Science Education, SIGCSE 2015, pp. 356–361. ACM, New York (2015)Google Scholar
  23. 23.
    Huss-Lederman, S.: The impact on student learning and satisfaction when a CS2 course became interactive (abstract only). In: Proceedings of the 47th ACM Technical Symposium on Computing Science Education, SIGCSE 2016, p. 687. ACM, New York (2016)Google Scholar
  24. 24.
    Bloom, B.S.: Taxonomy of Educational Objectives: The Classification of Educational Goals. Longmans, Green (1956)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Department of Computer ScienceCarleton UniversityOttawaCanada
  2. 2.Department of Computer ScienceUniversity of CalgaryCalgaryCanada

Personalised recommendations