Integrating Instant Response System (IRS) as an In-Class Assessment Tool into Undergraduate Chemistry Learning Experience: Student Perceptions and Performance

  • Tzy-Ling Chen
  • Yan-Fu Lin
  • Yi-Lin Liu
  • Hsiu-Ping Yueh
  • Horn-Jiunn Sheen
  • Wei-Jane Lin


Since being introduced nearly a decade ago, the use of instant response systems (IRS) (also referred to as a “clicker”) has been extensively adopted on college campuses, and is particularly popular among instructors of large lecture classes. The available evidence supports that IRS offers a promising avenue for future developments in pedagogy, though findings on advantages of effective use of IRS in relation to improving or enhancing student learning are inconclusive. Considering this unique attribute of IRS, the main purpose of the present study aims to examine the degree to which students perceive or believe that using IRS in class has an effect on their understanding of course content, engagement in classroom learning, and preparation to take class tests. Moreover, multiple student performance evaluation results are used to explore correlations between student perceptions of IRS and their actual learning outcomes. This chapter presents 151 undergraduate students’ learning experiences of a basic chemistry class incorporating IRS as an in-class assessment tool at National Chung Hsing University in Taiwan. Based on research findings, overall student perceptions on the use of IRS in class were positive. In addition, certain interactions between students’ perceptions on IRS use and their performance in learning basic chemistry were identified in the present study. Although students’ perceived benefits and effectiveness of IRS use are revealed, the research indicates that further studies are needed to probe what specifically about the use of IRS contributes to certain learning outcomes of a large chemistry class in higher education.


Instructional Design Student Perception Classroom Learning Test Tool Interactive Technology 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. Abrahamson, L. (2006). A brief history of networked classrooms: Effects, cases, pedagogy, and implications. In D. A. Banks (Ed.), Audience response systems in higher education: Applications and cases (pp. 1–25). Hershey, PA: Information Science Publishing.CrossRefGoogle Scholar
  2. Barnett, J. (2006). Implementation of personal response units in every large lecture classes: Student perceptions. Australasian Journal of Educational Technology, 22(4), 474–494.Google Scholar
  3. Blackman, M. S., Dooley, P., Kuchinski, B., & Chapman, D. (2002). It worked a different way. College Teaching, 50, 27–28.CrossRefGoogle Scholar
  4. Burnstein, R. A., & Lederman, L. M. (2006). The use and evolution of an audience response system. In D. A. Banks (Ed.), Audience response systems in higher education: Applications and cases (pp. 40–52). Hershey, PA: Information Science Publishing.CrossRefGoogle Scholar
  5. Caldwell, J. E. (2007). Clickers in the large classroom: current research and best-practice tips. CBE-Life Science Education, 6, 9–20.CrossRefGoogle Scholar
  6. Chen, T., & Chen, T. (2006). Examination of attitudes towards teaching online courses based on theory of reasoned action of university faculty in Taiwan. British Journal of Educational Technology, 37(5), 683–693.CrossRefGoogle Scholar
  7. Copas, G. M., & Del Valle, S. (2004). Where’s my clicker? Bringing the remote into the ­classroom-Part II. Usability News, 6.Google Scholar
  8. Crossgrove, K., & Curran, K. L. (2008). Using clickers in non-majors and majors-level biology courses: Student opinion, learning, and long-term retention of course material. CBE-Life Sciences Education, 7, 146–154.CrossRefGoogle Scholar
  9. Eilks, I., & Byers, B. (2010). The need for innovative methods of teaching and learning chemistry in higher education: Reflections from a project of the European Chemistry Thematic Network. Chemistry Educational Research Practice, 11, 233–240.CrossRefGoogle Scholar
  10. Fies, C., & Marshall, J. (2006). Classroom response systems: A review of the literature. Journal of Science Education and Technology, 15(1), 101–109.CrossRefGoogle Scholar
  11. Fitch, J. L. (2004). Student feedback in the college classroom: A technology solution. Educational Technology, Research and Development, 52, 71–81.CrossRefGoogle Scholar
  12. Hall, R. H., Collier, H. L., Thomas, M. L., & Hilgers, M. G. (2005, August). A student response system for increasing engagement, motivation, and learning in high enrollment chemistry ­lectures. Proceedings of the 11th Americas Conference on Information Systems, Omaha, NE.Google Scholar
  13. Hansen, C. R. (2007). An evaluation of a student response system used at Brigham Young University. Unpublished masters thesis, Brigham Young University, Provo, UT. Retrieved June 13, 2008 from
  14. Hoffman, C., & Goodwin, S. (2006). A clicker for your thoughts: Technology for active learning. New Library World, 107(9/10), 422–433.CrossRefGoogle Scholar
  15. Judson, E., & Sawada, D. (2002). Learning from past and present: Electronic response systems in college lecture halls. Journal of Computers in Mathematics and Science Teaching, 21(2), 167–181.Google Scholar
  16. Latessa, R., & Mouw, D. (2005). Use of an audience response system to augment interactive ­learning. Family Medicine, 37, 12–14.Google Scholar
  17. MacArthur, J. R., & Jones, L. L. (2008). A review of literature reports of clickers applicable to college chemistry classroom. Chemistry Educational Research Practice, 9, 187–195.CrossRefGoogle Scholar
  18. MacGeorge, E. L., Homan, S. R., Dunning, J. B., Jr., Elmore, D., Bodie, G. D., Evans, E., Khichadia, S., Lichti, S. M., Feng, B., & Geddes, B. (2008). Student evaluation of audience response technology in large lecture classes. Educational Technology Research Development, 56, 125–145.CrossRefGoogle Scholar
  19. Orzechowski, R. F. (1995). Factors to consider before introducing active learning into a large, lecture based course. Journal of College Science Teaching, 24(5), 347–349.Google Scholar
  20. Preszler, R. W., Dawe, A., Shuster, C. B., & Shuster, M. (2007). Assessment of the effects of student response systems on student learning and attitudes over a broad range of biology courses. ­CBE-Life Science Education, 6(1), 29–41.CrossRefGoogle Scholar
  21. Reay, N. W., Li, P., & Bao, L. (2008). Testing a new voting machine question methodology. American Journal of Physics, 76(2), 171–178.CrossRefGoogle Scholar
  22. Rice, R. E., & Bunz, U. (2006). Evaluating a wireless course feedback system: The role of ­demographics, expertise, fluency, competency, and usage. Studies in Media and Information Literacy Education, 6(3), 1–23.CrossRefGoogle Scholar
  23. Sharma, M. D., Khachan, J., Chan, B., & O’Byrne, J. (2005). An investigation of the effectiveness of electronic classroom communication systems in large lecture classes. Australasian Journal of Educational Technology, 21(2), 137–154.Google Scholar
  24. Stuart, S. A. J., Brown, M. I., & Draper, S. W. (2004). Using an electronic voting system in logic lectures: One practitioner’s application. Journal of Computer Assisted Learning, 20, 95–102.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2013

Authors and Affiliations

  • Tzy-Ling Chen
    • 1
  • Yan-Fu Lin
    • 2
  • Yi-Lin Liu
    • 3
  • Hsiu-Ping Yueh
    • 3
  • Horn-Jiunn Sheen
    • 4
  • Wei-Jane Lin
    • 5
  1. 1.Graduate Institute of Bio-Industry ManagementNational Chung Hsing UniversityTaichungRepublic of China
  2. 2.Department of ChemistryNational Chung Hsing UniversityTaichungRepublic of China
  3. 3.Department of Bio-Industry Communication and DevelopmentNational Taiwan UniversityTaipeiRepublic of China
  4. 4.Institute of Applied MechanicsNational Taiwan UniversityTaipeiRepublic of China
  5. 5.Department of Library and Information ScienceNational Taiwan UniversityTaipeiRepublic of China

Personalised recommendations