Journal of Science Education and Technology

, Volume 24, Issue 4, pp 432–447 | Cite as

CRiSP: An Instrument for Assessing Student Perceptions of Classroom Response Systems

  • Alice M. Richardson
  • Peter K. DunnEmail author
  • Christine McDonald
  • Florin Oprescu


This paper describes the development and validation of an instrument for evaluating classroom response systems (CRS). While a number of studies evaluating CRS have been published to date, no standardised instrument exists as a means of evaluating the impact of using the CRS. This means that comparing the different systems, or evaluating the benefits of using the CRS in different ways or settings, is very difficult despite the number of published reports, as indicated by Kay and LeSage (2009). An instrument was developed, called the classroom response system perceptions (CRiSP) questionnaire, which allows the evaluation of varied CRS on three scales: the usability; the impact on student engagement; and the impact on student learning. The development of CRiSP was undertaken in three universities, using different CRS, and evaluated through focus groups, one-on-one interviews and a factor analysis of the survey responses. We found no evidence of difference on the scales according to gender or age groups. The final CRiSP questionnaire consists of 26 base questions, with additional optional questions available. This paper proposes that the CRiSP Questionnaire could, in its current state or with minor changes, be used to evaluate the impact on learning of other classroom technologies also.


CRS Clickers Learning Teaching Instrument development Scale validation 



The authors acknowledge financial support from the USC’s Open Learning and Teaching Grants Scheme (OLTGP2011/7) and thank Mr Frank Muller and Ms Zanubia Hussain for assistance with the data collection. The contributions of the reviewers are also gratefully acknowledged.


  1. Alexander CJ, Crescini WM, Juskewitch JE, Lachman N, Pawlina W (2009) Assessing the integration of audience response system technology in teaching of anatomical sciences. Anat Sci Educ 2:160–166CrossRefGoogle Scholar
  2. Ayu MA, Taylor K, Mantoro T (2009) Active learning: engaging students in the classroom using mobile phones active learning: engaging students in the classroom using mobile phones active learning: engaging students in the classroom using mobile phones. In IEEE Symposium on Industrial Electronics and Applications, ISIEA, pp 711–715Google Scholar
  3. Bachman L, Bachman C (2011) A study of classroom response system clickers: increasing student engagement and performance in a large undergraduate lecture class on architectural research. J Interact Learn Res 22(1):5–21Google Scholar
  4. Barnett J (2006) Implementation of personal response units in very large lecture classes: student perceptions. Australasian J Educ Technol 22:474–494Google Scholar
  5. Barraguérs JI, Morias A, Manterola J, Guisasola J (2011) Use of a classroom response system (CRS) for teaching mathematics in engineering with large groups. In: Mendez-Vilas A (ed) Education in a technological world: communicating current and emerging research and technological efforts. Formatex Research Center, pp 572–580Google Scholar
  6. Beekes W (2006) The ‘millionaire’ method for encouraging participation. Act Learn High Educ 7:25–36CrossRefGoogle Scholar
  7. Bernaards CA, Sijtsma K (2010) Influence of imputation and EM methods on factor analysis when item nonresponse in questionnaire data is nonignorable. Multivar Behav Res 35(3):321–364CrossRefGoogle Scholar
  8. Berry J (2009) Technology support in nursing education: clickers in the classroom. Nurs Educ Res 30:295–298Google Scholar
  9. Bode M, Drane D, Kolikant YBD, Schuller M (2009) A clicker approach to teaching calculus. Not Am Math Soc 56(2):253–256Google Scholar
  10. Bruff D (2009) Teaching with classroom response systems. Jossey-Bass, San FransiscoGoogle Scholar
  11. Bunce DM, VandenPlas JR, Havanki KL (2006) Comparing the effectiveness on student achievement of a student response system versus online WebCT quizzes. J Chem Educ 83(3):488–493CrossRefGoogle Scholar
  12. Caldwell JE (2007) Clickers in the large classroom: current research and best-practice tips. CBE—life sciences. Education 6:9–20Google Scholar
  13. Chan KC, Snavely J (2009) Do clickers ‘click’ in the classroom? J Financ Educ 35(2):25–40Google Scholar
  14. Cronbach LJ (1951) Coefficient alpha and the internal structure of tests. Psychometrika 16(3):297–334CrossRefGoogle Scholar
  15. Crossgrove K, Curran KL (2008) Using clickers in nonmajors- and majors-level biology courses: student opinion, learning, and long-term retention of course material. CBE—life sciences. Education 7:146–154Google Scholar
  16. Davis F (1989) Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS q 13(3):319–340CrossRefGoogle Scholar
  17. Davis F, Bagozzi RP, Warshaw PR (1989) User acceptance of computer technology: a comparison of two theoretical models. Manag Sci 35(8):982–1003CrossRefGoogle Scholar
  18. Draper SW, Brown MI (2004) Increasing interactivity in lectures using an electronic voting system. J Comput Assist Learn 20:81–94CrossRefGoogle Scholar
  19. Duggan PM, Palmer E, Devitt P (2007) Electronic voting to encourage interactive lectures: a randomised trial. BMC Med Educ 7:25CrossRefGoogle Scholar
  20. Dunn PK, Richardson A, McDonald C, Oprescu F (2012) Instructor perceptions of using a mobile-phone-based, free classroom response system in first-year statistics undergraduate courses. Int J Math Educ Sci Technol 43(8):1041–1056CrossRefGoogle Scholar
  21. Dunn PK, Richardson A, McDonald C, Oprescu F (2013) Mobile-phone-based classroom response systems: students’ perceptions of engagement and learning in a large undergraduate course. J Math Educ Sci Technol. doi: 10.1080/0020739X.2012.756548 Google Scholar
  22. Elliott C (2003) Using a personal response system in economics teaching. Int Rev Econ Educ 1(1):80–86CrossRefGoogle Scholar
  23. Graham CR, Tripp TR, Seawright L, Joeckel GL III (2007) Empowering or compelling reluctant participators using audience response systems. Act Learn High Educ 8(3):233–258CrossRefGoogle Scholar
  24. Guthrie RW, Carlin A (2004) Waking the dead: using interactive technology to engage passive listeners in the classroom. In: Proceedings of the Tenth Americas Conference on Information Systems, New YorkGoogle Scholar
  25. Guttman L (1945) A basis for analyzing test-retest reliability. Psychometrika 10(4):255–282CrossRefGoogle Scholar
  26. Han JH, Finkelstein A (2013) Understanding the effects of professors’ pedagogical development with clicker assessment and feedback technologies and the impact on students’ engagement and learning in higher education. Comput Educ 65:64–76CrossRefGoogle Scholar
  27. Hoekstra A (2008) Vibrant student voices: exploring effects of the use of clickers in large college courses. Learn Media Technol 33:329–341CrossRefGoogle Scholar
  28. Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 6:65–70Google Scholar
  29. Horn JL (1965) A rationale and test for the number of factors in factor analysis. Psychometrika 30:179–185CrossRefGoogle Scholar
  30. Ismail K (2008) Unravelling factor analysis. Evidence-based mental. Health 11(4):99–102Google Scholar
  31. Kaleta R, Joosten T (2007) Student reponses systems: a Univeristy of Wisconsin system study of clickers. Educause Cent Appl Res Res Bull 2007:1–12Google Scholar
  32. Kay RH, LeSage A (2009) Examining the benefits and challenges of using audience response systems: a review of the literature. Comput Educ 53:819–827CrossRefGoogle Scholar
  33. Koppel N, Berenson M (2009) Ask the audience—Using clickers to enhance introductory business statistics courses. Inf Syst Educ J 7(92):1–18Google Scholar
  34. Kundisch D, Magenheim J, Beutner M, Hermann P, Reinhardt W, Zokye A (2013) Classroom response systems. Inform Spektrum 36(4):389–393CrossRefGoogle Scholar
  35. Kyei-Blankson L, Cheesman E, Blankson J (2009) The value added effect of using clickers in a graduate research methods and statistics course. In: Gibson I (ed) Proceedings of the society for information technology and teacher education international conference. AACE, Chesapeake, pp 1947–1952Google Scholar
  36. Lantz ME (2010) The use of ‘Clickers’ in the classroom: teaching innovation or merely an amusing novelty? Comput Hum Behav 26:556–561CrossRefGoogle Scholar
  37. Li P (2007) Creating and evaluating a new clicker methodology. PhD thesis, Ohio State UniversityGoogle Scholar
  38. Lozanovski C, Haeusler C, Tobin P (2011) Incorporating student response systems in mathematics classes. In: Hannah J, Thomas M (eds) Te ara mokoroa: the long abiding path of knowledge: proceedings of volcanic delta. University of Canterbury and The University of Auckland, Rotorua, pp 228–237Google Scholar
  39. Lucke T, Dunn P, Keyssner U (2013) The use of a classroom response system to more effectively flip the classroom. Frontiers in education conference: energizing the future. IEEE, Oklahoma City, pp 103–104Google Scholar
  40. Matsunaga M (2010) How to factor-analyze your data right: do’s, don’ts and how-to’s. Int J Psychol Res 3(1):97–110Google Scholar
  41. Mayer RE, Stull A, DeLeeuw K, Almeroth K, Bimber B, Chun D, Bulger M, Campbell J, Knight A, Zhang H (2009) Clickers in college classrooms: fostering learning with questioning methods in large lecture classes. Contemp Educ Psychol 34:51–57CrossRefGoogle Scholar
  42. McGowan HM, Gunderson BK (2010) A randomized experiment exploring how certain features of clicker use effect undergraduate students’ engagement and learning in statistics. Technol Innov Stat Educ 4(1):1–29Google Scholar
  43. Pallant J (2002) SPSS Survival manual: a step by step guide to data analysis using SPSS. Allen and Unwin, Crows NestGoogle Scholar
  44. Palmer EJ, Devitt PG, De Young NJ, Morris D (2005) Assessment of an electronic voting system within the tutorial setting: a randomised controlled trial. BMC Med Educ 5(1):1–8Google Scholar
  45. Pekrun R, Goetz T, Titz W, Perry RP (2002) Academic emotions in students’ self-regulated learning and achievement: a program of qualitative and quantitative research. Educ Psychol 37:91–105CrossRefGoogle Scholar
  46. Penuel WR, Boscardin CK, Masyn K, Crawford VM (2007) Teaching with student response systems in elementary and secondary education settings: a survey study. Educ Tech Res Dev 55:315–346CrossRefGoogle Scholar
  47. R Core Team (2013) R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0.
  48. Revelle W (2013) psych: Procedures for psychological, psychometric, and personality research. R Package Vers 1(3):2Google Scholar
  49. Rosseel Y (2012) lavaan: an R package for structural equation modeling. J Stat Softw 48(2):1–36Google Scholar
  50. Schackow TE, Chavez M, Loya L, Friedman M (2004) Audience response system: effect on learning in family medicine residents. Fam Med 36(7):496–504Google Scholar
  51. Schau C, Stevens J, Dauphinee T, Del Vecchio A (1995) The development and validation of the survey of attitudes toward statistics. Educ Psychol Meas 55:868–875CrossRefGoogle Scholar
  52. Schreiber JB, Nora A, Stage FK, Barlow EA, King J (2006) Reporting structural equation modeling and confirmatory factor analysis results: a review. J Educ Res 99(6):323–338CrossRefGoogle Scholar
  53. Scornavacca E, Huff S, Marshall S (2009) Mobile phones in the classroom: if you can’t beat them, join them. Commun ACM 52(4):142–148CrossRefGoogle Scholar
  54. Siau K, Sheng H, Nah FF-H (2006) Use of a classroom response system to enhance classroom interactivity. IEEE Trans Educ 49(3):398–403CrossRefGoogle Scholar
  55. Stowell JR, Nelson JM (2007) Benefits of electronic audience response systems on student participation, learning, and emotion. Teach Psychol 34(4):253–258CrossRefGoogle Scholar
  56. Titman AC, Lancaster GA (2011) Personal response systems for teaching postgraduate statistics to small groups. J Stat Educ 19(2):1–20Google Scholar
  57. Trees AR, Jackson MH (2007) The learning environment in clicker classrooms: student processes of learning and involvement in large university-level courses using student response systems. Learn Media Technol 32:21–40CrossRefGoogle Scholar
  58. Trowler V, Trowler P (2010) Student engagement evidence summary. Commissioned technical report. Higher Education Academy, New YorkGoogle Scholar
  59. Watkins J, Mazur E (2013) Retaining students in science, technology, engineering, and mathematics (STEM) majors. J Coll Sci Teach 42(5):36–41Google Scholar
  60. Williams B, Lewis B, Boyle M, Brown T (2011) The impact of wireless keypads in an interprofessional education context with health science students. Br J Educ Technol 42(2):337–350CrossRefGoogle Scholar
  61. Wood WB (2004) Clickers: a teaching gimmick that works. Dev Cell 7:796–798Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  • Alice M. Richardson
    • 1
  • Peter K. Dunn
    • 2
    Email author
  • Christine McDonald
    • 3
  • Florin Oprescu
    • 2
  1. 1.Faculty of Information Sciences and EngineeringUniversity of CanberraCanberraAustralia
  2. 2.Faculty of Science, Health, Education and EngineeringUniversity of the Sunshine CoastSippy DownsAustralia
  3. 3.Department of Mathematics and ComputingUniversity of Southern QueenslandToowoombaAustralia

Personalised recommendations