Abstract
This paper describes the development and validation of an instrument for evaluating classroom response systems (CRS). While a number of studies evaluating CRS have been published to date, no standardised instrument exists as a means of evaluating the impact of using the CRS. This means that comparing the different systems, or evaluating the benefits of using the CRS in different ways or settings, is very difficult despite the number of published reports, as indicated by Kay and LeSage (2009). An instrument was developed, called the classroom response system perceptions (CRiSP) questionnaire, which allows the evaluation of varied CRS on three scales: the usability; the impact on student engagement; and the impact on student learning. The development of CRiSP was undertaken in three universities, using different CRS, and evaluated through focus groups, one-on-one interviews and a factor analysis of the survey responses. We found no evidence of difference on the scales according to gender or age groups. The final CRiSP questionnaire consists of 26 base questions, with additional optional questions available. This paper proposes that the CRiSP Questionnaire could, in its current state or with minor changes, be used to evaluate the impact on learning of other classroom technologies also.
Similar content being viewed by others
References
Alexander CJ, Crescini WM, Juskewitch JE, Lachman N, Pawlina W (2009) Assessing the integration of audience response system technology in teaching of anatomical sciences. Anat Sci Educ 2:160–166
Ayu MA, Taylor K, Mantoro T (2009) Active learning: engaging students in the classroom using mobile phones active learning: engaging students in the classroom using mobile phones active learning: engaging students in the classroom using mobile phones. In IEEE Symposium on Industrial Electronics and Applications, ISIEA, pp 711–715
Bachman L, Bachman C (2011) A study of classroom response system clickers: increasing student engagement and performance in a large undergraduate lecture class on architectural research. J Interact Learn Res 22(1):5–21
Barnett J (2006) Implementation of personal response units in very large lecture classes: student perceptions. Australasian J Educ Technol 22:474–494
Barraguérs JI, Morias A, Manterola J, Guisasola J (2011) Use of a classroom response system (CRS) for teaching mathematics in engineering with large groups. In: Mendez-Vilas A (ed) Education in a technological world: communicating current and emerging research and technological efforts. Formatex Research Center, pp 572–580
Beekes W (2006) The ‘millionaire’ method for encouraging participation. Act Learn High Educ 7:25–36
Bernaards CA, Sijtsma K (2010) Influence of imputation and EM methods on factor analysis when item nonresponse in questionnaire data is nonignorable. Multivar Behav Res 35(3):321–364
Berry J (2009) Technology support in nursing education: clickers in the classroom. Nurs Educ Res 30:295–298
Bode M, Drane D, Kolikant YBD, Schuller M (2009) A clicker approach to teaching calculus. Not Am Math Soc 56(2):253–256
Bruff D (2009) Teaching with classroom response systems. Jossey-Bass, San Fransisco
Bunce DM, VandenPlas JR, Havanki KL (2006) Comparing the effectiveness on student achievement of a student response system versus online WebCT quizzes. J Chem Educ 83(3):488–493
Caldwell JE (2007) Clickers in the large classroom: current research and best-practice tips. CBE—life sciences. Education 6:9–20
Chan KC, Snavely J (2009) Do clickers ‘click’ in the classroom? J Financ Educ 35(2):25–40
Cronbach LJ (1951) Coefficient alpha and the internal structure of tests. Psychometrika 16(3):297–334
Crossgrove K, Curran KL (2008) Using clickers in nonmajors- and majors-level biology courses: student opinion, learning, and long-term retention of course material. CBE—life sciences. Education 7:146–154
Davis F (1989) Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS q 13(3):319–340
Davis F, Bagozzi RP, Warshaw PR (1989) User acceptance of computer technology: a comparison of two theoretical models. Manag Sci 35(8):982–1003
Draper SW, Brown MI (2004) Increasing interactivity in lectures using an electronic voting system. J Comput Assist Learn 20:81–94
Duggan PM, Palmer E, Devitt P (2007) Electronic voting to encourage interactive lectures: a randomised trial. BMC Med Educ 7:25
Dunn PK, Richardson A, McDonald C, Oprescu F (2012) Instructor perceptions of using a mobile-phone-based, free classroom response system in first-year statistics undergraduate courses. Int J Math Educ Sci Technol 43(8):1041–1056
Dunn PK, Richardson A, McDonald C, Oprescu F (2013) Mobile-phone-based classroom response systems: students’ perceptions of engagement and learning in a large undergraduate course. J Math Educ Sci Technol. doi:10.1080/0020739X.2012.756548
Elliott C (2003) Using a personal response system in economics teaching. Int Rev Econ Educ 1(1):80–86
Graham CR, Tripp TR, Seawright L, Joeckel GL III (2007) Empowering or compelling reluctant participators using audience response systems. Act Learn High Educ 8(3):233–258
Guthrie RW, Carlin A (2004) Waking the dead: using interactive technology to engage passive listeners in the classroom. In: Proceedings of the Tenth Americas Conference on Information Systems, New York
Guttman L (1945) A basis for analyzing test-retest reliability. Psychometrika 10(4):255–282
Han JH, Finkelstein A (2013) Understanding the effects of professors’ pedagogical development with clicker assessment and feedback technologies and the impact on students’ engagement and learning in higher education. Comput Educ 65:64–76
Hoekstra A (2008) Vibrant student voices: exploring effects of the use of clickers in large college courses. Learn Media Technol 33:329–341
Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat 6:65–70
Horn JL (1965) A rationale and test for the number of factors in factor analysis. Psychometrika 30:179–185
Ismail K (2008) Unravelling factor analysis. Evidence-based mental. Health 11(4):99–102
Kaleta R, Joosten T (2007) Student reponses systems: a Univeristy of Wisconsin system study of clickers. Educause Cent Appl Res Res Bull 2007:1–12
Kay RH, LeSage A (2009) Examining the benefits and challenges of using audience response systems: a review of the literature. Comput Educ 53:819–827
Koppel N, Berenson M (2009) Ask the audience—Using clickers to enhance introductory business statistics courses. Inf Syst Educ J 7(92):1–18
Kundisch D, Magenheim J, Beutner M, Hermann P, Reinhardt W, Zokye A (2013) Classroom response systems. Inform Spektrum 36(4):389–393
Kyei-Blankson L, Cheesman E, Blankson J (2009) The value added effect of using clickers in a graduate research methods and statistics course. In: Gibson I (ed) Proceedings of the society for information technology and teacher education international conference. AACE, Chesapeake, pp 1947–1952
Lantz ME (2010) The use of ‘Clickers’ in the classroom: teaching innovation or merely an amusing novelty? Comput Hum Behav 26:556–561
Li P (2007) Creating and evaluating a new clicker methodology. PhD thesis, Ohio State University
Lozanovski C, Haeusler C, Tobin P (2011) Incorporating student response systems in mathematics classes. In: Hannah J, Thomas M (eds) Te ara mokoroa: the long abiding path of knowledge: proceedings of volcanic delta. University of Canterbury and The University of Auckland, Rotorua, pp 228–237
Lucke T, Dunn P, Keyssner U (2013) The use of a classroom response system to more effectively flip the classroom. Frontiers in education conference: energizing the future. IEEE, Oklahoma City, pp 103–104
Matsunaga M (2010) How to factor-analyze your data right: do’s, don’ts and how-to’s. Int J Psychol Res 3(1):97–110
Mayer RE, Stull A, DeLeeuw K, Almeroth K, Bimber B, Chun D, Bulger M, Campbell J, Knight A, Zhang H (2009) Clickers in college classrooms: fostering learning with questioning methods in large lecture classes. Contemp Educ Psychol 34:51–57
McGowan HM, Gunderson BK (2010) A randomized experiment exploring how certain features of clicker use effect undergraduate students’ engagement and learning in statistics. Technol Innov Stat Educ 4(1):1–29
Pallant J (2002) SPSS Survival manual: a step by step guide to data analysis using SPSS. Allen and Unwin, Crows Nest
Palmer EJ, Devitt PG, De Young NJ, Morris D (2005) Assessment of an electronic voting system within the tutorial setting: a randomised controlled trial. BMC Med Educ 5(1):1–8
Pekrun R, Goetz T, Titz W, Perry RP (2002) Academic emotions in students’ self-regulated learning and achievement: a program of qualitative and quantitative research. Educ Psychol 37:91–105
Penuel WR, Boscardin CK, Masyn K, Crawford VM (2007) Teaching with student response systems in elementary and secondary education settings: a survey study. Educ Tech Res Dev 55:315–346
R Core Team (2013) R: A language and environment for statistical computing. R Foundation for Statistical Computing, Vienna, Austria. ISBN 3-900051-07-0. http://www.R-project.org/
Revelle W (2013) psych: Procedures for psychological, psychometric, and personality research. R Package Vers 1(3):2
Rosseel Y (2012) lavaan: an R package for structural equation modeling. J Stat Softw 48(2):1–36
Schackow TE, Chavez M, Loya L, Friedman M (2004) Audience response system: effect on learning in family medicine residents. Fam Med 36(7):496–504
Schau C, Stevens J, Dauphinee T, Del Vecchio A (1995) The development and validation of the survey of attitudes toward statistics. Educ Psychol Meas 55:868–875
Schreiber JB, Nora A, Stage FK, Barlow EA, King J (2006) Reporting structural equation modeling and confirmatory factor analysis results: a review. J Educ Res 99(6):323–338
Scornavacca E, Huff S, Marshall S (2009) Mobile phones in the classroom: if you can’t beat them, join them. Commun ACM 52(4):142–148
Siau K, Sheng H, Nah FF-H (2006) Use of a classroom response system to enhance classroom interactivity. IEEE Trans Educ 49(3):398–403
Stowell JR, Nelson JM (2007) Benefits of electronic audience response systems on student participation, learning, and emotion. Teach Psychol 34(4):253–258
Titman AC, Lancaster GA (2011) Personal response systems for teaching postgraduate statistics to small groups. J Stat Educ 19(2):1–20
Trees AR, Jackson MH (2007) The learning environment in clicker classrooms: student processes of learning and involvement in large university-level courses using student response systems. Learn Media Technol 32:21–40
Trowler V, Trowler P (2010) Student engagement evidence summary. Commissioned technical report. Higher Education Academy, New York
Watkins J, Mazur E (2013) Retaining students in science, technology, engineering, and mathematics (STEM) majors. J Coll Sci Teach 42(5):36–41
Williams B, Lewis B, Boyle M, Brown T (2011) The impact of wireless keypads in an interprofessional education context with health science students. Br J Educ Technol 42(2):337–350
Wood WB (2004) Clickers: a teaching gimmick that works. Dev Cell 7:796–798
Acknowledgments
The authors acknowledge financial support from the USC’s Open Learning and Teaching Grants Scheme (OLTGP2011/7) and thank Mr Frank Muller and Ms Zanubia Hussain for assistance with the data collection. The contributions of the reviewers are also gratefully acknowledged.
Author information
Authors and Affiliations
Corresponding author
Appendix: The items in the revised questionnaire
Appendix: The items in the revised questionnaire
The 26 ordinal scale items used in the initial questionnaire. All items are answered on a five-point ordinal scale (from Strongly Disagree to Strongly Agree)
Short description | Complete wording |
---|---|
Wasted time | Using [CRS] in lectures wasted too much time |
Recommend use | I would recommend that the lecturer continue to use [CRS] |
Overall value | The use of [CRS] helped increase the classes’ overall value |
Motivation | [CRS] used in this unit motivated me to learn |
Interaction | I found this method of interaction between students and lecturer effective |
Instant feedback | [CRS] helped me get instant feedback on what I knew and didn’t know |
Peer awareness | The use of [CRS] helped increase my awareness of my peers’ opinions and attitudes |
Understand concepts | [CRS] allows me to better understand key concepts |
Instructors used results | My instructor used the results from [CRS] questions to gauge class understanding and reinforce material that was not understood |
Enhanced learning | Using [CRS] questions enhanced my learning of the subject |
Control over learning | I believe that [CRS] provided me with more control over my learning than in units that do not use [CRS] |
Think deeply | Using [CRS] helped me think more deeply about course materials |
Correct but not understand | I often voted for the right answer without really understanding |
More confident | Using [CRS] made me more confident to participate in class |
Mostly used | I used [CRS] most times when it was used in class |
Increased participation | [CRS] increased the frequency of my direct participation in the course |
Active | The use of [CRS] helped me to be active in class |
Pay attention | Using [CRS] helped me pay more attention in class |
Concentrate | Using [CRS] has helped my concentration levels in lectures |
Attendance | Using [CRS] has encouraged me to attend lectures |
Easy to use | For me it was easy to use the [CRS] voting system |
Too difficult | For me [CRS] was too difficult to use |
Expectations hard | It was too hard to know what was expected of me using [CRS] |
Tech problems | There were too many technological problems using [CRS] |
Increased enjoyment | Using [CRS] has increased my enjoyment of lectures |
Anonymity good | Other students could not see my answers, which encouraged me to be an active participant in the class |
Other questions
Short description | Complete wording |
---|---|
Why Vote | When you did choose to vote in a [CRS] question, what was your reason for choosing to participate? Please select all that apply. |
Why not vote | When you did not vote in a [CRS] question, what was your reason for choosing not to participate? Please select all that apply. |
Age | Age |
Gender | Gender |
The following items are all answered on the five-point ordinal scale (from Strongly Disagree to Strongly Agree) unless noted otherwise.
Items for use with CRS that use phone-based technology:
-
Do have access to a phone in class?
-
Possible responses include: All classes; most classes; some classes; never.
-
What company is your phone provided? (As used by Scornavacca et al. 2009):
-
Students should select from a closed list of options, plus an “Other (specify)” option.
-
I was distracted from the class after using my phone in class.
-
I was distracted by other people using their phones in class after a [CRS] question was asked.
-
I often failed to get a sufficiently strong signal to be able to use my phone.
-
I didn’t like using my phone to vote.
For instructors who have used, or who are considering using, CRS technology to contribute towards grades:
-
I would like to receive credit for the correct responses chosen while using [CRS]. That is, I believe that responses should be graded in some fashion.
-
I would like to receive credit for using [CRS]. That is, I believe that participation in the voting using [CRS] should contribute towards grades in this course.
For CRS that require students to register before, they can use the technology:
-
Registering to use the [CRS] was too much hassle for me.
-
Registering to use the [CRS] was too difficult for me.
-
Registering to use the [CRS] was too time-consuming for me.
-
I did not want to register to use the [CRS] as I was required to supply some personal details.
For situations where the use of the CRS is anonymous for students to instructors:
-
I was encouraged to vote because the lecturer did not know how I voted.
Rights and permissions
About this article
Cite this article
Richardson, A.M., Dunn, P.K., McDonald, C. et al. CRiSP: An Instrument for Assessing Student Perceptions of Classroom Response Systems. J Sci Educ Technol 24, 432–447 (2015). https://doi.org/10.1007/s10956-014-9528-2
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10956-014-9528-2