Advertisement

Learning Environments Research

, Volume 19, Issue 2, pp 153–167 | Cite as

Effectiveness of student response systems in terms of learning environment, attitudes and achievement

  • Stephen T. Cohn
  • Barry J. FraserEmail author
Original Paper

Abstract

In order to investigate the effectiveness of using Student Response Systems (SRS) among grade 7 and 8 science students in New York, the How Do You Feel About This Class? (HDYFATC) questionnaire was administered to 1097 students (532 students did use SRS and 565 students who did not use SRS). Data analyses attested to the sound factorial validity and internal consistency reliability of the HDYFATC, as well as its ability to differentiate between the perceptions of students in different classrooms. Very large differences between users and non-users of SRS, ranging from 1.17 to 2.45 standard deviations for various learning environment scales, attitudes and achievement, supported the efficacy of using SRS.

Keywords

Achievement Attitudes Learning environment Middle-school science Student Response Systems (SRS) 

References

  1. Afari, E., Aldridge, J. M., Fraser, B. J., & Khine, M. S. (2013). Students’ perceptions of the learning environment and attitudes in game-based mathematics classrooms. Learning Environments Research, 13, 131–150.CrossRefGoogle Scholar
  2. Aldridge, J. M., & Fraser, B. J. (2000). A cross-cultural study of classroom learning environments in Australia and Taiwan. Learning Environments Research, 3, 101–134.CrossRefGoogle Scholar
  3. Aldridge, J. M., Fraser, B. J., & Ntuli, S. (2009). Utilising learning environment assessments to improve teaching practices among in-service teachers undertaking a distance education programme. South African Journal of Education, 29, 147–170.Google Scholar
  4. Aldridge, J. M., Fraser, B. J., & Sebela, M. P. (2004). Using teacher action research to promote constructivist learning environments in South Africa. South African Journal of Education, 24, 245–253.Google Scholar
  5. Aldridge, J. M., Fraser, B. J., Taylor, P. C., & Chen, C.-C. (2000). Constructivist learning environments in a cross-national study in Taiwan and Australia. International Journal of Science Education, 22, 37–55.CrossRefGoogle Scholar
  6. Allport, G. (1956). The nature of prejudice. Reading: Addison-Wesley.Google Scholar
  7. Baker, D. R. (1985). Predictive value of attitude, cognitive ability, and personality to science achievement in the middle school. Journal of Research in Science Teaching, 22, 103–113.CrossRefGoogle Scholar
  8. Baron, R. A., & Byrne, D. (1977). Understanding human interaction (2nd ed.). Boston: Allyn and Bacon.Google Scholar
  9. Brewer, C. (2004). Near real-time assessment of student learning and understanding in biology courses. BioScience, 54, 1034–1040.CrossRefGoogle Scholar
  10. Burnstein, R., & Lederman, L. (2001). Using wireless keypads in lecture classes. Physics Teaching, 39, 8–11.CrossRefGoogle Scholar
  11. Caine, G., & Caine, R. N. (1994). Making connections: Teaching and the human brain. Menlo Park: Addison-Wesley.Google Scholar
  12. Carnevale, D. (2005). Run class like a game show: ‘Clickers’ keep students involved. The Chronicle of Higher Education, 51, 42.Google Scholar
  13. Casanova, J. (1971). An instructional experiment in organic chemistry: The use of student response systems. Journal of Chemical Education, 48, 453–455.CrossRefGoogle Scholar
  14. Chionh, Y. H., & Fraser, B. J. (2009). Classroom environment, achievement, attitudes and self-esteem in geography and mathematics in Singapore. International Research in Geographical and Environmental Education, 18, 29–44.CrossRefGoogle Scholar
  15. Cohen, J. (1992). A power primer. Psychological Bulletin, 112, 155–159.CrossRefGoogle Scholar
  16. Crouch, C. H., & Mazur, E. (2001). Peer instruction: Ten years of experience and results. American Journal of Physics, 69, 970–977.CrossRefGoogle Scholar
  17. den Brok, P., Fisher, D., Rickards, T. W., & Bull, E. (2006). Californian science students’ perceptions of their classroom learning environments. Educational Research and Evaluation, 12(1), 3–25.CrossRefGoogle Scholar
  18. Dorman, J. P. (2003). Cross-national validation of the What Is Happening In this Class? (WIHIC) questionnaire using confirmatory factor analysis. Learning Environments Research, 6, 231–245.CrossRefGoogle Scholar
  19. Draper, S. W., & Brown, M. I. (2004). Increasing interactivity in lectures using an electronic voting system. Journal of Computer Assisted learning, 20, 81–94.CrossRefGoogle Scholar
  20. Duncan, D. (2005). Clickers in the classroom: How to enhance science teaching using classroom response systems. San Francisco: Pearson Education Inc.Google Scholar
  21. Duncan, D. (2006). Clickers: A new teaching aid with exceptional promise. Astronomy Education Review, 5, 70–88.CrossRefGoogle Scholar
  22. Duncan, D. (2008, November). Clickers: A new teaching tool of exceptional promise. Paper presented at the inaugural conference on classroom response systems: innovations and best practices, University of Louisville, KA.Google Scholar
  23. Fisher, D. L., & Fraser, B. J. (1981). Validity and use of my class inventory. Science Education, 65, 145–156.CrossRefGoogle Scholar
  24. Fisher, D. L., & Fraser, B. J. (1983). Validity and use of classroom environment scale. Educational Evaluation and Policy Analysis, 5, 261–271.CrossRefGoogle Scholar
  25. Fisher, D. L., Henderson, D., & Fraser, B. J. (1997). Laboratory environments and student outcomes in senior high school biology. American Biology Teacher, 59, 214–219.CrossRefGoogle Scholar
  26. Fraser, B. J. (1978). Some attitude scales for ninth grade science. School Science and Mathematics, 78, 379–384.CrossRefGoogle Scholar
  27. Fraser, B. J. (1981). Test of science-related attitudes handbook. Melbourne: Australian Council for Educational Research.Google Scholar
  28. Fraser, B. J. (1998). Science learning environments: Assessment, effects and determinants. In B. J. Fraser & K. G. Tobin (Eds.), International handbook of science education (pp. 527–564). Dordrecht: Kluwer Academic Publishers.CrossRefGoogle Scholar
  29. Fraser, B. J. (2001). Twenty thousand hours: Editor’s introduction. Learning Environments Research, 4, 1–5.CrossRefGoogle Scholar
  30. Fraser, B. J. (2012). Classroom learning environments: Retrospect, context and prospect. In B. J. Fraser, K. G. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 1191–1240). New York: Springer.CrossRefGoogle Scholar
  31. Fraser, B. J. (2014). Classroom learning environments: Historical and contemporary perspectives. In N. G. Lederman & S. K. Abell (Eds.), Handbook of research on science education (Vol. II, pp. 104–117). New York: Routledge.Google Scholar
  32. Fraser, B. J., Aldridge, J. M., & Adolphe, F. S. G. (2010). A cross-national study of secondary science classroom environments in Australia and Indonesia. Research in Science Education, 40, 551–571.CrossRefGoogle Scholar
  33. Fraser, B. J., Anderson, G. J., & Walberg, H. J. (1982). Assessment of learning environments: Manual for learning environment inventory (LEI) and my class inventory (MCI) (3rd ed.). Perth: Western Australian Institute of Technology.Google Scholar
  34. Fraser, B. J., Fisher, D. L., & McRobbie, C. J. (1996, April). Development, validation, and use of personal and class forms of a new classroom environment instrument. Paper presented at the annual meeting of the American Educational Research Association, New York.Google Scholar
  35. Fraser, B. J., Giddings, G. J., & McRobbie, C. J. (1995). Evolution and validation of a personal form of an instrument for assessing science laboratory classroom environments. Journal of Research in Science Teaching, 32, 399–422.CrossRefGoogle Scholar
  36. Fraser, B. J., & Lee, S. S. U. (2009). Science laboratory classroom environments in Korean high schools. Learning Environments Research, 12, 67–84.CrossRefGoogle Scholar
  37. Fraser, B. J., & McRobbie, C. J. (1995). Science laboratory classroom environments at schools and universities: A cross-national study. Educational Research and Evaluation, 1, 289–317.CrossRefGoogle Scholar
  38. Fraser, B. J., & O’Brien, P. (1985). Student and teacher perceptions of the environment of elementary-school classrooms. Elementary School Journal, 85, 576–580.CrossRefGoogle Scholar
  39. Fraser, B. J., & Raaflaub, C. (2013). Subject and sex differences in the learning environments—Perceptions and attitudes of Canadian mathematics and science students using laptop computers. Curriculum and Teaching, 28(1), 57–58.CrossRefGoogle Scholar
  40. Goh, S. C., Young, D. J., & Fraser, B. J. (1995). Psychosocial climate and student outcomes in elementary mathematics classrooms: A multilevel analysis. Journal of Experimental Education, 64, 29–40.CrossRefGoogle Scholar
  41. Guess, A. (2008). Keeping clickers in the classroom. Retrieved March 25, 2009, from http://www.insidehighered.com/news/2008/07/18/clickers.
  42. Hatch, J., Jensen, M., & Moore, R. (2005). Manna from heaven or ‘clickers’ from hell. Journal of College Science Teaching, 34, 36–39.Google Scholar
  43. Helding, K., & Fraser, B. J. (2013). Effectiveness of National Board Certification (NBC) teachers in terms of classroom environment, attitudes and achievement among secondary science students. Learning Environments Research, 13, 1–21.CrossRefGoogle Scholar
  44. Homme, J., Asay, G., & Morgenstern, B. (2004). Utilization of an audience response system. Medical Education, 38, 575.CrossRefGoogle Scholar
  45. Joosten, T., & Kaleta, R. J. (2006, April). “Clickers” in the classroom: Analyses from the University of Wisconsin System Project. Paper presented at the UW system joint conference 2006, Green Lake, WI.Google Scholar
  46. Judson, E., & Sawada, D. (2002). Learning from past and present: Electronic response systems in college lecture halls. Journal of Computers in Mathematics and Science Teaching, 21, 167–181.Google Scholar
  47. Julian, G. M. (1995). Socratic dialogue—With how many? The Physics Teacher, 33, 338–339.CrossRefGoogle Scholar
  48. Khoo, H. S., & Fraser, B. J. (2008). Using classroom psychosocial environment in the evaluation of adult computer application courses in Singapore. Technology, Pedagogy and Education, 17, 67–81.CrossRefGoogle Scholar
  49. Kim, H. B., Fisher, D. L., & Fraser, B. J. (1999). Assessment and investigation of constructivist science learning environments in Korea. Research in Science and Technological Education, 17, 239–249.CrossRefGoogle Scholar
  50. Klopfer, L. E. (1971). Evaluation of learning in science. In B. S. Bloom, J. T. Hastings, & G. F. Madaus (Eds.), Handbook on summative and formative evaluation of student learning (pp. 559–641). New York: McGraw Hill.Google Scholar
  51. Klopfer, L. E. (1976). A structure for the affective domain in relation to science education. Science Education, 60, 299–312.CrossRefGoogle Scholar
  52. Koh, N. K., & Fraser, B. J. (2014). Learning environment associated with the use of mixed-mode delivery model among secondary business studies students in Singapore. Learning Environments Research, 17, 157–171.CrossRefGoogle Scholar
  53. Kretch, D., & Crutchfield, R. S. (1980). Sosyal psikoloji: Teori ve problemler. Istanbul: Istanbul Üniversitesi.Google Scholar
  54. Lackney, J. (1998). Design principles based on brain-based learning research. Retrieved June 8, 2010, from http://www.designshare.com/research/brainbasedlearn98.htm.
  55. Lewin, K. (1936). Principles of topological psychology. New York: McGraw.CrossRefGoogle Scholar
  56. Lightburn, M. E., & Fraser, B. J. (2007). Classroom environment and student outcomes among students using anthropometry activities in high-school science. Research in Science and Technological Education, 25, 153–166.CrossRefGoogle Scholar
  57. Lightstone, K. (2006). How remote responders affect teaching. The Teaching Professor, 20, 8.Google Scholar
  58. Littauer, R. (1972). Instructional implications of a low-cost electronic student response system. Educational Technology Teacher and Technology Supplement, 12, 69–71.Google Scholar
  59. Liu, L., & Fraser, B. J. (2013). Development and validation of an English classroom learning environment inventory and its application in China. In M. S. Khine (Ed.), Application of structural equation modeling in educational research (pp. 75–89). Rotterdam: Sense Publishers.CrossRefGoogle Scholar
  60. MacLeod, C., & Fraser, B. J. (2010). Development, validation and application of a modified Arabic translation of the What Is Happening In this Class? (WIHIC) questionnaire. Learning Environments Research, 13, 105–125.CrossRefGoogle Scholar
  61. Martin-Dunlop, C., & Fraser, B. J. (2008). Learning environment and attitudes associated with an innovative science course designed for prospective elementary teachers. International Journal of Science and Mathematics Education, 6, 163–190.CrossRefGoogle Scholar
  62. Martyn, M. (2007). Clickers in the classroom: An active learning approach. Educase Quarterly, 30(2), 71–74.Google Scholar
  63. Mazur, E. (1997). Peer instruction: A user’s manual. Upper Saddle River: Prentice Hall.Google Scholar
  64. Misiti, F. L., Shrigley, R. L., & Hanson, L. (1991). Science attitudes scale for middle school students. Science Education, 75, 525–540.CrossRefGoogle Scholar
  65. Moore, R. W., & Foy, R. L. (1997). The scientific attitude inventory: A revision (SAI II). Journal of Research in Science Teaching, 34, 327–336.CrossRefGoogle Scholar
  66. Moore, R. W., & Sutman, F. X. (1970). The development, field test and validation of an inventory of scientific attitudes. Journal of Research in Science Teaching, 34, 327–336.CrossRefGoogle Scholar
  67. Moos, R. H. (1974). The social climate scales: An overview. Palo Alto: Consulting Psychologists Press.Google Scholar
  68. Moos, R. H. (1979). Evaluating educational environments: Procedures, measures, findings and policy implications. San Francisco: Jossey-Bass.Google Scholar
  69. Moos, R. H., & Trickett, E. J. (1974). Classroom environment scale manual. Palo Alto: Consulting Psychologists Press.Google Scholar
  70. Munby, H. (1982). The impropriety of “panel of judges” validation in science attitude scales: A research comment. Journal of Research in Science Teaching, 19, 617–619.CrossRefGoogle Scholar
  71. Munby, H. (1983). Thirty studies involving the “Scientific Attitude Inventory”: What confidence can we have in this instrument? Journal of Research in Science Teaching, 20, 141–162.CrossRefGoogle Scholar
  72. Munby, H. (1997). Issues in validity of science attitude measures. Journal of Research in Science Teaching, 34, 337–341.CrossRefGoogle Scholar
  73. Murray, H. A. (1938). Explorations in personality. New York: Oxford University Press.Google Scholar
  74. Nix, R. K., Fraser, B. J., & Ledbetter, C. E. (2005). Evaluating an integrated science learning environment using the constructivist learning environment survey. Learning Environments Research, 8, 109–133.CrossRefGoogle Scholar
  75. Ogbuehi, P. I., & Fraser, B. J. (2007). Learning environment, attitudes and conceptual development associated with innovative strategies in middle-school mathematics. Learning Environments Research, 10, 101–114.CrossRefGoogle Scholar
  76. Peer, J., & Fraser, B. J. (2015). Sex, grade-level and stream differences in learning environment and attitudes to science in Singapore primary schools. Learning Environments Research, 18, 143–161.CrossRefGoogle Scholar
  77. Peiro, M. M., & Fraser, B. J. (2009). Assessment and investigation of science learning environments in the early childhood grades. In M. Ortiz & C. Rubio (Eds.), Educational evaluation: 21st century issues and challenges (pp. 349–365). New York: Nova Science Publishers.Google Scholar
  78. Perrodin, A. F. (1966). Children’s attitudes towards science. Science Education, 50, 214–218.CrossRefGoogle Scholar
  79. Poulis, J., Massen, C., Robens, E., & Dilbert, M. (1998). Physics lecturing with audience paced feedback. American Journal of Physics, 66, 439–441.CrossRefGoogle Scholar
  80. Roberts, G. (2005). Instructional technology that’s hip high-tech. Computer in Libraries, 25, 26–28.Google Scholar
  81. Robertson, L. J. (2000). Twelve tips for using a computerized interactive audience response system. Medical Teaching, 22, 237–239.CrossRefGoogle Scholar
  82. Siau, K., Sheng, H., & Nah, F. F.-H. (2006). Use of a classroom response system to enhance classroom interactivity. IEEE Transactions on Education, 49, 399–403.Google Scholar
  83. Skiba, D. (2006). Got large lecture halls? Use clickers. Nursing Education Perspective, 27, 278–280.Google Scholar
  84. Steinert, Y., & Snell, L. S. (1999). Interactive lecturing: Strategies for increasing participation in large group presentations. Medical Teaching, 21, 37–42.CrossRefGoogle Scholar
  85. Stodolsky, S. S., Salk, S., & Blaessner, B. (1991). Student views about learning math and social sciences. American Educational Research Journal, 28, 89–116.CrossRefGoogle Scholar
  86. Tavsancil, E. (2006). Tutumlarin Ölçülmesi ve SPSS ile veri analizi. Ankara: Nobel Yayinlari.Google Scholar
  87. Taylor, B. A., & Fraser, B. J. (2013). Relationships between learning environment and mathematics anxiety. Learning Environments Research, 16, 297–313.CrossRefGoogle Scholar
  88. Taylor, P. C., Fraser, B. J., & Fisher, D. L. (1997). Monitoring constructivist classroom learning environments. International Journal of Educational Research, 27, 293–302.CrossRefGoogle Scholar
  89. Thompson, B. (1998). Review of ‘What if there were no significance tests?’. Educational and Psychological Measurement, 58, 334–346.CrossRefGoogle Scholar
  90. Thompson, B. (2002). What future quantitative social science research could look like: Confidence intervals for effect sizes. Educational Researcher, 31, 24–31.CrossRefGoogle Scholar
  91. Thurstone, L. L. (1929). Theory of attitude measurement. Psychological Bulletin, 36, 222–241.Google Scholar
  92. Tytler, R., & Osborne, J. (2012). Student attitudes and aspirations towards science. In B. J. Fraser, K. G. Tobin, & C. J. McRobbie (Eds.), Second international handbook of science education (pp. 597–625). New York: Springer.CrossRefGoogle Scholar
  93. Walberg, H. J., & Anderson, G. J. (1968). Classroom learning and individual learning. Journal of Educational Psychology, 59, 414–419.CrossRefGoogle Scholar
  94. Walker, S. L. (2006). Development and validation of the test of geography-related attitudes (ToGRA). Journal of Geography, 105, 175–181.CrossRefGoogle Scholar
  95. Wampler, P. J. (2006). Clickers in the classroom—Rewards and regrets of using student response systems in a large enrollment geology course. Geological Society of America Abstracts with Programs, 38, 497.Google Scholar
  96. Wieman, C., & Perkins, K. (2005). Transforming physics education. Physics Today, 58, 36–41.CrossRefGoogle Scholar
  97. Wit, E. (2003). Who wants to be… The use of a personal response system in statistics teaching. MSOR Connections, 3(2), 14–20.CrossRefGoogle Scholar
  98. Wolf, S. J., & Fraser, B. J. (2008). Learning environment, attitudes and achievement among middle-school science students using inquiry-based laboratory activities. Research in Science Education, 38, 321–341.CrossRefGoogle Scholar
  99. Wong, A. F. L., & Fraser, B. J. (1996). Environmental−attitude associations in the chemistry laboratory classroom. Research in Science and Technological Education, 14, 91–102.CrossRefGoogle Scholar
  100. Wood, W. B. (2004). Clickers: A teaching gimmick that works. Developmental Cell, 7, 796–798.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2015

Authors and Affiliations

  1. 1.Curtin UniversityPerthAustralia

Personalised recommendations