Journal of Science Education and Technology

, Volume 21, Issue 3, pp 392–402 | Cite as

Assessing Multimedia Influences on Student Responses Using a Personal Response System

  • Kyle GrayEmail author
  • Katharine Owens
  • Xin Liang
  • David Steer


To date, research to date on personal response systems (clickers) has focused on external issues pertaining to the implementation of this technology or broadly measured student learning gains rather than investigating differences in the responses themselves. Multimedia learning makes use of both words and pictures, and research from cognitive psychology suggests that using both words and illustrations improves student learning. This study analyzed student response data from 561 students taking an introductory earth science course to determine whether including an illustration in a clicker question resulted in a higher percentage of correct responses than questions that did not include a corresponding illustration. Questions on topics pertaining to the solid earth were categorized as illustrated questions if they contained a picture, or graph and text-only if the question only contained text. For each type of question, we calculated the percentage of correct responses for each student and compared the results to student ACT-reading, math, and science scores. A within-groups, repeated measures analysis of covariance with instructor as the covariate yielded no significant differences between the percentage of correct responses to either the text-only or the illustrated questions. Similar non-significant differences were obtained when students were grouped into quartiles according to their ACT-reading, -math, and -science scores. These results suggest that the way in which a conceptest question is written does not affect student responses and supports the claim that conceptest questions are a valid formative assessment tool.


Personal response systems Clickers Multimedia Conceptest ACT 



Partial support for this work was provided by the National Science Foundation’s Division of Undergraduate Education program under Award No. 0716397. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation. We would also like to thank Dawn Del Carlo and Jeff Morgan for helpful suggestions on an earlier draft of the manuscript.


  1. American College Test (ACT) (2008a) Facts about the ACT. Retrieved 10 Sept 2008, from
  2. American College Test (ACT) (2008b) ACT High School profile: section II, academic achievement. Retrieved 21 Jan 2009 from
  3. Auras R, Bix L (2007) WAKE UP! The effectiveness of a student response system in large packaging classes. Packag Technol Sci 20(3):183–195CrossRefGoogle Scholar
  4. Beatty ID, Gerace WJ (2009) Technology-enhanced formative assessment: a research-based pedagogy for teaching science with classroom response technology. J Sci Educ Technol 18(2):146–162CrossRefGoogle Scholar
  5. Beatty ID, Gerace WJ, Leonard WJ, Dufresne RJ (2006) Designing effective questions for classroom response system teaching. Am J Phys 74(1):31–39CrossRefGoogle Scholar
  6. Beekes W (2006) The ‘millionaire’ method for encouraging participation. Active Learn High Educ 7(1):25–36CrossRefGoogle Scholar
  7. Beuckman J, Rebello NS (2007) How does a classroom interaction system affect student performance? Paper presented at the 2007 Annual Meeting of the National Association of Research in Science Teaching, New Orleans, LA. Retrieved from
  8. Bombaro C (2007) Using audience response technology to teach academic integrity: “The seven deadly sins of plagiarism” at Dickinson College. Ref Serv Rev 35(2):296–309CrossRefGoogle Scholar
  9. Born WK, Revelle W, Pinto LH (2002) Improving Biology Performance with Workshop Groups. J Sci Educ Technol 11(4):347–365CrossRefGoogle Scholar
  10. Brewer CA (2004) Near real-time assessment of student learning and understanding in biology courses. Bioscience 54(11):1034–1039CrossRefGoogle Scholar
  11. Butcher DJ, Brandt PF, Norgaard NJ, Atterholt CA, Salido AL (2003) Sparky introchem: a student-oriented introductory chemistry course. J Chem Educ 80(2):137–139CrossRefGoogle Scholar
  12. Cohen J (1988) Statistical power analysis for the behavioral sciences, 2nd edn. Erlbaum, Hillsdale, NJGoogle Scholar
  13. Crouch CH, Mazur E (2001) Peer instruction: ten years of experience and results. Am J Phys 69(9):970–977CrossRefGoogle Scholar
  14. d′Inverno R, Davis H, White S (2003) Using a personal response system for promoting student interaction. Teach Math Appl 22(4):163–169CrossRefGoogle Scholar
  15. Davis CS (2002) Statistical methods for the analysis of repeated measurements. Springer, New YorkGoogle Scholar
  16. Debourgh GA (2007) Use of classroom “clickers” to promote acquisition of advanced reasoning skills. Nurse Educ Pract 8(2):76–87CrossRefGoogle Scholar
  17. Dufresne RJ, Gerace WJ (2004) Assessing-to-learn: formative assessment in physics instruction. Phys Teach 42:428–433CrossRefGoogle Scholar
  18. eInstruction. (2009). eInstruction, simple solutions. Real results. Accessed May 1, 2009 from
  19. Fies C, Marshall J (2006) Classroom response systems: a review of the literature. J Sci Educ Technol 15(1):101–109CrossRefGoogle Scholar
  20. Freeman S, O’Conner E, Parks JW, Cunningham M, Hurley D, Haak D, Wenderoth MP (2007) Prescribed active learning increases performance in introductory biology. CBE Life Sci Educ 6(2):132–139CrossRefGoogle Scholar
  21. Gonzalez-Espada WJ, Bullock DW (2007) Innovative applications of classroom response systems: Investigating students’ item response times in relation to final course grade, gender, general point average, and high school ACT scores. Electron J Integr Technol Educ 6:97–108Google Scholar
  22. Gray KR (2009) A study of student responses to text-only and illustrated conceptest questions related to plate tectonics: differences by gender and prior achievement. Unpublished Dissertation: University of AkronGoogle Scholar
  23. Greer L, Heany PJ (2004) Real-time analysis of student comprehension: an assessment of electronic student response technology in an introductory earth science course. J Geosci Edu 52(4):345–351Google Scholar
  24. Hatch J, Jensen M, Moore R (2005) Manna from heaven or “clickers” from hell: experiences with an electronic response system. J Coll Sci Teach 34(7):36–42Google Scholar
  25. Hestenes D, Wells M, Swackhamer G (1992) Force concept inventory. Phys Teach 30:141–158CrossRefGoogle Scholar
  26. Johnson M, Robson D (2008) Clickers, student engagement and performance in an introductory economics course: a cautionary tale. Comput High Educ Econ Rev 20:4–12CrossRefGoogle Scholar
  27. King DB, Joshi S (2008) Gender differences in the use and effectiveness of personal response devices. J Sci Educ Technol 17(6):544–552CrossRefGoogle Scholar
  28. Koenig KA, Frey MC, Detterman DK (2008) ACT and general cognitive ability. Intelligence 36(2):153–160CrossRefGoogle Scholar
  29. Landis CR, Ellis AB, Lisenky GC, Lorenz JK, Meeker K, Wamser CC (2001) Chemistry conceptests: a pathway to interactive classrooms. Prentice Hall, Upper Saddle River, NJGoogle Scholar
  30. Lass D, Morzuch B, Rogers R (2007) Teaching with technology to engage students and enhance learning (Working Paper No. 2007-1). Retrieved from the University of Massachusetts Amherst’s website:
  31. Libarkin JC, Anderson SW, Dahl J, Beilfuss M, Boone W (2005) Qualitative analysis of college students’ ideas about the earth: interviews and open-ended questionnaires. J Geosci Edu 53(1):17–26Google Scholar
  32. Lucas A (2007) Using peer instruction and i-clickers to enhance student participation in calculus. Primus. Accessed on 11/20/2008 at
  33. MacArthur JR, Jones LL (2008) A review of literature reports of clickers applicable to college chemistry classrooms. Chem Educ Res Pract 9:187–195CrossRefGoogle Scholar
  34. MacGeorge EL, Homan SR, Dunning JB, Elmore D, Bodie GD, Evans E, Lichti SM (2008a) The influence of learning characteristics on evaluation of audience response technology. J Comput High Educ 19(2):25–46CrossRefGoogle Scholar
  35. MacGeorge EL, Homan SR, Dunning JB Jr, Elmore D, Bodie GD, Evans E, Beddes B (2008b) Student evaluation of audience response technology in large lecture classes. Educ Technol Res Dev 56(2):125–145CrossRefGoogle Scholar
  36. Maier M, Simkins S (2008) Learning from physics education research: lessons for economics education. Retrieved 15 Sept 2008 from
  37. Mayer RE (1989) Systemic thinking fostered by illustrations in scientific text. J Educ Psychol 81(2):240–246CrossRefGoogle Scholar
  38. Mayer RE (2009) Multimedia learning, 2nd edn. Cambridge University Press, New YorkGoogle Scholar
  39. Mayer RE, Gallini JK (1990) When is an illustration worth ten thousand words? J Educ Psychol 82(4):715–726CrossRefGoogle Scholar
  40. Mayer RE, Steinhoff K, Bower G, Mars R (1995) A generative theory of textbook design: using annotated illustrations to foster meaningful learning of science text. Educ Tech Res Dev 43(1):31–43CrossRefGoogle Scholar
  41. Mayer RE, Almeroth K, Bimber B, Chun D, Knight A, Campbell J (2009) Clickers in college classrooms: fostering learning with questioning methods in large lecture classes. Contemp Educ Psychol 34(1):51–57CrossRefGoogle Scholar
  42. Mazur E (1997) Peer instruction: a user’s manual. Prentice Hall, Upper Saddle River, NJGoogle Scholar
  43. McConnell DA, Steer DN, Owens KD, Knott JR, Van Horn S, Borowski W, Heaney PJ (2006) Using conceptests to assess and improve student conceptual understanding in introductory geoscience courses. J Geosci Edu 54(1):61–68Google Scholar
  44. Mollborn S, Hoekstra A (2010) A meeting of minds: using clickers for critical thinking and discussion in large sociology classes. Teach Sociol 38(1):18–27CrossRefGoogle Scholar
  45. Moreno R, Mayer RE (1999) Gender differences in responding to open-ended problem-solving questions. Learn Individ Differ 11(4):355–364CrossRefGoogle Scholar
  46. Moreno R, Mayer RE (2002) Learning science in virtual reality multimedia environments. J Educ Psychol 96(3):598–610CrossRefGoogle Scholar
  47. Piepmeier E Jr (1998) Use of conceptests in a large lecture course to provide active student involvement and peer teaching. Am J Pharm Educ 62:347–351Google Scholar
  48. Schlatter MD (2002) Writing conceptests for a multivariable calculus class. Primus 12(4):305–314CrossRefGoogle Scholar
  49. Steer D, McConnell D, Gray K, Kortz K, Liang X (2009) Analysis of student responses to peer-instruction conceptual questions answered using an electronic response system: trends by gender and ethnicity. Sci Educ 18(2):30–38Google Scholar
  50. Stevens J (2007) Applied multivariate statistics for the social sciences. Lawrence Erlbaum Associates, Hillsdale, NJGoogle Scholar
  51. Stowell JR, Nelson JM (2007) Benefits of electronic audience response systems on student participation, learning, and emotion. Teach Psychol 34(4):253–258CrossRefGoogle Scholar
  52. Suchman E, Uchiyama K, Smith R, Bender K (2006) Evaluating the impact of a classroom response system in a microbiology course. Microbiol Educ 7:3–11Google Scholar
  53. Thompson RA, Zamboanga BL (2004) Academic aptitude and prior knowledge as predictors of student achievement in introduction to psychology. J Educ Psychol 96(4):778–784CrossRefGoogle Scholar
  54. Trees AR, Jackson MH (2007) The learning environment in clicker classrooms: student processes of learning and involvement in large university-level courses using student response systems. Learn Media Technol 32(1):21–40CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  • Kyle Gray
    • 1
    Email author
  • Katharine Owens
    • 2
  • Xin Liang
    • 3
  • David Steer
    • 4
  1. 1.Department of Earth ScienceUniversity of Northern IowaCedar FallsUSA
  2. 2.Department of Curricular and Instructional StudiesUniversity of AkronAkronUSA
  3. 3.Department of Educational Foundations and LeadershipUniversity of AkronAkronUSA
  4. 4.Department of Geology and Environmental ScienceUniversity of AkronAkronUSA

Personalised recommendations