Designing online software for teaching the concept of variable that facilitates mental interaction with the material: systemic approach
Our case study is a response to the need for research and reporting on specific strategies employed by software designers to produce effective multimedia instructional solutions. A systemic approach for identifying appropriate software features and conducting a formative evaluation that evaluates both the overall effectiveness of the multimedia instructional design and the effectiveness of specific features of the software is presented. The instructional software for teaching the concept of variable was designed as a research platform and tested with 90 undergraduate students at a Midwestern university. Behavior tracking and data collection instruments (pre-test, surveys, and delayed post-test) were embedded in the software. As a part of design-engineering-develop approach, two potential types of feedback, single try versus two tries, were tested in two experimental conditions. The results of the formative evaluation demonstrated preliminary evidence of the effectiveness of the designed software with either type of feedback for both high and low prior knowledge students. An innovative instructional strategy of helping the learner mindfully process the program feedback is described.
KeywordsFormative evaluation Instructional design Web-based interactive multimedia Computer-assisted instruction Software development Design experiment
- Chance, B., Ben-Zvi, D., Garfield, J., & Medina, E. (2007). The role of technology in improving student learning of statistics. Technology Innovations in Statistics Education, 1(1), 50.Google Scholar
- Clariana, R. B. (1993). A review of multiple-try feedback in traditional and computer-based instruction. Journal of Computer-Based Instruction, 20(3), 67–74.Google Scholar
- Clariana, R. B. (1999). CBT design: A feedback aptitude treatment interaction. In Proceedings of Selected Research and Development Papers Presented at the National Convention of the Association for Educational Communications and Technology (vol. 21, pp. 87–91). Houston, TX, February 10–14, 1999.Google Scholar
- Clark R. C., & Mayer, R. E. (2007). E-Learning and the science of instruction: Proven guidelines for consumers and designers of multimedia learning (2nd ed.). San Francisco, CA: Pfeiffer & Company. doi:10.1002/pfi.4930420510.
- Cohen, J. (1988). Statistical power analysis for the behavioral sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.Google Scholar
- Friel, S. (2007). The researcher frontier: Where technology interacts with the teaching and learning of data analysis and statistics. In G. W. Blume & M. K. Heid (Eds.), Research on technology and the teaching and learning of mathematics: Cases and Perspectives (Vol. 2, pp. 279–331). Greenwich, CT: Information Age Publishing Inc.Google Scholar
- Garfield, J., Chance, B., & Snell, J. L. (2000). Technology in college statistics courses. In D. Holton et al. (Eds.), The teaching and learning of mathematics at university level: AnICMI study (pp. 357-370). Dordrecht: Kluwer Academic Publishers. Retrieved from http://www.dartmouth.edu/~chance/teaching_aids/books_articles/technology.html. Accessed 7 Oct 2014.
- Keller, J. M. (1983). Motivational design of instruction. In C. M. Reigeluth (Ed.), Instructional-design theories and models: An overview of their current status (pp. 383–434). Hillsdale, NJ: Lawrence Erlbaum.Google Scholar
- Litchfield, B. C. (1987). The effect of presentation sequence and generalization formulae on retention of coordinate and successive concepts and rules in computer-based instruction (Doctoral dissertation, Florida State University, 1987). Dissertation Abstracts International, 49, 486A.Google Scholar
- Mason, B. J. & Bruning, R. (2001). Providing feedback in computer-based instruction: What the research tells us. Retrieved from: http://dwb.unl.edu/Edit/MB/MasonBruning.html. Accessed 7 Oct 2014.
- McKenney, S., & Reeves, T. (2012). Conducting educational design research. London: Routeldge.Google Scholar
- Moreno, R., & Valdez, A. (2005). Cognitive load and learning effects of having students organize pictures and words in multimedia environments: The role of student interactivity and feedback. Educational Technology Research and Development, 53(3), 35–45.Google Scholar
- Mory, E. (1996). Feedback research. In D. H. Jonassen (Ed.), Handbook of research for educational communications and technology (pp. 919–956). New York: Simon & Schuster Macmillan.Google Scholar
- Mory, E. H. (2004) Feedback research revisited, In D. H. Jonassen, (Ed.), Handbook of research on educational communications and technology (pp. 745–783, 2nd ed.). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
- Ormel, B., Pareja Roblin, N., McKenney, S., Voogt, J., & Pieters, J. (2012). Research-practice interactions as reported in recent design research studies: Still promising, still hazy. Educational Technology Research & Development, doi:10.1007/s11423-012-9261-6.
- Park, O. (1994). Dynamic visual displays in media-based instruction. Educational Technology, 34(4), 21–25.Google Scholar
- Sklar, J. C., & Zwick R. (2009). Multimedia presentations in educational measurement andstatistics: Design considerations and instructional approaches. Journal of Statistics Education, 17(3), 3. Retrieved from http://www.amstat.org/publications/jse/v17n3/sklar.html. Accessed 7 Oct 2014.
- Spector, M., Merrill, M. D., Merrienboer, J. V., & Driscoll, M. (2008). Handbook of research on educational communications and technology (3rd ed.). New York: Routledge.Google Scholar
- Symanzik, J. & Vulkasinovic, N. (2006). Teaching an introductory statistics course with CyberStats, an electronic textbook. Journal of Statistics Education, 14(1), 1. Retrieved from www.amstat.org/publications/jse/v14n1/symanzik.html. Accessed 7 Oct 2014.