A La Recherche du Temps Perdu, or As Time Goes By: Where Does the Time Go in a Reading Tutor That Listens?
Analyzing the time allocation of students’ activities in a school-deployed mixed initiative tutor can be illuminating but surprisingly tricky. We discuss some complementary methods that we have used to understand how tutoring time is spent, such as analyzing sample videotaped sessions by hand, and querying a database generated from session logs. We identify issues, methods, and lessons that may be relevant to other tutors. One theme is that iterative design of “non-tutoring” components can enhance a tutor’s effectiveness, not by improved teaching, but by reducing the time wasted on non-learning activities. Another is that it is possible to relate student’s time allocation to improvements in various outcome measures.
KeywordsExpense Lost Mellon Barge
Unable to display preview. Download preview PDF.
- 1.Mostow, J. and G. Aist. Evaluating tutors that listen: An overview of Project LISTEN. In Smart Machines in Education, K. Forbus and P. Feltovich, Editors. 2001, MIT/AAAI Press.Google Scholar
- 3.Aist, G. and J. Mostow. Faster, better task choice in a reading tutor that listens. In Speech Technology for Language Learning, P. DeCloque and M. Holland, Editors. in press, Swets & Zeitlinger Publishers: The Netherlands.Google Scholar
- 4.Aist, G. Challenges for a mixed initiative spoken dialog system for oral reading tutoring. Proc. Computational Models for Mixed Initiative Interaction: Working Notes of the AAAI 1997 Spring Symposium. 1997.Google Scholar
- 5.Shute, V. SMART Evaluation: Cognitive Diagnosis, Mastery Learning and Remediation. Proc. 7th World Conference on Artificial Intelligence in Education. 1995. Washington, DC: Springer-Verlag.Google Scholar
- 6.Gluck, K.A., V.J. Shute, J.R. Anderson, and M.C. Lovett. Deconstructing a Computer-Based Tutor: Striving for Better Learning Efficiency in Stat Lady. Proc. 4th International Conference on Intelligent Tutoring Systems. 1998. San Antonio, Texas: Springer-Verlag.Google Scholar
- 7.Hanna, L., K. Risden, M. Czerwinski, and K.J. Alexander. The role of usability research in designing children’s computer products. In The Design of Children’s Technology, A. Druin, Editor. 1999, Morgan Kaufmann: San Fransisco. p. 3–26.Google Scholar
- 8.Snow, C.E., M.S. Burns, and P. Griffin. Preventing Reading Difficulties in Young Children. 1998, National Academy Press: Washington D.C.Google Scholar
- 10.Mostow, J., G. Aist, P. Burkhead, A. Corbett, A. Cuneo, S. Eitelman, C. Huang, B. Junker, C. Platz, M.B. Sklar, and B. Tobin. A controlled evaluation of computer-versus human-assisted oral reading. In Artificial Intelligence in Education: AI-ED in the Wired and Wireless Future, J.D. Moore, C.L. Redfield, and W.L. Johnson, Editors. 2001, Amsterdam: IOS Press: San Antonio, Texas. p. 586–588.Google Scholar
- 11.Mostow, J., G. Aist, C. Huang, B. Junker, R. Kennedy, H. Lan, D.L. IV, R. O’Connor, R. Tassone, B. Tobin, and A. Wierman. 4-Month Evaluation of a Learner-controlled Reading Tutor that Listens. In Speech Technology for Language Learning, P. DeCloque and M. Holland, Editors. in press, Swets & Zeitlinger Publishers: The Netherlands.Google Scholar
- 12.Aist, G. and J. Mostow. Improving story choice in a reading tutor that listens. Proc. Fifth International Conference on Intelligent Tutoring Systems (ITS’2000). 2000. Montreal, Canada.Google Scholar
- 13.Aist, G. and J. Mostow. When Speech Input is Not an Afterthought: A Reading Tutor that Listens. Proc. Workshop on Perceptual User Interfaces. 1997. Banff, Canada.Google Scholar
- 14.Woodcock, R.W. Woodcock Reading Mastery Tests-Revised (WRMT-R/NU). 1998, Circle Pines, Minnesota: American Guidance Service.Google Scholar