Lattice-Based Approach to Building Templates for Natural Language Understanding in Intelligent Tutoring Systems

  • Shrenik Devasani
  • Gregory Aist
  • Stephen B. Blessing
  • Stephen Gilbert
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6738)

Abstract

We describe a domain-independent authoring tool, ConceptGrid, that helps non-programmers develop intelligent tutoring systems (ITSs) that perform natural language processing. The approach involves the use of a lattice-style table-driven interface to build templates that describe a set of required concepts that are meant to be a part of a student’s response to a question, and a set of incorrect concepts that reflect incorrect understanding by the student. The tool also helps provide customized just-in-time feedback based on the concepts present or absent in the student’s response. This tool has been integrated and tested with a browser-based ITS authoring tool called xPST.

Keywords

natural language processing intelligent tutoring system authoring tool 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Chi, M.T.H., de Leeuw, N., Chiu, M.H., LaVancher, C.: Eliciting Self-Explanations Improves Understanding. Cognitive Science 18, 439–477 (1994)Google Scholar
  2. 2.
    Chi, M.T.H., Bassok, M., Lewis, M.W., Reimann, P., Glaser, R.: Self-explanations: How Students Study and Use Examples in Learning to Solve Problems. Cognitive Science 13, 145–182 (1989)CrossRefGoogle Scholar
  3. 3.
    Aleven, V., Koedinger, K., Cross, K.: Tutoring Answer Explanations Fosters Learning With Understanding. In: Proceedings of Artificial Intelligence in Education, AIED 1999, pp. 199–206 (1999)Google Scholar
  4. 4.
    Glass, M.: Processing Language Input in the CIRCSIM-Tutor Intelligent Tutoring System. In: Moore, J.D., et al. (eds.) Artificial Intelligence in Education, pp. 210–221 (2001)Google Scholar
  5. 5.
    Rus, V., Graesser, A.: Deeper Natural Language Processing for Evaluating Student Answers in Intelligent Tutoring Systems. American Association for Artificial Intelligence (2006)Google Scholar
  6. 6.
    Graesser, A., Wiemer-Hastings, P., Wiemer-Hastings, K., Harter, D., Person, N.: TRG.: Using Latent Semantic Analysis to Evaluate the Contributions of Students in AutoTutor. Interactive Learning Environemnts, 149–169 (2000)Google Scholar
  7. 7.
    Steinhart, D.: Summary Street: An Intelligent Tutoring System for Improving Student Writing Through the Use of Latent Semantic Analysis. Ph.D. dissertation. Dept. Psychology, University of Colorado, Boulder (2001)Google Scholar
  8. 8.
    Landauer, T.K., Foltz, P.W., Laham, D.: Introduction to Latent Semantic Analysis. Discourse Processes 25, 259–284 (1998)CrossRefGoogle Scholar
  9. 9.
    Blessing, S., Gilbert, S., Blankenship, L., Sanghvi, B.: From SDK to xPST: A New Way to Overlay a Tutor on Existing Software. In: Proceedings of the Twenty-Second International FLAIRS Conference (2009)Google Scholar
  10. 10.
    Maass, J., Blessing, S.B.: xSTAT: An Intelligent Homework Helper for Students. Poster presented at the Georgia Undergraduate Research in Psychology Conference (2011)Google Scholar
  11. 11.
    Gilbert, S., Blessing, S.B., Kodavali, S.: The Extensible Problem-Specific Tutor (xPST): Evaluation of an API for Tutoring on Existing Interfaces. In: Dimitrova, V., et al. (eds.) Artificial Intelligence in Education, pp. 707–709. IOS Press, Brighton (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Shrenik Devasani
    • 1
  • Gregory Aist
    • 1
  • Stephen B. Blessing
    • 2
  • Stephen Gilbert
    • 1
  1. 1.VRACIowa State UniversityAmesUSA
  2. 2.University of TampaTampaUSA

Personalised recommendations