ARGarden: Augmented Edutainment System with a Learning Companion
Recently, many researchers have studied on agent-based edutainment systems to improve students’ learning experiences. In this paper, we present ARGarden which makes users experience an interactive flower gardening with a learning companion squatting on an augmented picture. The companion perceives users’ actions as well as situations in the learning environment and appraises the perceived information autonomously. Then, it presents peer support to help participants’ problem-solving through anthropomorphic expressions. We developed our system on a mobile device and visualized a learning companion as an animated bluebird. We also demonstrated the implemented system at an exhibition and evaluated the effectiveness of our system through the observation of participants’ responses to the demonstration. In this evaluation, we found that the bluebird as a learning companion helped users to experience how to properly grow the flower in our edutainment setting. Finally, we expect possibilities that an augmented learning companion is one of the key factors for developing effective edutainment applications.
Unable to display preview. Download preview PDF.
- 2.Johnson, W., Rickel, J., Lester, J.: Animated pedagogical agents: face-to-face interaction in interactive learning environments. International Journal of Artificial Intelligence in Education 11, 47–78 (2000)Google Scholar
- 4.Sklar, E., Richards, D.: The Use of Agents in Human Learning Systems. In: International Joint Conference on Autonomous Agents and Multiagent Systems, pp. 767–774 (2006)Google Scholar
- 5.Lester, J., Converse, S., Kahler, S., Barlow, T., Stone, B., Bhoga, R.: The Persona Effect: Affective Impact of Animated Pedagogical Agents. In: Conference on human factors in computing systems, pp. 359–366 (1997)Google Scholar
- 8.MacIntyre, B., Bolter, J.D., Moreno, E., Hannigan, B.: Augmented Reality as a New Media Experience. In: International Symposium on Augmented Reality (ISAR), pp. 197–206 (2001)Google Scholar
- 9.Andreas, D., Eva, H.: An observational study of children interacting with an augmented story book. In: Hui, K.-c., Pan, Z., Chung, R.C.-k., Wang, C.C.L., Jin, X., Göbel, S., Li, E.C.-L. (eds.) EDUTAINMENT 2007. LNCS, vol. 4469, pp. 305–315. Springer, Heidelberg (2007)Google Scholar
- 10.Anabuki, M., Kakuta, H., Yamamoto, H., Tamura, H.: Welbo: an embodied conversational agent living in mixed reality space. In: Conference on human factors in computing systems, pp. 10–11 (2000)Google Scholar
- 11.Barakonyi, I., Psik, T., Schmalstieg, D.: Agents That Talk And Hit Back: Animated Agents in Augmented Reality. In: IEEE and ACM International Symposium on Mixed and Augmented Reality, pp. 141–150 (2004)Google Scholar
- 12.Wagner, D., Billinghurst, M., Schmalstieg, D.: How Real Should Virtual Characters Be? In: International Conference on Advances in Computer Entertainment Technology (2006)Google Scholar
- 13.Bratman, M.E.: Intentions, Plans, and Practical Reason. Harvard University Press, Cambridge (1987)Google Scholar
- 15.Gratch, J., Marsella, S.: Technical details of a domain independent framework for modeling emotion from, www.ict.usc.edu/~gratch/EMA_Details.pdf