Advancing a Multimodal Real-Time Affective Sensing Research Platform
Abstract
Expert human tutors focus approximately half of their interactions on the affective and motivational engagement of their students (Lepper, Woolverton, Mumme, & Gurtner, 1993). In stark contrast, the vast majority of Intelligent Tutoring Systems (ITS) pay little or no attention to students’ emotional experiences. To redress this, the Affective Agent Research Platform has been advanced, demonstrating the ability to sense elements of student frustration and respond in real time with affective support. While the work has been advanced within the context of a challenging educational activity, the architecture and lessons from its implementation are broadly applicable and readily deployable within a wide range of settings, e.g., workplace, automotive, assistive care, etc.
Keywords
Skin Conductance Goal Mastery Intelligent Tutor System Support Dialogue Character EngineReferences
- Bailenson, J. N. (2005). Personal conversation. W. Burleson.Google Scholar
- Bickmore, T., & Picard, R. W. (2004). Establishing and maintaining long-term human-computer relationships. Transactions on Computer-Human Interaction, 12(2), 293–327.CrossRefGoogle Scholar
- Boucsein, W. (1992). Electrodermal activity. New York: Plenum Press.Google Scholar
- Dennerlein, J., Becker, T., Johnson, P., Reynolds, C., & Picard, R. W. (2003). Frustrating computer users increases exposure to physical factors. Proceedings of the international ergonomics association, August 24–29, Seoul, Korea Here is the pdf: http://affect.media.mit.edu/pdfs/03.dennerlein-etal.pdf.
- Burleson, W., Picard, R. W., & Perlin, K. (2004). A Platform for affective agent research. International conference on autonomous agents and multiagent systems. New York: Columbia University.Google Scholar
- Cooper, D. G., Arroyo, I., & Woolf, B. P. (2011). Actionable affective processing for automatic tutor interventions. In R. Calvo & S. D’Mello (Eds.), Explorations in the learning sciences, instructional systems and performance technologies. New York: Springer.Google Scholar
- Dweck, C. S. (1999). Self-theories: Their role in motivation, personality and development. Philadelphia: Psychology Press.Google Scholar
- Dweck, C. S. (2004). Personal Conversation.Google Scholar
- Haro, A., Essa, I., & Flickner, M. (2000). Detecting and tracking eyes by using their physiological properties, Dynamics and Appearance.In Proceedings of IEEE Computer Vision and Pattern Recognition 2000 Conference, Hilton Head, SC, June 2000. [PS.Z <http://www.cc.gatech.edu/cpl/projects/pupil/pupil.ps.Z> | PDF <http://www.cc.gatech.edu/cpl/projects/pupil/pupil.pdf>]. Also available as Georgia Tech, GVU Center Tech Report No. GIT-GVU-TR-99-46.
- Kapoor, A., Burleson, W., & Picard, R. (2007). Automatic prediction of frustration. International Journal of Human Computer Studies, 65(8), 724–736.CrossRefGoogle Scholar
- Kapoor, A., Mota, S., & Picard, R. (2001). Towards a learning companion that recognizes affect. AAAI Fall Symposium. North Falmouth.Google Scholar
- Kapoor, A., Picard, R. W., & Ivanov, Y. (2004). Probabilistic Combination of Multiple Modalities to Detect Interest. International Conference on Pattern Recognition. Cambridge.Google Scholar
- Kapoor, A., & Picard, R.W. (2005). Multimodal affect recognition in learning environments. ACM MM’05, November 6-11, Singapore. PDF <http://web.media.mit.edu/∼ash/papers/ACM2005.pdf>.
- Kapoor, A., Qi, Y., & Picard, R. W. (2003). Fully automatic upper facial action recognition. IEEE International workshop on analysis and modeling of faces and gestures (AMFG 2003) held in conjunction with ICCV 2003, October 2003, Nice, France. TR 571 <http://vismod.media.mit.edu/pub/tech-reports/TR-571-ABSTRACT.html>.
- Klein, J., Moon, Y., & Picard, W. (2002). This computer responds to user frustration: Theory, design, results, and implications. Interacting with Computers, 14, 119–140.Google Scholar
- Lepper, M. R., Woolverton, M., Mumme, D. L., & Gurtner, J. L. (1993). Motivational techniques of expert human tutors: Lessons for the design of computer-based tutors. Computers as cognitive tools, S. P. Lajoie and S. J. Derry (pp. 75–105). Hillsdale: Erlbaum.Google Scholar
- Morency, L. P., Rahimi, A., & Darrell, T. (2003). Adaptive view-based appearance model CVPR 2003. Wisconsin.Google Scholar
- Morency, L. P., Sundberg, P., & Darrell, T. (2003). Pose Estimation using 3D View-Based Eigenspaces. ICCV Workshop on Analysis and Modeling of Face and Gesture. Nice.Google Scholar
- Mota, S., & Picard, R. W. (2003). Automated posture analysis for detecting learner’s interest level. In:Workshop on Computer Vision and Pattern Recognition for Human-Computer Interaction, June 2003.Google Scholar
- Perlin, K. (1997). Layered compositing of facial expression. ACM SIGGRAPH 97. New York.Google Scholar
- Reynolds, C. (1999). The sensing and measurement of frustration with computers. Masters: Massachusetts Institute of Technology.Google Scholar
- Rueda, C. (2004). Personal Conversation.Google Scholar
- Strauss, M., Reynolds, C., Hughes, S., Park, K., McDarby, G., & Picard, R. (2005). The HandWave Bluetooth Skin Conductance Sensor. 1 st International Conference on Affective Computing and Intelligent Interaction. Beijing.Google Scholar
- Tekscan. (1997). Tekscan body pressure measurement system user’s manual. Tekscan Inc: South Boston.Google Scholar
- Videre Design. (2010). http://www.videredesign.com/. MEGA-DCS Stereo Camera website SVS version 3.2 (2004).