Abstract
In this article the design of the experiment’s protocol (elements, considerations and formalization) for to create databases of physiological and behavioral signals from college students are doing a learning activity is described. The main thing is to define a formal protocol for data capture to provide an adequate database for the study of learning-centered emotions. For the recognition of emotions in specific contexts is a fundamental task and generally is part of the data treatment stage in research that is intended to automatically identify emotions in educational environments (as interest, boredom, confusion and frustration, according to [1]).
For the execution the capture it is proposed to merge data from technologies for the acquisition of physiological and behavioral signals with the idea of integrating a vast and diverse set of data.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
D’Mello, S., Graesser, A.: Dynamics of affective states during complex learning. Learn. Instr. 22(2), 145–157 (2012). https://doi.org/10.1016/j.learninstruc.2011.10.001
Fuentes, C., Herskovic, V., Rodríguez, I., Gerea, C., Marques, M., Rossel, P.O.: A systematic literature review about technologies for self-reporting emotional information. J. Ambient Intell. Humaniz. Comput. 1–14 (2016). https://doi.org/10.1007/s12652-016-0430-z
Ekman, P.: Emotions Revealed. Recognizing Faces and Feelings to Improve Communication and Emotional Life, 1st edn. Henrry Holt and Company, New York (2003)
Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980). https://doi.org/10.1037/h0077714
Ekman, P., Friesen, W., Hager, J.: Facial Action Coding System. The Manual, Salt Lake City (2002)
Soleymani, M., Member, S., Lee, J.: DEAP: a database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput. 3(1), 18–31 (2012)
el Kaliouby, R., Picard, R.W.: Affectiva Database. MIT Media Laboratory (2019). https://www.affectiva.com. Accessed 08 Apr 2019
Arroyo, I., Cooper, D.G., Burleson, W., Woolf, B.P., Muldner, K., Christopherson, R.: Emotion sensors go to school. Front. Artif. Intell. Appl. 200(1), 17–24 (2009). https://doi.org/10.3233/978-1-60750-028-5-17
Livingstone, S.R., Russo, F.A.: The Ryerson audio-visual database of emotional speech and song (RAVDESS): a dynamic, multimodal set of facial and vocal expressions in north American English. PLoS ONE 13(5), 14–18 (2018). https://doi.org/10.1371/journal.pone.0196391
Freitas-Magalhães, A.: Facial Action Coding System 3.0: Manual of Scientific Codification of the Human Face (English edition). FEELab Science Books, Porto (2018)
Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z.: The extended Cohn-Kanade dataset (CK +): a complete facial expression dataset for action unit and emotion-specified expression. In: 2010 IEEE Computer Society Conference on in Computer Vision and Pattern Recognition Workshops (CVPRW), July 2010, pp. 94–101 (2010)
Lyons, M.J., Gyoba, J., Kamachi, M.: Japanese Female Facial Expressions (JAFFE), Database of Digital Images (1997)
Valstar, M.F., Pantic, M.: Induced disgust, happiness and surprise: an addition to the MMI facial expression database. In: Seventh International Conference on Language Resources and Evaluation (2010)
Sneddon, I., Mcrorie, M., Mckeown, G., Hanratty, J.: Belfast induced natural emotion database. IEE Trans. Affect. Comput. 3(1), 32–41 (2012)
Mavadati, S.M., Member, S., Mahoor, M.H., Bartlett, K., Trinh, P., Cohn, J.F.: DISFA: a spontaneous facial action intensity database. IEEE Trans. Affect. Comput. 6(1), 1–13 (2013)
Aifanti, N., Papachristou, C., Delopoulos, A.: The MUG facial expression database. In: 11th International Workshop on Image Analysis for Multimedia Interactive Services WIAMIS 2010, pp. 1–4 (2010). https://doi.org/10.1371/journal.pone.0009715
Happy, S.L., Patnaik, P., Routray, A., Guha, R.: The Indian spontaneous expression database for emotion recognition. IEEE Trans. Affect. Comput. 8(1), 131–142 (2017). https://doi.org/10.1109/TAFFC.2015.2498174
Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D.H.J., Hawk, S.T., van Knippenberg, A.: Presentation and validation of the Radboud Faces Database. Cogn. Emot. 24(8), 1377–1388 (2010). https://doi.org/10.1080/02699930903485076
Zhao: Oulu-CASIA NIR&VIS facial expression database. Center for Machine Vision and Signal Analysis. University of Oulu, Oulum, Yliopisto. http://www.cse.oulu.fi/wsgi/CMV/Downloads/Oulu-CASIA. Accessed 04 Apr 2019
Aneja, D., Colburn, A., Faigin, G., Shapiro, L., Mones, B.: Modeling stylized character expressions via deep learning. In: Lai, S.-H., Lepetit, V., Nishino, K., Sato, Y. (eds.) ACCV 2016. LNCS, vol. 10112, pp. 136–153. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-54184-6_9
Mollahosseini, A., Hasani, B., Mahoor, M.H.: AffectNet: a database for facial expression, valence, and arousal computing in the wild. IEEE Trans. Affect. Comput. 10(1), 18–31 (2017). https://doi.org/10.1109/TAFFC.2017.2740923
Mena-Chalco, R., Marcondes, J., Velho, L.: Banco de Dados de Faces 3D: IMPA-FACE3D, Brasil (2008)
Thomaz, C.E.: FEI face database. Department of Electrical Engineering. Centro Universitario da FEI, São Bernardo do Campo, São Paulo, Brazil (2012). https://fei.edu.br/~cet/facedatabase.html. Accessed 04 Apr 2019
Zafeiriou, S., Kollias, D., Nicolaou, M.A., Papaiooannou, A., Kotsia, I.: Aff-Wild: valence and arousal ‘in-the-wild’ challenge. In: Computer Vision and Pattern Recognition Workshops (CVPRW), 2017, pp. 1980–1987 (2017)
Nye, B., et al.: Analyzing learner affect in a scenario-based intelligent tutoring system. In: André, E., Baker, R., Hu, X., Rodrigo, Ma.Mercedes T., du Boulay, B. (eds.) AIED 2017. LNCS (LNAI), vol. 10331, pp. 544–547. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-61425-0_60
Xiao, X., Pham, P., Wang, J.: Dynamics of affective states during MOOC learning. In: André, E., Baker, R., Hu, X., Rodrigo, Ma.Mercedes T., du Boulay, B. (eds.) AIED 2017. LNCS (LNAI), vol. 10331, pp. 586–589. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-61425-0_70
Zataraín, R., Barrón, M.L., González, F., Reyes-García, C.A.: An affective and web 3.0 based learning environment for a programming language. Telemat. Inform. (2017). https://doi.org/10.1016/j.tele.2017.03.005
Zatarain-Cabada, R., Barrón-Estrada, M.L., González-Hernández, F., Oramas-Bustillos, R., Alor-Hernández, G., Reyes-García, C.A.: Building a corpus and a local binary pattern recognizer for learning-centered emotions. Adv. Comput. Intell. II, 524–535 (2017)
Zatarain-Cabada, R., Barron-Estrada, M.L., González-Hernández, F., Rodríguez-Rangel, H.: Building a face expression recognizer and a face expression database for an intelligent tutoring system. In: Proceedings - IEEE 17th International Conference Advanced Learning Technologies ICALT 2017, no. 2161–377X/17, pp. 391–393 (2017). https://doi.org/10.1109/icalt.2017.141
Bosch, N., D’Mello, S.: The affective experience of novice computer programmers. Int. J. Artif. Intell. Educ. 27(1), 181–206 (2015). https://doi.org/10.1007/s40593-015-0069-5
Harley, J.M., Bouchet, F., Azevedo, R.: Aligning and comparing data on emotions experienced during learning with MetaTutor. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS (LNAI), vol. 7926, pp. 61–70. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-39112-5_7
Graesser, A.C., D’Mello, S.: Emotions during the learning of difficult material 57 (2012). https://doi.org/10.1016/b978-0-12-394293-7.00005-4
Arana-Llanes, J.Y., González-Serna, G., Pineda-Tapia, R., Olivares-Peregrino, V., Ricarte-Trives, J.J., Latorre-Postigo, J.M.: EEG lecture on recommended activities for the induction of attention and concentration mental states on e-learning students. J. Intell. Fuzzy Syst. 34, 3359–3371 (2017)
Bosch, N., et al.: Detecting student emotions in computer-enabled classrooms. In: IJCAI International Joint Conference on Artificial. Intelligence, vol. 2016-Janua, pp. 4125–4129 (2016)
Bosch, N., D’mello, S.K., Ocumpaugh, J., Baker, R.S., Shute, V.: Using video to automatically detect learner affect in computer-enabled classrooms. ACM Trans. Interact. Intell. Syst. 6(2), 1–26 (2016). https://doi.org/10.1145/2946837
Bosch, N., et al.: Automatic detection of learning-centered affective states in the wild. In: Proceedings of the 20th International Conference on Intelligent User Interfaces - IUI 2015, pp. 379–388 (2015). https://doi.org/10.1145/2678025.2701397
Monkaresi, H., Bosch, N., Calvo, R.A., D’Mello, S.K.: Automated detection of engagement using video-based estimation of facial expressions and heart rate. IEEE Trans. Affect. Comput. 1–14 (2016). https://doi.org/10.1109/taffc.2016.2515084
Almohammadi, K., Hagras, H., Yao, B., Alzahrani, A., Alghazzawi, D., Aldabbagh, G.: A type-2 fuzzy logic recommendation system for adaptive teaching. Soft. Comput. 21(4), 965–979 (2017). https://doi.org/10.1007/s00500-015-1826-y
Bixler, R., D’Mello, S.: Towards automated detection and regulation of affective states during academic writing. In: Lane, H.C., Yacef, K., Mostow, J., Pavlik, P. (eds.) AIED 2013. LNCS (LNAI), vol. 7926, pp. 904–907. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-39112-5_142
González-Hernández, F., Zatarain-Cabada, R., Barrón-Estrada, M.L., Rodríguez-Rangel, H.: Recognition of learning-centered emotions using a convolutional neural network. J. Intell. Fuzzy Syst. 34, 3325–3336 (2017)
Steidl, S.: Automatic Classification of Emotion-Related User States in Spontaneous Children’s Speech. Universität Erlangen-Nürnberg, Erlangen (2009)
Cowie, R., et al.: Emotion recognition in human computer interaction. IEEE Signal Process. Mag. 18(1), 32–80 (2001). https://doi.org/10.1109/79.911197
Picard, R.W.: Affective Computing (1997). https://doi.org/10.1007/bf01238028
Schlosberg, H.: Three dimensions review. J. Gen. Psychol. 61(2), 81–88 (1954). https://doi.org/10.1080/00221309.1934.9917853
Wilson, B.G.: Constructivist Learning Environments: Case Studies in Instructional Design, p. 4. Educational Technology, New York City (1996)
Picard, R.W.: Affective computing: challenges. Int. J. Hum Comput Stud. 59(1–2), 55–64 (2003). https://doi.org/10.1016/S1071-5819(03)00052-1
Nijhawan, L., et al.: Informed consent: issues and challenges. J. Adv. Pharm. Technol. Res. 4(3), 134–140 (2013). https://doi.org/10.4103/2231-4040.116779
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
González-Meneses, Y.N., Guerrero-García, J., Reyes-García, C.A., Olmos-Pineda, I., González-Calleros, J.M. (2019). Formal Protocol for the Creation of a Database of Physiological and Behavioral Signals for the Automatic Recognition of Emotions. In: Ruiz, P., Agredo-Delgado, V. (eds) Human-Computer Interaction. HCI-COLLAB 2019. Communications in Computer and Information Science, vol 1114. Springer, Cham. https://doi.org/10.1007/978-3-030-37386-3_16
Download citation
DOI: https://doi.org/10.1007/978-3-030-37386-3_16
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-37385-6
Online ISBN: 978-3-030-37386-3
eBook Packages: Computer ScienceComputer Science (R0)