Abstract
Recent scientific evidence suggests that emotions are a ubiquitous element of human-computer interaction and should be considered when designing usable and intelligent systems. The Internet of Things as technology is now used in various spheres of public life. The great benefit of using it is automation of processes and acceleration of activities. To do this, various intelligent devices are integrated, which make the internet of things a complex system. Today, a great deal of effort is being made with these devices to increase the quality of human–computer interaction. Also, a lot of attention is paid to the examination of the emotional state of a person and various tests are carried out in this respect. Comprehensive systems that have the ability to evaluate those states of people are used in many areas. The process of emotion perception is divided into two sensory and intellectual levels. In each person, these two levels are developed and improved. Therefore, sensory networks and the Internet of Things are a good tool in assessing the emotional state of a person, or, on the basis of the obtained data, it is possible to adapt the surroundings to the resulting emotional state.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Alberdi, A., Aztiria, A., Basarab, A.: Towards an automatic early stress recognition system for office environments based on multimodal measurements: a review. J. Biomed. Inform. 59, 49–75 (2016). https://doi.org/10.1016/J.JBI.2015.11.007
Ashok, A., Xu, C., Vu, T., Gruteser, M., Howard, R., Zhang, Y., … Dana, K.: What am i looking at ? low power radio-optical beacons for augmented reality. IEEE Trans. Mobile Comp. 15(12), 3185–3199 (n.d.).
Ashton, K.: That “Internet of Things” thing. RFID J. 22(7), 97–114 (2009)
Carneiro, D., Castillo, J.C., Novais, P., Fernández-Caballero, A., Neves, J.: Multimodal behavioral analysis for non-invasive stress detection. Expert Syst. Appl. 39(18), 13376–13389 (2012). https://doi.org/10.1016/J.ESWA.2012.05.065
Caruso, D.: EmoÄŤnĂ inteligence. Grada Publishing, a. s., Praha (2015)
Czako, M., Seemannova, M., Bratska, M.: Emócie. Slovenské pedagogické nakladateľstvo, Bratislava (1982)
De Marsico, M., Nappi, M., Riccio, D., Wechsler, H.: Mobile Iris Challenge Evaluation (MICHE)-I, biometric iris dataset and protocols. Pattern Recogn. Lett. 57, 17–23 (2015). https://doi.org/10.1016/j.patrec.2015.02.009
Ekman, P., Friesen, W.: Facial action coding system: investigator’s guide. Consulting Psychologists Press, Palo Alto, CA (1978)
Ghazali, K.H., Jadin, M.S., Jie, M., Xiao, R.: Novel automatic eye detection and tracking algorithm. Opt. Lasers Eng. 67, 49–56 (2015). https://doi.org/10.1016/j.optlaseng.2014.11.003
Ghosh, S., Nandy, T., Manna, N.: Advancements of medical electronics. (2015) https://doi.org/10.1007/978-81-322-2256-9
Gjoreski, M., Luštrek, M., Gams, M., Gjoreski, H.: Monitoring stress with a wrist device using context. J. Biomed. Inform. (2017) https://doi.org/10.1016/j.jbi.2017.08.006
Gómez-Poveda, J., Gaudioso, E.: Evaluation of temporal stability of eye tracking algorithms using webcams. Expert Syst. Appl. 64, 69–83 (2016). https://doi.org/10.1016/j.eswa.2016.07.029
Hasson, G.: InteligenÄŤnĂ emoce. Grada Publishing, a. s, Praha (2015)
Jeong, M., Nam, J.Y., Ko, B.C.: Eye pupil detection system using an ensemble of regression forest and fast radial symmetry transform with a near infrared camera. Infrared Phys. Technol. 85, 44–51 (2017). https://doi.org/10.1016/j.infrared.2017.05.019
Jung, Y., Kim, D., Son, B., Kim, J.: An eye detection method robust to eyeglasses for mobile iris recognition. Expert Syst. Appl. 67, 178–188 (2017). https://doi.org/10.1016/j.eswa.2016.09.036
Kacete, A., Royan, J., Seguier, R., Collobert, M., & Soladie, C.: Real-time eye pupil localization using Hough regression forest. 2016 IEEE Winter Conference on Applications of Computer Vision, WACV (2016). https://doi.org/10.1109/WACV.2016.7477666
Kaklauskas, A.: Student progress assessment with the help of an intelligent pupil analysis system. Intell. Syst. Ref. Lib. 81(1), 175–193 (2015). https://doi.org/10.1007/978-3-319-13659-2_6
Kaklauskas, A., Zavadskas, E. K., Seniut, M., Dzemyda, G., Stankevic, V., Simkevičius, C., … Gribniak, V.: Web-based biometric computer mouse advisory system to analyze a user’s emotions and work productivity. Eng. Appl. Artif. Intell. 24(6), 928–945 (2011). https://doi.org/10.1016/J.ENGAPPAI.2011.04.006
Kleinginna, P.R., Kleinginna, A.M.: A categorized list of motivation definitions, with a suggestion for a consensual definition. Motivation and Emotion 5(3), 263–291 (1981). https://doi.org/10.1007/BF00993889
Lopatovska, I., Arapakis, I.: Theories, methods and current research on emotions in library and information science, information retrieval and human-computer interaction. Inf. Process. Manage. 47(4), 575–592 (2011). https://doi.org/10.1016/j.ipm.2010.09.001
Magdin, M., Turcani, M., Hudec, L.: Evaluating the emotional state of a user using a webcam. Int. J. Interact. Multimed. Artif. Intell. 4(1), 61 (2016). https://doi.org/10.9781/ijimai.2016.4112
Mano, L.Y., Faiçal, B.S., Nakamura, L.H.V., Gomes, P.H., Libralon, G.L., Meneguete, R.I., … Ueyama, J.: Exploiting IoT technologies for enhancing Health Smart Homes through patient identification and emotion recognition. Comp. Commun. 89–90, 178–190 (2016). https://doi.org/10.1016/j.comcom.2016.03.010
Martinez-Millana, A., Bayo-Monton, J.L., Lizondo, A., Fernandez-Llatas, C., Traver, V.: Evaluation of google glass technical limitations on their integration in medical systems. Sensors (Switzerland) 16(12), 1–12 (2016). https://doi.org/10.3390/s16122142
Otsuka, T., Ohya, J.: A study of transformation of facial expressions based on expression recognition from temporal image sequences. Technical report, Institute of Electronic, Information, and Communications Engineers (IEICE). Technical report, Institute of Electronic, Information, and Communications Engineers (IEICE) (1997)
Pantic, M., Rothkrantz, L.J.: Automatic analysis of facial expressions: the state of art. IEEE Trans. Pattern Recognit. Mach. Intell. 12, 1424–1445 (2000).
Rattani, A., Derakhshani, R.: Ocular biometrics in the visible spectrum: a survey. Image Vis. Comput. 59, 1–16 (2017). https://doi.org/10.1016/j.imavis.2016.11.019
Rosenblum, M., Yacoob, Y., Davis, L. Human expression recognition from motion using a radial basis function network architecture. IEEE Trans. Neural Netw. 7(5), 1121–1138 (1996)
Sharma, N., Gedeon, T.: Modeling a stress signal. Appl. Soft Comput. 14, 53–61 (2014). https://doi.org/10.1016/J.ASOC.2013.09.019
Skodras, E., Kanas, V.G., Fakotakis, N.: On visual gaze tracking based on a single low cost camera. Sig. Process. Image Commun. 36, 29–42 (2015). https://doi.org/10.1016/j.image.2015.05.007
Tian, Y., Kanade, T., Cohn, J.: Recognizing Action Units for Facial Expression Analysis. Carnegie_Mellon University: IEEE Transactions on Pattern Recognition and Machine Intelligence (2001)
Vizer, L.M., Zhou, L., Sears, A.: Automated stress detection using keystroke and linguistic features: an exploratory study. Int. J. Hum. Comput. Stud. 67(10), 870–886 (2009). https://doi.org/10.1016/J.IJHCS.2009.07.005
Võ, M.L.H., Jacobs, A.M., Kuchinke, L., Hofmann, M., Conrad, M., Schacht, A., Hutzler, F.: The coupling of emotion and cognition in the eye: Introducing the pupil old/new effect. Psychophysiology 45(1), 130–140 (2007). https://doi.org/10.1111/j.1469-8986.2007.00606.x
Weber, R.H.: Accountability in the Internet of Things. Comp. Law Secur. Rev. 27, 133–138 (2011). https://doi.org/10.1016/j.clsr.2011.01.005
Yacoob, Y., Davis, L.: Recognizing human facial expressions from long image sequences using optical flow. IEEE Trans. Pattern Anal. 18(6), 636–642 (1996).
Acknowledgements
This research has been supported by University Grant Agency under the contract No. VII/6/2018
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Francisti, J., Balogh, Z. (2019). An Overview of Solutions to the Issue of Exploring Emotions Using the Internet of Things. In: Ntalianis, K., Vachtsevanos, G., Borne, P., Croitoru, A. (eds) Applied Physics, System Science and Computers III. APSAC 2018. Lecture Notes in Electrical Engineering, vol 574 . Springer, Cham. https://doi.org/10.1007/978-3-030-21507-1_9
Download citation
DOI: https://doi.org/10.1007/978-3-030-21507-1_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-21506-4
Online ISBN: 978-3-030-21507-1
eBook Packages: Physics and AstronomyPhysics and Astronomy (R0)