Emotional Domotics: Inhabitable Home Automation System for Emotion Modulation Through Facial Analysis

  • Sergio A. Navarro-Tuch
  • M. Rogelio Bustamante-Bello
  • Javier Izquierdo-Reyes
  • Roberto Avila-Vazquez
  • Ricardo Ramirez-Mendoza
  • Pablos-Hach Jose Luis
  • Yadira Gutierrez-Martinez
Conference paper
Part of the Studies in Computational Intelligence book series (SCI, volume 751)

Abstract

This research proposed working with an influence on the subject mood, presenting an approach to state the subjects analysis when the light hue is varied. The experimental results led to the finding of the emotional response time dynamics. Such dynamics are important for future design and implementation of the control loops in-house automation systems for emotion modulation. Throughout this document, the details and progress of the research in emotional domotics, with the aim of developing a controlled algorithm for living space based on the user’s emotional state, will be illustrated and detailed. This project is centered on domotics (home automation) systems, which is, a set of elements installed, interconnected and controlled by a computer system. After introducing the investigation’s core, general preview, and the experiment’s description conducted with light hue variation, the description is followed by a presentative approach to state the subjects analysis when light hue is varied. The experimental results led to the time dynamics of emotional response findings. Such dynamics are important for future design and implementation of the control loops in house automation systems for emotion modulation.

Keywords

Emotional domotics Intelligent ambient Facial analysis Facial action coding system 

References

  1. 1.
    Coral Lozada, M.A.: In: 2014 IEEE Colombian Conference on Communications and Computing (COLCOM), pp. 1–6 (2014).  https://doi.org/10.1109/ColComCon.2014.6860413. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=6860413
  2. 2.
    Lutfi, S., Lucas, J.M., Montero, J.M.: pp. 1–5. http://www-gth.die.upm.es/research/documentation/AN-090Des-09.pdf
  3. 3.
    Ekman, P., Rosenberg, E.: What the Face Reveals, 2nd edn. Oxford University Press, United States of America (1997)Google Scholar
  4. 4.
    Herrera Quintero, L.F.: Revista Ingeniería e investigación 25(2), 45 (2005). http://www.itescam.edu.mx/principal/sylabus/fpdb/recursos/r101024.PDF
  5. 5.
    Wang, Y.Q.: Image Processing On Line 4, 128 (2014). http://www.ipol.im/pub/art/2014/104/article.pdf
  6. 6.
    Viola, P., Jones, M.J.: Int. J. Comput. Vis. 57(2), 137 (2004). http://www.vision.caltech.edu/html-files/EE148-2005-Spring/pprs/viola04ijcv.pdf
  7. 7.
    Freund, Y., Schapire, R.: Comput. Learn. Theory 55, 119 (1997).  https://doi.org/10.1006/jcss.1997.1504. http://link.springer.com/chapter/10.1007/3-540-59119-2_166
  8. 8.
    Sung, K.K., Poggio, T.: IEEE Trans. Pattern Anal. Mach. Intell. 20(1), 39 (1998)CrossRefGoogle Scholar
  9. 9.
    Rowley, H.A., Baluja, S., Kanade, T.: IEEE Trans. Pattern Anal. Mach. Intell. 20(1), 23 (1998).  https://doi.org/10.1109/34.655647. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=655647
  10. 10.
    Schneiderman, H., Kanade, T.: Proceedings. 1998 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No.98CB36231), pp. 45–51 (1998).  https://doi.org/10.1109/CVPR.1998.698586. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=698586
  11. 11.
    Schneiderman, H., Kanade, T.: Proceedings. IEEE Conference on Computer Vision and Pattern Recognition. CVPR 2000 (Cat. No.PR00662), vol. 1, p. 2 (2000).  https://doi.org/10.1109/CVPR.2000.855895
  12. 12.
    Roth, D., Yang, M.-H., Ahuja, N.: A Snow-Based Face Detectos. Neural Inf. Process. Syst. 12 (n.d.)Google Scholar
  13. 13.
    Viola, P., Jones, M.: Comput. Vis. Pattern Recogn. (CVPR) 1, I (2001).  https://doi.org/10.1109/CVPR.2001.990517
  14. 14.
    An, K.H., Chung, M.J.: pp. 2271–2279 (2009). http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5373798
  15. 15.
  16. 16.
    Littlewort, G.C., Bartlett, M.S., Fasel, I.R., Chenu, J., Kanda, T., Ishiguro, H., Movellan, J.R.: Adv. Neural Inf. Process. Syst. 16, 1563–1570 (2004). http://papers.nips.cc/paper/2402-towards-social-robots-automatic-evaluation-of-human-robot-interaction-by-facial-expression-classification.pdf
  17. 17.
    Bartlett, M., Littlewort, G.: In: 7th International Conference on Automatic Face and Gesture Recognition, 2006. FGR 2006, pp. 223–230 (2006).  https://doi.org/10.1109/FGR.2006.55. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=1613024. http://ieeexplore.ieee.org/xpls/abs_all.jsp?arnumber=1613024
  18. 18.
    Chew, S.W., Lucey, P., Lucey, S., Saragih, J., Cohn, J.F., Matthews, I., Sridharan, S.: IEEE Trans. Syst. Man Cybern. Part B, Cybern.: Publ. IEEE Syst. Man Cybern. Soc. 42(4), 1006 (2012).  https://doi.org/10.1109/TSMCB.2012.2194485. http://www.ncbi.nlm.nih.gov/pubmed/22581139
  19. 19.
    Senechal, T., Rapp, V., Salam, H., Seguier, R., Bailly, K., Prevost, L.: IEEE Trans. Syst. Man Cybern. 42(4), 993 (2012). http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=6202713
  20. 20.
    Wu, T., Butko, N.J., Ruvolo, P., Whitehill, J., Bartlett, M.S., Movellan, J.R.: In: Face and Gesture 2011, pp. 889–896 (2011).  https://doi.org/10.1109/FG.2011.5771369. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5771369
  21. 21.
    Wu, T., Butko, N.J., Ruvolo, P., Whitehill, J., Bartlett, M.S., Movellan, J.R.: IEEE Trans. Syst. Man Cybern. Part B: Cybern. 42(4), 1027 (2012).  https://doi.org/10.1109/TSMCB.2012.2195170
  22. 22.
    Gehrig, T., Ekenel, H.K.: In: Cvpr 2011 Workshops, pp. 1–6 (2011).  https://doi.org/10.1109/CVPRW.2011.5981817. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5981817
  23. 23.
    Littlewort, G., Whitehill, J., Wu, T.F., Butko, N., Ruvolo, P., Movellan, J., Bartlett, M.: In: Face and Gesture 2011, pp. 897–902 (2011).  https://doi.org/10.1109/FG.2011.5771370. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5771370
  24. 24.
    Kanade, T., Tian, Y., Cohn, J.F.: In: Proceedings of the Fourth IEEE International Conference on Automatic Face and Gesture Recognition 2000, FG ’00, pp. 46–53. IEEE Computer Society, Washington, DC, USA (2000). http://dl.acm.org/citation.cfm?id=795661.796155
  25. 25.
    Ekman, P.: Emotions Revealed, 2nd edn. Henry Holt and Company, LLC, New York (2007)Google Scholar
  26. 26.
    Hadid, A.: In: 2008 First Workshops on Image Processing Theory, Tools and Applications, pp. 1–9 (2008).  https://doi.org/10.1109/IPTA.2008.4743795. http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4743795
  27. 27.
    Bui, T. T. T., Phan, N. H., Spitsyn, V. G.: Face and hand gesture recognition algorithm based on wavelet transforms and principal component analysis. Strategic Technology (IFOST), 2012 7th International Forum on, pp. 1–4 (2012).  https://doi.org/10.1109/IFOST.2012.6357626
  28. 28.
    Pantic, M., Patras, I.: IEEE Trans. Syst. Man Cybern. 36(2), 433 (2006). http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=1605389
  29. 29.
    FACET Module Facial Expression Analysis.: 1(617) (2014)Google Scholar
  30. 30.
    Donato, G., Bartlett, M.S., Hager, J.C., Ekman, P., Sejnowski, T.J.: IEEE Trans. Pattern Anal. Mach. Intell. 21(10), 974 (1999).  https://doi.org/10.1109/34.799905
  31. 31.
    Jang, E.H., Park, B.J., Kim, S.H., Eum, Y., Sohn, J.: In: 2011 International Conference on Engineering and Industries (ICEI), pp. 1–6 (2011)Google Scholar
  32. 32.
    Mano, L.Y., Faiçal, B.S., Nakamura, L.H.V., Gomes, P.H., Libralon, G.L., Meneguete, R.I., Filho, G.P.R., Giancristofaro, G.T., Pessin, G., Krishnamachari, B., Ueyama, J.: Comput. Commun. 89–90, 178 (2015). http://dx.doi.org/10.1016/j.comcom.2016.03.010
  33. 33.
    Barmaki, R.: Gesture Assessment of Teachers in an Immersive Rehearsal Environment. Ph.D. thesis, University of Central Florida Electronic (2016). http://stars.library.ucf.edu/cgi/viewcontent.cgi?article=6067&context=etd
  34. 34.
    Cruz, A.C., Bhanu, B., Thakoor, N.S.: IEEE Trans. Affect. Comput. 5(4), 418 (2014).  https://doi.org/10.1109/TAFFC.2014.2316151
  35. 35.
    Khowaja, S.A., Dahri, K.: pp. 4–9 (2015).  https://doi.org/10.1109/ICET.2015.7389223
  36. 36.
    Bettadapura, V.: arXiv preprint arXiv:1203.6722, pp. 1–27 (2012)
  37. 37.
  38. 38.
    Picard, R.: Crowd sourcing facial responses to online videos. J. LATEX Class Files 6(1), 456–468 (2007)Google Scholar
  39. 39.
    Fernández-Caballero, A., Martínez-Rodrigo, A., Pastor, J.M., Castillo, J.C., Lozano-Monasor, E., López, M.T., Zangróniz, R., Latorre, J.M., Fernández-Sotos, A.: J. Biomed. Inf. 64, 55 (2016).  https://doi.org/10.1016/j.jbi.2016.09.015. http://linkinghub.elsevier.com/retrieve/pii/S1532046416301289
  40. 40.
    Ekman, P., Friesen, W.V., Hager, J.C.: Facial Action Coding System, vol. 1. Research Nexus division of Network Information Research Coorporation, Salt Lake City (2002)Google Scholar
  41. 41.
    Atwater, F. H.: Inducing altered states of consciousness with binaural beat technology binaural beats and the physiology of the brain. In: Proceedings of the 8th International Symposium on New Science, pp. 11–15 (2003)Google Scholar
  42. 42.
    Quentin, W.: Guide to Digital Home Technology Integration, 1st edn. Cengage Learning, New York (2009)Google Scholar
  43. 43.
    Lee, W.S., Hong, S.H.: In: 4th IEEE Conference on Automation Science and Engineering, CASE 2008, pp. 750–755 (2008).  https://doi.org/10.1109/COASE.2008.4626433
  44. 44.
    Woo, S.L., Seung, H.H.: In: Digest of Technical Papers—IEEE International Conference on Consumer Electronics (2009), pp. 545–549.  https://doi.org/10.1109/ISCE.2009.5156866
  45. 45.
    Ning, H. N. H., Ya-Hu, W. Y.-H. W., Yi, T. Y. T.: Research of KNX device node and development based on the bus interface module. In: Control Conference (CCC), 2010 29th Chinese, pp. 4346–4350 (2010)Google Scholar
  46. 46.
    Martirano, L., Marrocco, R., Liberati, F., Di Giorgio, A.: KNX protocol compliant load shifting and storage control in residential buildings. In: Industry Applications Society Annual Meeting, IEEE, pp. 1–6 (2015)Google Scholar
  47. 47.
    Bujdei, C., Moraru, S.A.: In: Proceedings—2011 7th International Conference on Intelligent Environments, IE 2011, pp. 222–229 (2011).  https://doi.org/10.1109/IE.2011.29
  48. 48.
    Sita, I.V.: In: Electrical Systems for Aircraft, Railway and Ship Propulsion. ESARS (2012).  https://doi.org/10.1109/ESARS.2012.6387411
  49. 49.
    Vanus, J., Cerny, M., Koziorek, J.: The proposal of the smart home care solution with KNX components. In: 2015 38th International Conference on Telecommunications and Signal Processing, TSP 2015 (2015).  https://doi.org/10.1109/TSP.2015.7296410
  50. 50.
    Gilbreth, F.B., Gilbreth, L.M.: Applied Motion Study: A Collection of Papers on the Efficient Method to Industrial Preparedness. Sturgis & Walton Company, New York (1917).  https://doi.org/10.1017/CBO9781107415324.004
  51. 51.
  52. 52.
    Aguayo González, R.: El revestimiento del espacio energético. Ph.D. thesis, Universidad Politécnica de Cataluña (1999)Google Scholar
  53. 53.
    Abascal Carranza, C. M.: NORMA Oficial Mexicana, Condiciones de seguridad e higiene en los centros de trabajo donde segenere ruido. NOM-011-STPS-2001 (2002). Estados Unidos MexicanosGoogle Scholar
  54. 54.
    Lozano Alarcon, J.: NORMA Oficial Mexicana, Condiciones de iluminación en los centros de trabajo, NOM-025-STPS- 2008, 1 §(2008). Estados Unidos MexicanosGoogle Scholar
  55. 55.
    Kuehl, R.O.: Diseño de experimentos, 2nd edn. Thomson Learning, Mexico city (2001)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Sergio A. Navarro-Tuch
    • 1
  • M. Rogelio Bustamante-Bello
    • 1
  • Javier Izquierdo-Reyes
    • 1
  • Roberto Avila-Vazquez
    • 1
  • Ricardo Ramirez-Mendoza
    • 1
  • Pablos-Hach Jose Luis
    • 1
  • Yadira Gutierrez-Martinez
    • 1
  1. 1.Escuela de Ingenieria y CienciasTecnológico de MonterreyMexico CityMexico

Personalised recommendations