Advertisement

Can Eye Movement Improve Prediction Performance on Human Emotions Toward Images Classification?

  • Kitsuchart Pasupa
  • Wisuwat Sunhem
  • Chu Kiong Loo
  • Yoshimitsu Kuroki
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10637)

Abstract

Recently, image sentiment analysis has become more and more attractive to many researchers due to an increasing number of applications developed to understand images e.g. image retrieval systems and social networks. Many studies aim to improve the performance of the classifier by many approaches. This work aims to predict the emotional response of a person who is exposed to images. The prediction model makes use of eye movement data captured while users are looking at images to enhance the prediction performance. An image can stimulate different emotions in different users depending on where and how their eyes move on the image. Two image datasets were used, i.e. abstract images and images with context information, by using leave-one-user-out and leave-one-image-out cross-validation techniques. It was found that eye movement data is useful and able to improve the prediction performance only in leave-one-image-out cross-validation.

Keywords

Emotion classification Eye movement Abstract image Image with context information 

Notes

Acknowledgments

This work was supported by the Faculty of Information Technology, King Mongkut’s Institute of Technology Ladkrabang under grant agreement number 2560-06-002.

References

  1. 1.
    Auer, P., Hussain, Z., Kaski, S., Klami, A., Kujala, J., Laaksonen, J., Leung, A.P., Pasupa, K., Shawe-Taylor, J.: Pinview: implicit feedback in content-based image retrieval. In: Proceeding of the Workshop on Applications of Pattern Analysis, WAPA 2010, Windsor, UK, pp. 51–57 (2010)Google Scholar
  2. 2.
    Avital, T.: Art versus Nonart: Art Out of Mind. Cambridge University Press Cambridge, UK (2003)Google Scholar
  3. 3.
    Chandon, P., Hutchinson, J., Bradlow, E., Young, S.H.: Measuring the value of point-of-purchase marketing with commercial eye-tracking data. INSEAD Working Paper Collection, vol. 22, pp. 1 (2007)Google Scholar
  4. 4.
    Duchowski, A.T.: Eye Tracking Methodology: Theory and Practice, no. 328. Springer, Cham (2017). doi: 10.1007/978-3-319-57883-5 CrossRefGoogle Scholar
  5. 5.
    Feist, G.J., Brady, T.R.: Openness to experience, non-conformity, and the preference for abstract art. Empirical Stud. Arts 22(1), 77–89 (2004)CrossRefGoogle Scholar
  6. 6.
    Hardoon, D.R., Pasupa, K., Shawe-Taylor, J.: Image ranking with implicit feedback from eye movements. In: Proceeding of the 6th Biennial Symposium on Eye Tracking Research and Applications, ETRA 2010, Austin, USA, pp. 291–298 (2010)Google Scholar
  7. 7.
    Hussain, Z., Leung, A.P., Pasupa, K., Hardoon, D.R., Auer, P., Shawe-Taylor, J.: Exploration-exploitation of eye movement enriched multiple feature spaces for content-based image retrieval. In: Balcázar, J.L., Bonchi, F., Gionis, A., Sebag, M. (eds.) ECML PKDD 2010. LNCS, vol. 6321, pp. 554–569. Springer, Heidelberg (2010). doi: 10.1007/978-3-642-15880-3_41 CrossRefGoogle Scholar
  8. 8.
    Müller, H., Michoux, N., Bandon, D., Geissbuhler, A.: A review of content-based image retrieval systems in medical applications-clinical benefits and future directions. Int. J. Med. Inform. 73(1), 1–23 (2004)CrossRefGoogle Scholar
  9. 9.
    Pasupa, K., Chatkamjuncharoen, P., Wuttilertdeshar, C., Sugimoto, M.: Using image features and eye tracking device to predict human emotions towards abstract images. In: Bräunl, T., McCane, B., Rivera, M., Yu, X. (eds.) PSIVT 2015. LNCS, vol. 9431, pp. 419–430. Springer, Cham (2016). doi: 10.1007/978-3-319-29451-3_34 CrossRefGoogle Scholar
  10. 10.
    Pasupa, K., Szedmak, S.: Utilising Kronecker decomposition and tensor-based multi-view learning to predict where people are looking in images. Neurocomputing 248, 80–93 (2017)CrossRefGoogle Scholar
  11. 11.
    Rui, Y., Huang, T.S., Chang, S.F.: Image retrieval: Current techniques, promising directions, and open issues. J. Vis. Commun. Image Representation 10(1), 39–62 (1999)CrossRefGoogle Scholar
  12. 12.
    Schapiro, M.: Nature of Abstract Art. American Marxist Association (1937)Google Scholar
  13. 13.
    Smeulders, A.W., Worring, M., Santini, S., Gupta, A., Jain, R.: Content-based image retrieval at the end of the early years. IEEE Trans. Pattern Anal. Mach. Intell. 22(12), 1349–1380 (2000)CrossRefGoogle Scholar
  14. 14.
    Strandvall, T.: Eye Tracking in Human-Computer Interaction and Usability Research. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5727, pp. 936–937. Springer, Heidelberg (2009). doi: 10.1007/978-3-642-03658-3_119 CrossRefGoogle Scholar
  15. 15.
    Wang, W., He, Q.: A survey on emotional semantic image retrieval. In: Proceeding of 15th IEEE International Conference on Image Processing, ICIP 2008, CA, USA, pp. 117–120 (2008)Google Scholar
  16. 16.
    Wei-ning, W., Ying-lin, Y., Sheng-ming, J.: Image retrieval by emotional semantics: a study of emotional space and feature extraction. In: Proceeding of the IEEE International Conference on Systems, Man, and Cybernetics, SMC 2006, Taipei, Taiwan, pp. 3534–3539 (2006)Google Scholar
  17. 17.
    You, Q., Luo, J., Jin, H., Yang, J.: Robust image sentiment analysis using progressively trained and domain transferred deep networks. In: Proceeding of the 29th AAAI Conference on Artificial Intelligence, AAAI 2015, Austin, Texas, USA, pp. 381–388 (2015)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Kitsuchart Pasupa
    • 1
  • Wisuwat Sunhem
    • 1
  • Chu Kiong Loo
    • 2
  • Yoshimitsu Kuroki
    • 3
  1. 1.Faculty of Information TechnologyKing Mongkut’s Institute of Technology LadkrabangBangkokThailand
  2. 2.Faculty of Computer Science and Information TechnologyUniversity of MalayaKuala LumpurMalaysia
  3. 3.Department of Control and Information Systems EngineeringKurume National College of TechnologyFukuokaJapan

Personalised recommendations