Abstract
Quantitatively evaluating the psychological and perceptual effects of objects is an important issue, but is difficult. In cognitive studies, the psychological potential field (PPF), which represents psychological intensities in vision and can be calculated by applying computational algorithms to digital images, may help with this issue. Although studies have reported using the PPF to evaluate psychological effects, such as impressions, detailed investigations on how the PPF represents psychological perception and its limitations have not yet been performed. Another relevant tool is the fixation map, which visualizes human eye fixations; this map is generated from actual measurements acquired by eye-tracking and does not represent psychological effects directly. Although the PPF and the fixation map are based on visual imaging, they have never been compared. In this paper, we do so for the first time, using psychological and perceptual properties of line-drawing images. The results demonstrate the difference between these methods, including their representation of different properties with respect to visual perception. Moreover, the similarity between the two methods highlights the possibility of assessing perceptual phenomena such as categorization and cognition of objects based on human vision.
Article PDF
Similar content being viewed by others
Avoid common mistakes on your manuscript.
References
Liu, Y. J.; Yu, M. J.; Fu, Q. F.; Chen, W. F.; Liu, Y.; Xie, L. X. Cognitive mechanism related to line drawings and its applications in intelligent process of visual media: A survey. Frontiers of Computer Science Vol. 10, No. 2, 216–232, 2016.
Walther, D. B.; Chai, B.; Caddigan, E.; Beck, D. M.; Fei-Fei, L. Simple line drawings suffice for functional MRI decoding of natural scene categories. Proceedings of the National Academy of Sciences of the United States of America Vol. 108, No. 23, 9661–9666, 2011.
Fu, Q. F.; Liu, Y. J.; Dienes, Z.; Wu, J. H.; Chen, W. F.; Fu, X. L. Neural correlates of subjective awareness for natural scene categorization of color photographs and line-drawings. Frontiers in Psychology Vol. 8, 210, 2017.
Yokose, Z. A study on character-patterns based upon the theory of psychological potential field. Japanese Psychological Research Vol. 12, No. 1, 18–25, 1970.
Awano, N.; Morohoshi, K. Objective evaluation of impression of faces with various female hairstyles using field of visual perception. IEICE Transactions on Information and Systems Vol. E101.D, No. 6, 1648–1656, 2018.
Courtemanche, F.; Léger, P. M.; Dufresne, A.; Fredette, M.; Labonté-Lemoyne, É.; Sénécal, S. Physiological heatmaps: A tool for visualizing users’ emotional reactions. Multimedia Tools and Applications Vol. 77, No. 9, 11547–11574, 2018.
MIT saliency benchmark. Available at http://saliency.mit.edu/.
Ichikawa, N. The measurement of the figure-effect in the third dimension by the light threshold method. The Japanese Journal of Psychology Vol. 38, No. 5, 274–283, 1967.
Kaji, S.; Yamane, S.; Yoshimura, M.; Sugie, N. Contour enhancement of two-dimensional figures observed in the lateral geniculate cells of cats. Vision Research Vol. 14, No. 1, 113–117, 1974.
Fukouzu, Y.; Itoh, A.; Yoshida, T.; Shiraishi, T. An analysis of the figure by the visual space transfer model: A study on elucidation and estimation of a scene on figure recognition (6). Bulletin of Japanese Society for Science of Design Vol. 45, No. 4, 75–82, 1998.
Nagaishi, M. Identifying ability of a recognition method based on the field of induction. In: Proceedings of the 2nd International Conference on Document Analysis and Recognition, 926–929, 1993.
Miyoshi, M.; Shimoshio, Y.; Koga, H.; Shimoda, M.; Uchimura, K. A spacing method for readable arrangement of characters. In: Proceedings of the International Conference on Human–Computer Interaction, Vol. 1, 1348–1352, 2001.
Miyoshi, M.; Shimoshio, Y.; Koga, H. Automatic lettering design based on human kansei. In: Proceedings of the International Conference on Human–Computer Interaction, 2005.
Onaga, H.; Fukouzu, Y.; Yoshida, T.; Shiraishi T. Relation between potential and curvature optical illusion: A study on elucidation and estimation of a scene on figure recognition (report 5). Bulletin of Japanese Society for the Science of Design Vol. 43, No. 2, 77–84, 1996.
Majaranta, P.; Bulling, A. Eye tracking and eyebased human–computer interaction. In: Advances in Physiological Computing. Human–Computer Interaction Series. Fairclough, S.; Gilleade, K. Eds. Springer London, 39–65, 2014.
Lutteroth, C.; Penkar, M.; Weber, G. Gaze vs. mouse: A fast and accurate gaze-only click alternative. In: Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, 385–394, 2015.
Borji A.; Itti, L. CAT2000: A large scale fixation dataset for boosting saliency research. arXiv preprint arXiv:1505.03581, 2015.
Kaiser, D.; Jacob, G. A.; van Zutphen, L.; Siep, N.; Sprenger, A.; Tuschen-Caffier, B.; Senft, A.; Arntz, A.; Domes, G. Biased attention to facial expressions of ambiguous emotions in borderline personality disorder: An eye-tracking study. Journal of Personality Disorders Vol. 33, No. 5, 671S8, 2019.
Types of eye movements. Available at https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/types-of-eye-movements/.
Jiang, M.; Huang, S. S.; Duan, J. Y.; Zhao, Q. SALICON: Saliency in context. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1072–1080, 2015.
Pan, J. T.; Sayrol, E.; Giro-I-nieto, X.; McGuinness, K.; O’Connor, N. E. Shallow and deep convolutional networks for saliency prediction. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 598–606, 2016.
Tavakoli, H. R.; Ahmed, F.; Borji, A.; Laaksonen, J. Saliency revisited: Analysis of mouse movements versus fixations. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 6354–6362, 2017.
Kim, N. W.; Bylinskii, Z.; Borkin, M. A.; Gajos, K. Z.; Oliva, A.; Durand, F.; Pfister, H. BubbleView: An interface for crowdsourcing image importance maps and tracking visual attention. ACM Transactions on Computer–Human Interaction Vol. 24, No. 5, Article No. 36, 2017.
Bojko, A. Informative or misleading? Heatmaps deconstructed. In: Human–Computer Interaction. New Trends. Lecture Notes in Computer Science, Vol. 5610. Jacko, J. A. Ed. Springer Berlin Heidelberg, 30–39, 2009.
Holmqvist, K.; Nystrom, M.; Andersson, R.; Dewhurst, R.; Jarodzka, H.; van de Weijer, J. Eye Tracking: A Comprehensive Guide to Methods and Measures. Oxford University Press, 2011.
NAMOTO TKD-UK1. Available at http://www.namoto.com/vision/chinerest.htm.
Tobii Eye Tracker 4C. Available at https://gaming.tobii.com/tobiieye-tracker-4c/.
Aramaki, E.; Nakamura, T.; Usuda, Y.; Kubo, K.; Miyabe, M. Naming of meaningless sketch image. In: Interactive Information Access and Visual Mining, 47–51, 2013.
Judd, T.; Ehinger, K.; Durand, F.; Torralba, A. Learning to predict where humans look. In: Proceedings of the IEEE 12th International Conference on Computer Vision, 2106–2113, 2009.
Borji, A.; Sihite, D. N.; Itti, L. Quantitative analysis of human-model agreement in visual saliency modeling: A comparative study. IEEE Transactions on Image Processing Vol. 22, No. 1, 55–69, 2013.
Borji, A.; Tavakoli, H. R.; Sihite, D. N.; Itti, L. Analysis of scores, datasets, and models in visual saliency prediction. In: Proceedings of the IEEE International Conference on Computer Vision, 921–928, 2013.
Peters, R. J.; Iyer, A.; Itti, L.; Koch, C. Components of bottom-up gaze allocation in natural images. Vision Research Vol. 45, No. 18, 2397–2416, 2005.
Kümmerer, M.; Wallis, T. S. A.; Bethge, M. Information-theoretic model comparison unifies saliency metrics. Proceedings of the National Academy of Sciences of the United States of America Vol. 112, No. 52, 16054–16059, 2015.
Rodgers, J. L.; Nicewander, W. A. Thirteen ways to look at the correlation coefficient. The American Statistician Vol. 42, No. 1, 59–66, 1988.
Riche, N.; Duvinage, M.; Mancas, M.; Gosselin, B.; Dutoit, T. Saliency and human fixations: State-of-theart and study of comparison metrics. In: Proceedings of the IEEE International Conference on Computer Vision, 1153–1160, 2013.
Bylinskii, Z.; Judd, T.; Oliva, A.; Torralba, A.; Durand, F. What do different evaluation metrics tell us about saliency models? IEEE Transactions on Pattern Analysis and Machine Intelligence Vol. 41, No. 3, 740–757, 2019.
Sammaknejad, N.; Pouretemad, H.; Eslahchi, C.; Salahirad, A.; Alinejad, A. Gender classification based on eye movements: A processing effect during passive face viewing. Advances in Cognitive Psychology Vol. 13, No. 3, 232–240, 2017.
Zhu, W. N.; Drewes, J.; Gegenfurtner, K. R. Animal detection in natural images: Effects of color and image database. PLoS One Vol. 8, No. 10, e75816, 2013.
Zhu, W. N.; Drewes, J.; Peatfield, N. A.; Melcher, D. Differential visual processing of animal images, with and without conscious awareness. Frontiers in Human Neuroscience Vol. 10, 513, 2016.
Peelen, M. V.; Li, F. F.; Kastner, S. Neural mechanisms of rapid natural scene categorization in human visual cortex. Nature Vol. 460, No. 7251, 94–97, 2009.
Fu, X. L.; Cai, L. H.; Liu, Y.; Jia, J.; Chen, W. F.; Yi, Z.; Zhao, G. Z.; Liu, Y. J.; Wu, C. X. A computational cognition model of perception, memory, and judgment. Science China Information Sciences Vol. 57, No. 3, 1–15, 2014.
Yu, M. J.; Liu, Y. J.; Wang, S. J.; Fu, Q. F.; Fu, X. L. A PMJ-inspired cognitive framework for natural scene categorization in line drawings. Neurocomputing Vol. 173, 2041–2048, 2016.
Acknowledgements
We would like to express our great appreciation for the experimental participants and the staff of Osaka University of Economics.
Author information
Authors and Affiliations
Corresponding author
Additional information
Naoyuki Awano received his Doctor of Information Science degree from Osaka Institute of Technology in 2012. He was a research associate at Osaka Institute of Technology from 2009 to 2013 and an assistant professor at Seikei University from 2013 to 2018. He has been a lecturer at Osaka University of Economics since 2018. His research interests include image processing, computer graphics, and perceptual information processing.
Yuki Hayashi received his Ph.D. degree in information science from Nagoya University in 2012. From 2009 to 2012, he was a recipient of a JSPS research fellowship for young scientists (DC1). From 2012 to 2014, he was an assistant professor at Seikei University. He is currently an associate professor in the College of Sustainable System Sciences and the Graduate School of Humanities and Sustainable System Sciences, Osaka Prefecture University. His research interests include computer-supported collaborative learning and human-computer interaction.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made.
The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
Other papers from this open access journal are available free of charge from http://www.springer.com/journal/41095.
To submit a manuscript, please go to https://www.editorialmanager.com/cvmj.
About this article
Cite this article
Awano, N., Hayashi, Y. Psychological potential field and human eye fixation on binary line-drawing images: A comparative experimental study. Comp. Visual Media 6, 205–214 (2020). https://doi.org/10.1007/s41095-020-0169-5
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s41095-020-0169-5