Abstract
Vision Buy is a productive tool in enhancing the user experience of e-commerce Web sites. The consumer has the ability to purchase anything at anytime from anywhere through their visual attention and eye movements. The process of analysis typically involves examining the characteristics and patterns of visual attention during the online shopping process. The selection of the product based on the consumer’s eye movements is done by adopting the principle of attention distribution, which refers to the percentage of time visually allocated to each category of product available on the webpage. This data will be analyzed based on gaze points, and based on these selections, it enables users to navigate through the webpages. Thus, it is an ensuing product for any e-commerce web application.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Jacob, R.J.K.: Eye movement-based human computer interaction techniques: toward non command interfaces. Adv. Hum.-Comput. Interact. 4, 151–190 (1993)
Ability Net Web Site: www.abilitynet.org.uk/news-blogs/next-gen-eyetracking-game-changer-disabled-people
Arai, K., Mardiyanto, R.: Eye-based human-computer interaction allowing phoning, reading e-book/e-comic/elearning, Internet browsing and TV information extraction. Int. J. Adv. Comput. Sci. Appl. 2, 12 (2011)
Sundstedt, V.: Gazing at games: an introduction to eye tracking control. Synth. Lect. Comput. Graphics Animation 5(1), 1–113 (2012)
Pundlik, S.: Magnifying smartphone screen using google glass for low-vision users. IEEE Trans. Neural Syst. Rehabil. Eng. 25(1), 52–61 (2017)
Alshaqaqi, B.: Driver drowsiness detection system. In: 2013 8th International Workshop on Systems, Signal Processing and their Applications (WoSSPA). IEEE (2013)
Kim, J.: Driver’s drowsiness warning system based on analyzing driving patterns and facial images. In: 23rd International Technical Conference on the Enhanced Safety of Vehicles (ESV). No. 13-0158 (2013)
Sawhney, T., Pravin Reddy, S., Amudha, J., Jyotsna, C.: Helping hands—an eye tracking enabled user interface framework for amyotrophic lateral sclerosis patients. In: 2nd International Conference on sustainable Computing Techniques in Engineering, Science and Management (SCESM-2017), 27–28 Jan 2017
Njeru, A.M., Paracha, S.: Learning analytics: supporting at-risk student through eye-tracking and a robust intelligent tutoring system. In: 2017 International Conference on Applies System Innovation (ICASI). IEEE (2017)
Thorat, S.S.: User oriented approach to website navigation concept using mathematical model. In: Computational Intelligence and Communication Networks (CICN) International Conference on IEEE, 2015
Venugopal, D., Amudha, J., Jyotsna, C.: Developing an application using eye tracker. In: IEEE International Conference on Recent Trends in Electronics, Information and Communication Technology (RTEICT). IEEE (2016)
Spakov, O., Miniotas, D.: Visualization of eye gaze data using heat maps. Elektronika ir elektrotechnika 55–58 (2007)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Lakshmi Pavani, M., Bhanu Prakash, A.V., Shwetha Koushik, M.S., Amudha, J., Jyotsna, C. (2019). Navigation Through Eye-Tracking for Human–Computer Interface. In: Satapathy, S., Joshi, A. (eds) Information and Communication Technology for Intelligent Systems . Smart Innovation, Systems and Technologies, vol 107. Springer, Singapore. https://doi.org/10.1007/978-981-13-1747-7_56
Download citation
DOI: https://doi.org/10.1007/978-981-13-1747-7_56
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-1746-0
Online ISBN: 978-981-13-1747-7
eBook Packages: Intelligent Technologies and RoboticsIntelligent Technologies and Robotics (R0)