Skip to main content

Detecting users’ usage intentions for websites employing deep learning on eye-tracking data

Abstract

We proposed a method employing deep learning (DL) on eye-tracking data and applied this method to detect intentions to use apparel websites that differed in factors of depth, breadth, and location of navigation. Results showed that users’ intentions could be predicted by combining a deep neural network algorithm and metrics recorded from an eye-tracker. Using all of the eye-tracking metric features attained the best accuracy when predicting usage/not-usage intention to websites. In addition, the results suggest that for apparel websites with the same depth, designers can increase usage intention by using a larger number of navigation items and placing the navigation at the top and left of the homepage. The results show that building intelligent usage intention-detection systems is possible for the range of websites we examined and is also computationally practical. Hence, the study motivates future investigations that focus on design of such systems.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig.4
Fig. 5

References

  1. 1.

    Almeida VM, Rafael S, Neves M (2020) Natural human-computer interfaces’ paradigm and cognitive ergonomics. In: Rebelo F, Soares M (eds) Advances in Ergonomics in Design AHFE 2019, vol 955. Springer, pp 220–227

    Google Scholar 

  2. 2.

    Ajzen I (2002) Perceived behavioral control, self-efficacy, locus of control, and the theory of planned behavior 1. J Appl Soc Psychol 32(4):665–683

    Article  Google Scholar 

  3. 3.

    Muslim A, Harun A, Ismael D, Othman B (2020) Social media experience, attitude and behavioral intention towards umrah package among generation X and Y. Manage Sci Lett 10(1):1–12

    Article  Google Scholar 

  4. 4.

    Ajzen I (1985) From intentions to actions: A theory of planned behavior. In: Kuhl J, Beckmann J (eds) Action control. Springer, pp 11–39

    Chapter  Google Scholar 

  5. 5.

    Oberauer K (2009) Design for a working memory. In: Ross BH (ed) The psychology of learning and motivation, vol 51. Academic Press, pp 45–100

    Chapter  Google Scholar 

  6. 6.

    Xiong A, Proctor RW (2018) The role of task space in action control: Evidence from research on instructions. In: Federmeier KD (ed) The psychology of learning and motivation. Academic Press, Cambridge, MA, pp 325–364

    Google Scholar 

  7. 7.

    Shojaeizadeh M, Djamasbi S, Paffenroth RC, Trapp AC (2019) Detecting task demand via an eye tracking machine learning system. Decis Support Syst 116:91–101

    Article  Google Scholar 

  8. 8.

    Ding Y, Cao Y, Duffy VG, Wang Y, Zhang X (2020) Measurement and identification of mental workload during simulated computer tasks with multimodal methods and machine learning. Ergonomics 63(7):896–908

    Article  Google Scholar 

  9. 9.

    Tzafilkou K, Protogeros N (2017) Diagnosing user perception and acceptance using eye tracking in web-based end-user development. Comput Hum Behav 72:23–37

    Article  Google Scholar 

  10. 10.

    Deng M, Gu X (2020) Information acquisition, emotion experience and behaviour intention during online shopping: an eye-tracking study. Behav Inf Technol 2:1–11

    Google Scholar 

  11. 11.

    Guo F, Cao Y, Ding Y, Liu W, Zhang X (2015) A multimodal measurement method of users’ emotional experiences shopping online. Human Factors Ergon Manuf Serv Ind 25(5):585–598

    Article  Google Scholar 

  12. 12.

    Slanzi G, Balazs JA, Velásquez JD (2017) Combining eye tracking, pupil dilation and EEG analysis for predicting web users click intention. Inf Fusion 35:51–57

    Article  Google Scholar 

  13. 13.

    Jadue, J., Slanzi, G., Salas, L., & Velásquez, J.D. (2015). Web user click intention prediction by using pupil dilation analysis. In 2015 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology (WI-IAT) (Vol. 1, pp. 433–436): IEEE.

  14. 14.

    Deng Q, Wang J, Hillebrand K, Benjamin CR, Soffker D (2019) Prediction performance of lane changing behaviors: a study of combining environmental and eye-tracking data in a driving simulator. IEEE Trans Intell Transp Syst 21(8):1–10

    Google Scholar 

  15. 15.

    Joseph AW, Murugesh R (2020) Potential eye tracking metrics and indicators to measure cognitive load in human-computer interaction research. J Sci Res 64(1):168–175

    Google Scholar 

  16. 16.

    Zhu Z, Zhou Y, Deng X, Wang X (2019) A graph-oriented model for hierarchical user interest in precision social marketing. Electron Commer Res Appl 35:1–12

    Article  Google Scholar 

  17. 17.

    Djamasbi S (2014) Eye tracking and web experience. AIS Trans Human-Comput Interact 6(2):37–54

    Article  Google Scholar 

  18. 18.

    Hwang AH-C, Oh J (2020) Interacting with background music engages E-Customers more: the impact of interactive music on consumer perception and behavioral intention. J Retail Consum Serv 54(5):1–15

    Google Scholar 

  19. 19.

    Guo F, Wang X-S, Liu W-L, Ding Y (2018) Affective preference measurement of product appearance based on event-related potentials. Cogn Technol Work 20(2):299–308

    Article  Google Scholar 

  20. 20.

    Gurbuz, A., Aktas, M.S., & Ieee. (2019). Prediction of purchase intention on the E-Commerce clickstream data. In 27th Signal Processing and Communications Applications Conference. New York: IEEE.

  21. 21.

    Wu I-C, Yu H-K (2020) Sequential analysis and clustering to investigate users’ online shopping behaviors based on need-states. Inf Process Manage 57(6):1–18

    Google Scholar 

  22. 22.

    Yan H, Wang Z, Lin T-H, Li Y, Jin D (2018) Profiling users by online shopping behaviors. Multimed Tools Appl 77(17):21935–21945

    Article  Google Scholar 

  23. 23.

    Ahmad IS, Bakar AA, Yaakub MR (2020) Movie revenue prediction based on purchase intention mining using YouTube trailer reviews. Inf Process Manage 57(5):1–15

    Article  Google Scholar 

  24. 24.

    Hibbeln MT, Jenkins JL, Schneider C, Valacich J, Weinmann M (2017) How is your user feeling? inferring emotion through human-computer interaction devices. MIS Q 41(1):1–21

    Article  Google Scholar 

  25. 25.

    Leiva LA, Huang J (2015) Building a better mousetrap: compressing mouse cursor activity for web analytics. Inf Process Manage 51(2):114–129

    Article  Google Scholar 

  26. 26.

    Liu W, Liang X, Wang X, Guo F (2019) The evaluation of emotional experience on webpages: an event-related potential study. Cogn Technol Work 21(2):317–326

    Article  Google Scholar 

  27. 27.

    Sung B, Wilson NJ, Yun JH, LEE EJ (2020) What can neuroscience offer marketing research? Asia Pac J Mark Logist 32(5):1089–1111

    Article  Google Scholar 

  28. 28.

    Xiong J, Zuo M (2020) What does existing NeuroIS research focus on? Inf Syst 89:1–12

    Article  Google Scholar 

  29. 29.

    Campbell CS, & Maglio PP (2001) A robust algorithm for reading detection. In Proceedings of the 2001 workshop on Perceptive user interfaces (pp. 1–7). ACM.

  30. 30.

    Guo F, Ding Y, Liu W, Liu C, Zhang X (2016) Can eye-tracking data be measured to assess product design?: visual attention mechanism should be considered. Int J Ind Ergon 53(5):229–235

    Article  Google Scholar 

  31. 31.

    Guo F, Li M, Qu Q, Duffy VG (2019) The effect of a humanoid robot’s emotional behaviors on users’ emotional responses: evidence from pupillometry and electroencephalography measures. Int J Human-Comput Interact 35(20):1947–1959

    Article  Google Scholar 

  32. 32.

    Espigares-Jurado F, Munoz-Leiva F, Correia MB, Sousa CMR, Ramos CMQ, Faisca L (2020) Visual attention to the main image of a hotel website based on its position, type of navigation and belonging to Millennial generation: an eye tracking study. J Retail Consum Serv 52(1):1–11

    Google Scholar 

  33. 33.

    Liu Y, Yttri EA, Snyder LH (2010) Intention and attention: different functional roles for LIPd and LIPv. Nat Neurosci 13(4):495–502

    Article  Google Scholar 

  34. 34.

    Rayner K (1998) Eye movements in reading and information processing: 20 years of research. Psychol Bull 124(3):372–422

    Article  Google Scholar 

  35. 35.

    Di Stasi LL, Catena A, Canas JJ, Macknik SL, Martinez-Conde S (2013) Saccadic velocity as an arousal index in naturalistic tasks. Neurosci Biobehav Rev 37(5):968–975

    Article  Google Scholar 

  36. 36.

    Fuchs A (1967) Saccadic and smooth pursuit eye movements in the monkey. J Physiol 191(3):609–631

    Article  Google Scholar 

  37. 37.

    Jonikaitis D, Szinte M, Rolfs M, Cavanagh P (2013) Allocation of attention across saccades. J Neurophysiol 109(5):1425–1434

    Article  Google Scholar 

  38. 38.

    Marchak F (2013) Detecting false intent using eye blink measures. Front Psychol 4:1–9

    Article  Google Scholar 

  39. 39.

    Stern JA, Boyer D, Schroeder D (1994) Blink rate: a possible measure of fatigue. Hum Factors 36(2):285–297

    Article  Google Scholar 

  40. 40.

    Noton D, Stark L (1971) Scanpaths in eye movements during pattern perception. Science 171(3968):308–311

    Article  Google Scholar 

  41. 41.

    Coutrot A, Hsiao JH, Chan AB (2018) Scanpath modeling and classification with hidden Markov models. Behav Res Methods 50(1):362–379

    Article  Google Scholar 

  42. 42.

    Lim Y, Gardi A, Pongsakornsathien N, Sabatini R, Ezer N, Kistan T (2019) Experimental characterisation of eye-tracking sensors for adaptive human-machine systems. Measurement 140:151–160

    Article  Google Scholar 

  43. 43.

    Park H, Lee S, Lee M, Chang M-S, Kwak H-W (2016) Using eye movement data to infer human behavioral intentions. Comput Hum Behav 63:796–804

    Article  Google Scholar 

  44. 44.

    Jang Y-M, Mallipeddi R, Lee M (2014) Identification of human implicit visual search intention based on eye movement and pupillary analysis. User Model User-Adapt Interact 24(4):315–344

    Article  Google Scholar 

  45. 45.

    Koochaki F, Najafizadeh L (2018) Predicting intention through eye gaze patterns. 2018 IEEE Biomedical Circuits and Systems Conference. IEEE, New York, pp 25–28

    Google Scholar 

  46. 46.

    Yang, M., Lin, L., Chen, Z., Wu, L., & Guo, Z. (2020). Research on the construction method of kansei image prediction model based on cognition of EEG and ET. International Journal on Interactive Design and Manufacturing, 1–21.

  47. 47.

    Alpaydin E (2020) Introduction to machine learning. MIT press

    Google Scholar 

  48. 48.

    Kim I-H, Bong J-H, Park J, Park S (2017) Prediction of driver’s intention of lane change by augmenting sensor information using machine learning techniques. Sensors 17(6):1–18

    Article  Google Scholar 

  49. 49.

    Fu X, Ouyang T, Chen J, Luo X (2020) Listening to the investors: a novel framework for online lending default prediction using deep learning neural networks. Inf Process Manage 57(4):1–13

    Article  Google Scholar 

  50. 50.

    Marblestone AH, Wayne G, Kording KP (2016) Toward an integration of deep learning and neuroscience. Front Comput Neurosci 10:1–41

    Article  Google Scholar 

  51. 51.

    Schmidhuber J (2015) Deep learning in neural networks: an overview. Neural Netw 61:85–117

    Article  Google Scholar 

  52. 52.

    Su M-C, Hsieh Y-Z, Yeh Z-F, Lee S-F, Lin S-S (2020) An eye-tracking system based on inner corner-pupil center vector and deep neural network. Sensors 20(1):1–15

    Article  Google Scholar 

  53. 53.

    Sheela KG, Deepa SN (2013) Review on methods to fix number of hidden neurons in neural networks. Math Probl Eng 2013:1–11

    Article  Google Scholar 

  54. 54.

    Abbaspour-Gilandeh Y, Fazeli M, Roshanianfard A, Hernández-Hernández M, Gallardo-Bernal I, Hernández-Hernández JL (2020) Prediction of draft force of a chisel cultivator using artificial neural networks and its comparison with regression model. Agronomy 10(4):1–14

    Article  Google Scholar 

  55. 55.

    Lever J, Krzywinski M, Altman N (2016) Points of significance: classification evaluation. Nat Methods 13(8):603–604

    Article  Google Scholar 

  56. 56.

    Karanam S, Van Oostendorp H, Tat Fu, W (2016) Performance of computational cognitive models of web-navigation on real websites. J Inf Sci 42(1):94–113

    Article  Google Scholar 

  57. 57.

    Kumar V, Jenamani M (2017) Context preserving navigation redesign under Markovian assumption for responsive websites. Electron Commer Res Appl 21(1):65–78

    Article  Google Scholar 

  58. 58.

    Katz MA, Byrne MD (2003) Effects of scent and breadth on use of site-specific search on e-commerce Web sites. ACM Trans Comput-Human Interact (TOCHI) 10(3):198–220

    Article  Google Scholar 

  59. 59.

    Donovan RD, Rossiter JR (1982) Store atmosphere: An environmental psychology approach. J Retail 58(1):34–57

    Google Scholar 

  60. 60.

    Tuch AN, Roth SP, Hornbæk K, Opwis K, Bargas-Avila JA (2012) Is beautiful really usable? toward understanding the relation between usability, aesthetics, and affect in HCI. Comput Hum Behav 28(5):1596–1607

    Article  Google Scholar 

  61. 61.

    Hu Lt, Bentler PM (1999) Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Model 6(1):1–55

    Article  Google Scholar 

  62. 62.

    Williams BA, Mandrekar JN, Mandrekar SJ, Cha SS, & Furth AF (2006) Finding optimal cutpoints for continuous covariates with binary and time-to-event outcomes. Technical Report Series #79, Department of Health Sciences Research, Mayo Clinic, Rochester, MN.

  63. 63.

    Haggard P (2005) Conscious intention and motor cognition. Trends in Cognit Sci 9(6):290–295

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (Grant Numbers 71701003, 71801002, 71802002), Ministry of Education Industry-University Cooperation Collaborative Education Project (Grant No. 201901024006), the University Natural Science Research Key Project of Anhui Province (Grant No. KJ2017A108). We thank all the participants for carrying out the experiments.

Author information

Affiliations

Authors

Corresponding author

Correspondence to Yaqin Cao.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Cao, Y., Ding, Y., Proctor, R.W. et al. Detecting users’ usage intentions for websites employing deep learning on eye-tracking data. Inf Technol Manag 22, 281–292 (2021). https://doi.org/10.1007/s10799-021-00336-6

Download citation

Keywords

  • Behavioral intention
  • Deep learning
  • Eye-tracking
  • Website