Skip to main content
Log in

High-order attentive graph neural network for session-based recommendation

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Recommender systems are becoming a crucial part of several websites. The purpose of session-based recommendations is to predict the next item that users might click based on users’ interaction behavior in a session. The latest research on session-based recommendation focuses on using graph neural networks to model transfer relationships between items. However, when the interaction of low-order relationships between adjacent items is insufficient, learning the high-order relationships between non-adjacent items becomes a challenge. Additionally, to distinguish the importance of nodes in the graph, different weights should be assigned to each edge. Therefore, we propose a novel high-order attentive graph neural network (HA-GNN) model for session-based recommendations. In the proposed method, first, we model sessions as graph-structured data. Then, we use the self-attention mechanism to capture the dependencies between items. Next, we use the soft-attention mechanism to learn high-order relationships in a graph. Finally, we update the embeddings of items using a simple fully connected layer. Experiments on two public e-commerce datasets show that HA-GNN has excellent performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

Notes

  1. http://2015.recsyschallenge.com/challenge.html

  2. http://cikm2016.cs.iupui.edu/cikm-cup

References

  1. Su X, Khoshgoftaar TM (2009) A survey of collaborative filtering techniques. Adv Artif Intell 2009:1–19

    Article  Google Scholar 

  2. Shani G, Heckerman D, Brafman RI (2005) An MDP-based Recommender System. J Mach Learn Res 6:1265–1295

    MathSciNet  MATH  Google Scholar 

  3. Rendle S, Freudenthaler C, Schmidt-thieme L (2010) Factorizing personalized Markov chains for next-basket recommendation. In: Proceedings of the 19th international conference on World wide web - WWW ’10. ACM Press, USA, pp 811–821

  4. Devooght R, Bersini H (2017) Collaborative filtering with recurrent neural networks. arXiv:160807400. [cs] 1–10

  5. Ko Y-J, Maystre L, Grossglauser M (2016) Collaborative recurrent neural networks for dynamic recommender systems. In: Asian conference on machine learning. PMLR, pp 366–381

  6. Wang D, Xu D, Yu D, Xu G (2020) Time-aware sequence model for next-item recommendation. Appl Intell, 906–920

  7. Hidasi B, Karatzoglou A, Baltrunas L, Tikk D (2016) Session-based recommendations with recurrent neural networks. In: International conference on learning representations, pp 1–10

  8. Tan YK, Xu X, Liu Y (2016) Improved Recurrent Neural Networks for Session-based Recommendations. In: Proceedings of the 1st Workshop on Deep Learning for Recommender Systems. Association for Computing Machinery, USA, pp 17–22

  9. Li J, Ren P, Chen Z et al (2017) Neural Attentive Session-based Recommendation. In: Proceedings of the 2017 ACM on Conference on Information and Knowledge Management. ACM, Singapore, pp 1419–1428

  10. Liu Q, Zeng Y, Mokhosi R, Zhang H (2018) STAMP: Short-term attention/memory priority model for session-based recommendation. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery & data mining. Association for Computing Machinery, USA, pp 1831–1839

  11. Wu S, Tang Y, Zhu Y et al (2019) Session-based recommendation with graph neural networks. AAAI 33:346–353

    Article  Google Scholar 

  12. Yu F, Zhu Y, Liu Q et al (2020) TAGNN: Target attentive graph neural networks for session-based recommendation. In: Proceedings of the 43rd International ACM SIGIR conference on research and development in information retrieval, pp 1921–1924

  13. Pan Z, Cai F, Ling Y et al (2020) An intent-guided collaborative machine for session-based recommendation. In: Proceedings of the 43rd international ACM SIGIR conference on research and development in information retrieval, pp 1833–1836

  14. Hu Y, Koren Y, Volinsky C (2008) Collaborative filtering for implicit feedback datasets. In: 2008 Eighth IEEE international conference on data mining. IEEE, Italy, pp 263–272

  15. Zhang Z, Liu H (2015) Social recommendation model combining trust propagation and sequential behaviors. Appl Intell 43:695–706

    Article  Google Scholar 

  16. Zhang P, Zhang Z, Tian T, Wang Y (2019) Collaborative filtering recommendation algorithm integrating time windows and rating predictions. Appl Intell 49:3146–3157

    Article  Google Scholar 

  17. Zhang Z, Xu G, Zhang P, Wang Y (2017) Personalized recommendation algorithm for social networks based on comprehensive trust. Appl Intell 47:659–669

    Article  Google Scholar 

  18. Wang S, Cao L, Wang Y (2019) A survey on session-based recommender systems. arXiv:1902.04864, 1–39

  19. Sarwar B, Karypis G, Konstan J, Reidl J (2001) Item-based collaborative filtering recommendation algorithms. In: Proceedings of the tenth international conference on World Wide Web-WWW’01. ACM Press, Hong Kong, pp 285–295

  20. Chen S, Moore JL, Turnbull D, Joachims T (2012) Playlist prediction via metric embedding. In: Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining. Association for Computing Machinery, USA, pp 714–722

  21. Gu W, Dong S, Zeng Z (2014) Increasing recommended effectiveness with markov chains and purchase intervals. Neural Comput & Applic 25:1153–1162

    Article  Google Scholar 

  22. Chung J, Gulcehre C, Cho K, Bengio Y (2014) Empirical Evaluation of Gated Recurrent Neural Networks on Sequence Modeling. arXiv:14123555 [cs], pp 1–9

  23. Pan Y, He F, Yu H (2020 Jul) Learning social representations with deep autoencoder for recommender system. World Wide Web 23(4):2259–79

  24. Pan Y, He F, Yu H (2020) A correlative denoising autoencoder to model social influence for top-N recommender system. Front Comput Sci 14(3):1–13

    Article  Google Scholar 

  25. Pan Y, He F, Yu H (2019) A novel enhanced collaborative autoencoder with knowledge distillation for top-N recommender systems. Neurocomputing 332:137–148

    Article  Google Scholar 

  26. Bahdanau D, Cho K, Bengio Y (2016) Neural Machine Translation by Jointly Learning to Align and Translate. arXiv:14090473 [cs, stat], 1–15

  27. Vaswani A, Shazeer N, Parmar N et al (2017) Attention is all you need. In: Advances in neural information processing systems. pp 5998–6008

  28. Xu K, Ba J, Kiros R et al (2015) Show, attend and tell: Neural image caption generation with visual attention. In: International conference on machine learning. pp 2048–2057

  29. Yuan W, Wang H, Yu X et al (2020) Attention-based context-aware sequential recommendation model. Inf Sci 510:122–134

    Article  Google Scholar 

  30. Feng Y, Lv F, Shen W et al (2019) Deep session interest network for Click-Through rate prediction. In: IJCAI, pp 2301–2307

  31. Wen P, Yuan W, Qin Q et al (2020) Neural attention model for recommendation based on factorization machines. Appl Intell, 1–16

  32. Zhang S, Tay Y, Yao L, Sun A (2018) Next Item Recommendation with Self-Attention. arXiv:180806414 [cs], 1–10

  33. van den Berg R, Kipf TN, Welling M (2017) Graph Convolutional Matrix Completion. In: International conference on learning representations, pp 1–9

  34. Fan W, Ma Y, Li Q et al (2019) Graph neural networks for social recommendation. In: The world wide web conference. Association for computing machinery, USA, pp 417–426

  35. Zhang J, Shi X, Zhao S, King I (2019) STAR-GCN: stacked and reconstructed graph convolutional networks for recommender systems. In: Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence. International Joint Conferences on Artificial Intelligence Organization, China, pp 4264–4270

  36. Veličković P, Cucurull G, Casanova A et al (2018) Graph Attention Networks. In: International conference on learning representations, pp 1–12

  37. Scarselli F, Gori M, Tsoi AC et al (2009) The graph neural network model. IEEE Transactions on Neural Networks 20:61–80

    Article  Google Scholar 

  38. Li Y, Tarlow D, Brockschmidt M, Zemel R (2015) Gated graph sequence neural networks. In: International conference on learning representations, pp 1–16

  39. Hinton G, Srivastava N, Krizhevsky A et al (2012) Improving neural networks by preventing co-adaptation of feature detectors. arXiv:12070580 [cs], 1–18

  40. Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: A simple way to prevent neural networks from overfitting. JMLR (2014) 1929–1958

  41. Lei Ba J, Kiros JR, Hinton GE (2016) Layer normalization. arXiv:1607.06450

Download references

Acknowledgements

This paper is made possible thanks to the generous support from the Natural Science Foundation of Shandong Province (ZR2021MF099), Jinan Science and Technology Project (201704065), PhD Funding Project of Shandong Jianzhu University (No. X19044Z), Science and Technology Program Project of Shandong Colleges and Universities (No.J17KA070), Undergraduate Education Reform Project of Shandong Province (M2021130), the National Natural Science Foundation of China (61902221,62177031).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Weihua Yuan.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Sang, S., Liu, N., Li, W. et al. High-order attentive graph neural network for session-based recommendation. Appl Intell 52, 16975–16989 (2022). https://doi.org/10.1007/s10489-022-03170-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03170-7

Keywords

Navigation