Abstract
Many problems in computer vision and machine learning can be cast as learning on hypergraphs that represent higher-order relations. Recent approaches for hypergraph learning extend graph neural networks based on message passing, which is simple yet fundamentally limited in modeling long-range dependencies and expressive power. On the other hand, tensor-based equivariant neural networks enjoy maximal expressiveness, but their application has been limited in hypergraphs due to heavy computation and strict assumptions on fixed-order hyperedges. We resolve these problems and present Equivariant Hypergraph Neural Network (EHNN), the first attempt to realize maximally expressive equivariant layers for general hypergraph learning. We also present two practical realizations of our framework based on hypernetworks (EHNN-MLP) and self-attention (EHNN-Transformer), which are easy to implement and theoretically more expressive than most message passing approaches. We demonstrate their capability in a range of hypergraph learning problems, including synthetic k-edge identification, semi-supervised classification, and visual keypoint matching, and report improved performances over strong message passing baselines. Our implementation is available at https://github.com/jw9730/ehnn.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Higher-order tensors can in principle represent directed hypergraphs as well; we constrain them to be symmetric to specifically represent undirected hypergraphs.
- 2.
Note that we do not share the hypernetworks across different levels of layers.
References
Albooyeh, M., Bertolini, D., Ravanbakhsh, S.: Incidence networks for geometric deep learning. arXiv (2019)
Arya, D., Gupta, D.K., Rudinac, S., Worring, M.: Hypersage: generalizing inductive representation learning on hypergraphs. arXiv (2020)
Bai, S., Zhang, F., Torr, P.H.S.: Hypergraph convolution and hypergraph attention. Pattern Recognit. (2021)
Belongie, S.J., Malik, J., Puzicha, J.: Shape matching and object recognition using shape contexts. IEEE Trans. Pattern Anal. Mach. Intell. (2002)
Berend, D., Tassa, T.: Improved bounds on bell numbers and on moments of sums of random variables. Probability and Mathematical Statistics. (2010)
Berge, C., Minieka., E.: Graphs and Hypergraphs. North-Holland Publishing Company (1981)
Bevilacqua, B., et al.: Equivariant subgraph aggregation networks. In: ICLR (2022)
Botsch, M., Pauly, M., Kobbelt, L., Alliez, P., Lévy, B.: Geometric modeling based on polygonal meshes. In: Eurographics Tutorials (2008)
Bourdev, L., Malik, J.: Poselets: body part detectors trained using 3d human pose annotations. In: ICCV (2009)
Bu, J., et al.: Music recommendation by unified hypergraph: combining social media information and music content. In: Proceedings of the 18th International Conference on Multimedia 2010, Firenze, Italy, 25–29 October, 2010 (2010)
Cai, C., Wang, Y.: A note on over-smoothing for graph neural networks. arXiv (2020)
Chen, Z., Chen, L., Villar, S., Bruna, J.: Can graph neural networks count substructures? In: NeurIPS (2020)
Chien, E., Pan, C., Peng, J., Milenkovic, O.: You are allset: a multiset function framework for hypergraph neural networks. In: ICLR (2022)
Cho, M., Alahari, K., Ponce, J.: Learning graphs to match. In: ICCV (2013)
Ding, K., Wang, J., Li, J., Li, D., Liu, H.: Be more with less: hypergraph attention networks for inductive text classification. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020, Online, 16–20 November, 2020 (2020)
Dong, Y., Sawin, W., Bengio, Y.: HNHN: hypergraph networks with hyperedge neurons. arXiv (2020)
Everingham, M., Van Gool, L., Williams, C.K., Winn, J., Zisserman, A.: The pascal visual object classes (voc) challenge. Int. J. Comput. Vis. (2010)
Feng, Y., You, H., Zhang, Z., Ji, R., Gao, Y.: Hypergraph neural networks. In: AAAI (2019)
Gao, Y., Wang, M., Tao, D., Ji, R., Dai, Q.: 3-d object retrieval and recognition with hypergraph analysis. IEEE Trans. Image Process. (2012)
Gilmer, J., Schoenholz, S.S., Riley, P.F., Vinyals, O., Dahl, G.E.: Neural message passing for quantum chemistry. In: ICML (2017)
Gu, F., Chang, H., Zhu, W., Sojoudi, S., Ghaoui, L.E.: Implicit graph neural networks. In: NeurIPS (2020)
Gu, S., Yang, M., Medaglia, J.D., Gur, R.C., Gur, R.E., Satterthwaite, T.D., Bassett, D.S.: Functional hypergraph uncovers novel covariant structures over neurodevelopment. Human Brain Mapp. (2017)
Ha, D., Dai, A.M., Le, Q.V.: Hypernetworks. In: ICLR (2017)
Hamilton, W.L., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: NeurIPS (2017)
Hornik, K., Stinchcombe, M.B., White, H.: Multilayer feedforward networks are universal approximators. Neural Networks (1989)
Huang, J., Yang, J.: Unignn: a unified framework for graph and hypergraph neural networks. In: IJCAI (2021)
Ishiguro, K., ichi Maeda, S., Koyama, M.: Graph warp module: an auxiliary module for boosting the power of graph neural networks in molecular graph analysis. arXiv (2019)
Kalai, G.: Linear programming, the simplex algorithm and simple polytopes. Math. Program. (1997)
Keriven, N., Peyré, G.: Universal invariant and equivariant graph neural networks. In: NeurIPS (2019)
Kim, E., Kang, W., On, K., Heo, Y., Zhang, B.: Hypergraph attention networks for multimodal learning. In: CVPR (2020)
Kim, J., et al.: Pure transformers are powerful graph learners. arXiv (2022)
Kim, J., Oh, S., Hong, S.: Transformers generalize deepsets and can be extended to graphs and hypergraphs. In: NeurIPS (2021)
Klimm, F., Deane, C.M., Reinert, G., Estrada, E.: Hypergraphs for predicting essential genes using multiprotein complex data. J. Complex Networks 9, cnaa028 (2021)
Knyazev, B., Taylor, G.W., Amer, M.R.: Understanding attention and generalization in graph neural networks. In: NeurIPS (2019)
Kofidis, E., Regalia, P.A.: On the best rank-1 approximation of higher-order supersymmetric tensors. Siam J. Matrix Anal. Appl. 23, 863–884 (2002)
Lee, J., et al.: Set transformer: a framework for attention-based permutation-invariant neural networks. In: ICML (2019)
Li, D., Xu, Z., Li, S., Sun, X.: Link prediction in social networks based on hypergraph. In: 22nd International World Wide Web Conference, WWW ’13, Rio de Janeiro, Brazil, 13–17 May 2013, Companion Volume (2013)
Li, J., Cai, D., He, X.: Learning graph-level representation for drug discovery. arXiv (2017)
Li, Q., Han, Z., Wu, X.: Deeper insights into graph convolutional networks for semi-supervised learning. In: AAAI (2018)
Louis, S.M., et al.: Global attention based graph convolutional neural networks for improved materials property prediction. arXiv (2020)
Lowe, D.G.: Object recognition from local scale-invariant features. In: ICCV (1999)
Maron, H., Ben-Hamu, H., Serviansky, H., Lipman, Y.: Provably powerful graph networks. In: NeurIPS (2019)
Maron, H., Ben-Hamu, H., Shamir, N., Lipman, Y.: Invariant and equivariant graph networks. In: ICLR (2019)
Maron, H., Fetaya, E., Segol, N., Lipman, Y.: On the universality of invariant networks. In: ICML (2019)
Maron, H., Litany, O., Chechik, G., Fetaya, E.: On learning sets of symmetric elements. In: ICML (2020)
Milano, F., Loquercio, A., Rosinol, A., Scaramuzza, D., Carlone, L.: Primal-dual mesh convolutional neural networks. In: NeurIPS (2020)
Oono, K., Suzuki, T.: Graph neural networks exponentially lose expressive power for node classification. In: ICLR (2020)
Puny, O., Ben-Hamu, H., Lipman, Y.: From graph low-rank global attention to 2-fwl approximation. In: ICML (2020)
Ray, L.A.: 2-d and 3-d image registration for medical, remote sensing, and industrial applications. J. Electronic Imaging (2005)
Rolínek, M., Swoboda, P., Zietlow, D., Paulus, A., Musil, V., Martius, G.: Deep graph matching via Blackbox differentiation of combinatorial solvers. In: Vedaldi, A., Bischof, H., Brox, T., Frahm, J.-M. (eds.) ECCV 2020. LNCS, vol. 12373, pp. 407–424. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58604-1_25
Serviansky, H., et al.: Set2graph: learning graphs from sets. In: NeurIPS (2020)
Tan, S., Bu, J., Chen, C., He, X.: Using rich social media information for music recommendation via hypergraph model. In: Hoi, S., Luo, J., Boll, S., Xu, D., Jin, R., King, I. (eds.) Social Media Modeling and Computing. Springer, London (2011). https://doi.org/10.1007/978-0-85729-436-4_10
Vaswani, A., et al.: Attention is all you need. In: NeurIPS (2017)
Velikovic, P.: Message passing all the way up. arXiv (2022)
Wang, R., Yan, J., Yang, X.: Neural graph matching network: Learning lawler’s quadratic assignment problem with extension to hypergraph and multiple-graph matching. IEEE Trans. Pattern Anal. Mach. Intell. (2021)
Wu, Z., Jain, P., Wright, M.A., Mirhoseini, A., Gonzalez, J.E., Stoica, I.: Representing long-range context for graph neural networks with global attention. arXiv (2021)
Yadati, N., Nimishakavi, M., Yadav, P., Nitin, V., Louis, A., Talukdar, P.P.: Hypergcn: A new method for training graph convolutional networks on hypergraphs. In: NeurIPS (2019)
Yavartanoo, M., Hung, S., Neshatavar, R., Zhang, Y., Lee, K.M.: Polynet: Polynomial neural network for 3d shape recognition with polyshape representation. In: 3DV (2021)
Zaheer, M., Kottur, S., Ravanbakhsh, S., Póczos, B., Salakhutdinov, R., Smola, A.J.: Deep sets. In: NeurIPS (2017)
Zhang, M., Cui, Z., Neumann, M., Chen, Y.: An end-to-end deep learning architecture for graph classification. In: AAAI (2018)
Acknowledgement
This work was supported by Institute of Information & communications Technology Planning & Evaluation (IITP) (No. 2021-0-00537, 2019-0-00075 and 2021-0-02068), the National Research Foundation of Korea (NRF) (No. 2021R1C1C1012540 and 2021R1A4A3032834), and Korea Meteorological Administration Research and Development Program “Development of AI techniques for Weather Forecasting” under Grant (KMA2021-00121).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Kim, J., Oh, S., Cho, S., Hong, S. (2022). Equivariant Hypergraph Neural Networks. In: Avidan, S., Brostow, G., Cissé, M., Farinella, G.M., Hassner, T. (eds) Computer Vision – ECCV 2022. ECCV 2022. Lecture Notes in Computer Science, vol 13681. Springer, Cham. https://doi.org/10.1007/978-3-031-19803-8_6
Download citation
DOI: https://doi.org/10.1007/978-3-031-19803-8_6
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-19802-1
Online ISBN: 978-3-031-19803-8
eBook Packages: Computer ScienceComputer Science (R0)