Abstract
This chapter introduces the DeepHypergraph library, which bridges the hypergraph theory and hypergraph applications. This library provides the generation of multiple low-order structures (such as graph and directed graph), high-order structures (such as hypergraph and directed hypergraph), datasets, operations, learning methods, visualizations, etc. We first introduce the design motivation and the overall architecture of the library. Then, we introduce the “correlation structure” and “function library” of the Deephypergraph library, respectively.
You have full access to this open access chapter, Download chapter PDF
12.1 Introduction
We have designed DeepHypergraph (DHG),Footnote 1 a deep learning library built upon PyTorchFootnote 2 for hypergraph computation. It is a general framework that supports both low-order and high-order message passing such as from vertex to vertex, from vertex in one domain to vertex in another domain, from vertex to hyperedge, from hyperedge to vertex, and from vertex set to vertex set. It supports the generation of a wide variety of structures such as low-order structures (graph, directed graph, bipartite graph, etc.) and high-order structures (hypergraph, etc.). Various spectral-based operations (such as Laplacian-based smoothing) and spatial-based operations (such as message passing from domain to domain) are integrated inside different structures. It also provides multiple common metrics for performance evaluation on different tasks. A group of state-of-the-art models has also been implemented and can be easily used for research. We also provide several visualization tools for demonstration of both low-order structures and high-order structures. Besides, the dhg.experiments module (that implements Auto-ML upon OptunaFootnote 3) can automatically tune the hyperparameters of the models in training and return the model with the best performance. In this chapter, we first introduce the correlation structures in DHG and then introduce the function library in DHG.
12.2 The Correlation Structures in DHG
The core motivation of designing the DHG library is to attach the spectral-based and spatial-based operations to each specified structure. When a structure has been created, these related Laplacian matrices and message passing operations with different aggregation functions can be called and combined to manipulate different input features. Figure 12.1 illustrates the architecture of the “correlation structure” in DHG. Currently, the implemented correlation structures of DHG include graph, directed graph, bipartite graph, and hypergraph. For each correlation structure, DHG has developed the corresponding basic operations, such as construction and structure modification functions, related structure transformation functions, and learning functions.
The most computation process on those correlation structures (graph, hypergraph, etc.) can be divided into two categories: spectral-based convolution and spatial-based message passing. The spectral-based convolution methods, such as typical GCN [1] and HGNN [2], learn a Laplacian matrix for a given structure and perform vertex feature smoothing with the generated Laplacian matrix to embed low-order and high-order structures to vertex features. The spatial-based message passing methods, such as typical GraphSAGE [3], GAT [4], and HGNN+ [5], perform vertex to vertex, vertex to hyperedge, hyperedge to vertex, and vertex set to vertex set message passing to embed the low-order and high-order structures to vertex features. The learned vertex features can also be pooled to generate the unified structure feature. Finally, the learned vertex features or structure features can be fed into many downstream tasks, such as classification, retrieval, regression, and link prediction, and applications including paper classification, movie recommender, drug exploitation, etc.
12.3 The Function Library in DHG
To facilitate the complex and repetition codes of learning on correlation structures, DHG further provides the function library. As shown in Fig. 12.2, the function library includes five parts: data module, metric module, visualization module, auto-ML module, and structure generators module.
In the data module, DHG integrates more than 20 public graph/bipartite graph/hypergraph datasets and some commonly used pre-process function such as File Loader and Normalization. By default, DHG can automatically download the integrated datasets and check the integrity of the downloaded files. You can also manually construct your own dataset of DHG style with the existing Datapipe functions in DHG.
In the metric module, DHG has provided many widely used metrics such as Accuracy, Recall, and mAP for different tasks. Some encapsulation evaluators for different tasks such as classification, retrieval, and recommendation have also been implemented. Besides, DHG provides the structure and feature visualization functions, automatic hyperparameters search function, and random structure generation functions for different applications.
12.4 Summary
In this chapter, we introduce the DHG library for hypergraph computation. It simultaneously supports the generation and learning on low-order structures and high-order structures. Besides, many commonly used functions have also been integrated in the library.
Notes
References
T.N. Kipf, M. Welling, Semi-supervised classification with graph convolutional networks, in Proceedings of the International Conference on Learning Representations (2016)
Y. Feng, H. You, Z. Zhang, R. Ji, Y. Gao, Hypergraph neural networks, in Proceedings of the AAAI Conference on Artificial Intelligence (2019), pp. 3558–3565
W. Hamilton, Z. Ying, J. Leskovec, Inductive representation learning on large graphs, in Proceedings of the Advances in Neural Information Processing Systems, vol. 30 (2017)
P. Veličković, G. Cucurull, A. Casanova, A. Romero, P. Lio, Y. Bengio, (2017). Graph attention networks, in Proceedings of the International Conference on Learning Representations (2018)
Y. Gao, Y. Feng, S. Ji, R. Ji, HGNN+: General hypergraph neural networks. IEEE Trans. Pattern Analy. Mach. Intell. 45(3), 3181–3199 (2023)
Author information
Authors and Affiliations
Rights and permissions
Open Access This chapter is licensed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license and indicate if changes were made.
The images or other third party material in this chapter are included in the chapter's Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the chapter's Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder.
Copyright information
© 2023 The Author(s)
About this chapter
Cite this chapter
Dai, Q., Gao, Y. (2023). The DeepHypergraph Library. In: Hypergraph Computation. Artificial Intelligence: Foundations, Theory, and Algorithms. Springer, Singapore. https://doi.org/10.1007/978-981-99-0185-2_12
Download citation
DOI: https://doi.org/10.1007/978-981-99-0185-2_12
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-0184-5
Online ISBN: 978-981-99-0185-2
eBook Packages: Computer ScienceComputer Science (R0)