12.1 Introduction

We have designed DeepHypergraph (DHG),Footnote 1 a deep learning library built upon PyTorchFootnote 2 for hypergraph computation. It is a general framework that supports both low-order and high-order message passing such as from vertex to vertex, from vertex in one domain to vertex in another domain, from vertex to hyperedge, from hyperedge to vertex, and from vertex set to vertex set. It supports the generation of a wide variety of structures such as low-order structures (graph, directed graph, bipartite graph, etc.) and high-order structures (hypergraph, etc.). Various spectral-based operations (such as Laplacian-based smoothing) and spatial-based operations (such as message passing from domain to domain) are integrated inside different structures. It also provides multiple common metrics for performance evaluation on different tasks. A group of state-of-the-art models has also been implemented and can be easily used for research. We also provide several visualization tools for demonstration of both low-order structures and high-order structures. Besides, the dhg.experiments module (that implements Auto-ML upon OptunaFootnote 3) can automatically tune the hyperparameters of the models in training and return the model with the best performance. In this chapter, we first introduce the correlation structures in DHG and then introduce the function library in DHG.

12.2 The Correlation Structures in DHG

The core motivation of designing the DHG library is to attach the spectral-based and spatial-based operations to each specified structure. When a structure has been created, these related Laplacian matrices and message passing operations with different aggregation functions can be called and combined to manipulate different input features. Figure 12.1 illustrates the architecture of the “correlation structure” in DHG. Currently, the implemented correlation structures of DHG include graph, directed graph, bipartite graph, and hypergraph. For each correlation structure, DHG has developed the corresponding basic operations, such as construction and structure modification functions, related structure transformation functions, and learning functions.

Fig. 12.1
A framework of D H G architecture. It consists of low-order structures, high-order structures, and correlation structures and respective attributes like basic operations, structure transformation, and computation and learning.

The architecture of the “correlation structures” in DHG

The most computation process on those correlation structures (graph, hypergraph, etc.) can be divided into two categories: spectral-based convolution and spatial-based message passing. The spectral-based convolution methods, such as typical GCN [1] and HGNN [2], learn a Laplacian matrix for a given structure and perform vertex feature smoothing with the generated Laplacian matrix to embed low-order and high-order structures to vertex features. The spatial-based message passing methods, such as typical GraphSAGE [3], GAT [4], and HGNN+ [5], perform vertex to vertex, vertex to hyperedge, hyperedge to vertex, and vertex set to vertex set message passing to embed the low-order and high-order structures to vertex features. The learned vertex features can also be pooled to generate the unified structure feature. Finally, the learned vertex features or structure features can be fed into many downstream tasks, such as classification, retrieval, regression, and link prediction, and applications including paper classification, movie recommender, drug exploitation, etc.

12.3 The Function Library in DHG

To facilitate the complex and repetition codes of learning on correlation structures, DHG further provides the function library. As shown in Fig. 12.2, the function library includes five parts: data module, metric module, visualization module, auto-ML module, and structure generators module.

Fig. 12.2
A framework of D H G function library. It consists data, metric, visualization, auto-M L, structure generators, and their attributes.

The architecture of the “function library” in DHG

In the data module, DHG integrates more than 20 public graph/bipartite graph/hypergraph datasets and some commonly used pre-process function such as File Loader and Normalization. By default, DHG can automatically download the integrated datasets and check the integrity of the downloaded files. You can also manually construct your own dataset of DHG style with the existing Datapipe functions in DHG.

In the metric module, DHG has provided many widely used metrics such as Accuracy, Recall, and mAP for different tasks. Some encapsulation evaluators for different tasks such as classification, retrieval, and recommendation have also been implemented. Besides, DHG provides the structure and feature visualization functions, automatic hyperparameters search function, and random structure generation functions for different applications.

12.4 Summary

In this chapter, we introduce the DHG library for hypergraph computation. It simultaneously supports the generation and learning on low-order structures and high-order structures. Besides, many commonly used functions have also been integrated in the library.