Reference Work Entry

Encyclopedia of Parallel Computing

pp 1479-1487

PaToH (Partitioning Tool for Hypergraphs)

  • Ümit ÇatalyürekAffiliated withDepartment of Biomedical Informatics, The Ohio State University
  • , Cevdet AykanatAffiliated withComputer Engineering Department, Bilkent University

Synonyms

Partitioning tool for hypergraphs (PaToH)

Definition

PaToH is a sequential, multilevel, hypergraph partitioning tool that can be used to solve various combinatorial scientific computing problems that could be modeled as hypergraph partitioning problem, including sparse matrix partitioning, ordering, and load balancing for parallel processing.

Discussion

Introduction

Hypergraph partitioning has been an important problem widely encountered in VLSI layout design [22]. Recent works since the late 1990s have introduced new application areas, including one-dimensional and two-dimensional partitioning of sparse matrices for parallel sparse-matrix vector multiplication [6812], sparse matrix reordering [611], permuting sparse rectangular matrices into singly bordered block-diagonal form for parallel solution of LP problems [3], and static and dynamic load balancing for parallel processing [5]. PaToH [9] has been developed to provide fast and high-quality solutions for these motivating applications.

In simple terms, the hypergraph partitioning problem can be defined as the task of dividing a hypergraph into two or more roughly equal sized parts such that a cost function on the hyperedges connecting vertices in different parts is minimized. The hypergraph partitioning problem is known to be NP-hard [22], therefore a wide variety of heuristic algorithms have been developed in the literature to solve this complex problem [115212325]. Following the success of multilevel partitioning schemes in ordering and graph partitioning [41618], PaToH [9] has been developed as one of the first multilevel hypergraph partitioning tools.

Preliminaries

A hypergraph \(\mathcal{H}\! =\! (\mathcal{V},\mathcal{N})\) is defined as a set of vertices (also called cells) \(\mathcal{V}\) and a set of nets (hyperedges) \(\mathcal{N}\) among those vertices. Every net \(n \in \mathcal{N}\) is a subset of vertices, that is, \(n\! \subseteq \!\mathcal{V}\). The vertices in a net n are called its pins in PaToH. The size of a net, s[n], is equal to the number of its pins. The degree of a vertex is equal to the number of nets it is connected to. Graph is a special instance of hypergraph such that each net has exactly two pins. Vertices and nets of a hypergraph can be associated with weights. For simplicity in the presentation, net weights are refered as cost here and denoted with c[. ], whereas w[. ] will be used for vertex weights.

\(\Pi \! =\!\{ {\mathcal{V}}_{1},{\mathcal{V}}_{2},\ldots, {\mathcal{V}}_{K}\}\) is a K-way partition of \(\mathcal{H}\) if the following conditions hold:

  • Each part \({\mathcal{V}}_{k}\) is a nonempty subset of \(\mathcal{V}\), that is, \({\mathcal{V}}_{k}\,\subseteq \,\mathcal{V}\) and \({\mathcal{V}}_{k}\neq \varnothing \) for \(1 \leq k \leq K\).

  • Parts are pairwise disjoint, that is, \({\mathcal{V}}_{k} \cap {\mathcal{V}}_{\mathcal{l}} = \varnothing \) for all \(1 \leq k < \mathcal{l} \leq K\).

  • Union of K parts is equal to \(\mathcal{V}\), that is, \({\bigcup \nolimits }_{k=1}^{K}{\mathcal{V}}_{k}\! =\! \mathcal{V}\).

In a partition Π of \(\mathcal{H}\), a net that has at least one pin (vertex) in a part is said to connect that part. Connectivity \({\lambda }_{n}\) of a net n denotes the number of parts connected by n. A net n is said to be cut (external) if it connects more than one part (i.e., \({\lambda }_{n}> 1\) ), and uncut (internal) otherwise (i.e., \({\lambda }_{n} = 1\) ). In a partition Π of \(\mathcal{H}\), a vertex is said to be a boundary vertex if it is incident to a cut net. A K-way partition is also called a multiway partition if K​ > ​ 2 and a bipartition if K​ = ​ 2. A partition is said to be balanced if each part \({\mathcal{V}}_{k}\) satisfies the balance criterion:
$${W}_{k} \leq {W}_{avg}(1 + \epsilon ),\ \ \ for\ k = 1,2,\ldots, K.$$
(1)
In (1), weight W k of a part \({\mathcal{V}}_{k}\) is defined as the sum of the weights of the vertices in that part (i.e., \({W}_{k}\! =\!{ \sum \nolimits }_{v\in {\mathcal{V}}_{k}}w[v]\) ), \({W}_{avg}\) denotes the weight of each part under the perfect load balance condition (i.e., \({W}_{avg}\! =\! ({\sum \nolimits }_{v\in \mathcal{V}}w[v])/K\) ), and ε represents the predetermined maximum imbalance ratio allowed.
The set of external nets of a partition Π is denoted as \({\mathcal{N}}_{E}\). There are various [1327] cutsize definitions for representing the cost \(\chi (\Pi )\) of a partition Π. Two relevant definitions are:
$$\begin{array}{rcl} (a)\ \ \ \ \chi (\Pi )& =& {\sum \limits _{n\in {\mathcal{N}}_{E}}}c[n]\ \ \ \ \ \ and\ \ \ \ \ \\ (b)\ \ \ \ \chi (\Pi )& =& {\sum \limits _{n\in {\mathcal{N}}_{E}}}c[n]({\lambda }_{n} - 1). \\ \end{array}$$
(2)

In (2a), the cutsize is equal to the sum of the costs of the cut nets. In (2b), each cut net n contributes \(c[n]({\lambda }_{n} - 1)\) to the cutsize. The cutsize metrics given in (2a) and (2b) will be referred to here as cut-net and connectivity metrics, respectively. The hypergraph partitioning problem can be defined as the task of dividing a hypergraph into two or more parts such that the cutsize is minimized, while a given balance criterion (1) among part weights is maintained.

A recent variant of the above problem is the multi-constraint hypergraph partitioning [26101924] in which each vertex has a vector of weights associated with it. The partitioning objective is the same as above, and the partitioning constraint is to satisfy a balancing constraint associated with each weight. Let w[v, i] denote the C weights of a vertex v for \(i = 1,\ldots, C\). Then balance criterion (1) can be rewritten as:
$${W}_{k, i} \leq {W}_{\mathit{avg}, i}\,(1 + \epsilon )\;\mbox{ for}\ k = 1,\ldots, K\mbox{ and }\ i = 1,\ldots, C\;,$$
(3)
where the ith weight \({W}_{k, i}\) of a part \({\mathcal{V}}_{k}\) is defined as the sum of the ith weights of the vertices in that part (i.e., \({W}_{k, i}\! =\!{ \sum \nolimits }_{v\in {\mathcal{V}}_{k}}w[v, i]\) ), and \({W}_{avg, i}\) is the average part weight for the ith weight (i.e., \({W}_{\mathit{avg}, i}\! =\! ({\sum \nolimits }_{v\in \mathcal{V}}w[v, i])/K\) ), and ε again represents allowed imbalance ratio.

Another variant is the hypergraph partitioning with fixed vertices, in which some of the vertices are fixed in some parts before partitioning. In other words, in this problem, a fixed-part function is provided as an input to the problem. A vertex is said to be free if it is allowed to be in any part in the final partition, and it is said to be fixed in part k if it is required to be in \({\mathcal{V}}_{k}\) in the final partition Π.

Using PaToH

PaToH provides a set of functions to read, write, and partition a given hypergraph, and evaluate the quality of a given partition. In terms of partitioning, PaToH provides a user customizable hypergraph partitioning via multilevel partitioning scheme. In addition, PaToH provides hypergraph partitioning with fixed cells and multi-constraint hypergraph partitioning.

Application developers who would like to use PaToH can either directly use PaToH through a simple, easy-to-use C library interface in their applications, or they can use stand-alone executable.

PaToH Library Interface

PaToH library interface consists of two files: a header file patoh.h which contains constants, structure definitions, and functions proto-types, and a library file libpatoh.a.

Before starting to discuss the details, it is instructive to have a look at a simple C program that partitions an input hypergraph using PaToH functions. The program is displayed in Fig. 1. The first statement is a function call to read the input hypergraph file which is given by the first command line argument. PaToH partition functions are customizable through a set of parameters. Although the application user can set each of these parameters one by one, it is a good habit to call PaToH function PaToH_Initialize_Parameters to set all parameters to one of the three preset default values by specifying PATOH_SUGPARAM_<preset>, where <preset> is DEFAULT, SPEED, or QUALITY. After this call, the user may prefer to modify the parameters according to his/her need before calling PaToH_Alloc. All memory that will be used by PaToH partitioning functions is allocated by PaToH_Alloc function, that is, there will be no more dynamic memory allocation inside the partitioning functions. Now, everything is set to partition the hypergraph using PaToH’s multilevel hypergraph partitioning functions. A call to PaToH_Partition (or PaToH_MultiConst_Partition) will partition the hypergraph, and the resulting partition vector, part weights, and cutsize will be returned in the parameters. Here, variable cut will hold the cutsize of the computed partition according to cutsize definition (2b) since, this metric is specified by initializing the parameters with constant PATOH_CONPART. The user may call partitioning functions as many times as he/she wants before calling function PaToH_Free. There is no need to reallocate the memory before each partitioning call, unless either the hypergraph or the desired customization (like changing coarsening algorithm, or number of parts) is changed.
https://static-content.springer.com/image/prt%3A978-0-387-09766-4%2F16/MediaObjects/978-0-387-09766-4_16_Part_Fig1-93_HTML.jpg
PaToH (Partitioning Tool for Hypergraphs). Fig. 1

A simple C program that partitions an input hypergraph using PaToH functions

A hypergraph and its representation can be seen in Fig. 2. In the figure, large circles are cells (vertices) of the hypergraph, and small circles are nets. xpins and pins arrays store the beginning index of pins (cells) connected to each net, and IDs of the pins, respectively. Hence, xpins is an array of size equal to the number of nets plus one (11 in this example), and pins is an array of size equal to the number of pins in the hypergraph (31 in this example). Cells connected to net n j are stored in pins[xpins[j]] through pins[xpins[j+1]-1].
https://static-content.springer.com/image/prt%3A978-0-387-09766-4%2F16/MediaObjects/978-0-387-09766-4_16_Part_Fig2-93_HTML.gif
PaToH (Partitioning Tool for Hypergraphs). Fig. 2

A sample hypergraph and its representation

Stand-Alone Program

Distribution includes a stand-alone program, called patoh, for single constraint partitioning (this executable will not work with multiple vertex weights; for multi-constraint partitioning there is an interface and some sample source codes). The program patoh gets its parameters from command line arguments. PaToH can be run from command line as follows:

> patoh <hypergraph-file>

<number-of-parts> [[parameter1]

[parameter2]....].

Partitioning can be customized by using the optional [parameter] arguments. The syntax of these optional parameters is as follows: two-letter abbreviation of a parameter is followed by an equal sign and a value. For example, if the user wishes to change refinement algorithm (abbreviated as “RA”) to “Kernighan–Lin with dynamic locking” (sixth algorithm out of 12 implemented in PaToH), the user should specify “RA=6.” For a complete example, consider the sample hypergraph displayed in Fig. 2. In order to partition this hypegraph into three parts by using the Kernighan–Lin refinement algorithm with cut-net metric (the default is connectivity metric (Equation (2b)), one has to issue the following command whose output is shown next:

This output shows that the cutsize (cut cost) according to cut-net metric is 2. Final imbalance ratios (in parentheses) for the least loaded and the most loaded parts are 0% (perfect balance with four vertices in each part), and partitioning only took about 1 ms. The input hypergraph and resulting partition is displayed in Fig. 3. A quick summary of the input file format (the details are provided in the PaToH manual [9]) is as follows: the first non-comment line of the file is a header containing the index base (0 or 1) and the size of the hypergraph, and information for each net (only pins in this case) and cells (none in this example) follows.
https://static-content.springer.com/image/prt%3A978-0-387-09766-4%2F16/MediaObjects/978-0-387-09766-4_16_Part_Fig3-93_HTML.gif
PaToH (Partitioning Tool for Hypergraphs). Fig. 3

Text file representation of the sample hypergraph in Fig. 2 and illustration of a partition found by PaToH

All of the PaToH customization parameters that are available through library interface are also available as command line options. PaToH manual [9] contains details of each of those customization parameters.

Customizing PaToH’s Hypergraph Partitioning

PaToH achieves K-way hypergraph partitioning through recursive bisection (two-way partition), and at each bisection step it uses a multilevel hypergraph bisection algorithm. In the recursive bisection, first a bisection of \(\mathcal{H}\) is obtained, and then each part of this bipartition is further partitioned recursively. After \({\lg }_{2}K\) steps, hypergraph \(\mathcal{H}\) is partitioned into K parts. Please note that, K is not restricted to be a power of 2. For any K > 1, one can achieve K-way hypergraph partitioning through recursive bisection by first partitioning \(\mathcal{H}\) into two parts with a load ratio of \(\lfloor K/2\rfloor \) to \((K -\lfloor K/2\rfloor )\), and then recursively partitioning those parts into \(\lfloor K/2\rfloor \) and \((K -\lfloor K/2\rfloor )\) parts, respectively, using the same approach.

A pseudo-code of the multilevel hypergraph bisection algorithm used in PaToH is displayed in Algorithm ??. Mainly, the algorithm has three phases: coarsening, initial partitioning, and uncoarsening. In the first phase, a bottom-up multilevel clustering is successively applied starting from the original hypergraph until either the number of vertices in the coarsened hypergraph reduces below a predetermined threshold value or clustering fails to reduce the size of the hypergraph significantly. In the second phase, the coarsest hypergraph is bipartitioned using one of the 12 initial partitioning techniques. In the third phase, the partition found in the second phase is successively projected back towards the original hypergraph while it is being improved by one of the iterative refinement heuristics. These three phases are summarized below.
https://static-content.springer.com/image/prt%3A978-0-387-09766-4%2F16/MediaObjects/978-0-387-09766-4_16_Part_Fig4-93_HTML.jpg

1. Coarsening Phase: In this phase, the given hypergraph \(\mathcal{H}\! =\! {\mathcal{H}}_{0}\! =\! ({\mathcal{V}}_{0},{\mathcal{N}}_{0})\) is coarsened into a sequence of smaller hypergraphs \({\mathcal{H}}_{1}\! =\! ({\mathcal{V}}_{1},{\mathcal{N}}_{1})\), \({\mathcal{H}}_{2}\! =\! ({\mathcal{V}}_{2},{\mathcal{N}}_{2})\), \(\ldots \), \({\mathcal{H}}_{\mathcal{l}}\! =\! ({\mathcal{V}}_{\mathcal{l}},{\mathcal{N}}_{\mathcal{l}})\) satisfying \(\vert {\mathcal{V}}_{0}\vert \!>\! \vert {\mathcal{V}}_{1}\vert \!\! >\! \vert {\mathcal{V}}_{2}\vert \! > \ldots >\! \vert {\mathcal{V}}_{\mathcal{l}}\vert \). This coarsening is achieved by coalescing disjoint subsets of vertices of hypergraph \({\mathcal{H}}_{i}\) into clusters such that each cluster in \({\mathcal{H}}_{i}\) forms a single vertex of \({\mathcal{H}}_{i+1}\). The weight of each vertex of \({\mathcal{H}}_{i+1}\) becomes equal to the sum of its constituent vertices of the respective cluster in \({\mathcal{H}}_{i}\). The net set of each vertex of \({\mathcal{H}}_{i+1}\) becomes equal to the union of the net sets of the constituent vertices of the respective cluster in \({\mathcal{H}}_{i}\). Here, multiple pins of a net \(n\! \in \!{\mathcal{N}}_{i}\) in a cluster of \({\mathcal{H}}_{i}\) are contracted to a single pin of the respective net \(n^\prime\! \in \!{\mathcal{N}}_{i+1}\) of \({\mathcal{H}}_{i+1}\). Furthermore, the single-pin nets obtained during this contraction are discarded. The coarsening phase terminates when the number of vertices in the coarsened hypergraph reduces below the predetermined number or clustering fails to reduce the size of the hypergraph significantly.

In PaToH, two types of clusterings are implemented, matching-based, where each cluster contains at most of two vertices; and agglomerative-based, where clusters can have more than two vertices. The former is simply called matching in PaToH, and the latter is called clustering.

The matching-based clustering works as follows. Vertices of \({\mathcal{H}}_{i}\) are visited in a user-specified order (could be random, degree sorted, etc.). If a vertex \(u\! \in \!{\mathcal{V}}_{i}\) has not been matched yet, one of its unmatched adjacent vertices is selected according to a criterion. If such a vertex v exists, the matched pair u and v are merged into a cluster. If there is no unmatched adjacent vertex of u, then vertex u remains unmatched, that is, u remains as a singleton cluster. Here, two vertices u and v are said to be adjacent if they share at least one net, that is, \(nets[u] \cap nets[v]\neq \varnothing \).

In the agglomerative clustering schemes, each vertex u is assumed to constitute a singleton cluster \({C}_{u}\! =\!\{ u\}\) at the beginning of each coarsening level. Then, vertices are again visited in a user specified order. If a vertex u has already been clustered (i.e., \(\vert {C}_{u}\vert \!>\! 1\) ) it is not considered for being the source of a new clustering. However, an unclustered vertex u can choose to join a multi-vertex cluster as well as a singleton cluster. That is, all adjacent vertices of an unclustered vertex u are considered for selection according to a criterion. The selection of a vertex v adjacent to u corresponds to including vertex u to cluster C v to grow a new multi-vertex cluster \({C}_{u}\! =\! {C}_{v}\! =\! {C}_{v} \cup \{ u\}\).

PaToH includes a total of 17 coarsening algorithms: eight matchings and nine clustering algorithms, and the default method is a clustering algorithm that uses absorption metric. In this method, when selecting the adjacent vertex v to cluster with vertex u, vertex v is selected to maximize \({\sum \nolimits }_{n\in nets[u]\cap nets[{C}_{v}]}\frac{\vert {C}_{v}\cap n\vert } {s[n]-1}\), where \(nets[{C}_{v}] = {\cup }_{w\in {C}_{v}}nets[w]\).

2. Initial Partitioning Phase: The goal in this phase is to find a bipartition on the coarsest hypergraph \({\mathcal{H}}_{\mathcal{l}}\). PaToH includes various random partitioning methods as well as variations of Greedy Hypergraph Growing (GHG) algorithm for bisecting \({\mathcal{H}}_{\mathcal{l}}\). In GHG, a cluster is grown around a randomly selected vertex. During the coarse of the algorithm, the selected and unselected vertices induce a bipartition on \({\mathcal{H}}_{\mathcal{l}}\). The unselected vertices connected to the growing cluster are inserted into a priority queue according to their move-gain [15], where the gain of an unselected vertex corresponds to the decrease in the cutsize of the current bipartition if the vertex moves to the growing cluster. Then, a vertex with the highest gain is selected from the priority queue. After a vertex moves to the growing cluster, the gains of its unselected adjacent vertices that are currently in the priority queue are updated and those not in the priority queue are inserted. This cluster growing operation continues until a predetermined bipartition balance criterion is reached. The quality of this algorithm is sensitive to the choice of the initial random vertex. Since the coarsest hypergraph \({\mathcal{H}}_{\mathcal{l}}\) is small, initial partitioning heuristics can be run multiple times and select the best bipartition for refinement during the uncoarsening phase. By default, PaToH runs 11 different initial partitioning algorithms and selects the bipartition with lowest cost.

3. Uncoarsening Phase: At each level i (for \(i = \mathcal{l},\mathcal{l}\! -\! 1,\ldots, 1\) ), bipartition Π i found on \({\mathcal{H}}_{i}\) is projected back to a bipartition \({\Pi }_{i-1}\) on \({\mathcal{H}}_{i-1}\). The constituent vertices of each cluster in \({\mathcal{H}}_{i-1}\) is assigned to the part of the respective vertex in \({\mathcal{H}}_{i}\). Obviously, \({\Pi }_{i-1}\) of \({\mathcal{H}}_{i-1}\) has the same cutsize with \({\Pi }_{i}\) of \({\mathcal{H}}_{i}\). Then, this bipartition is refined by running a KL/FM-based iterative improvement heuristics on \({\mathcal{H}}_{i-1}\) starting from initial bipartition \({\Pi }_{i-1}\). PaToH provides 12 refinement algorithms that are based on the well-known Kernighan–Lin (KL) [20] and Fiduccia–Mattheyses (FM) [15] algorithms. These iterative algorithms try to improve the given partition by either swapping vertices between parts or moving vertices from one part to other, while not violating the balance criteria. They also provide heuristic mechanisms to avoid local minima. These algorithms operate on passes. In each pass, a sequence of unmoved/unswapped vertices with the highest gains are selected for move/swap, one by one. At the end of a pass, the maximum prefix subsequence of moves/swaps with the maximum prefix sum that incurs the maximum decrease in the cutsize is constructed, allowing the method to jump over local minima. The permanent realization of the moves/swaps in this maximum prefix subsequence is efficiently achieved by rolling back the remaining moves at the end of the overall sequence. The overall refinement process in a level terminates if the maximum prefix sum of a pass is not positive.

PaToH includes original KL and FM implementations, hybrid versions, like one pass FM followed by one pass KL, as well as improvements like multilevel-gain concept of Krishnamurthy [21] that adds a look-ahead ability, or dynamic locking of Hoffman [17], and Dasdan and Aykanat [14] that relaxes vertex moves allowing a vertex to be moved multiple times in the same pass. PaToH also provides heuristic trade-offs, like early-termination in a pass of KL/FM algorithms, or boundary KL/FM, which only considers vertices that are in the boundary, to speed up the refinement. The default refinement scheme is boundary FM+KL.

Related Entries

Chaco

Data Distribution

Graph Algorithms

Graph Partitioning

Hypergraph Partitioning

Linear Algebra, Numerical

Preconditioners for Sparse Iterative Methods

Bibliographic Notes and Further Reading

Latest PaToH binary distributions, including recently developed MATLAB interface [26], and related papers can be found on the Web site listed in  [9]. The “Hypergraph Partitioning” entry contains some use cases of hypergraph partitioning.

Copyright information

© Springer Science+Business Media, LLC 2011
Show all