Joint Partitioning and Sampling Algorithm for Scaling Graph Neural Network | IEEE Conference Publication | IEEE Xplore

Joint Partitioning and Sampling Algorithm for Scaling Graph Neural Network


Abstract:

Graph Neural Network (GNN) has emerged as a popular toolbox for solving complex problems on graph data structures. Graph neural networks use machine learning techniques t...Show More

Abstract:

Graph Neural Network (GNN) has emerged as a popular toolbox for solving complex problems on graph data structures. Graph neural networks use machine learning techniques to learn the vector representations of nodes and/or edges. Learning these representations demands a huge amount of memory and computing power. The traditional shared-memory multiprocessors are insufficient to meet real-world data’s computing requirements; hence, research has gained momentum toward distributed GNN.Scaling the distributed GNN has the following challenges: (1) the input graph needs to be efficiently partitioned, (2) the cost of communication between compute nodes should be reduced, and (3) the sampling strategy should be efficiently chosen to minimize the loss in accuracy. To address these challenges, we propose a joint partitioning and sampling algorithm, which partitions the input graph with weighted METIS and uses a bias sampling strategy to minimize total communication costs.We implemented our approach using the DistDGL framework and evaluated it using several real-world datasets. We observe that our approach (1) shows an average reduction in communication overhead by 53%, (2) requires less partitioning time to partition a graph, (3) shows improved accuracy, (4) shows a speed up of 1.5x on OGB-Arxiv dataset, when compared to the state-of-the-art DistDGL implementation.
Date of Conference: 18-21 December 2022
Date Added to IEEE Xplore: 26 April 2023
ISBN Information:

ISSN Information:

Conference Location: Bengaluru, India

Contact IEEE to Subscribe

References

References is not available for this document.