Loading web-font TeX/Main/Regular
MQ-GNN: A Multi-Queue Pipelined Architecture for Scalable and Efficient GNN Training | IEEE Journals & Magazine | IEEE Xplore

MQ-GNN: A Multi-Queue Pipelined Architecture for Scalable and Efficient GNN Training


This figure illustrates the MQ-GNN framework, which integrates multi-queue pipelining to optimize mini-batch generation, data transfer, computation, and gradient updates ...

Abstract:

Graph Neural Networks (GNNs) are powerful tools for learning graph-structured data, but their scalability is hindered by inefficient mini-batch generation, data transfer ...Show More

Abstract:

Graph Neural Networks (GNNs) are powerful tools for learning graph-structured data, but their scalability is hindered by inefficient mini-batch generation, data transfer bottlenecks, and costly inter-GPU synchronization. Existing training frameworks fail to overlap these stages, leading to suboptimal resource utilization. This paper proposes MQ-GNN, a multi-queue pipelined framework that maximizes training efficiency by interleaving GNN training stages and optimizing resource utilization. MQ-GNN introduces Ready-to-Update Asynchronous Consistent Model (RaCoM), which enables asynchronous gradient sharing and model updates while ensuring global consistency through adaptive periodic synchronization. Additionally, it employs global neighbor sampling with caching to reduce data transfer overhead and an adaptive queue-sizing strategy to balance computation and memory efficiency. Experiments on four large-scale datasets and ten baseline models demonstrate that MQ-GNN achieves up to {4.6\,\times } faster training time and 30% improved GPU utilization while maintaining competitive accuracy. These results establish MQ-GNN as a scalable and efficient solution for multi-GPU GNN training. The code is available at MQ-GNN.
This figure illustrates the MQ-GNN framework, which integrates multi-queue pipelining to optimize mini-batch generation, data transfer, computation, and gradient updates ...
Published in: IEEE Access ( Volume: 13)
Page(s): 27550 - 27569
Date of Publication: 07 February 2025
Electronic ISSN: 2169-3536

Funding Agency:


References

References is not available for this document.