Abstract:
Graph Neural Networks (GNNs) have drawn great research attention for graph machine learning. However, graph learning techniques are extremely difficult for practical depl...Show MoreMetadata
Abstract:
Graph Neural Networks (GNNs) have drawn great research attention for graph machine learning. However, graph learning techniques are extremely difficult for practical deployments in the industry applications owing to the scalability challenges incurred by data dependency. Although some works attempt to establish the known efficient MLPs with GNNs based on knowledge distillation (KD), they are primarily designed for graphs in Euclidean spaces, which can not provide the most powerful geometry for graph representation as numerous real- world graphs display a combination of Euclidean and Hyperbolic geometry. To achieve comprehensive expression for complex graph data with high efficiency, in this paper, we proposed a novel Advanced Graph-MLPs Distillation Framework (AGMDF) based on global and local hyperbolic geometry learning. The key point of our method is to fully exploit the complex graph with additional hyperbolic properties based on knowledge distillation. Specifically, the global cross-geometric knowledge fusion can exploit the compensation information learned from Euclidean view and Hyperbolic domain. Then, the local hyperbolic knowledge enhancement can employ prominent tree-likeness components among graph data to improve the graph representation ability. Thus, the distilled MLP model enjoys the high expressive ability of graph context-awareness based on global and local hyperbolic geometry learning. Extensive experiments show that AGMDF achieves competitive accuracy with GNNs and improves over stand-alone MLPs by 21.84% on average while inferring faster than GNNs across five benchmark datasets.
Published in: ICASSP 2025 - 2025 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)
Date of Conference: 06-11 April 2025
Date Added to IEEE Xplore: 07 March 2025
ISBN Information: