Loading [MathJax]/extensions/MathZoom.js
HGNAS++: Efficient Architecture Search for Heterogeneous Graph Neural Networks | IEEE Journals & Magazine | IEEE Xplore

HGNAS++: Efficient Architecture Search for Heterogeneous Graph Neural Networks


Abstract:

Heterogeneous graphs are commonly used to describe networked data with multiple types of nodes and edges. Heterogeneous Graph Neural Networks (HGNNs) are powerful tools f...Show More

Abstract:

Heterogeneous graphs are commonly used to describe networked data with multiple types of nodes and edges. Heterogeneous Graph Neural Networks (HGNNs) are powerful tools for analyzing heterogeneous graphs. However, designing neural architectures of HGNNs requires extensive domain knowledge and time-consuming manual work. Recently, neural architecture search algorithms have become popular in automatically designing neural architectures for homogeneous graph neural networks. In this paper, we present a Heterogeneous Graph Neural Architecture Search algorithm (HGNAS for short) which allows the automatic design of heterogeneous graph neural architectures. Specifically, HGNAS first designs a new search space based on existing popular HGNNs. Then, HGNAS uses a policy network as the controller to sample and find the best neural architecture from the designed search space by maximizing the expected accuracy of the selected architectures on a given validation dataset. Moreover, we design a new method HGNAS++ to improve the efficiency of HGNAS by training the RNN controller within a generative adversarial learning framework. The basic idea of HGNAS++ is to embed a pairwise ranker into the reinforcement learning based architecture search algorithm. The pairwise ranker can be taken as a discriminator which selects more accurate architectures between pairs of candidate architectures. Then, the RNN controller can be updated more efficiently by only using a relatively small number of candidate architectures selected by the pairwise ranker. Experiments on real-world heterogeneous graph datasets show that HGNAS is capable of designing novel HGNNs that beat the best human-invented HGNNs. On the benchmark datasets, HGNAS++ improves HGNAS in terms of evaluation cost, with a reduction of 50% of the evaluated candidate architectures and a decrease of 24% in search time on average. As a byproduct, HGNAS++ can find sparse yet powerful neural architectures for HGNNs.
Published in: IEEE Transactions on Knowledge and Data Engineering ( Volume: 35, Issue: 9, 01 September 2023)
Page(s): 9448 - 9461
Date of Publication: 07 February 2023

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.