Abstract:
Graph Neural Networks (GNNs) have demonstrated superior performance in many challenging applications, including few-shot learning tasks. Despite its powerful capacity to ...Show MoreMetadata
Abstract:
Graph Neural Networks (GNNs) have demonstrated superior performance in many challenging applications, including few-shot learning tasks. Despite its powerful capacity to learn and generalize a model from a few samples, GNN usually suffers from severe over-fitting and over-smoothing as the model becomes deep, which limits its scalability. In this work, we propose a novel Attentive GNN (AGNN) to tackle these challenges by incorporating a triple-attention mechanism, i.e., node self-attention, neighborhood attention, and layer memory attention. We explain why the proposed attentive modules can improve GNN for few-shot learning with theoretical analysis and illustrations. Extensive experiments demonstrate that the proposed AGNN model achieves promising results, compared to state-of-the-art GNN- and CNN-based methods for few-shot learning tasks, over the mini-ImageNet and tiered-ImageNet benchmarks under ConvNet-4 backbone with both inductive and transductive settings.
Published in: 2022 IEEE 5th International Conference on Multimedia Information Processing and Retrieval (MIPR)
Date of Conference: 02-04 August 2022
Date Added to IEEE Xplore: 08 September 2022
ISBN Information: