Loading [MathJax]/extensions/MathMenu.js
Attentive Graph Neural Networks for Few-Shot Learning | IEEE Conference Publication | IEEE Xplore

Attentive Graph Neural Networks for Few-Shot Learning


Abstract:

Graph Neural Networks (GNNs) have demonstrated superior performance in many challenging applications, including few-shot learning tasks. Despite its powerful capacity to ...Show More

Abstract:

Graph Neural Networks (GNNs) have demonstrated superior performance in many challenging applications, including few-shot learning tasks. Despite its powerful capacity to learn and generalize a model from a few samples, GNN usually suffers from severe over-fitting and over-smoothing as the model becomes deep, which limits its scalability. In this work, we propose a novel Attentive GNN (AGNN) to tackle these challenges by incorporating a triple-attention mechanism, i.e., node self-attention, neighborhood attention, and layer memory attention. We explain why the proposed attentive modules can improve GNN for few-shot learning with theoretical analysis and illustrations. Extensive experiments demonstrate that the proposed AGNN model achieves promising results, compared to state-of-the-art GNN- and CNN-based methods for few-shot learning tasks, over the mini-ImageNet and tiered-ImageNet benchmarks under ConvNet-4 backbone with both inductive and transductive settings.
Date of Conference: 02-04 August 2022
Date Added to IEEE Xplore: 08 September 2022
ISBN Information:

ISSN Information:

Conference Location: CA, USA

Contact IEEE to Subscribe

References

References is not available for this document.