Loading [MathJax]/extensions/MathMenu.js
Graph Evolving and Embedding in Transformer | IEEE Conference Publication | IEEE Xplore

Graph Evolving and Embedding in Transformer


Abstract:

This paper presents a novel graph representation which tightly integrates the information sources of node embed-ding matrix and weight matrix in a graph learning represen...Show More

Abstract:

This paper presents a novel graph representation which tightly integrates the information sources of node embed-ding matrix and weight matrix in a graph learning representation. A new parameter updating method is proposed to dynamically represent the graph network by using a specialized transformer. This graph evolved and embedded transformer is built by using the weights and node embeddings from graph structural data. The attention-based graph learning machine is implemented. Using the proposed method, each transformer layer is composed of two attention layers. The first layer is designed to calculate the weight matrix in graph convolutional network, and also the self attention within the matrix itself. The second layer is used to estimate the node embedding and weight matrix, and also the cross attention between them. Graph learning representation is enhanced by using these two attention layers. Experiments on three financial prediction tasks demonstrate that this transformer captures the temporal information and improves the Fl score and the mean reciprocal rank.
Date of Conference: 07-10 November 2022
Date Added to IEEE Xplore: 21 December 2022
ISBN Information:

ISSN Information:

Conference Location: Chiang Mai, Thailand

Contact IEEE to Subscribe

References

References is not available for this document.