Skip to Main Content
In this paper we introduce the Graph Echo State Network (GraphESN) model, a generalization of the Echo State Network (ESN) approach to graph domains. GraphESNs allow for an efficient approach to Recursive Neural Networks (RecNNs) modeling extended to deal with cyclic/acyclic, directed/undirected, labeled graphs. The recurrent reservoir of the network computes a fixed contractive encoding function over graphs and is left untrained after initialization, while a feed-forward readout implements an adaptive linear output function. Contractivity of the state transition function implies a Markovian characterization of state dynamics and stability of the state computation in presence of cycles. Due to the use of fixed (untrained) encoding, the model represents both an extremely efficient version and a baseline for the performance of recursive models with trained connections. The performance are shown on standard benchmark tasks from Chemical domains, allowing the comparison with both Neural Network and Kernel-based approaches for graphs.
Date of Conference: 18-23 July 2010