Abstract:
Graph convolutional neural networks (GCNN) have been widely used in graph learning and related applications. It has been identified that the filters in the state-of-the-a...Show MoreMetadata
Abstract:
Graph convolutional neural networks (GCNN) have been widely used in graph learning and related applications. It has been identified that the filters in the state-of-the-art spectral graph convolutional networks (SGCNs) are essentially low-pass filters that enforce smoothness across the graph and use the functions of graph Laplacian as a tool that injects graph structure into the learning algorithm. There had been research findings that connect the smoothness functional in graphs, graph Laplacian, and regularization operators. We review the existing SGCNs in this context and propose a framework where the state-of-the-art filter designs can be deduced as special cases. We designed new filters that are associated with well-defined low-pass behavior and tested their performance on semisupervised node classification tasks. Their performance was found to be superior to that of the other state-of-the-art techniques. We further investigate the representation capability of low-pass features and make useful observations. In this context, we discuss a few points to further optimize the network, new strategies for designing SGCNs, their challenges, and some latest related developments. Based on our framework, we also deduce the connection between support vector kernels and SGCN filters.
Published in: IEEE Transactions on Neural Networks and Learning Systems ( Volume: 35, Issue: 4, April 2024)