Abstract
Graph representation learning with the family of graph convolution networks (GCN) provides powerful tools for prediction on graphs. As graphs grow with more edges, the GCN family suffers from sub-optimal generalization performance due to task-irrelevant connections. Recent studies solve this problem by using graph sparsification in neural networks. However, graph sparsification cannot generate ultra-sparse graphs while simultaneously maintaining the performance of the GCN family. To address this problem, we propose Graph Ultra-sparsifier, a semi-supervised graph sparsifier with dynamically-updated regularization terms based on the graph convolution. The graph ultra-sparsifier can generate ultra-sparse graphs while maintaining the performance of the GCN family with the ultra-sparse graphs as inputs. In the experiments, when compared to the state-of-the-art graph sparsifiers, our graph ultra-sparsifier generates ultra-sparse graphs and these ultra-sparse graphs can be used as inputs to maintain the performance of GCN and its variants in node classification tasks.
Original language | English (US) |
---|---|
Journal | ICASSP, IEEE International Conference on Acoustics, Speech and Signal Processing - Proceedings |
DOIs | |
State | Published - 2023 |
Event | 48th IEEE International Conference on Acoustics, Speech and Signal Processing, ICASSP 2023 - Rhodes Island, Greece Duration: Jun 4 2023 → Jun 10 2023 |
Keywords
- graph neural network
- Graph sparsifier
- reweighted optimization
ASJC Scopus subject areas
- Software
- Signal Processing
- Electrical and Electronic Engineering