TY - JOUR
T1 - Graph sparsification with graph convolutional networks
AU - Li, Jiayu
AU - Zhang, Tianyun
AU - Tian, Hao
AU - Jin, Shengmin
AU - Fardad, Makan
AU - Zafarani, Reza
N1 - Funding Information:
This research was supported in part by the National Science Foundation under awards CAREER IIS-1942929, CAREER CMMI-1750531, and ECCS-1609916.
Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Nature Switzerland AG.
PY - 2022/1
Y1 - 2022/1
N2 - Graphs are ubiquitous across the globe and within science and engineering. Some powerful classifiers are proposed to classify nodes in graphs, such as Graph Convolutional Networks (GCNs). However, as graphs are growing in size, node classification on large graphs can be space and time consuming due to using whole graphs. Hence, some questions are raised, particularly, whether one can prune some of the edges of a graph while maintaining prediction performance for node classification, or train classifiers on specific subgraphs instead of a whole graph with limited performance loss in node classification. To address these questions, we propose Sparsified Graph Convolutional Network (SGCN), a neural network graph sparsifier that sparsifies a graph by pruning some edges. We formulate sparsification as an optimization problem and solve it by an Alternating Direction Method of Multipliers (ADMM). The experiment illustrates that SGCN can identify highly effective subgraphs for node classification in GCN compared to other sparsifiers such as Random Pruning, Spectral Sparsifier and DropEdge. We also show that sparsified graphs provided by SGCN can be inputs to GCN, which leads to better or comparable node classification performance with that of original graphs in GCN, DeepWalk, GraphSAGE, and GAT. We provide insights on why SGCN performs well by analyzing its performance from the view of a low-pass filter.
AB - Graphs are ubiquitous across the globe and within science and engineering. Some powerful classifiers are proposed to classify nodes in graphs, such as Graph Convolutional Networks (GCNs). However, as graphs are growing in size, node classification on large graphs can be space and time consuming due to using whole graphs. Hence, some questions are raised, particularly, whether one can prune some of the edges of a graph while maintaining prediction performance for node classification, or train classifiers on specific subgraphs instead of a whole graph with limited performance loss in node classification. To address these questions, we propose Sparsified Graph Convolutional Network (SGCN), a neural network graph sparsifier that sparsifies a graph by pruning some edges. We formulate sparsification as an optimization problem and solve it by an Alternating Direction Method of Multipliers (ADMM). The experiment illustrates that SGCN can identify highly effective subgraphs for node classification in GCN compared to other sparsifiers such as Random Pruning, Spectral Sparsifier and DropEdge. We also show that sparsified graphs provided by SGCN can be inputs to GCN, which leads to better or comparable node classification performance with that of original graphs in GCN, DeepWalk, GraphSAGE, and GAT. We provide insights on why SGCN performs well by analyzing its performance from the view of a low-pass filter.
KW - Graph convolutional network
KW - Graph sparsification
KW - Node classification
UR - http://www.scopus.com/inward/record.url?scp=85116981187&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85116981187&partnerID=8YFLogxK
U2 - 10.1007/s41060-021-00288-8
DO - 10.1007/s41060-021-00288-8
M3 - Article
AN - SCOPUS:85116981187
SN - 2364-415X
VL - 13
SP - 33
EP - 46
JO - International Journal of Data Science and Analytics
JF - International Journal of Data Science and Analytics
IS - 1
ER -