TY - GEN
T1 - Affentive graph-based recursive neural network for collective vertex classification
AU - Xu, Qiongkai
AU - Wang, Qing
AU - Xu, Chenchen
AU - Qu, Lizhen
N1 - Publisher Copyright:
© 2017 ACM.
PY - 2017/11/6
Y1 - 2017/11/6
N2 - Vertex classification is a critical task in graph analysis, where both contents and linkage of vertices are incorporated during classification. Recently, researchers proposed using deep neural network to build an end-to-end framework, which can capture both local content and structure information. These approaches were proved effective in incorporating semantic meanings of neighbouring vertices, while the usefulness of this information was not properly considered. In this paper, we propose an Attentive Graph-based Recursive Neural Network (AGRNN), which exerts attention on neural network to make our model focus on vertices with more relevant semantic information. We evaluated our approach on three real-world datasets and also datasets with synthetic noise. Our experimental results show that AGRNN achieves the state-of-the-art performance, in terms of effectiveness and robustness. We have also illustrated some attention weight samples to demonstrate the rationality of our model.
AB - Vertex classification is a critical task in graph analysis, where both contents and linkage of vertices are incorporated during classification. Recently, researchers proposed using deep neural network to build an end-to-end framework, which can capture both local content and structure information. These approaches were proved effective in incorporating semantic meanings of neighbouring vertices, while the usefulness of this information was not properly considered. In this paper, we propose an Attentive Graph-based Recursive Neural Network (AGRNN), which exerts attention on neural network to make our model focus on vertices with more relevant semantic information. We evaluated our approach on three real-world datasets and also datasets with synthetic noise. Our experimental results show that AGRNN achieves the state-of-the-art performance, in terms of effectiveness and robustness. We have also illustrated some attention weight samples to demonstrate the rationality of our model.
KW - Attention model
KW - Collective vertex classification
KW - Recursive neural network
UR - http://www.scopus.com/inward/record.url?scp=85037357986&partnerID=8YFLogxK
U2 - 10.1145/3132847.3133081
DO - 10.1145/3132847.3133081
M3 - Conference contribution
T3 - International Conference on Information and Knowledge Management, Proceedings
SP - 2403
EP - 2406
BT - CIKM 2017 - Proceedings of the 2017 ACM Conference on Information and Knowledge Management
PB - Association for Computing Machinery (ACM)
T2 - 26th ACM International Conference on Information and Knowledge Management, CIKM 2017
Y2 - 6 November 2017 through 10 November 2017
ER -