TY - GEN
T1 - Ensemble application of convolutional and recurrent neural networks for multi-label text categorization
AU - Chen, Guibin
AU - Ye, Deheng
AU - Xing, Zhenchang
AU - Chen, Jieshan
AU - Cambria, Erik
N1 - Publisher Copyright:
© 2017 IEEE.
PY - 2017/6/30
Y1 - 2017/6/30
N2 - Text categorization, or text classification, is one of key tasks for representing the semantic information of documents. Multi-label text categorization is finer-grained approach to text categorization which consists of assigning multiple target labels to documents. It is more challenging compared to the task of multi-class text categorization due to the exponential growth of label combinations. Existing approaches to multi-label text categorization fall short to extract local semantic information and to model label correlations. In this paper, we propose an ensemble application of convolutional and recurrent neural networks to capture both the global and the local textual semantics and to model high-order label correlations while having a tractable computational complexity. Extensive experiments show that our approach achieves the state-of-the-art performance when the CNN-RNN model is trained using a large-sized dataset.
AB - Text categorization, or text classification, is one of key tasks for representing the semantic information of documents. Multi-label text categorization is finer-grained approach to text categorization which consists of assigning multiple target labels to documents. It is more challenging compared to the task of multi-class text categorization due to the exponential growth of label combinations. Existing approaches to multi-label text categorization fall short to extract local semantic information and to model label correlations. In this paper, we propose an ensemble application of convolutional and recurrent neural networks to capture both the global and the local textual semantics and to model high-order label correlations while having a tractable computational complexity. Extensive experiments show that our approach achieves the state-of-the-art performance when the CNN-RNN model is trained using a large-sized dataset.
UR - http://www.scopus.com/inward/record.url?scp=85029529704&partnerID=8YFLogxK
U2 - 10.1109/IJCNN.2017.7966144
DO - 10.1109/IJCNN.2017.7966144
M3 - Conference contribution
T3 - Proceedings of the International Joint Conference on Neural Networks
SP - 2377
EP - 2383
BT - 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2017 International Joint Conference on Neural Networks, IJCNN 2017
Y2 - 14 May 2017 through 19 May 2017
ER -