Relation classification is a crucial component in many Natural Language Processing(NLP) systems. In this paper, we propose a novel bidirectional recurrent neural network architecture(using Long Short-Term Memory,LSTM,...Relation classification is a crucial component in many Natural Language Processing(NLP) systems. In this paper, we propose a novel bidirectional recurrent neural network architecture(using Long Short-Term Memory,LSTM, cells) for relation classification, with an attention layer for organizing the context information on the word level and a tensor layer for detecting complex connections between two entities. The above two feature extraction operations are based on the LSTM networks and use their outputs. Our model allows end-to-end learning from the raw sentences in the dataset, without trimming or reconstructing them. Experiments on the SemEval-2010 Task 8dataset show that our model outperforms most state-of-the-art methods.展开更多
基金supported by the National Natural Science Foundation of China (No. 61572505)ChanXueYan Prospective Project of Jiangsu Province (No. BY201502305)
文摘Relation classification is a crucial component in many Natural Language Processing(NLP) systems. In this paper, we propose a novel bidirectional recurrent neural network architecture(using Long Short-Term Memory,LSTM, cells) for relation classification, with an attention layer for organizing the context information on the word level and a tensor layer for detecting complex connections between two entities. The above two feature extraction operations are based on the LSTM networks and use their outputs. Our model allows end-to-end learning from the raw sentences in the dataset, without trimming or reconstructing them. Experiments on the SemEval-2010 Task 8dataset show that our model outperforms most state-of-the-art methods.