期刊文献+

ENT-BERT:结合BERT和实体信息的实体关系分类模型 被引量:12

ENT-BERT:Entity Relation Classification Model Combining BERT and Entity Information
下载PDF
导出
摘要 目前,在关系抽取和分类任务中,通过将卷积神经网络和循环神经网络与注意力机制相结合的深度学习技术,一直以来都是主流和最佳的解决方法.最近提出的BERT(Bidirectional Encoder Representations from Transformers)模型在自然语言处理方向的多项任务中都达到了最佳效果,尤其对于文本分类任务,仅利用从模型中得到的句子向量结合全连接层便可使得分类效果有较大提升.针对实体关系分类任务,它与文本分类任务最主要的区别在于它更需要突出实体信息,因此该文章提出实体关系分类模型ENT-BERT,该模型首先通过BERT得到句子向量和字向量,再通过对实体中的字向量取平均得到实体向量,然后通过注意力机制将句子向量和实体向量相结合,最终利用全连接层和softmax函数来确定实体对之间的关系,实现实体关系的分类.实验结果表明,该模型能有效突出实体信息,并且在中英文数据集上都取得较佳效果. At present,in relation extraction and classification tasks,deep learning techniques combining Convolutional Neural Networks and Recurrent Neural Networks with attention mechanisms are the mainstream and state-of-the-art methods.The recently proposed BERT(Bidirectional Encoder Representations from Transformers)model achieves the best results in many tasks in natural language processing.Especially for text classification,only using the sentence vector obtained from the model combined with the fully-connected layer can make the classification effect greatly improved.For the entity relation classification,the main difference compared with text classification is that it requires more emphasis on entity information,so this paper proposes an entity relation classification model ENTBERT.This model first obtains sentence vector and w ord vector through BERT,then averagesw ord vectors in the entity to obtain entity vector.Next,combining the sentence vector and entity vector through the attention mechanism,finally fully-connected layer and softmax function are used to determine the relation between entity,so as to achievethe classification of entity relation.Experiments show the model can effectively highlight entity information and achieve better results on Chinese and English datasets.
作者 张东东 彭敦陆 ZHANG Dong-dong;PENG Dun-lu(Schoolof Optical-Electrical and Computer Engineering,University of Shanghai for Science and Technology,Shanghai 200093,China)
出处 《小型微型计算机系统》 CSCD 北大核心 2020年第12期2557-2562,共6页 Journal of Chinese Computer Systems
基金 国家自然科学基金项目(61772342,61703278)资助。
关键词 深度学习 实体关系分类 BERT 注意力机制 deep learning entity relation classification BERT attention mechanism
  • 相关文献

参考文献3

二级参考文献7

共引文献39

同被引文献57

引证文献12

二级引证文献25

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部