期刊文献+

融合实体注意力与语义信息的关系抽取模型

Relational Extraction Model Incorporating Entity Attention and Semantic Information
下载PDF
导出
摘要 知识图谱通过语义网络,建立现实世界和数据世界映射,支撑了很多行业中的具体应用,实体关系抽取是知识图谱构建中的核心环节。论文针对关系抽取任务中实体相关特征利用率低、文本特征提取不充分以及部分预训练模型不能够很好提取序列特征的问题,提出一个基于BERT预训练模型,下游利用长短期记忆网络(LSTM)能够有效处理长期依赖问题的特点,再结合实体位置自感知注意力机制组合成新的模型。模型分别在两个公共数据集上测试,实验结果表明论文模型在TacRed数据集和SemEval 2020 Task 8数据集上f1得分值分别可以达到67.1%,87.8%,均优于部分先前的模型。 Knowledge Graph builds the mapping of the real world and the data world through the semantic network,which sup-ports many specific applications in the industry.Entity relationship extraction is the core link in the construction of knowledge graph.However,the automatic extraction of relational knowledge from documents to supplement the knowledge base has been slow to devel-op.Based on the low utilization of relationship between extraction task entity in the position and inadequate text feature extraction problem,this paper proposes a model of entity relationship extraction based on BERT,downstream uses both short-term and long-term memory network(LSTM)to deal effectively with the characteristics of long relied on,combining entity position self-sens-ing attention mechanism to from a new composite model.The model is tested on two common data sets respectively,and the experi-mental results show that the F1 score of the model in this paper can reach 67.1%and 87.8%on TacRed data set and SemEval 2020 Task 8 data set,respectively,which is better than some previous models.
作者 刘云腾 LIU Yunteng(School of Internet of Things Engineering,Jiangnan University,Wuxi 214000)
出处 《计算机与数字工程》 2024年第2期487-491,520,共6页 Computer & Digital Engineering
关键词 预训练模型 语义关系抽取 注意力机制 长短期记忆网络 自然语言处理 pretraining model semantic relation extraction attention mechanism long and short term memory networks natural language processing
  • 相关文献

参考文献1

二级参考文献7

共引文献18

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部