期刊文献+

基于BERT和双向RNN的中文林业知识图谱构建研究 被引量:9

Research on the Construction of Chinese Forestry Knowledge Graph Based on BERT and Bi-directional RNN
下载PDF
导出
摘要 为了融合碎片化林业文本数据,解决目前林业网络知识散乱、无序和关联性不强的问题,将深度学习与知识图谱相结合,提出一种基于改进BERT和双向RNN的模型用于林业实体识别和实体关系抽取。通过基于实体Mask的BERT词向量处理,自动提取出序列中与研究领域相关的词级和语义特征,将词向量输入命名实体识别模型和实体关系提取模型训练。在通用数据集上,实体关系抽取BERT-BiGRU-Attention模型效果更优,F1值提升1%,准确率在90%以上;实体识别BERT-BiLSTM-CRF模型各项指标比传统模型提高2%。在林业数据集上,模型准确率达到80%以上。表明基于BERT和双向RNN模型构建中文林业知识图谱是可行的。在此模型的基础上搭建中文林业知识图谱智能系统,可以优化林业信息管理方法,促进林业发展。 To integrate fragmented forestry text data and solve the problems of scattered, disordered, and poorly related forestry network knowledge, deep learning and Knowledge Graph(KG) are combined to propose a model based on improved BERT and Bi-directional RNN for forestry entity recognition and entity relation extraction.Through Entity Mask BERT word vector processing, the word-level and semantic features related to the research field in the sequence are automatically extracted.Input the generated word vectors into the named entity recognition model and entity relation extraction model for training.On the general dataset, the entity relation extraction BERT-BiGRU-Attention model is more effective, F1-score is increased by 1%,and accuracy reaches 90%;The indicators of entity recognition BERT-BiLSTM-CRF model are improved by 2% compared with the traditional model.On the forestry dataset, the accuracy of the model is over 80%.It is feasible to build a Chinese forestry KG based on BERT and Bi-directional RNN model.Building a Chinese forestry KG intelligent system based on this model can optimize forestry information management methods and promote forestry development.
作者 岳琪 李想 YUE Qi;LI Xiang(College of Information and Computer Engineering,Northeast Forestry University,Harbin 150040,China)
出处 《内蒙古大学学报(自然科学版)》 CAS 北大核心 2021年第2期176-184,共9页 Journal of Inner Mongolia University:Natural Science Edition
基金 黑龙江省自然科学基金资助项目(LH2019G001)。
关键词 深度学习 BERT 命名实体识别 实体关系抽取 deep learning BERT named entity recognition entity relation extraction
  • 相关文献

参考文献7

二级参考文献149

共引文献352

同被引文献131

引证文献9

二级引证文献40

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部