期刊文献+

联合知识的融合训练模型 被引量:1

Ensemble Training Model Integrating Knowledge
下载PDF
导出
摘要 在互联网医疗领域,智能AI分科室是一个很关键的环节,就是根据患者病情描述、疾病特征、药品等信息将患者分配到匹配的科室,可以利用深层双向Transformer结构的BERT预训练语言模型增强字的语义,但是患者病情文本描述具有信息稀疏的特点,不利于BERT的充分学习其中特征.本文提出了一种DNNBERT模型.是一种融合知识的联合训练模型,DNNBERT结合了神经网络(DNN)和Transformer模型的优势,能从文本中学习到更多的语义.实验证明DNNBERT的计算时间相比BERT-large速度提升1.7倍,并且准确率比ALBERT的F1值提高了0.12,比TextCNN提高了0.17,本文的工作将为特征稀疏学习提供新思路,也将为基于深度Transformer的模型应用于生产提供新的思路. In the field of Internet-based medical treatment,AI-based triage represents a key link,which allocates patients to departments according to conditions,disease attributes,medications,etc.We can adopt the BERT with a deep bidirectional Transformer structure for language model pre-training to enhance the word semantics;however,the text description of patients’conditions offers sparse information,which is not conducive to the full learning of characteristics by BERT.This paper presents DNNBERT,a joint training model integrating knowledge.Combining the advantages of Deep Neural Network(DNN)and the Transformer model,DNNBERT can learn more semantics from text.The experimental results prove that the computing time of DNNBERT is 1.7 times shorter than that of BERT-large;the accuracy rate of DNNBERT is 0.12 higher than the F1 value of ALBERT and 0.17 higher than that of TextCNN.This paper will provide a new idea for sparse feature learning and the applications of deep Transformer-based models to production.
作者 王永鹏 周晓磊 马慧敏 曹吉龙 WANG Yong-Peng;ZHOU Xiao-Lei;MA Hui-Min;CAO Ji-Long;无(Shenyang Institute of Computing Technology,Chinese Academy of Sciences,Shenyang 110168,China;University of Chinese Academy of Sciences,Beijing 100049,China;Neusoft Group Co.Ltd.,Shenyang 110179,China;The Fourth Affiliated Hospital of China Medical University,Shenyang 110032,China)
出处 《计算机系统应用》 2021年第7期50-56,共7页 Computer Systems & Applications
基金 国家水体污染控制与治理科技重大专项(2012ZX07505)。
关键词 知识融合 医疗短文本 BERT模型 联合训练 文本分类 knowledge fusion medical short text BERT model joint training text classification
  • 相关文献

参考文献2

二级参考文献8

共引文献216

同被引文献8

引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部