期刊文献+

基于门控注意力的双通道情感分析及应用

Dual-Channel Sentiment Analysis and Application Based on Gated Attention
下载PDF
导出
摘要 针对传统的基于深度学习的文本情感分类模型特征抽取不全面以及不能区分一词多义的问题,提出一种基于门控注意力的双通道情感分类模型BGA-DNet。该模型使用BERT预训练模型对文本数据进行处理,然后经过双通道网络提取文本特征,其中通道一利用TextCNN提取局部特征,通道二利用BiLSTM-Attention提取全局特征。同时引入门控注意力单元将部分无用的注意力信息过滤掉,并结合残差网络思想,确保双通道的输出在网络学习到饱和状态下保留原始编码信息。BGA-DNet在公开的酒店评论和餐饮评论两个数据集上进行实验评估,并与最新的情感分类方法进行对比,分别取得了准确率94.09%和91.82%的最佳效果。最后将BGA-DNet模型应用到真实的学生实验心得体会评价任务上,与其他方法相比准确率和F1值也是最高的。 Traditional deep learning-based text sentiment classification models usually cannot extract features completely and cannot distinguish polysemous words.To resolve these problems,a dual-channel sentiment classification model named BGA-DNet based on gated attention is proposed.The model uses the BERT pre-training model to process text data,and then extracts text features through a dual-channel network.The channel one uses TextCNN to extract local features,and the channel two uses BiLSTM-Attention to extract global features.At the same time,a gated attention unit is introduced to filter out some useless attention information.With a residual network,it also ensures that the output of the dual-channel retains the original coding information when the network reaches a saturated state.BGA-DNet is evaluated on two public datasets of hotel reviews and restaurant reviews,and compared with the latest sentiment classification methods,it achieves the best results with accuracy rate of 94.09%and 91.82%,respectively.At last,the BGA-DNet model is applied to the real dataset of students’experiment reports,and the accuracy and F1 value are also the highest.
作者 魏龙 胡建鹏 张庚 WEI Long;HU Jianpeng;ZHANG Geng(School of Electronic and Electrical Engineering,Shanghai University of Engineering Science,Shanghai 201620,China)
出处 《计算机工程与应用》 CSCD 北大核心 2023年第10期134-141,共8页 Computer Engineering and Applications
基金 2021年上海市数据智能技术及其应用协同创新中心项目(E1-8938-21-0101) 科技创新2030“-新一代人工智能”重大项目(2020AAA0109302)。
关键词 门控注意力 双通道 情感分类 BERT BiLSTM-Attention gated attention dual-channel sentiment classification BERT BiLSTM-Attention
  • 相关文献

参考文献7

二级参考文献40

共引文献261

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部