期刊文献+

面向语义片段结构化自注意力的目标情感分析 被引量:1

Semantic Fragment-based Structured Self-attention for Targeted Sentiment Analysis
下载PDF
导出
摘要 目标情感分析任务中大多数方法都使用循环神经网络或注意力机制对句子进行建模,但循环神经网络很难进行并行化计算,且不能充分捕捉长距离的语义信息;注意力机制注重于词与词之间的相关性,忽略了语义片段的重要性.针对以上问题,论文提出了一种面向语义片段结构化自注意力的目标情感分析方法.首先通过BERT获取目标词、上下文和整个句子的嵌入表示,再利用注意力编码网络进行语义建模.其次,通过多头注意力机制获得目标与上下文的融合语义特征,通过结构化自注意力机制获得句子片段的语义特征.最终,在融合各个语义特征的基础上对目标的情感极性分类.本方法在SemEval 2014 Task4和SemEval 2015 Task12通用数据集上的实验表明,该方法对比基线方法获得了提升. In target sentiment analysis task,most of the current methods use Recurrent Neural Network(RNN)or attention mechanisms for modeling sentences.However,RNNs are difficult to parallel computation and can′t fully capture semantic information over long distances.Attention mechanism focuses on the correlation between words and ignores the importance of semantic fragments.To solve the above problems,this paper proposes a targeted sentiment analysis method with semantic fragment-based structured self-attention.Firstly,the target embedding,context embedding and sentence embedding are pre-trained by Bidirectional Encoder Representations from Transformers(BERT).Then the attentional encoder network is used to model the three semantics.Secondly,the fusion semantic features of the target and the context are obtained through the multi-head attention mechanism,and the semantic fragment features of the whole sentence are obtained through the structured self-attention mechanism.Finally,the sentiment polarity of the target is classified on the basis of fusing each semantic feature.Experiments on SemEval 2014 Task4 and SemEval 2015 Task12 show that the performance of this model is improved compared with baseline methods.
作者 邓航 陈渝 赵容梅 琚生根 DENG Hang;CHEN Yu;ZHAO Rong-mei;JU Sheng-gen(College of Computer Science,Sichuan University,Chengdu 610000,China;College of Science&Technology,Sichuan Minzu College,Kangding 626000,China)
出处 《小型微型计算机系统》 CSCD 北大核心 2022年第12期2499-2505,共7页 Journal of Chinese Computer Systems
基金 国家自然科学基金重点项目(62137001)资助。
关键词 目标情感分析 注意力编码网络 结构化自注意力 BERT targeted sentiment analysis attentional encoder network structured self-attention BERT
  • 引文网络
  • 相关文献

同被引文献9

引证文献1

相关主题

;
使用帮助 返回顶部