期刊文献+

引入双循环机制深度学习模型的文本情感分析 被引量:2

Text sentiment analysis that introduces a double recurrent mechanism deep learning model
下载PDF
导出
摘要 深度神经网络模型通常使用注意力机制或融合卷积神经网络来提取特征,但由于注意力机制抓取的特征过于单一,存在提取特征不完善的问题。将循环机制引入卷积神经网络中,构建了具有双循环结构的网络模型(DRCNN),从而改善模型的特征提取能力,将其与双向长短期记忆网络结合,提出一种带有注意力机制、特征提取能力更强的混合模型(BiLSTM-DRCNN)并应用于情感分类任务中。通过情感分类的实验分析表明,BiLSTM-DRCNN神经网络模型具有比较好的性能,与融合卷积神经网络(CNN)和双循环长短期记忆神经网络(BiLSTM)模型相比,综合评价指标提高2%以上;与BiLSTM-CNN、Fusion Model模型相比,综合评价指标提高了近1%,且收敛速度更快。 Deep neural network models usually use attention mechanism or fusion convolutional neural network for feature extraction.However,due to the fact that the features grasped by the attention mechanism are too simplified,there is a problem that the extraction features are not perfect.Therefore,the recurrent mechanism was introduced into the convolutional neural network,and a double recurrent convolutional neural network model was constructed to improve the feature extraction ability of the model.Then the model was combined with the bidirectional long and short-term memory network,so as to put forward a hybrid model with the attention mechanism and stronger feature extraction capability(BiLSTM-CNN),which was applied to sentiment classification tasks.Experimental analysis of sentiment classification shows that the BiLSTM-DRCNN neural network model has good performance,and the comprehensive evaluation index is increased by more than 2%compared with the fusion convolutional neural network(CNN)and the double-loop long short-term memory neural network(BiLSTM)model.Compared with the BiLSTM-CNN and Fusion Model models,the comprehensive evaluation index is improved by nearly 1%,and the convergence is faster.
作者 胡任远 刘建华 王璇 罗逸轩 林鸿辉 HU Renyuan;LIU Jianhua;WANG Xuan;LUO Yixuan;LIN Honghui(School of Information Science and Engineering,Fujian University of Technology,Fuzhou 350118,China)
出处 《福建工程学院学报》 CAS 2022年第4期383-390,共8页 Journal of Fujian University of Technology
关键词 双向长短期记忆神经网络 双循环卷积神经网络 注意力机制 文本情感分析 bidirectional long and short-term memory neural network double recurrent convolution neural network attention mechanism text sentiment analysis
  • 相关文献

参考文献4

二级参考文献33

  • 1朱远平,戴汝为.基于SVM决策树的文本分类器[J].模式识别与人工智能,2005,18(4):412-416. 被引量:25
  • 2Bengio Y, Ducharme R, Vincent P, et al. A neural probabilistic language model. The Journal of Ma- chine Learning Research, 2003, 3; 1137-1155. 被引量:1
  • 3Mikolov T, Karaficit M, Burget L, et al. Recurrent neural network based language model[C]//Proceed- ings of the llth Annual Conference of the International Speech Communication Association, Makuhari, Chiba, Japan, September 26-30, 2010. 2010. 1045-1048. 被引量:1
  • 4Socher R, Pennington J, Huang E H, et al. Semi-su- pervised recursive autoencoders for predicting senti- ment distributions[C]//Proeeedings of the Conference on Empirical Methods in Natural Language Process- ing. Association for Computational Linguistics, 2011:151-161. 被引量:1
  • 5Hochreiter S, Bengio Y, Frasconi P, et al. Gradient flow in recurrent nets: the difficulty of learning long- term dependencies M. Wiley-IEEE Press, 2001: 237-243. 被引量:1
  • 6Hochreiter S, Schmidhuber J. Long short-term memo- ry. Neural computation, 1997, 9(8): 1735-1780. 被引量:1
  • 7Socher R, Lin C C, Manning C, et al. Parsing natural scenes and natural language with recursive neural net- works[C//Proceedings of the 28th international con- ference on machine learning (ICML-11). 2011 : 129- 136. 被引量:1
  • 8Socher R, Perelygin A, Wu J Y, et al. Recursive deep models for semantic compositionality over a sentiment treebankC//Proceedings of the conference on empiri- cal methods in natural language processing (EMNLP). 2013 : 1631-1642. 被引量:1
  • 9Irsoy O, Cardie C. Deep Recursive Neural Networks for Compositionality in Language[-C//Proeeedings of the Advances in Neural Information Processing Sys- tems. 2014:2096 -2104. 被引量:1
  • 10Li P, Liu Y, Sun M. Recursive Autoencoders for ITG-Based Translation[C]//Proceedings of the EMN- LP. 2013: 567-577. 被引量:1

共引文献268

同被引文献16

引证文献2

二级引证文献1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部