摘要
为更好解决卷积神经网络提取特征不充分,难以处理长文本结构信息和捕获句子语义关系等问题,提出一种融合CNN和自注意力BiLSTM的并行神经网络模型TC-ABlstm。对传统的卷积神经网络进行改进,增强对文本局部特征的提取能力;设计结合注意力机制的双向长短期记忆神经网络模型来捕获文本上下文相关的全局特征;结合两个模型提取文本特征的优势,提高分类的准确性。在搜狗语料库和复旦大学中文语料库上的实验结果表明,所提模型能有效提升文本分类准确度。
To better solve the problems of insufficient feature extraction and difficulty in processing long text structure information and capturing sentence semantic relations,a parallel neural network model TC-ABlstm combining CNN and self-attention BiLSTM was proposed.The traditional convolution neural network was improved to enhance the ability of extracting local features of text.A bidirectional long-term and short-term memory neural network model combined with attention mechanism was designed to capture the global features of text context.The advantages of the two models were combined to extract text features to improve the accuracy of classification.Results of experiments on Sogou corpus and Fudan University Chinese corpus show that the proposed model can effectively improve the accuracy of text classification.
作者
梁顺攀
豆明明
于洪涛
郑智中
LIANG Shun-pan;DOU Ming-ming;YU Hong-tao;ZHENG Zhi-zhong(School of Information Science and Engineering,Yanshan University,Qinhuangdao 066004,China)
出处
《计算机工程与设计》
北大核心
2022年第2期573-579,共7页
Computer Engineering and Design
基金
国家自然科学基金项目(61772451)
河北省自然科学基金面上基金项目(G2021203010)。