期刊文献+

基于改进Seq2Seq的翻译机器人纠错系统设计

Design of translation robot error correction system based on improved seq2seq
原文传递
导出
摘要 针对传统英语翻译的语法纠错系统存在英语语法错误检测准确率低,纠正效果不佳的问题,提出一种基于Seq2Seq神经网络的英语翻译语法纠错模型。首先,采用Seq2Seq神经网络中的Encode部分对输入序列进行建模,并输出此序列的语义向量;然后在Decode部分引入Attention机制,实现原始序列到目标序列的直接映射,从而完成英语语法纠错。实验结果表明,在CoNLL2018数据集的英语语法纠错测试结果中,本模型的准确率、召回率和F_(0.5)值分别为35.44%、40.68%和32.56%,均高于传统CAMB语法纠错模型。在英语冠词错误纠正结果中,本方法的F_(0.5)取值为32.36%,比传统UIUC方法和Corpus GEC方法高出了7.02%和2.76%;介词错误纠错实验中,本方法比另外两种方法高出了5.91%和13.15%。综合分析可知,本模型对英语翻译语法纠错准确率和精度更高,对比于传统的语法纠错模型纠错效果更好,可在英语翻译机器人语法纠错系统中进行广泛应用和推广。 A grammatical error correction model based on Seq2Seq neural network is proposed.First,the Encode part of the Seq2Seq neural network is used to model the input sequence and output the semantic vector of this sequence;then,the Attention mechanism is introduced in the Decode part to realize the direct mapping of the original sequence to the target sequence,so as to complete the English grammar correction.The experimental results show that the accuracy,recall and F_(0.5)values of the present model are 35.44%,40.68%and 32.56%,respectively,which are higher than the traditional CAMB syntactic error correction model.In the English crown word error correction results,the F_(0.5)value of this method is 32.36%,7.02%and 2.76%higher than the traditional UIUC and Corpus GEC methods;in the preposition error correction experiment,this method is 5.91%and 13.15%higher than the other two methods.The comprehensive analysis shows that this model has higher accuracy and accuracy for English translation grammar,and better effect than the traditional grammar error correction model,which can be widely used and popularized in the grammar error correction system of English translation robot.
作者 刘晓娟 LIU Xiaojuan(Xi’an Siyuan University,Xi’an 710038,China)
机构地区 西安思源学院
出处 《自动化与仪器仪表》 2023年第4期201-205,共5页 Automation & Instrumentation
关键词 深度学习 Seq2Seq 英语翻译机器人 语法纠错 Attention机制 deep learning Seq2Seq English translation robot syntax error correction attention mechanism
  • 相关文献

参考文献15

二级参考文献124

共引文献98

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部