摘要
将递归神经网络(RNN)应用于意图检测和槽填充已实现较好的识别效果。传统Slot-Gated模型旨在将意图特征融入槽位识别中,但未能将文本标签信息作为模型先验知识传入模型参与训练。在Slot-Gated模型的基础上,通过意图标签信息构建一种基于注意力机制的意图列表查询模块,并通过全局优化的方法提升模型意图识别以及意图与槽填充联合准确率。通过与Slot-Gated模型进行对比实验,该方法在ATIS数据集上的意图及联合准确率分别提升了1.1%和1.5%;在Snips数据集上,意图及联合准确率分别提升了0.3%和0.4%。实验结果表明,将意图种类标签信息作为先验知识加入训练能提升模型性能。
Recurrent neural network(RNN)for intent detection and slot filling had achieved a good recognition effect.The traditional Slot-Gated model aimed to associate the intention with the slot display,but it failed to pass the label information of the text as the prior knowledge of the model to participate in the training.Based on the Slot-Gated model,this research constructes an intent list query module simultaneously the attention mechanism through intent label information,and used a global optimization method to enable the model to achieve better intent recognition and the combined accuracy of intent and slot filling rate.As compared with,the Slot-Gated model,The intention and joint accuracy of this method on the ATIS data set are increased by 1.1%and 1.5%respectively;In addition,on the Snips data set,this method has improved intent and joint accuracy of 0.3%and 0.4%.According to the experimental results,this method verifies that prior knowledge has a positive effect on the improvement of model performance.
作者
胡光敏
姜黎
HU Guang-min;JIANG Li(School of Physics and Optoelectronic Engineering,Xiangtan University,Xiangtan 411100,China)
出处
《软件导刊》
2021年第9期51-55,共5页
Software Guide
关键词
自然语言理解
神经网络
预训练模型
槽填充
注意力机制
natural language understanding
neural network
pre-training model
slot filling
attention mechanism