摘要
现有的卷积神经网络规模越来越大,导致参数量过大,结构不够轻量,并且现有的网络难以识别人脸表情的细微变化,不能对人脸表情特征进行精确提取,表情识别性能有待提高.针对以上问题,提出了一种基于注意力机制的卷积神经网络表情识别方法.该方法设计了一种新的网络结构,网络在卷积层的基础上增加了残差恒等块,同时引入注意力模块(Spatial Group-wise Enhance module,SGE),有效缓解了网络的过拟合现象,丰富了人脸表情特征学习,并利用全局特征和局部特征的相似性来指导语义特征的空间分布,使每个特征组自主增强人脸表情的特征学习.该网络结构较为轻量,参数量较少.在RAF-DB和CK+数据集上的实验结果表明,该方法有效改善了人脸表情识别的性能.
The scale of existing convolutional neural networks is getting larger and larger,resulting in too large parameters and insufficient lightweight structure,and it is difficult for existing networks to recognize subtle changes in facial expressions,so it is impossible to extract facial expression features accurately.The performance of facial expression recognition needs to be improved.In view of the above problems,a convolutional neural network expression recognition method based on the attention mechanism is proposed.A new network structure is designed in this method.The network adds residual identity blocks on the basis of the convolution layer.At the same time,the attention module Spatial Group-wise Enhance module(SGE)is introduced.The method effectively alleviates the overfitting of the network,enriches facial expression feature learning,and uses the similarity of global and local features to guide the spatial distribution of semantic features,so that each feature group can enhance the feature learning of facial expression independently.The network structure is relatively lightweight and the number of parameters is small.The experimental results on the RAF-DB and CK+datasets show that the proposed method can effectively improve the performance of facial expression recognition.
作者
亢洁
李思禹
KANG Jie;LI Si-yu(School of Electrical and Control Engineering, Shaanxi University of Science & Technology, Xi′an 710021, China)
出处
《陕西科技大学学报》
CAS
2020年第4期159-165,171,共8页
Journal of Shaanxi University of Science & Technology
基金
陕西省科技厅社会发展科技攻关计划项目(2016SF-410)
西安市科技计划项目(2019216514GXRC001CG002-GXYD1.7)
国家留学基金项目(201708615011)。
关键词
卷积神经网络
人脸表情识别
注意力机制
残差恒等块
convolutional neural network
facial expression recognition
attention mechanism
residual identity block