摘要
为了提高推荐算法的推荐性能,在序列建模过程中,针对循环神经网络(recurrent neural network,RNN)无法并行运算导致建模速度与准确度较低,以及在偏好预测过程中对用户不同阶段偏好没有动态融合的问题,提出了一种基于混合神经网络的序列推荐算法。在算法模型的用户交互序列建模阶段,考虑到用户近期偏好变化频繁,对于时间效率与推荐准确度都有更高的要求,引入时间卷积网络(temporal convolutional network,TCN)对近期交互序列进行建模,解决了循环神经网络建模速度和准确度较低的问题;在用户偏好预测阶段,在考虑用户近期与长期偏好的基础上,基于注意力机制动态融合了用户近期与长期2个交互阶段的偏好,从而提高了推荐的性能。在公共数据集MovieLens10M与LastFM上进行了实验,结果证明了模型的有效性。
In order to improve the recommendation performance of the recommendation algorithm,in the sequence modeling process,a hybrid neural network based sequence recommendation algorithm is proposed to solve the problem that the recurrent neural network(RNN)cannot operate in parallel,resulting in low modeling speed and accuracy,and there is no dynamic fusion of user preferences at different stages in the process of preference prediction.In the user interaction sequence modeling stage,considering the frequent changes in user preferences in recent times,there are higher requirements for time efficiency and recommendation accuracy,so the temporal convolutional network is introduced to model recent interaction sequences,which solves the problem of low speed and accuracy of recurrent neural network modeling.In the user preference prediction stage,considering the user’s short-term and long-term preferences,based on the attention mechanism,the user’s preferences in the recent and long-term interaction phases are dynamically integrated,thereby improving the performance of the recommendation.This paper conducted experiments on the public data sets MovieLens10M and LastFM,and the results proved the effectiveness of the model.
作者
刘纵横
汪海涛
姜瑛
陈星
LIU Zongheng;WANG Haitao;JIANG Ying;CHEN Xing(Faculty of Information Engineering and Automation,Kunming University of Science and Technology,Kunming 650500,P.R.China)
出处
《重庆邮电大学学报(自然科学版)》
CSCD
北大核心
2021年第3期466-474,共9页
Journal of Chongqing University of Posts and Telecommunications(Natural Science Edition)
基金
国家自然科学基金(61462049)。
关键词
序列推荐
卷积神经网络
长短期记忆网络
注意力机制
sequence recommendation
convolutional neural network
long short-term memory networks
attention mechanism