期刊文献+

基于LpTransformer网络的手语动画拼接模型

Sign Language Animation Splicing Model Based on LpTransformer Network
下载PDF
导出
摘要 手语动画拼接是一个热门话题。随着机器学习技术的不断发展,尤其是深度学习相关技术的逐渐成熟,手语动画拼接的速度和质量不断提高。将手语单词拼接成句子时,相应的动画也需要拼接。传统的算法在拼接动画时采取距离损失的方式寻找最佳拼接点,使用线性或球面插值的方式生成过渡帧,这种拼接算法不仅在效率和灵活性方面存在明显缺陷,而且生成的过渡帧也不自然。为解决上述问题,提出了LpTransformer模型来预测拼接位置和生成过渡帧。实验表明,LpTransformer的过渡帧预测精度达到99%,优于ConvS2S,LSTM和Transformer模型,且其拼接速度较Transformer快5倍。因此,所提模型能够实现实时性拼接。 Sign language animation splicing is a hot topic.With the continuous development of machine learning technology,especially the gradual maturity of deep learning related technologies,the speed and quality of sign language animation splicing are constantly improving.When splicing sign language words into sentences,the corresponding animation also needs to be spliced.Traditional algorithms use distance loss to find the best splicing position when splicing animation,and use linear or spherical interpolation to generate transition frames.This splicing algorithm not only has obvious defects in efficiency and flexibility,but also gene-rates unnatural sign language animation.In order to solve the above problems,LpTransformer model is proposed to predict the splicing position and generate transition frames.Experiment results show that the prediction accuracy of LpTransformer's transition frames reaches 99%,which is superior to ConvS2S,LSTM and Transformer,and its splicing speed is five times faster than Transformer,so it can achieve real-time splicing.
作者 黄涵强 邢云冰 沈建飞 范非易 HUANG Hanqiang;XING Yunbing;SHEN Jianfei;FAN Feiyi(Henan Institute of Advanced Technology,Zhengzhou University,Zhengzhou 450000,China;Institute of Computing Technology,Chinese Academy of Sciences,Beijing 100000,China;Shandong Industrial Technology Research Institute Intelligent Computing Research Institute,Jinan 250000,China)
出处 《计算机科学》 CSCD 北大核心 2023年第9期184-191,共8页 Computer Science
基金 国家重点研发计划(2018YFC2002603)。
关键词 手语动画拼接 深度学习 LpTransformer 拼接位置 过渡帧 Sign language animation splicing Deep learning LpTransformer Splicing position Transition frames
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部