期刊文献+

基于W正则化和变式余弦动量的二值量化

A Binary Quantization Method Incorporating W-regularization and Variational Cosine Momentum
原文传递
导出
摘要 目前二值量化在信息提取过程中只能提取参数符号信息并且完全忽视位置信息,这导致了二值神经网络精度提升困难.针对当前存在的问题,提出了基于W正则化及变式余弦动量的新训练模块.W正则化可依据二值量化特点,对网络权重进行一定调整,使得不同位置的参数根据不同函数进行优化.设计充分利用了参数位置信息,同时提升了二值量化精度.此外,引入连续可导的变式余弦动量,即针对权重的位置分布添加不同幅度的动量,使得距离±1区间较远的参数可以以较快的速度逼近零点,该动量可以大幅提升收敛速度,在不影响推理速度的情况下进一步增加量化精度.在CIFAR-10, CIFAR-100, SVHN, Tiny ImageNet数据集上的实验结果表明,该方法在预测准确率上分别可以达到84.74%, 56.58%, 96.33%和42.41%,高于现有的先进算法. The current binary quantization can only extract the sign information of parameters and completely ignore the position information completely, which causes the difficulty in accuracy improvement. To address this problem, a new training module based on W-regularization and variable cosine momentum is proposed, which can be applied to adjust the network weights according to the binary quantization features.This design makes full use of the parameter position information while improving the accuracy of the binary quantization. In addition, the introduction of continuously derivable variable cosine momentum, which is a momentum of different magnitudes added to the position distribution of the weights, enabling parameters farther away from the ±1 interval to approach zero at a fast speed, which can significantly improve the convergence speed and further increase the quantization accuracy. Experimental results on the CIFAR-10,CIFAR-100, SVHN and Tiny ImageNet datasets show that the accuracy can achieve 84.74%, 56.58%,96.33% and 42.41% respectively, outperforming existing state-of-the-art methods.
作者 刘畅 陈莹 Liu Chang;Chen Ying(Key Laboratory of Adoanced Process Control for Light Industry,Ministry of Education,Jiangnan University,Wuri 214122,China)
出处 《南开大学学报(自然科学版)》 CAS CSCD 北大核心 2023年第2期22-30,36,共10页 Acta Scientiarum Naturalium Universitatis Nankaiensis
基金 国家自然科学基金(62173160)。
关键词 神经网络 模型压缩 二值量化 W正则化 变式余弦动量 neural networks deep compression binary quantization W-regularization variable cosine
  • 相关文献

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部