期刊文献+

结合剪枝与流合并的卷积神经网络加速压缩方法 被引量:6

Accelerated compression method for convolutional neural network combining with pruning and stream merging
下载PDF
导出
摘要 深度卷积神经网络因规模庞大、计算复杂而限制了其在实时要求高和资源受限环境下的应用,因此有必要对卷积神经网络现有的结构进行优化压缩和加速。为了解决这一问题,提出了一种结合剪枝、流合并的混合压缩方法。该方法通过不同角度去压缩模型,进一步降低了参数冗余和结构冗余所带来的内存消耗和时间消耗。首先,从模型的内部将每层中冗余的参数剪去;然后,从模型的结构上将非必要的层与重要的层进行流合并;最后,通过重新训练来恢复模型的精度。在MNIST数据集上的实验结果表明,提出的混合压缩方法在不降低模型精度前提下,将LeNet-5压缩到原来的1/20,运行速度提升了8倍。 Deep convolutional neural networks are generally large in scale and complex in computation,which limits their application in high real-time and resource-constrained environments.Therefore,it is necessary to optimize the compression and acceleration of the existing structures of convolutional neural networks.In order to solve this problem,a hybrid compression method combining pruning and stream merging was proposed.In the method,the model was decompressed through different angles,further reducing the memory consumption and time consumption caused by parameter redundancy and structural redundancy.Firstly,the redundant parameters in each layer were cut off from the inside of the model.Then the non-essential layers were merged with the important layers from the structure of the model.Finally,the accuracy of the model was restored by retraining.The experimental results on the MNIST dataset show that the proposed hybrid compression method compresses LeNet-5 to 1/20 and improves its running speed by 8 times without reducing the accuracy of the model.
作者 谢斌红 钟日新 潘理虎 张英俊 XIE Binhong;ZHONG Rixin;PAN Lihu;ZHANG Yingjun(Department of Computer Science and Technology,Taiyuan University of Science and Technology,Taiyuan Shanxi 030024,China;Institute of Geographic Sciences and Natural Resources Research,Chinese Academy of Sciences,Beijing 100101 China)
出处 《计算机应用》 CSCD 北大核心 2020年第3期621-625,共5页 journal of Computer Applications
基金 山西省科技重大专项(20141101001) 山西省重点计划研发项目(201803D121048)~~
关键词 卷积神经网络 模型压缩 网络剪枝 流合并 冗余 Convolutional Neural Network(CNN) model compression network pruning stream merging redundancy
  • 相关文献

参考文献2

二级参考文献2

共引文献20

同被引文献49

引证文献6

二级引证文献35

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部