期刊文献+

图像超分辨率全局残差递归网络 被引量:4

Global Residual Recursive Network for Image Super-resolution
下载PDF
导出
摘要 将深度网络模型应用在图像超分辨率上取得了很大的成功,并且已经证明了在将低分辨率图像重建成高分辨率图像的重建质量上深度网络模型普遍高于传统的算法。为了进一步提高图片的重建质量,文中提出了全局残差递归网络。通过优化经典的残差网络,提出全局残差块特征融合和局部残差块特征融合,让模型产生“自适应”更新权值的思想,改善信息流。结合L1代价函数,ADAM优化器进一步提高了训练的稳定性,并通过DIV2K训练集来训练模型。通过PSNR/SSIM图像重建指标来评价图片重建质量,在SSIM指标中,所提模型最高可达0.94,优于目前最新的深度学习模型(EDSR)的0.92。全局残差递归网络模型有效地提高了图像的重建质量,减少了训练时间,避免了梯度衰减,提高了学习效率。 The application of the deep network model has achieved great success in image super-resolution,and it has been proven that the reconstruction quality of low-resolution images reconstructed into high-resolution images is gene-rally higher than traditional algorithms.In order to further improve the reconstruction quality of image,a global residual recursive network was proposed.By optimizing the classical residual network,the global residual block feature fusion and the local residual block feature fusion are proposed,which allows the model to generate the idea of adaptive updating weights,and it improves information flow.In combination with the L1 cost function,the ADAM optimizer further improves training stability and trains the model through the DIV2K training set.Through the PSNR/SSIM image reconstruction index,the quality of picture reconstruction is obtained.In the SSIM index,the maximum value is 0.94,which is superior to 0.92 of the current latest deep learning model(EDSR).The global residual recursive network model effectively improves the image reconstruction quality,reduces straining time,effectively avoids gradient attenuation,and improves learning efficiency.
作者 张雷 胡博文 张宁 王茂森 ZHANG Lei;HU Bo-wen;ZHANG Ning;WANG Mao-sen(College of Electronic Information Engineering,Shenyang Aerospace University,Shenyang 110136,China;Space Electronic Technology Research Institute in Shanghai,Shanghai 201109,China)
出处 《计算机科学》 CSCD 北大核心 2019年第B06期230-233,共4页 Computer Science
基金 国家自然科学基金(61671037) 上海航天科技创新基金(SAST2016090)资助
关键词 图像超分辨率 L1代价函数 全局残差递归网络 ADAM优化器 DIV2K训练集 Image super-resolution L1 cost function Global residual recursive network ADAM optimizer DIV2K training set
  • 相关文献

参考文献2

二级参考文献4

共引文献4

同被引文献27

引证文献4

二级引证文献26

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部