期刊文献+

面向遥感图像场景分类的双知识蒸馏模型 被引量:2

A Double Knowledge Distillation Model for Remote Sensing Image Scene Classification
下载PDF
导出
摘要 为了提高轻型卷积神经网络(CNN)在遥感图像(RSI)场景分类任务中的精度,该文设计一个双注意力(DA)与空间结构(SS)相融合的双知识蒸馏(DKD)模型。首先,构造新的DA模块,将其嵌入到ResNet101与设计的轻型CNN,分别作为教师与学生网络;然后,构造DA蒸馏损失函数,将教师网络中的DA知识迁移到学生网络之中,从而增强其对RSI的局部特征提取能力;最后,构造SS蒸馏损失函数,将教师网络中的语义提取能力以空间结构的形式迁移到学生网络,以增强其对RSI的高层语义表示能力。基于两个标准数据集AID和NWPU-45的对比实验结果表明,在训练比例为20%的情况下,经知识蒸馏之后的学生网络性能分别提高了7.69%和7.39%,且在参量更少的情况下性能也优于其他方法。 In order to improve the accuracy of light-weight Convolutional Neural Networks(CNN)in the classification task of Remote Sensing Images(RSI)scene,a Double Knowledge Distillation(DKD)model combined with Dual-Attention(DA)and Spatial Structure(SS)is designed in this paper.First,new DA and SS modules are constructed and introduced into ResNet101 and lightweight CNN designed as teacher and student networks respectively.Then,a DA distillation loss function is constructed to transfer DA knowledge from teacher network to student network,so as to enhance their ability to extract local features from RSI.Finally,constructing a SS distillation loss function,migrating the semantic extraction ability in the teacher network to the student network in the form of a spatial structure to enhance its ability to express the high-level semantics of the RSI.The experimental results based on two standard data sets AID and NWPU-45 show that the performance of the student network after knowledge distillation is improved by 7.57%and 7.28%respectively under the condition of 20%training proportion,and the performance is still better than other methods under the condition of fewer parameters.
作者 李大湘 南艺璇 刘颖 LI Daxiang;NAN Yixuan;LIU Ying(School of Telecommunication and Information Engineering,Xi’an University of Posts and Telecommunications,Xi’an 710121,China)
出处 《电子与信息学报》 EI CSCD 北大核心 2023年第10期3558-3567,共10页 Journal of Electronics & Information Technology
基金 国家自然科学基金(62071379) 陕西省自然科学基金(2017KW-013) 西安邮电大学创新基金(CXJJYL2021055,YJGJ201902)。
关键词 遥感图像分类 知识蒸馏 双注意力 空间结构 Remote sensing image classification Knowledge distillation Dual Attention(DA) Spatial Structure(SS)
  • 相关文献

参考文献3

二级参考文献26

共引文献26

同被引文献19

引证文献2

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部