摘要
针对混凝土结构表观裂缝检测准确率低、细节信息丢失及精度不高等问题,提出一种利用HU-ResNet卷积神经网络的混凝土表观裂缝检测方法。基于改进U-Net网络建立HU-ResNet模型,采用经ImageNet预训练的ResNet34残差网络作为编码器,以保留裂缝细节信息并加速网络收敛,引入scSE注意力机制模块在空间和通道重新标定编码块与解码块的输出特征,并利用超柱模块融合解码器各阶段所输出特征图获取更精确的裂缝图像语义信息和定位,同时采用组合损失函数进一步提高裂缝图像精度。实验结果表明,该模型的像素准确率、交并比和F1值分别达到0.9904、0.6933和0.8166,优于Canny、区域生长等传统数字图像模型和FCN8s、U-Net、U-ResNet等深度学习模型且裂缝检测更精准。
Existing detection methods for apparent crack of concrete structure are inaccurate and low-precision,losing much detail information.To address the problem,this paper proposes an apparent crack detection method for concrete based on the HU-ResNet Convolutional Neural Networks(CNN).Based on improved U-Net,the HU-ResNet model is established using the ResNet34 residual network trained by ImageNet as the encoder to retain crack details and accelerate network convergence.The scSE attention mechanism module is also introduced to recalibrate the output characteristics of the encoding block and decoding block in space and channel.At the same time,the output feature maps of each stage of the decoder are fused by the hypercolumn module to obtain more accurate semantic information and location of crack images,and the precision of which is further improved by using the combined loss function.Experimental results show that the pixel accuracy,Intersection-over-Union and F1 value of the proposed model reach 0.9904,0.6933 and 0.8166 respectively,which are better than that of Canny,region growing and other traditional digital image models and FCN8s,U-Net,U-ResNet and other deep learning models,and the proposed model has more accurate crack detection results.
作者
徐国整
廖晨聪
陈锦剑
董斌
周越
XU Guozheng;LIAO Chencong;CHEN Jinjian;DONG Bin;ZHOU Yue(School of Naval Architecture,Ocean and Civil Engineering,Shanghai Jiao Tong University,Shanghai 200240,China;School of Electronic Information and Electrical Engineering,Shanghai Jiao Tong University,Shanghai 200240,China;School of Civil Engineering,Southeast University,Nanjing 211189,China)
出处
《计算机工程》
CAS
CSCD
北大核心
2020年第11期279-285,共7页
Computer Engineering
基金
国家自然科学基金(51978399)。