期刊文献+

基于ResNet-ViT和注意力机制的车道线检测方法

Lane Detection Method Based on ResNet-ViT and Attention Mechanism
下载PDF
导出
摘要 车道线检测是自动驾驶领域中的重要感知任务。针对当前基于卷积神经网络(CNN)的车道线检测方法存在网络推理速度慢和对细长车道线结构建模能力不佳的问题,提出了一种基于ResNet-ViT和注意力机制的车道线检测方法。具体地,该方法首先搭建主干网络ResNet用于特征提取,并在主干网络中引入Vision Transformer (ViT)的编码结构,以提高网络对车道线细长结构的建模能力。其次,设计辅助分割网络,在其中嵌入通道注意力机制模块,以增强网络对重要通道的学习能力;辅助分割网络与主干网络通过共享部分参数来实现权重共享,从而提高模型的效率和泛化能力。最后,特征解码部分引入行锚分类的思想,在特征图行方向上预测车道线的位置坐标,输出带有车道线标记点的图像。经过实验验证,本文所提出的方法在TuSimple数据集上的准确率达到96.04%,推理速度达到98帧/秒,验证了其有效性。 Lane detection is a crucial task in the field of autonomous driving. However, the current lane detection methods based on Convolutional Neural Networks (CNN) suffer from slow network inference speed and poor ability to model the slender lane structure. To overcome these limitations, this paper proposes a lane detection method based on ResNet-ViT and attention mechanism. Specifically, the proposed method first constructs a backbone network ResNet for feature extraction, and incorporates the Vision Transformer (ViT) coding structure into the backbone network to enhance the network’s ability to model the slender structure of lane lines. Additionally, an auxiliary segmentation network is designed, in which a channel attention mechanism module is incorporated to enhance the network’s learning ability for important channels. The auxiliary segmentation network and the backbone network share some parameters to achieve weight sharing, thereby improving the efficiency and generalization ability of the model. Finally, the line anchor classification concept is introduced in the feature decoding part to predict the position coordinates of the lane lines in the line direction of the feature map and generate the image with lane mark points. Experimental results on the TuSimple dataset demonstrate that the proposed method achieves an accuracy of 96.04% and an inference speed of 98 frames per second, verifying its effectiveness.
作者 何飞 唐春晖
机构地区 上海理工大学
出处 《软件工程与应用》 2023年第3期381-392,共12页 Software Engineering and Applications
  • 相关文献

参考文献2

二级参考文献5

共引文献15

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部