摘要
针对移动机械臂抓取物料过程中的三维空间定位问题,提出了一种适用于双目视觉的物料定位算法。首先基于双目视觉成像原理,构建手眼矩阵,完成双目相机坐标系到机器人坐标系的转换;其次将注意力机制嵌入主干网络的CSP(cross stage partial)结构中,采用一种YOLOv5s+CBAM(you only look once+convolutional block attention module)改进网络结构,在自制的物料数据集上mAP(mean average precision)达到99.1%,FPS(frames per second)为71,可实现物料抓取点位置的准确检测;最后针对抓取点立体匹配错误导致深度信息获取错误的问题,提出一种基于灰度值测距的SGBM(semi-global block matching)改进算法获取深度信息。经实验检测,平均测距误差5.16 mm,结合目标检测,可实现物料抓取点三维坐标的定位。
A material positioning algorithm for binocular vision was proposed,aiming at the problem of three-dimensional spatial positioning in the process of grasping materials by mechanical arm.Firstly,based on the principle of binocular vision imaging,the hand eye matrix was constructed to complete the transformation from binocular camera coordinate system to robot coordinate system.Secondly,the attention mechanism was embedded into the CSP structure of the backbone network,and a yolov5s+CBAM was used to improve the network structure.On the self-made material data set,the map is 99.1%and the FPS is 71,which can realize the accurate detection of the position of material grab point.Finally,aiming at the problem of depth information acquisition error caused by grab point stereo matching error,an improved sgbm algorithm based on gray value ranging was proposed to obtain depth information.Through experimental detection,the average ranging error is 5.16 mm.Combined with target detection,the three-dimensional coordinate positioning of material grab point can be realized.
作者
赵杰
汪志成
黄南海
王哲
ZHAO Jie;WANG Zhi-cheng;HUANG Nan-hai;WANG Zhe(Mechanical and Electronic Engineering Institute,East China University of Technology,Nanchang 330013,China;Jiangxi Province New Energy Technology and Equipment Engineering Technology Research Center,Nanchang 330013,China)
出处
《科学技术与工程》
北大核心
2023年第18期7861-7867,共7页
Science Technology and Engineering
基金
江西省新能源工艺及装备工程技术研究中心开放基金(JXNE2019-04)
江西省科技厅科技计划(20181BBE58006)
江西省科技厅科技合作专项重点项目(20212BDH80008)。
关键词
双目视觉
物料定位
三维坐标
网格结构
深度信息
binocular vision
material positioning
three dimensional coordinates
grid structure
depth information