摘要
针对传统视觉里程计算法在动态场景下对相机位姿估计鲁棒性和精确性不高的问题,基于场景流及静态点相对空间距离不变原理,提出一种动态场景下鲁棒的立体视觉里程计算法。该算法首先利用相机初始位姿估计,计算相邻帧之间匹配点对的场景流,进而建立高斯混合模型,并对场景中动、静物体的特征点进行初步分离;然后利用静态点空间相对距离不变的原理和匹配虚拟地图点集的方法进一步分离动、静物体上的特征点;最后,生成由静态特征点对应地图点构成的局部地图,并考虑虚拟地图点集的匹配情况,最小化重投影误差,以得到精确的相机位姿估计。对TUM数据集以及实际场景进行实验,结果表明提出的算法在动态场景下能够鲁棒地估计出高精度的相机位姿。
The traditional visual odometry algorithms are not robust and accurate for camera pose estimation in dynamic scenes. A robust stereo visual odometry algorithm in dynamic scenes is proposed based on the principle of the scene flow and the invariance of relative spatial distance of static points. Firstly, the initial pose estimation of the camera is used to calculate the scene flow of matching point pairs between adjacent frames. A gaussian mixture model is established and the features of the dynamic and static objects in the scenes are distinguished preliminarily. Then, according to the invariance principle of static point relative spatial distance and the method of matching virtual map points, the features on dynamic and static objects are further identified. Finally, the local map are generated, which composes of the map points corresponding to static feature points. Meanwhile, the matching of the virtual map point sets is considered, and the accurate pose estimation of the camera is obtained by minimizing the reprojection errors. Experiments on the TUM dataset and the lab scene show that the precise camera pose is estimated robustly under dynamic scenes by the proposed algorithm.
作者
张合新
徐慧
姚二亮
宋海涛
赵欣
Zhang Hexin;Xu Hui;Yao Erliang;Song Haitao;Zhao Xin(Missile Engineering College,Rocket Force University of Engineering,Xi'an 710025,China)
出处
《仪器仪表学报》
EI
CAS
CSCD
北大核心
2018年第9期246-254,共9页
Chinese Journal of Scientific Instrument
基金
青年科学基金(61503393)项目资助
关键词
动态场景
双目视觉里程计
场景流
静态点空间相对距离不变
虚拟地图点
dynamic scenes
stereo visual odometry (VO)
scene flow
static point relative spatial distance invariance
virtual map point