摘要
对于室内视觉定位系统,需要在离线阶段建立Visual Map数据库用来存储图像信息,在线阶段用户通过与Visual Map数据库进行比对来完成用户位置的估计.离线阶段建立的数据库可以采用逐点采样或视频流采样的方式.但是无论何种方式,考虑到数据库中图像信息的相似性,传统方式建立的数据库中存储图像有较多冗余,导致增加了在线阶段的定位时间开销.因此,本文根据Visual Map中的相邻图像间的相似性,提出了一种基于图像关键帧的Visual-Depth Map建立方法,有效地减少了离线数据库的规模.在离线阶段,本文使用Kinect传感器同时获得图像信息和深度信息;然后,通过基于图像相似度的图像关键帧算法对原始图像序列进行筛选,得到关键帧序列,从而实现Visual-Depth Map的建立.在线阶段,用户可以直接输入查询图像与Visual-Depth Map中的图像序列进行检索匹配,找到相似度较高的匹配图像,再通过EPnP算法进行2D-3D的位姿估计,完成用户位置的计算.实验证明,本文所提方法可以在保证较高定位精度的前提下,有效减少离线数据库规模,降低在线阶段的定位时间开销.
In the indoor visual positioning system,the offline Visual Map database is usually used to store the database images,and then user position is estimated by comparing with the images in the Visual Map database in the online phase.The database establishment in the offline phase can be realized with point-by-point sampling method or video stream sampling method.However,taking into account the similarity of database images,redundancy exists between the images in the database,which leads to more positioning time consumption in the online phase.Therefore,owing to the similarity between successive images,a Visual-Depth Map method which reduces the database scale is proposed based on image keyframes.In the offline phase,the method adopts a Kinect sensor which acquires image information and depth information simultaneously.And then,for establishing the Visual-Depth Map,the keyframe sequence is selected from original database images by the image keyframe algorithm that is based on the image similarity.In the online phase,the query image captured by the user is retrieved and matched with the database image in the Visual-Depth Map with higher similarity.The EPnP algorithm is employed to estimate the query camera pose so as to achieve the user position.Experimental results show that the proposed Visual-Depth Map method based on the image keyframe reduces the database scale and positioning time consumption comparing with traditional methods,under the precondition of the high positioning accuracy.
作者
马琳
杨浩
谭学治
冯冠元
MA Lin;YANG Hao;TAN Xuezhi;FENG Guanyuan(School of Electronics and Information Engineering,Harbin Institute of Technology,Harbin 150001,China)
出处
《哈尔滨工业大学学报》
EI
CAS
CSCD
北大核心
2018年第11期23-31,共9页
Journal of Harbin Institute of Technology
基金
国家自然科学基金(61571162)
教育部-中国移动科研基金(MCM20170106)
黑龙江省自然科学基金(F2016019)