摘要
提出了一种基于时空结合的动态场景立体视频匹配方法。通过对连续视频帧的仿射变换运动建模,建立连续帧之间的时域联系,在颜色分割的基础上,根据重叠分割块在连续帧之间视差变化的平滑性约束,利用左右视图之间的几何相关性对视差进行选择性优化,以克服在立体视频逐帧匹配过程中由于颜色分割块的不稳定而导致的误匹配和视差跳变现象,使得最终获取的视差视频在时间轴上更加准确稳定。实验结果表明,算法能有效降低视差视频的跳变现象,对由颜色分割引入的错误匹配也能进行很好的纠正,获取了较为准确的视差视频结果。
This paper presents a novel method for estimating temporally consistent disparity maps for stereo video sequences. The proposed method incorporates the colol: constancy constraint and the geometric coherence constraint in a non-iterative stereo matching process. It uses the disparity smoothness of overlapped segmentation between adjacent frames based on the affine transformation to constrain the temporal consistency. And a similarity checking based on the spatial geometrical coherence is used to refine the disparity result. Experimental result shows the proposed algorithm can reduce the "popping artifacts" and over-segmentation obviously, and yield satisfactorv disparitv video red,lilt
出处
《电路与系统学报》
CSCD
北大核心
2012年第6期6-13,共8页
Journal of Circuits and Systems
基金
国家自然科学基金项目(61271339)
浙江省公益性技术应用研究计划项目(2010C31108)
浙江省自然科学基金项目(LY12F01020)
关键词
立体匹配
视差
仿射变换
时间一致性
stereo matching
disparity
affine transformation
time consistency