期刊文献+

基于光学扩散的景物3D形貌重建

3D reconstruction based on optical diffusion
原文传递
导出
摘要 基于光学扩散的景物3D重建方法具有小透镜下深度恢复精度高、重建深度范围广、只需要一台摄像机、不需要调整相机参数或者改变相机和景物之间距离等优势.但是目前并没有一种描述光学扩散中深度变化和扩散能量分布之间关系的严谨数学模型.本文引入了物理学中的热量辐射方程,建立了光扩散过程中的连续能量分布模型,并提出了一种基于光扩散的精确单目视觉全局景物3D深度信息获取方法.首先,详细分析了物理学中热辐射机理,建立了用来描述光扩散成像过程中的连续能量分布模型;然后,结合光扩散引起的图像模糊(点扩散函数扩散程度)与景物深度之间的数学关系,提出了一种基于光学扩散的景物3D形貌重建方法.最后,使用合成图像和真实图像分别验证了本文算法在3D重建中的有效性和精确性. 3D reconstruction based optical diffusion has some significant advantages such as high-precision depth estimation with a small lens, depth estimation for distant objects, monocular vision based, and no adjusting of cameras or scenes. However, rare mathematical model has been proposed to relate depth information to the basic principle of intensity distribution during optical diffusion. In this paper, heat diffusion equation in physics is introduced into optical diffusion, and a mathematic model of intensity distribution during optical diffusion is constructed. A high-precision 3D reconstruction method with optical diffusion is proposed based on heat diffusion equation. First, the heat diffusion equation in physics is analyzed, and an optical diffusion model is introduced to explain the basic principle of diffusion imaging process. Second, a novel 3D reconstruction based on global heat diffusion is proposed with the relationship between depth information and diffusion degree. Finally, simulation with synthetic images and experiment with five playing cards are conducted, and the result proves the effectiveness and feasibility of the proposed method.
出处 《中国科学:技术科学》 EI CSCD 北大核心 2016年第2期204-212,共9页 Scientia Sinica(Technologica)
基金 国家自然科学基金(批准号:61305025 61473282) 中央高校基本科研业务费专项资金(编号:N130504011)资助项目
关键词 3D重建 光学扩散 热辐射 能量分布 3D reconstruction optical diffusion heat diffusion energy distribution
  • 相关文献

参考文献22

  • 1Yin C Y. Determining residual nonlinearity of a high-precision heterodyne interferometer. Opt Eng, 1999, 38:1361-1365. 被引量:1
  • 2Wu L D. Computer Vision. Shanghai: Fudan University Press, 1993. 被引量:1
  • 3Girod B, Scherock S. Depth from defocus of structured light. In: Svetkoff D I, ed. Optics, Illumination, and Image Sensing for Machine Vision IV. SPIE, 1989. 209-215. 被引量:1
  • 4Bove V M J. Entropy-based depth from focus. J Opt Soc Am A, 1993, 10:561-566. 被引量:1
  • 5Nayar S K. Shape from focus system. In: Proceedings of IEEE Computer Vision and Pattern Recognition, Berlin, 1992. 302-308. 被引量:1
  • 6Pentland A P, Scheroch S, Darrell T, et al. Simple range cameras based on focus error. J Opt Soc Am A, 1994, 11:2925-2934. 被引量:1
  • 7Navar S K, Watanabe M, Noguchi M. Real-time focus range sensor. IEEE T Pattern Anal, 1996, 18:1186-1198. 被引量:1
  • 8Pentland A P. A new sense for depth of field. IEEE T Pattern Anal, 1987, 9:523-531. 被引量:1
  • 9Gokstorp M. Computing depth from out-of-focus blur using a local frequency representation. In: Proceedings of International Conference on Pattern Recognition, Jerusalem, 1994. 153-158. 被引量:1
  • 10Subbarao M, Surya G. Depth from defocus: A spatial domain approach. Int J Comput Vision, 1994, 13:271-294. 被引量:1

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部