期刊文献+

K-nearest neighborhood based integration of time-of-flight cameras and passive stereo for high-accuracy depth maps? 被引量:1

K-nearest neighborhood based integration of time-of-flight cameras and passive stereo for high-accuracy depth maps?
原文传递
导出
摘要 Both time-of-flight(ToF) cameras and passive stereo can provide the depth information for their corresponding captured real scenes, but they have innate limitations. ToF cameras and passive stereo are intrinsically complementary for certain tasks. It is desirable to appropriately leverage all the available information by ToF cameras and passive stereo. Although some fusion methods have been presented recently, they fail to consider ToF reliability detection and ToF based improvement of passive stereo. As a result, this study proposes an approach to integrating ToF cameras and passive stereo to obtain high-accuracy depth maps. The main contributions are:(1) An energy cost function is devised to use data from ToF cameras to boost the stereo matching of passive stereo;(2) A fusion method is used to combine the depth information from both ToF cameras and passive stereo to obtain high-accuracy depth maps. Experiments show that the proposed approach achieves improved results with high accuracy and robustness. Both time-of-flight (ToF) cameras and passive stereo can provide the depth information for their corresponding captured real scenes, but they have innate limitations. ToF cameras and passive stereo are intrinsically complementary for certain tasks. It is desirable to appropriately leverage all the available information by ToF cameras and passive stereo. Although some fusion methods have been presented recently, they fail to consider ToF reliability detection and ToF based improvement of passive stereo. As a result, this study proposes an approach to integrating ToF cameras and passive stereo to obtain high-accuracy depth maps. The main contributions are: (1) An energy cost function is devised to use data from ToF cameras to boost the stereo matching of passive stereo; (2) A fusion method is used to combine the depth information from both ToF cameras and passive stereo to obtain high-accuracy depth maps. Experiments show that the proposed approach achieves improved results with high accuracy and robustness.
出处 《Journal of Zhejiang University-Science C(Computers and Electronics)》 SCIE EI 2014年第3期174-186,共13页 浙江大学学报C辑(计算机与电子(英文版)
基金 Project supported by the National Natural Science Foundation of China(Nos.61072081 and 61271338) the National High-Tech R&D Program(863)of China(No.2012AA011505) the National Science and Technology Major Project of the Ministry of Science and Technology of China(No.2009ZX01033-001-007) the Key Science and Technology Innovation Team of Zhejiang Province(No.2009R50003) the China Postdoctoral Science Foundation(No.2012T50545)
关键词 Depth map Passive stereo Time-of-flight camera Fusion Depth map, Passive stereo, Time-of-flight camera, Fusion
  • 相关文献

参考文献24

  • 1Ringbeck, T., Hagebeuker, B., 2007. A 3D time of flight camera for object detection. Proc. 8th Conf. on Optical 3-D Measurement Techniques. 被引量:1
  • 2Chen, Q., l,i, D., Tang, C., 2012. KNN matting. Proc. IEEE Conf. on Computer Vision and Pattern Recognition, p.869-876. [doi: 10.1109/CVPR.2012.6247760]. 被引量:1
  • 3Lindner, M., Kolb, A., Hartmann, K., 2007. Data-fllsion of PMD-based distance-information and high-resolution RGB-images. Proc. Int. Syrup. on Signals, Circuits and Systems, p.1-4. [doi:10.1109/ISSCS.2007.4292666]. 被引量:1
  • 4Zhang, Z., 19991 Flexible camera calibration by view- ing a plane from unknown orientations. Proc. 7th IEEE Int. Conf. on Computer Vision, p.666-673. [doi: 10.1109/ICCV. 1999.791289]. 被引量:1
  • 5Scharstein, D., Szeliski, R., 2002a. A taxonomy and eval- uation of dense two-frame stereo correspondence algo- rithms. Int. J. Oomput. Vis., 47(1-3):7-42. [doi:10. 1023/A: 1014573219977]. 被引量:1
  • 6PMD, 2009. Camcube Series. AvaiLable from http://www. pmdtec.com/. 被引量:1
  • 7Scharstein, D., Szeliski, R., 2002b. Middlebury Stereo Eval- uation - version 2. Available from http://vision, mid- dlebury.edu/stereo/eval. 被引量:1
  • 8Yang, Q., 2012. A non-local cost aggregation method for stereo matching. Proc. IEEE Conf. on Com- puter Vision and Pattern Recognition, p.1402-1409. [doi: 10.1109/CVPR.2012.6247827]. 被引量:1
  • 9Attamimi, M., Mizutani, A., Nalmura, T., et al., 2010. Real-time 3D visuM sensor for robust object recognition. IEEE/RSJ Int. Conf. on Intelligent Robots and Sys- tems, p.4560-4565. [doi:10.1109/IROS.2010.5650455]. 被引量:1
  • 10Opencv, 2012. Open Source Computer Vision Library (opencv). Available from www.intel.com/teehnology/ computing/opencv/. 被引量:1

二级参考文献1

同被引文献3

引证文献1

二级引证文献4

相关作者

内容加载中请稍等...

相关机构

内容加载中请稍等...

相关主题

内容加载中请稍等...

浏览历史

内容加载中请稍等...
;
使用帮助 返回顶部