摘要
作为几何活动轮廓模型(GACM)的一个标志性模型,C-V模型在图像分割应用中因具有对目标遮挡和边缘噪声的鲁棒性而受到关注.然而该模型通常不能较好地处理复杂的异质图像,并且有对演化曲线的初始位置较为敏感和计算复杂度高等弱点.依据演化曲线内、外区域平均灰度值差的绝对值越大,演化曲线越靠近准确目标边缘的特性,提出一种基于灰度差能量函数引导的图像分割自适应C-V模型.该模型通过构造基于轮廓曲线内、外区域平均灰度差引导函数自适应地调整演化曲线的运动趋势,使得曲线演化可在一个有效的"窄带"范围内进行,保证轮廓曲线内、外部区域灰度计算的局部均一性,增强对目标细节信息的捕捉能力,同时也在一定程度上提高模型的计算速度和对轮廓曲线初始位置的适应性.大量的仿真实验验证该模型的有效性.
As the sign of geometric active contour model (GACM) , the C-V model has robustness to obscured targets and edge noise in image segmentation. However, this model usually cannot deal with complex heterogeneous images, and it is also sensitive to the initial position of evolution curve and has a high computational complexity. The more the average gray difference between inner region and outer region is, the closer the evolutionary curve to accurate target edge is. On this basis, an adaptive C-V image segmentation model guided by gray difference energy function is proposed. The model can adjust the movement trend of the evolutionary curve by the guidance function constructed based on average gray difference between inner region and outer region adaptively. This makes the evolution of the curve within a valid narrow band scope. The proposed model ensures the local homogeneity of gray calculation of contour curve between inner region and outer region and enhances the ability to capture the detailed target. At the same time, it improves the calculation speed of the model and the adaptability to the initial position of evolution curve to a certain extent. A large number of simulation experiments verify the validity of the proposed model.
出处
《模式识别与人工智能》
EI
CSCD
北大核心
2015年第3期214-222,共9页
Pattern Recognition and Artificial Intelligence
基金
国家自然科学基金项目(No.41271422
61402214)
高等学校博士学科点专项科研基金项目(No.20132136110002)
辽宁省博士科研启动基金项目(No.20121076)
辽宁省教育厅科学研究一般项目(No.L2011192
L2013405
L2013406)
计算机软件新技术国家重点实验室开放基金项目(No.KFKT2011B11)
智能计算与信息处理教育部重点实验室(湘潭大学)开放课题项目(No.2011ICIP06)资助
关键词
C—V活动轮廓模型
灰度差能量引导函数
图像分割
轮廓曲线
C-V Active Contour Model, Gray Difference Energy Guidance Function, Image Segmentation, Contour Curve