摘要
Multiple object tracking (MOT) poses many difficulties to conventional well-studied single object tracking (SOT) algorithms, such as severe expansion of configuration space, high complexity of motion conditions, and visual ambiguities among nearby targets, among which the visual ambiguity problem is the central challenge. In this paper, we address this problem by embedding adaptive mixture observation models (AMOM) into a mixture tracker which is implemented in Particle Filter framework. In AMOM, the extracted multiple features for appearance description are combined according to their discriminative power between ambiguity prone objects, where the discriminability of features are evaluated by online entropy-based feature selection techniques. The induction of AMOM can help to surmount the incapability of conventional mixture tracker in handling object occlusions, and meanwhile retain its merits of flexibility and high efficiency. The final experiments show significant improvement in MOT scenarios compared with other methods.
Multiple object tracking (MOT) poses many difficulties to conventional well-studied single object tracking (SOT) algorithms, such as severe expansion of configuration space, high complexity of motion conditions, and visual ambiguities among nearby targets, among which the visual ambiguity problem is the central challenge. In this paper, we address this problem by embedding adaptive mixture observation models (AMOM) into a mixture tracker which is implemented in Particle Filter framework. In AMOM, the extracted multiple features for appearance description are combined according to their discriminative power between ambiguity prone objects, where the discriminability of features are evaluated by online entropy-based feature selection techniques. The induction of AMOM can help to surmount the incapability of conventional mixture tracker in handling object occlusions, and meanwhile retain its merits of flexibility and high efficiency. The final experiments show significant improvement in MOT scenarios compared with other methods.
基金
Supported by National Natural Science Foundation of China (Grant No.60573167)
National High-Tech Research and Development Program of China (Grant No.2006AA01Z118)
National Basic Research Program of China (Grant No.2006CB303103)