Skip to Main Content
Even though sensor fusion techniques based on particle filters have been applied to object tracking, their implementations have been limited to combining measurements from multiple sensors by the simple product of individual likelihoods. Therefore, the number of observations is increased as many times as the number of sensors, and the combined observation may become unreliable through blind integration of sensor observations - especially if some sensors are too noisy and non-discriminative. We describe a methodology to model interactions between multiple sensors and to estimate the current state by using a mixture of Bayesian filters - one filter for each sensor, where each filter makes a different level of contribution to estimate the combined posterior in a reliable manner. In this framework, an adaptive particle arrangement system is constructed in which each particle is allocated to only one of the sensors for observation and a different number of samples is assigned to each sensor using prior distribution and partial observations. We apply this technique to visual tracking in logical and physical sensor fusion frameworks, and demonstrate its effectiveness through tracking results.