Skip to Main Content
Sensor data fusion from multiple cameras is an important problem for machine vision systems operating in complex, natural environments. We tackle the problem of how information from different sensors can be fused in 3D object tracking. We embed an approach called democratic integration into a probabilistic framework and solve the fusion step by hierarchically fusing the information of different sensors and different information sources (cues) derived from each sensor. We compare different fusion architectures and different adaptation schemes. The experiments for 3D object tracking using three calibrated cameras show that adaptive hierarchical fusion improves the tracking robustness and accuracy compared to a flat fusion strategy.