Skip to Main Content
This paper focuses on two aspects of a human robot interaction scenario: Detection and tracking of moving objects, e.g., persons is necessary for localizing possible interaction partners and reconstruction of the surroundings can be used for navigation purposes and room categorization. Although these processes can be addressed independent from each other, we show that using the available data in exchange enables a more exact reconstruction of the static scene. A 6D data representation consisting of 3D Time-of-Flight (ToF) Sensor data and computed 3D velocities allows segmenting the scene into clusters with consistent velocities. A weak object model is applied to localize and track objects within a particle filter framework. As a consequence, points emerging from moving objects can be neglected during reconstruction. Experiments demonstrate enhanced reconstruction results in comparison to pure bottom-up methods, especially for very short image sequences.