Skip to Main Content
Here we present a hybrid sensor system consisting of a time-of-flight depth-imaging camera combined with a traditional stereo camera system. The unit can be mounted at the ceiling and delivers a 2 1/2-dimensional dataset that describes the scene beneath it. The combination of two different depth-imaging principles enables the system to compensate each single sensorÂ¿s weaknesses and thus improve overall reliability under a wide range of operating conditions. The system aims at supervision of work areas, which is necessary prerequisite to enable safe human-robot interaction in a collaborating workspace. In this contribution, we will briefly describe application scenarios, which will benefit from employing our device. We will also discuss weaknesses of both, time-of-flight and stereo imaging. Based on that, we describe our hybrid-sensor approach and provide a description of a suitable method for system calibration. At the end we derive an algorithm for an adaptive combination of data from both sensors and describe it in more detail.