Abstract:
Event cameras are a new technology that can enable low-latency, fast visual sensing in dynamic environments towards faster robotic vision as they respond only to changes ...Show MoreMetadata
Abstract:
Event cameras are a new technology that can enable low-latency, fast visual sensing in dynamic environments towards faster robotic vision as they respond only to changes in the scene and have a very high temporal resolution (<; 1μs). Moving targets produce dense spatio-temporal streams of events that do not suffer from information loss “between frames”, which can occur when traditional cameras are used to track fast-moving targets. Event-based tracking algorithms need to be able to follow the target position within the spatio-temporal data, while rejecting clutter events that occur as a robot moves in a typical office setting. We introduce a particle filter with the aim to be robust to temporal variation that occurs as the camera and the target move with different relative velocities, which can lead to a loss in visual information and missed detections. The proposed system provides a more persistent tracking compared to prior state-of-the-art, especially when the robot is actively following a target with its gaze. Experiments are performed on the iCub humanoid robot performing ball tracking and gaze following.
Date of Conference: 24-28 September 2017
Date Added to IEEE Xplore: 14 December 2017
ISBN Information:
Electronic ISSN: 2153-0866