Abstract:
For a robot to compete at the game of air-hockey requires the ability to track the fast-moving puck, and fast reaction of its control system. Event-cameras can be used to...Show MoreMetadata
Abstract:
For a robot to compete at the game of air-hockey requires the ability to track the fast-moving puck, and fast reaction of its control system. Event-cameras can be used to solve the visual tracking task in order to overcome problems of motion blur and/or high processing requirements that come from when using traditional RGB cameras. Each pixel of an event-camera responds independently to change in light, resulting in a high frequency, low-latency update of the puck position. A vision-in-the-loop robot control can then maintain stability with much faster movements. In this paper, we introduce the control loop for an iCub robot to follow the position of the puck with its head motion. We evaluate the accuracy and stability of the iCub motion as the latency of the tracked position is varied from 1 ms to 30 ms, achievable in real-time with the event-camera, eventually resulting in control failure. We conclude that the event-driven tracking paradigm is an enabling technology for unlocking smooth dynamic robot motions from vision, also for tasks beyond air-hockey.
Published in: 2022 8th International Conference on Event-Based Control, Communication, and Signal Processing (EBCCSP)
Date of Conference: 22-24 June 2022
Date Added to IEEE Xplore: 18 August 2022
ISBN Information: