Skip to Main Content
Micromanipulation systems have recently been receiving increased attention. Teleoperated or automated micromanipulation is a challenging task due to the need for high-frequency position or force feedback to guarantee stability. In addition, the integration of sensors within micromanipulation platforms is complex. Vision is a commonly used solution for sensing; unfortunately, the update rate of the frame-based acquisition process of current available cameras cannot ensure-at reasonable costs-stable automated or teleoperated control at the microscale level, where low inertia produces highly unreachable dynamic phenomena. This paper presents a novel vision-based microrobotic system combining both an asynchronous address event representation silicon retina and a conventional frame-based camera. Unlike frame-based cameras, recent artificial retinas transmit their outputs as a continuous stream of asynchronous temporal events in a manner similar to the output cells of a biological retina, enabling high update rates. This paper introduces an event-based iterative closest point algorithm to track a microgripper's position at a frequency of 4 kHz. The temporal precision of the asynchronous silicon retina is used to provide a haptic feedback to assist users during manipulation tasks, whereas the frame-based camera is used to retrieve the position of the object that must be manipulated. This paper presents the results of an experiment on teleoperating a sphere of diameter around 50 μm using a piezoelectric gripper in a pick-and-place task.