Skip to Main Content
In this paper, we present a new method to automatically transfer touch stimulation into controllable, reflex reactions for articulated robots - such as a humanoid robot. Our work has been motivated by the necessity to automate the reaction setup for an increasing number of our multi-modal artificial sensor skin units (HEX-O-SKIN). Our method therefore evaluates the effect of isolated, sinusoidal degree of freedom (DoF) movements, around a current pose of the robot, towards the motion of each sensor unit. We do this by exploiting data from a 3-axis accelerometer one of the multiple modalities located on every of our sensor units (SU). Direction and amplitude from all three axes enable us to generate signed weights, one per DoF and SU, leading to the partial construction of a sensory-motor map. In this paper, we focus on reactions towards lateral sensory modalities - such as a distance sensor. A higher acceleration along the surface normal thus leads to a higher lateral weight, while unwanted sideway movements decrease it. We then define a reaction controller at the level of each sensor unit. The sensory-motor map is used to map these local reactions, from units distributed all over the robot, into the robot's motor space. Through activation, inhibition or inversion it is possible to adapt these low-level controller instances to a given context. Here, we show experiments with a KUKA lightweight robotic arm reacting evasively or aggressively towards contact with a lateral distance sensor, emulating the sensation of light touch. In comparison to other related works, our method does not suffer from occlusion, complex touch situations or long calibration time.