Loading web-font TeX/Main/Regular
Spatial Asymmetry in Tactile Sensor Skin Deformation Aids Perception of Edge Orientation During Haptic Exploration | IEEE Journals & Magazine | IEEE Xplore

Spatial Asymmetry in Tactile Sensor Skin Deformation Aids Perception of Edge Orientation During Haptic Exploration


Abstract:

Upper-limb amputees rely primarily on visual feedback when using their prostheses to interact with others or objects in their environment. A constant reliance upon visual...Show More

Abstract:

Upper-limb amputees rely primarily on visual feedback when using their prostheses to interact with others or objects in their environment. A constant reliance upon visual feedback can be mentally exhausting and does not suffice for many activities when line-of-sight is unavailable. Upper-limb amputees could greatly benefit from the ability to perceive edges, one of the most salient features of 3D shape, through touch alone. We present an approach for estimating edge orientation with respect to an artificial fingertip through haptic exploration using a multimodal tactile sensor on a robot hand. Key parameters from the tactile signals for each of four exploratory procedures were used as inputs to a support vector regression model. Edge orientation angles ranging from −90 to 90 degrees were estimated with an 85-input model having an R^{2} of 0.99 and RMS error of 5.08 degrees. Electrode impedance signals provided the most useful inputs by encoding spatially asymmetric skin deformation across the entire fingertip. Interestingly, sensor regions that were not in direct contact with the stimulus provided particularly useful information. Methods described here could pave the way for semi-autonomous capabilities in prosthetic or robotic hands during haptic exploration, especially when visual feedback is unavailable.
Published in: IEEE Transactions on Haptics ( Volume: 7, Issue: 2, April-June 2014)
Page(s): 191 - 202
Date of Publication: 21 October 2013

ISSN Information:

PubMed ID: 24960552

1. Introduction

The intimate connection between an amputee and his or her upper-limb prosthesis brings together two complex systems that speak different languages at different timescales. Communication delays inherent to human-machine systems result from the necessary translation between the biological and artificial systems for both afferent and efferent signals [1]. The cognitive burden on an amputee can be minimized by making the prosthesis more intuitive to use and minimizing the details that the amputee must consider in light of such delays. Subtle details of control include determining which of the multitude of joints to actuate, when and how hard to grasp an object, and how to adjust fingertip forces to maintain stability during object grasp and tool use.

Contact IEEE to Subscribe

References

References is not available for this document.