Abstract:
Fine-grained tactile perception of objects is significant for robots to explore the unstructured environment. Recent years have seen the success of Convolutional Neural N...Show MoreMetadata
Abstract:
Fine-grained tactile perception of objects is significant for robots to explore the unstructured environment. Recent years have seen the success of Convolutional Neural Networks (CNNs)-based methods for tactile perception using high-resolution optical tactile sensors. However, CNNs-based approaches may not be efficient for processing tactile image data and have limited interpretability. To this end, we propose a Graph Neural Network (GNN)-based approach for tactile recognition using a soft biomimetic optical tactile sensor. The obtained tactile images can be transformed into graphs, while GNN can be used to analyse the implicit tactile information among the tactile graphs. The experimental results indicate that with the proposed GNN-based method, the maximum tactile recognition accuracy can reach 99.53%. In addition, Gradient-weighted Class Activation Mapping (Grad-CAM) and Unsigned Grad-CAM (UGrad-CAM) methods are used for visual explanations of the models. Compared to traditional CNNs, we demonstrated that the generated features of the GNN-based model are more intuitive and interpretable.
Date of Conference: 01-03 September 2022
Date Added to IEEE Xplore: 10 October 2022
ISBN Information: