Soft Tactile Sensor With Multimodal Data Processing for Texture Recognition

In this letter, a soft tactile device with an alternative design approach is presented in tasks of texture recognition using multimodal data processing. This device integrates multiple layers and sensing elements in a soft device. The top layer is covered with a flexible piezoresistive material. The bottom layer is comprised of a soft case and a seven-axis chip within capable of measuring acceleration, angular velocity, and pressure data. The soft tactile sensor is validated with texture recognition tasks using data collected from five textures slid on the sensor with a robotic arm. These experiments are key to validate and characterize the sensor design by analyzing both individual and combined piezoresistive, accelerometer, and angular velocity signals with Bayesian methods. The results show that the recognition accuracy achieved by the sensor is related to the type and combination of data modalities. The highest accuracy achieved is 99.43% by combining piezoresistive and accelerometer data, while the lowest accuracy of 90.12% is obtained with angular velocity data alone. Overall, this work shows that the proposed multimodal soft tactile sensor can improve the performance of recognition tasks by the systematic use of multimodal data.


I. INTRODUCTION
Artificial sense of touch is an essential component to develop robots able to interact, perceive, and explore safely their surrounding environment [1], [2]. In order to have a robust perception, living creatures use a plethora of sensors that combined provide a sophisticated, robust, and complex sensory systems, such as vision, touch, taste, and hearing [3], [4]. For example, different types and specialized sensors, such as mechanoreceptors, individually can provide a limited and very specific signal relating to specific stimuli. However, combining the outputs of multiple modalities of sensing, richer information can be inferred. This biological multimodal sensing process represents an engineering challenge to be artificially replicated given the variety and quantity of information, and number of sensors needed to replicate human skin capabilities [5], [6].
Tactile systems can be broadly divided in two categories: artificial tactile fingertips and artificial skin. Tactile fingertips are localized and predominant for manipulation tasks, whereas artificial skin is distributed and normally characterized by lower sensitivity due to number of sensors. Technologies and strategies for both of these categories vary and depending on the specific sensing system there are advantages and disadvantages to be balanced. Advances in materials and electronics have allowed researchers to develop tactile systems that integrate different sensing elements, including vibration, pressure, temperature, orientation, and even embedded vision sensors [7], [8], [9], [10], [11], [12], capable of providing an effective media to process tactile information.
This letter presents a soft tactile device comprised of sensors that provide angular velocity, accelerometer, piezoresistive, and pressure data. The sensing elements are integrated in a soft material and interfaced to a microcontroller for data acquisition. The tactile device is evaluated with texture recognition experiments in which individual and combinations of sensor data are analyzed to determine whether multiple signals from different sensor modalities (multimodal sensing) can lead to better performance in terms of accuracy and speed. A group of five textures (plastic bubbles, foam, paper, soft material, and fabric, see Fig. 1) attached on a robot arm are slid in one direction on the soft sensor for systematic data collection and analysis. The results show that the piezoresistive element achieves the highest accuracy of 99.38% when individual sensors are used. This performance is improved by combining piezoresistive and accelerometer data achieving the accuracy of 99.43%. The proposed tactile sensor is capable of exploiting the benefits of multimodal data processing for texture recognition, but it also has potential for object exploration, grasping, human-robot interaction, and biomedical applications.
The rest of this letter is organized as follows. The sensor design and data collection are described in Section II. The experiments and results are presented in Section III. Finally, Section IV concludes this letter.

II. METHODS
The multimodal soft tactile sensor is designed to measure acceleration, angular velocity, and pressure data. The sensing elements are embedded and mounted on a soft structure, as shown in Fig. 2.

A. Multimodal Soft Tactile Sensor Design
Soft structure: The tactile device uses a soft structure built with Ecoflex 00-50 rubber material, which has become popular for the development of soft sensors [13], [14], [15]. The soft component has a rectangular and rounded shape of dimensions 35 mm × 25 mm × 15 mm, as shown in Fig. 2(a). The soft structure has a rectangular cavity in its center to embed an inertial measurement unit (IMU), which is mounted on a small printed circuit board (PCB). The top and lateral sides of the soft structure have a series of trails to mount conductive threads required for the piezoresistive sensing element.
Sensing elements: The soft structure of the sensor is embedded with a seven-axis IMU (ICM-20789, TDK InvenSense) that can measure acceleration and angular velocity data in x-, y-and z-axes and barometric pressure data when the sensor interacts with objects [16], [17], [18]. This sensing element of dimensions 4 mm × 4 mm × 1.4 mm is located in the center of the soft tactile device, as shown in Fig. 2(a). The tactile sensor is covered with a pressure-sensitive conductive sheet (Velostat [19], [20], [21]) and conductive threads (4 rows × 4 columns) forming a matrix of 16 piezoresistive contact points [see Fig. 2 The conductive threads are connected to a PCB that sends to an Arduino Mega2560 microcontroller the changes in resistance measured from the velostat material for conversion into voltages.

B. Signal Conditioning Interface
All the signals from the multimodal sensing elements are connected to a PCB designed for signal conditioning. This interface, shown in Fig. 2(c), allows the application of 5 V to the piezoresistive matrix and 1.8 V needed for the IMU module. The PCB shows the blue and green regions that contain the conditioning circuitry and input voltages for the two sensing modalities. The yellow region is not used in this work, but it has been included for future use of piezoelectric materials. The signal conditioning board sends all the data to an Arduino Mega2560 microcontroller using I2C communication protocol, and analog and digital pins. These data are systematically collected and used for subsequent processing and analysis. The complete multimodal tactile sensor connected to the signal conditioning interface is shown in Fig. 2(d).

A. Sensor Data Collection
The multimodal soft tactile sensor is placed on a table to collect data from textures manually attached on a UR3 robot arm. The robot is programmed to perform a sliding exploratory procedure along the y-axis (length of 35 mm) on the soft sensor with a speed of 35 mm/s and a duration of 20 s for the collection of multimodal data from each texture (see Figs. 1 and 3). This sliding process is repeated 5 times for each texture in order to create three training datasets and two testing datasets for posterior analysis. The plots in Fig. 3 show examples of piezoresistive, accelerometer, and gyroscope data collected from five different textures used for recognition tasks (plastic bubbles, foam, paper, soft material, and fabric) [22]. Barometric data are not used in this work given that it was visually observed that this signal did not change while sliding the textures on the sensor. However, barometricbased pressure data play a key role in a variety of robotic applications such as grasping, manipulation, and human-robot interaction.

B. Sensor Data Analysis for Texture Recognition
Tactile recognition of textures with individual and combined data modalities is implemented using a probabilistic Bayesian Network approach. This method has been used for texture recognition with a variety of sensors and robots [23], [24], [25], [26]. The Bayesian Network works by iteratively combining the prior probability and likelihood from sensor measurements in order to estimate the posterior probability (the texture explored by the soft sensor), as follows: where P(c n |z t ) is the posterior probability of class c n ∈ C = {plastic bubbles, foam, paper, soft material, fabric}, with N = 5 classes (or textures), and P(z t |c n ) is the likelihood of a sensor measurement z t belonging to a particular class c n at time t. The sensor measurement z t is formed according to two scenarios: 1) using individual sensor data (e.g., piezoresistive or three-axis accelerometer or three-axis gyroscope data) and 2) using concatenated sensor data (e.g., piezoresistive, three-axis accelerometer, and threeaxis gyroscope). Uniform prior probabilities, P(c n ) = P(c n |t 0 ) = 1/N, are used at t = 0 assigning the same probability to each class or texture. For sensor measurements at t > 0, the prior probability is updated by the posterior probability from time t − 1 as P(c n ) = P(c n |z t−1 ). The histogram from z t is used as the likelihood. Normalized posterior Fig. 3. Examples of piezoresistive, accelerometer, and gyroscope data from five different textures for recognition tasks. Each texture is attached on the robot arm to perform a sliding exploratory procedure along one direction (y-axis) on the soft tactile sensor for systematic data collection. probabilities are ensured using the marginal probability, as follows: where P(z t |z t−1 ) is the marginal probability. The iterative updating process implemented by 1 stops when the posterior probability P(c n |z t ) exceeds the belief threshold β threshold , as follows: if any P(c n |z t ) > β threshold thenĉ = arg max cn P(c n |z t ) whereĉ is the estimated texture, and β threshold = [0, 1] is set manually with 0.1 steps to identify the optimal threshold value to achieve the highest texture recognition accuracy by the soft tactile sensor.

C. Texture Recognition With Multimodal Tactile Sensing
The multimodal soft tactile sensor is tested in two scenarios for texture recognition using piezoresistive (velostat), three-axis accelerometer and three-axis gyroscope data. In the first scenario, the sensor measurement, z t , is prepared using data from velostat, accelerometer, and gyroscope individually for texture recognition. The bar plot in Fig. 4(a) shows that velostat data alone can achieve high recognition accuracy of 99.38% (0.616% error) over the accuracy of 93.02% (6.98% error) and 90.12% (9.87% error) with the y-axis component from accelerometer and gyroscope data alone, respectively. In the second scenario, the four sensor data combinations of velostat-accel, velostat-gyro, velostat-accel-gyro, and accel-gyro are used for the measurement parameter z t . The highest recognition accuracy of 99.43% is achieved by combining velostat and y-axis accelerometer (velostat-accel) data [see Fig. 4(a)], which slightly improves the accuracy of velostat data alone. The velostat-gyro and velostat-accel-gyro combinations also achieved high accuracy of 99.37% and 99.35%, respectively, over the use of accelerometer and gyroscope alone. The high accuracies are achieved using data from the y-axis component of accelerometer and gyroscope. The only combination of sensor data that did not show improvement is accelerometer and gyroscope (accel-gyro). In all cases, data from the x and z axes of the accelerometer and gyroscope did not show improvement in the recognition process, which might be related to the direction of the sliding exploratory procedure performed along the y-axis only. The results from the experiments are achieved by setting β threshold = 0.9 in the probabilistic method. The high performance of the velostat might be due to its location on the soft sensor, allowing direct contact to the slid textures to extract relevant data. The reduced performance of the gyroscope and accelerometer might be due to being embedded in a lower layer of the soft sensor.
The mean reaction times (mean number of data samples) needed by the tactile sensor to make a decision about the texture being explored, using individual and combined sensor data, are shown in Fig. 4(b). The results show that accelerometer and gyroscope alone require a mean of 41.88 and 69.61 data samples, respectively, to achieve their corresponding recognition accuracies. The combination of these two signals (accel-gyro) is able to improve the reaction time to a mean of 15.22 data samples. Velostat alone can recognize textures faster and accurately with a mean of 1.3 data samples. However, this reaction time is still slightly improved by combining velostat-accel, velostat-gyro, and velostat-accel-gyro sensor data, which require a mean of 1.28, 1.26, and 1.26 data samples, respectively, to accurately recognize the five textures used in the experiments.
The highest and lowest recognition accuracies for each texture achieved by velosta-accel combined data and gyroscope data alone, respectively, are presented by the confusion matrices in Fig. 4(c) and (d). The results from the combination of velostatat and accelerometer (velostat-accel) sensor data provided the best recognition accuracy for the belief threshold of β threshold = 0.9. The confusion matrix shows that foam and paper textures are recognized with 100%, and they are not confused by any other texture. Soft material is recognized with 99.75% accuracy and confused 0.25% times by paper texture. Fabric material is recognized with an accuracy of 99.55% and confused by plastic bubbles 0.45% times. The lowest accuracy of 97.02% is observed with the plastic bubbles texture, which were confused by fabric 2.98% times. Regarding the use of gyroscope data alone, the tactile sensor cannot recognize reliably foam (accuracy of 52.29%) and fabric (accuracy of 1.19%) textures.
The experiments show that velostat data alone achieves high recognition accuracy, which is related to its location on the sensor (top of the soft structure) that allows direct contact with the explored textures. Both the accuracy and reaction time have been improved by combining velostat and accelerometer (velostat-accel) data, while combining velostat and gyroscope (velostat-gyro) improved the reaction time only. Overall, combining different data modalities offered by the proposed soft tactile sensor can improve the performance of texture recognition in terms of accuracy and speed.

IV. CONCLUSION
This letter presented a multimodal soft tactile sensor that can provide accelerometer, gyroscope, piezoresistive, and pressure data in a single device. The sensor and combination of data modalities were tested with texture recognition tasks using a sliding exploratory approach with a UR3 robot arm. The results showed that the combination of velostat and accelerometer sensor data offers the optimal texture recognition performance with the proposed soft tactile device (99.43% accuracy and mean reaction time of 1.28 data samples). Overall, the proposed soft tactile sensor offers an alternative device capable of measuring and combining multimodal data for robotic applications, such as object exploration and recognition, human-robot interaction, grasping, manipulation, and biomedical applications.

DATA AVAILABILITY
Data created during this research work is openly available from the University of Bath Research Data Archive at https://doi.org/10.15125/BATH-01303.