I. Introduction
The future intelligent factory is created by the harmonious coexistence of human and robots, which requires robots to complete different tasks together with human. As a result, human will face the threat of potential collision with robots. Therefore, it is necessary to solve the safety issue of human-robot collaboration [1]. The industry robots require various sensing systems to monitor the surrounding potential collision objects and collaborate with human in a safety workspace as shown in Fig. 1. Currently, the commonly used safety per-ception systems mainly include visual, proximity and tactile sensing systems [2]–[9]. Visual safety systems based on visual sensors and depth cameras [10]–[15] could effectively detect the collision between robot and human by 3D model and depth image in augmented reality environment. But the current systems are either too sensitive to collisions, which may cause error feedbacks, or too complex and expensive to install. Compared with visual sensors, tactile sensors which are made of flexible materials are capable of integrating on robot arms and detecting the contact force. For instance, Mu et al. proposed a micro-electro-mechanical system based acoustic wave sensor, which has a dual mode (Lamb wave mode and surface acoustic wave mode) behavior for reading out pressure change [8]. Ji et al. proposed a large-area flexible tactile sensing array (TSA) based on self-powered triboelectric sensing effect [9]. The TSA is attached on the robot arm to realize safety emergency stop in case any sudden unexpected contact is occurred. However, it is more effective and attractive to detect unknown objects by using proximity sensors before the collision happens.
Human-robot collaboration working environment, application scenarios of human-robot safety sensors.