Skip to Main Content
This paper proposes a multimodal communication method for human-friendly robot partners based on various types of sensors. First, we explain informationally structured space to extend the cognitive capabilities of robot partners based on environmental systems. Next, we discuss the suitable measurement range for recognition technologies of touch interface, voice recognition, human detection, gesture recognition, and others. Based on the suitable measurement ranges, we propose an integration method to estimate human behaviors based on the human detection using color image and 3-D distance information, and gesture recognition by the multilayered spiking neural network using the time series of human-hand positions. Furthermore, we propose a conversation system to realize the multimodal communication with a person. Finally, we show several experimental results of the proposed method, and discuss the future direction of this research.
Note: In the print edition, this paper appears with the incorrect publication title in the running head. The correct publication title is IEEE TRANSACTIONS ON SYSTEMS, MAN, AND CYBERNETICS-PART C: APPLICATIONS AND REVIEWS.
Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on (Volume:42 , Issue: 6 )
Date of Publication: Nov. 2012