Skip to Main Content
Recently, robot technique has been developed remarkably. But, they cannot do emotional tasks and the work they can is limited. In this view, it is important for machine to understand human's emotion. Also, emotion recognition is necessary to make an intimate robot. This paper shows simulation results which classify emotions by learning 4 pitch patterns and results from some analyses. The pitch contour includes emotion information. This is why the pitch has been widely used for emotion recognition. However, because the pitch contour is not sufficient for recognizing emotion, we should add other elements. Thus, several analyses are done and the analyzed elements are called acoustic elements for convenience. These elements are the fundamental for more accurate recognition. In addition to this, we analyze the relation between emotion and acoustic elements. The brain is high-dimensional nonlinear dynamical system. So, it is essential to utilize a system that is capable of storing internal states and utilize a system that is capable of storing internal states and implementing complex dynamics. DRNN fits such a system. The simulator is composed of the DRNN (dynamic recurrent neural network), feature extraction.