Skip to Main Content
This video illustrates some of the bahavioral capabilities achieved using an implementation of the proposed GSM architecture on the conversational robot Ripley, as introduced in the video, and as further described in the accompanying paper. The situation model acts as a "theatrical stage" in the robot's mind, filled in with present, past or imagined situations. A special tripley-layered design enables bidirectional translation between the sensory data and linguistic descriptions. In the video, you will see the robot: 1) Answering questions and serving motor commands, and verbalising uncertainty: What color are the objects? 2) "Imagining" objects when informed about their existence through language, talking about them without having seen them. Later, you will see him matching his sensory expectations with existing objects, and integrating sensory-derived information about the objects with language-derived. Imagine a blue object on the left! 3) Remembering past events, resolving temporal referents, and answering questions about the past. How big was the blue object when your head started moving? More videos of the system at author's site.