Abstract:
With rapid advances in social robotics, humanoids and autonomy, robot assistants appear to be within reach. However, robots are still unable to effectively interact with ...Show MoreMetadata
Abstract:
With rapid advances in social robotics, humanoids and autonomy, robot assistants appear to be within reach. However, robots are still unable to effectively interact with humans in activities of daily living. One of the challenges is in the frequent use of multiple communication modalities when humans engage in collaborative activities. In this paper, we propose a Multimodal Interaction Manager, a framework for an assistive robot that maintains an active multimodal interaction with a human partner while performing physical collaborative tasks. The heart of our framework is a Hierarchical Bipartite Action-Transition Network (HBATN), which allows the robot to infer the state of the task and the dialogue given spoken utterances and observed pointing gestures from a human partner, and to plan its next actions. Finally, we implemented this framework on a robot to provide preliminary evidence that the robot can successfully participate in a task-oriented multimodal interaction.
Date of Conference: 03-08 November 2019
Date Added to IEEE Xplore: 28 January 2020
ISBN Information: