By Topic

Learning Actions from Observations

Sign In

Cookies must be enabled to login.After enabling cookies , please use refresh or reload or ctrl+f5 on the browser for the login options.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

5 Author(s)
Kruger, V. ; He is with the Computer Vision and Machine Intelligence Lab (CVMI) at the Copenhagen Institute of Technology (CIT) of Aalborg University. ; Herzog, D. ; Baby, S. ; Ude, A.
more authors

In the area of imitation learning, one of the important research problems is action representation. There has been a growing interest in expressing actions as a combination of meaningful subparts called action primitives. Action primitives could be thought of as elementary building blocks for action representation. In this article, we present a complete concept of learning action primitives to recognize and synthesize actions. One of the main novelties in this work is the detection of primitives in a unified framework, which takes into account objects and actions being applied to them. As the first major contribution, we propose an unsupervised learning approach for action primitives that make use of the human movements as well as object state changes. As the second major contribution, we propose using parametric hidden Markov models (PHMMs) for representing the discovered action primitives. PHMMs represent movement trajectories as a function of their desired effect on the object, and we will discuss 1) how these PHMMs can be trained in an unsupervised manner, 2) how they can be used for synthesizing movements to achieve a desired effect, and 3) how they can be used to recognize an action primitive and the effect from an observed acting human.

Published in:

Robotics & Automation Magazine, IEEE  (Volume:17 ,  Issue: 2 )