Skip to Main Content
This video presents a new method for mobile unobtrusive interaction with bipedal robots using the Orient on-body, fully wireless motion capture system. During the learning phase for the robot, data for motion such as waving the hands, standing on a leg, performing sit-ups and squats is captured from a human operator strapped with the Orient specks. Key features are extracted from the captured motion data using unsupervised learning algorithms. During subsequent interactions with the robot, the motion of the human operator, speckled with Orients, is classified on-line and the robot selects to play the closest motion. This approach is particularly useful in situations where the robot operates a well defined vocabulary of motion, and the advantages are the speed with which new robot motion behaviour can be programmed in a matter of minutes (compared to a heuristics-based approach), and the mobility (compared to a camera-based method) that this mode of interaction affords. Related papers have compared the performances of three unsupervised learning algorithms: c-means, k-means and expectation maximisation (EM) for the four motion scenarios described above, and for walking. The video is also available at http://www.specknet.org/about/edinburgh/DKArvindMBarto sik_RO-MAN2009.mpeg.