By Topic

Data fusion for robotic assembly tasks based on human skills

Sign In

Full text access may be available.

To access full text, please use your member or institutional sign in.

Formats Non-Member Member
$31 $13
Learn how you can qualify for the best price for this item!
Become an IEEE Member or Subscribe to
IEEE Xplore for exclusive pricing!
close button

puzzle piece

IEEE membership options for an individual and IEEE Xplore subscriptions for an organization offer the most affordable access to essential journal articles, conference papers, standards, eBooks, and eLearning courses.

Learn more about:

IEEE membership

IEEE Xplore subscriptions

4 Author(s)
Cortesao, R. ; Inst. of Syst. & Robotics, Coimbra Univ., Portugal ; Koeppe, R. ; Nunes, U. ; Hirzinger, G.

This work describes a data fusion architecture for robotic assembly tasks based on human sensory-motor skills. These skills are transferred to the robot through geometric and dynamic perception signals. Artificial neural networks are used in the learning process. The data fusion paradigm is addressed. It consists of two independent modules for optimal fusion and filtering. Kalman techniques linked to stochastic signal evolutions are used in the fusion algorithm. Compliant motion signals obtained from vision and pose sense are fused, enhancing the task performance. Simulations and peg-in-hole experiments are reported.

Published in:

Robotics, IEEE Transactions on  (Volume:20 ,  Issue: 6 )