Abstract:
Developing artificial agents able to autonomously discover interesting states of the environment, set them as goals and learn the related skills and curricula is a paramo...Show MoreMetadata
Abstract:
Developing artificial agents able to autonomously discover interesting states of the environment, set them as goals and learn the related skills and curricula is a paramount challenge for the deployment of robotic systems in real-world scenarios. Here robots must adapt to situations not foreseen at design-time and learn new skills eventually handling unexpected changes in the environment. In this work we present and test a cognitive architecture for robotic control, that integrates different features and mechanisms in a developmental perspective. In particular, we propose to treat goal-discovery as a specific motivation and allow the system to autonomously select it through a motivation selector. Moreover, we also propose to use intrinsic motivations (specifically the measure of competence) to let the system autonomously regulate its exploration in the learning of curricula of interrelated goals. The presented robotic experiments show the advantages of our approach in learning to achieve different goals with non-stationary interdependencies.
Date of Conference: 20-23 May 2024
Date Added to IEEE Xplore: 27 August 2024
ISBN Information: